# 1+1=2. This is a true statement. Is there any logical way of making it false?

Ive been wondering about certainty and objective truth. Can the equation be made false without changing its components?

Observing members:
0
Composing members:
0
## 27 Answers

Er…ho hum? Two is **one** number, not two ones, so those two ones don’t add up unless they merge together like some creepy reversed Siamese plant monster.

No, this equation cannot be made false.

Mathematically speaking it is true…if you create a story around it, or if this reprensents something else then you could probably say it’s not true: for example 1 woman + 1 man = 3 (1woman, 1man, 1 baby).

Well. You could presumably redefine the meaning of the symbols and make it false.

As for the logical truth of the *concept*, it is based on some axioms (or unproven assumptions) about natural numbers, see the Peano axioms.

Axioms are are logical concepts that are “too simple to prove”, and it is assumed that everyone can agree that they are true without further justification.

1 cup oil + 1 cup water = 1.5 cups roughly

Seeing how it’s an axiom… No it can’t because it is true by definition.

What if you divided both sides by zero? The answer would be undefined. But I suppose that wouldn’t make it false necessarily.

What do you mean? Division by zero is undefined. I was just trying to think of ANY possible way to answer the question.

Truth is a matter of perception. I can 1+1=2 is not a true statement. That is accurate because you don’t know the whole truth, only a partial truth based on your perception. 1 orange + 1 apple = 2 apples is false. You aren’t changing the components, simply acknowledging that your perception of them was incomplete.

@cockswain Actually, it’s not undefined because you can *never* divide by zero. It is a consistent rule and simply can’t be done, and is therefore not undefined.

1 half cup of flour + 1 half cup of flour = 1 cup of flour

Adding to the equation other words or symbols is a way of changing the original question.

@Fly I disagree with your logic. I’m viewing it like one would an asymptote in a graph. Do you think that is unreasonable?

@cockswain In regard to asymptotes, the equation itself is not actually dividing by zero. The vertical asymptote of the equation is any number that would make the denominator equal zero, *because* dividing by zero is impossible. The asymptote only exists because one cannot divide by zero. On the graph, it is considered undefined, but algebraically it is still impossible, which is why the asymptote exists. While I disagree with your reasoning, I see where you’re coming from.

With a brain for Math like mine, I could easily make any equation false!

@Fly so back to the original question, is there any possible way (1 + 1)/0 = 2/0 be considered true? Would it have to be false if not true? Further, philosophically, what is “undefined”?

If you’re a redneck hillbillies wet dream then the answer would be 3,or thereabouts.

@cockswain Saw your other question too, which I answered algebraically. If you are considering an asymptote in a graph, it is really a question of divergence. As you take the limit of g(x)=1/x towards the asmyptote at 0, the values of g(x) diverge meaning they become unbounded, or go to infinity if you prefer. Now infinity has conceptual meaning, but it isn’t an algebraic value, so you couldn’t have an equation such as (1/0)+(1/0)=(3/0). (infinity)+(infinity) doesn’t have a meaning since addition simply isn’t defined for infinity.

@noodlehead710 thanks. the more I thought about it, I realized I was missing something fundamental about the concept of zero. That other question I started really simplified my view. But in a nutshell, you can’t really regard ‘infinity’ as a value upon which to perform operations. Agreed?

@cockswain Yeah, that’s my take. While infinity is a great concept, I’m always careful when I teach Calculus to emphasize that infinity is just a symbol, not an actual value.

In Binary, 1+1=10

Do I win?

EDIT: Oh darn, didn’t see @ben‘s comment >:|

In Boolean Algebra, 1 + 1 = 1.

Yes, 1+1 is not necessarily 2.

Try fuzzy logic when 1+1=2 might only be a little true.

## Answer this question