I don't recall the exact details, but I know I've seen a mathematical demonstration that 1+1=3. It involved writing the original 1+1=2 equation in the form a+b=c and doing some operations on it, and in the end it got to 1+1=3.

Also, I recall (90% certain) that at some point during this demonstration, we have a fraction where the denominator is equal to 0 (something of the form 1/a-a or 1/b-b).

Anyway, the problem with that is that in my math classes (high school + 2 more years of science), it was stressed repeatedly that all the things we had learned were true only if there were no fractions involved with 0 as a denominator. Hence you had to always had to add a disclaimer that a given equation was true as long as *insert denominator's equation* wasn't 0. (of course, this was in theory; in practice, always adding this disclaimer would triple the lenght of the class and of our papers)

I'm just wondering, does this restriction go away at some point later down the road and 1+1=3 ends up being "true", so to speak, in very advanced math? Or is the whole thing nothing more than a flawed reasonning, no matter how you slice it?

posted
I would think that the equation with variables should be a+a=b.

-------------------- "When a stupid man is doing something he is ashamed of, he always declares that it is his duty."--George Bernard Shaw Posts: 19266 | From: Nashville, TN | Registered: Jun 2002
| IP: Logged |

posted
It's been a while since I've taken my Calc classes, but I remember the prof in Calc 2 harping on anything divided by 0 is "undefined." I was not a math major, but took my fair share as an engineering student. At no time did we accept anything divided by 0 as a value other than 1) undefined or 2) an error(mostly in computer apps.)

But 1+1 can = 3 if the value of 1 is 1.25 and 1.25 then you get 2.5 and you can round up to 3.

ETA: Spanked by Doug
Posts: 50 | From: Louisville, KY | Registered: Dec 2005
| IP: Logged |

quote:Originally posted by AnglRdr: I would think that the equation with variables should be a+a=b.

No, the "Proof" that the OP referred to was a+b=c so that the division by zero could be more effectively hidden.

I’m not versed in the mathematics involved, but in “higher” levels yes you can get expression that resemble 1+1=3 involving different planes, geometric shapes and generally “weird” stuff.
Posts: 287 | From: Sacramento, CA | Registered: Sep 2005
| IP: Logged |

quote:Originally posted by Oualawouzou: Heya there,

I don't recall the exact details, but I know I've seen a mathematical demonstration that 1+1=3. It involved writing the original 1+1=2 equation in the form a+b=c and doing some operations on it, and in the end it got to 1+1=3.

Also, I recall (90% certain) that at some point during this demonstration, we have a fraction where the denominator is equal to 0 (something of the form 1/a-a or 1/b-b).

a = b (assumed) a²= ab (multiply both sides by a) a²-b² = ab-b² (subtract b² from both sides) (a+b)(a-b) = b(a-b) (factoring) a+b = b (divide both sides by a-b, which is dividing by zero)

a+a = a (substituting a for b) 2a = a (combining terms)

Finally, dividing both sides by a gives

2 = 1

If you add a to both sides and combine terms on the left hand side before dividing by a, you get 3 = 1 + 1

quote:Originally posted by Oualawouzou: Anyway, the problem with that is that in my math classes (high school + 2 more years of science), it was stressed repeatedly that all the things we had learned were true only if there were no fractions involved with 0 as a denominator. Hence you had to always had to add a disclaimer that a given equation was true as long as *insert denominator's equation* wasn't 0. (of course, this was in theory; in practice, always adding this disclaimer would triple the lenght of the class and of our papers)

I'm just wondering, does this restriction go away at some point later down the road and 1+1=3 ends up being "true", so to speak, in very advanced math? Or is the whole thing nothing more than a flawed reasonning, no matter how you slice it?

Thanks!

Depending on what form of science you took in college, you may have been introduced to calculus and limits. Though you're never allowed to talk about what happens when x=0 in the equation 1/x, you can take the limit of 1/x as x approaches 0, which is usually described as infinity. The closest I can think of where this restriction "goes away" is in the evaluation of indeterminate forms.

buf 'but 1 + 1 never actually equals 3' ungla

-------------------- "Pardon him. Theodotus: he is a barbarian, and thinks that the customs of his tribe and island are the laws of nature."

George Bernard Shaw, Caesar and Cleopatra Posts: 4847 | From: Washington, DC | Registered: Jun 2001
| IP: Logged |

quote:Originally posted by TayLow: It's been a while since I've taken my Calc classes, but I remember the prof in Calc 2 harping on anything divided by 0 is "undefined." I was not a math major, but took my fair share as an engineering student. At no time did we accept anything divided by 0 as a value other than 1) undefined or 2) an error(mostly in computer apps.)

But 1+1 can = 3 if the value of 1 is 1.25 and 1.25 then you get 2.5 and you can round up to 3.

ETA: Spanked by Doug

x/0 should be taken as infinity rather than undefined, I seem to remember this fact being important doing something in calculus..

0/0 is undefined however or possibly a paradox (should be both 1 and infinity at the same time). I also seem to remember 0^0 of zero being a special case also in the fact it simply doesn't exist...
Posts: 824 | From: England | Registered: Mar 2005
| IP: Logged |

posted
Just so everyone is clear, what bufungla posted is wrong. Everybody got that? 1+1 is not, and never is, equal to 3. (I'm sure bufungla knows it is wrong)

The third step conflicts with the statement of the problem so the proof fails.

Calculas does not enter into the problem, this is algebra. What calculas says about limits and the like is totally irrelevant.

So the proof goes like this a = b (assumed) a²= ab (multiply both sides by a) a²-b² = ab-b² (subtract b² from both sides) (a+b)(a-b) = b(a-b) (factoring) a+b = b (divide both sides by a-b, which is dividing by zero) Proof fails since a-b=0 because a=bPosts: 629 | From: Greenwood, IN | Registered: Dec 2005
| IP: Logged |

quote:Originally posted by abigsmurf: x/0 should be taken as infinity rather than undefined, I seem to remember this fact being important doing something in calculus..

0/0 is undefined however or possibly a paradox (should be both 1 and infinity at the same time). I also seem to remember 0^0 of zero being a special case also in the fact it simply doesn't exist...

Anything divided by zero is undefined. (You can take the limit of, say, 1/x as x goes to zero, but this just means "let x be a very very small number and see what happens.")

0^0 is undefined as well, for similar reasons.

The problems here arise because zero is not what's called a unit. There is no other number you can multiply zero by to get 1. If you allow fractions, every number you've ever seen is a unit, except for zero. It turns out that you can construct different algebraic systems that don't have everything a unit, and this is where modern (i.e. "abstract") algebra comes from.

Remember, division is really just multiplication by a reciprocal. And the reciprocal of a number is its multiplicative inverse; that is, the number you multiply it by to get 1. And zero isn't a unit, so it doesn't have an inverse, so you can't multiply by its reciprocal since it doesn't have one, so you can't divide by zero.
Posts: 5 | From: Tallahassee, FL | Registered: Sep 2005
| IP: Logged |

posted
In simple terms, this little exercise is just a mathematical trick. It's the sort of thing a math teacher might use as a joke or to make a point about why it's important to be careful with algebra.

If you take away all the fancy stuff, what the whole thing basically boils down to is this:

1 x 0 = 2 x 0

This part is true. One times zero is zero, and two times zero is also zero. They are equal. But if we try to divide by zero (which is a big no-no), we get:

1 = 2

Obviously wrong. It's just a proof of why you can't divide by zero. But since it uses letters instead of numbers, it's easy to miss the trick.

posted
I remember once taking a calculator and punching in 1 divided by 3. Result: .333333333333333 -- which I then multiplied by 3 and got .999999999999999

So yay, I proved that 1 doesn't = 1

I can't quite make 1 + 1 = 3, tho... have fun!

-------------------- They just don't make crazed, beserk robots like they used to. --Sheen Estevez, Jimmy Neutron, Boy Genius

If I manage to post something swipe-worthy that you would like to make your sig, you may do so with my blessing. Posts: 2486 | From: East Stroudsburg, PA | Registered: Oct 2005
| IP: Logged |

quote:Originally posted by TayLow: It's been a while since I've taken my Calc classes, but I remember the prof in Calc 2 harping on anything divided by 0 is "undefined." I was not a math major, but took my fair share as an engineering student. At no time did we accept anything divided by 0 as a value other than 1) undefined or 2) an error(mostly in computer apps.)

But 1+1 can = 3 if the value of 1 is 1.25 and 1.25 then you get 2.5 and you can round up to 3.

ETA: Spanked by Doug

x/0 should be taken as infinity rather than undefined, I seem to remember this fact being important doing something in calculus..

0/0 is undefined however or possibly a paradox (should be both 1 and infinity at the same time). I also seem to remember 0^0 of zero being a special case also in the fact it simply doesn't exist...

No, I believe x/0 is undefined. Because the limit approaching 0 from the positive is positive infinity, but the limit approaching 0 from the negative is negative infinity. If the limits are unequal then the result is undefined. I think it might be valid to say that (x/0)^2 is infinity, because the limit from both directions is positive infinity.
Posts: 2018 | From: Santa Barbara, California | Registered: Aug 2005
| IP: Logged |

quote:No, I believe x/0 is undefined. Because the limit approaching 0 from the positive is positive infinity, but the limit approaching 0 from the negative is negative infinity. If the limits are unequal then the result is undefined. I think it might be valid to say that (x/0)^2 is infinity, because the limit from both directions is positive infinity.

I would say even then it isn't valid. The limit can be the same from the positive and negative sides in an equation but still be undefined to the actual value.

For example, take the equation f(x) = (x+2)(x-2)/(x-2). The limit of f(2) from both sides will equal 4, but on the actual graph of f(x) the point will be undefined, since it requires dividing by zero.
Posts: 417 | From: Escondido, California | Registered: Jun 2004
| IP: Logged |