Mistakes regarding proofs

From Applied Science
Revision as of 03:53, 29 January 2022 by Wikiadmin (talk | contribs)

I had a teacher who would repeat many times in different classes "If you write 0 = 0 I'm going to give you a zero in the exam!".

Suppose we want to prove that [math]\displaystyle{ a + b = c + d }[/math]. We begin by saying that [math]\displaystyle{ a + b = x }[/math]. Then we say that [math]\displaystyle{ c + d = x }[/math]. Therefore [math]\displaystyle{ x = x \iff a + b = c + d }[/math]. Eureka! We didn't prove anything! What is wrong in the reasoning that we just did? The mistake is that we assumed that the equation is true without knowing whether it holds or not. We can't do such assumptions! That's why in linear algebra and calculus many properties are true if we impose certain conditions. Other times we are presented with counter-examples to show that some property is true for some cases, but not for all of them. If we want to prove something, we either assume that something is false and then try to find one case in which it's true. Or the other way around, we assume that something is true and try to find a case in which it's false.

Many proofs are done by finding a contradiction. Two statements that contradict each other because they can't be both false or both true at the same time. For the most purposes this logic is undeniable, we aren't concerned with the definition of what can be and what can't be denied. That discussion is way beyond what we learn at undergraduate levels.