13
$\begingroup$

I've been trying to learn naive set theory through a YouTube series by 'The Bright Side of Mathematics'. So far, I've been able to understand successor maps and the definition of $\mathbb{N}_{0}$. I have now gotten to the point where he defines addition. His definition is the following:

$$\text{Addition is a map } \mathbb{N}_{0} \times \mathbb{N}_{0} \rightarrow \mathbb{N}_{0}$$ $$(m,n) \mapsto m + n$$ He also gives the following: $$m+0:=m$$ $$m + s(n) := s(m+n)$$

I am able to understand how he gets these statements, as he works through it quite well on the video. However, to me this seems like circular reasoning. How can you define addition using the $+$ symbol, isn't that against the purpose of a definition?

Whilst doing further research I managed to find the following article: How is addition defined?. The answers on it however don't make much sense to me as a complete beginner, and I am wondering whether anyone has a definition of addition without using $+$ (or if that's even possible)?

$\endgroup$
3
  • 12
    $\begingroup$ Its done recursively. Once you know what $m+n$ means, you can define what $m+(n+1)$ means. $\endgroup$
    – Jakobian
    Commented Jul 5 at 20:16
  • 3
    $\begingroup$ You can think of the definition by recursive equations (the ones involving $+$ on the right hand side) as being justified by, or syntactic sugar for, the "real" definition of $+$, which in the language of type theory would be something like $+ = \lambda m. \text{rec}_\mathbb{N}(m, \lambda x. s\ x)$. In other words, the recursive definition is a solution of those equations. $\endgroup$ Commented Jul 5 at 20:22
  • 10
    $\begingroup$ "How can you define addition using the + symbol, isn't that against the purpose of a definition?" Good question! But no it is not circular. The plus is just a symbol that is used to denote an abstract function. If we call the function: $f:\mathbb N_0\to \mathbb N_0$ via $f(m,0)=m$ and $f(m,s(n))= s(f(m),n)$ the $+$ is nothing more than a notation symbol that we define "$a + b$" to mean simply $f(a,b)$. This is intuitive useful notation as was learn about binary operations before we learn about functions even though binary operations are just a type of function. $\endgroup$
    – fleablood
    Commented Jul 5 at 21:05

5 Answers 5

32
$\begingroup$

The apparent circularity in this definition is in fact illusory. It is possible to prove that there exists a unique function $f:\mathbb N\times\mathbb N\to \mathbb N$ such that \begin{align} f(m,0) &= m \, , \tag{1}\label{1}\\ f(m,s(n)) &= s(f(m,n)) \, , \tag{2}\label{2} \end{align} for all $m,n\in\mathbb N$. Once we have established this fact, it makes perfect sense to define $+$ as this unique function. This is in the same way that it makes sense to define $e^x$ as the solution to the initial value problem $y'=y,\, y(0)=1$, provided we know that (a) such a solution exists, and (b) it is unique.

So how can we prove the existence and uniqueness of $f$? Since the theorem in question is so fundamental, the answer is sensitive to the foundational system you are using. If you are working in a set theory environment, then you would typically appeal to the following.

Recursion theorem. If $a$ is an element of a set $X$, and $g:X\to X$ is a function, then there exists a unique function $u:\mathbb N\to X$ such that $u(0)=a$ and $u(s(n)) =g(u(n))$ for all $n\in\mathbb N$.

A proof of the recursion theorem can be found in Halmos's book Naive Set Theory (sec. 12, p.48), and in a number of other books on mathematical logic and set theory. Almost all of the work involved in the proof goes into actually constructing the function $u$; the uniqueness assertion is just a routine induction.

Once we have the recursion theorem in hand, we can proceed as follows. For each $m\in\mathbb N$, we let $s_m:\mathbb N\to\mathbb N$ be the unique function satisfying $s_m(0)=m$ and $s_m(s(n))=s(s_m(n))$ for all $n\in\mathbb N$. Then, we define $f:\mathbb N\times\mathbb N\to\mathbb N$ by $f(m,n)=s_m(n)$. It is clear that $f$ satisfies $\eqref{1}$ and $\eqref{2}$, and an induction on $n$ shows that there is at most one function with this property. Thus, our definition of $+$ is coherent.

$\endgroup$
8
  • $\begingroup$ The formal definition of the u in the recursion theorem is prima facie simple. Define u(n)=y where y uniquely satisfies: there exists an n+1 length sequence v where v(0)=a and all k<n have v(s(k))=g(v(k)) and finally v(n)=y. By induction on n, we can prove there's a unique v satisfying the first two properties, at which point it follows that y is unique and thus u is well defined. The proof that u is unique is nearly identical to the proof that the v's are unique; they are both nearly identical inductions. $\endgroup$ Commented Jul 7 at 15:40
  • $\begingroup$ @JadeVanadium: Interesting, the proof in Halmos is quite different to the one you give. He instead considers the set $\mathcal C$ of subsets $A$ of $\mathbb N\times X$ such that $(0,a)\in A$ and $(s(n),g(x))\in A$ whenever $(n,x)\in A$. It can be shown that the intersection of $\mathcal C$ is the desired function $u$. $\endgroup$
    – Joe
    Commented Jul 7 at 15:51
  • $\begingroup$ Very interesting indeed; the intersection definition seems to define recursion "from above" using supersets, whereas my definition defines it "from below" using finite initial segments. I think the initial segments definition is more common in reverse mathematics, since it requires fewer assumptions. For example, first-order Peano Arithmetic can prove Recursion as a theorem schema using finite sequences (where finite sequences can be encoded using Godel's Beta function). The initial segments approach is also the only way ZFC/NBG can prove recursion over the proper class of all ordinals. $\endgroup$ Commented Jul 7 at 16:07
  • $\begingroup$ @JadeVanadium I think this proof is not so simple anymore when you really want to fill the gaps. For example, you need to show at every step that $v$ exists and is indeed a sequence, which seems a bit tedious. And at the end, you only have shown that there is a $y$ for every $n$, but you still have to show how this implies the existence of $u$. All of this is obvious, of course, but so is the recursion theorem itself. I suspect if you want a pedantic proof, Halmos’ one is simpler in the end. $\endgroup$
    – ipsec
    Commented Jul 8 at 17:42
  • $\begingroup$ @ipsec I have worked through it formally, and it's not nearly as tedious as you suggest. Even so, my intent was never to say it's simpler than Halmos' approach, but that it's simple enough to be said explicitly, rather than merely alluding to it. The initial segments approach has other advantages though. It requires fewer assumptions, and the proof neatly generalizes to wellfounded recursion. In NBG, the intersections definition doesn't work for recursion on a proper class, but the initial segments definition still works. $\endgroup$ Commented Jul 8 at 19:10
11
$\begingroup$

The $+$ symbol is an "infix" function definition $+: \mathbb{N}_0 \times \mathbb{N}_0 \to \mathbb{N}_0$. The only difference to a normal function is, that for $m,n \in \mathbb{N}_0$ instead of writing $+(m,n)$ we write $m + n$ (infix notation). Note that you can use any symbol other than $+$ for example just call it $\mathrm{ADD}$. Furthermore you do not need to use the infix notation.

The definition is recursive in nature. Let $m \in \mathbb{N}_0$. Say we want to calculate $m + s (0)$. Then by the definition $$ m + s(0) =s( m+ 0) = s (m). $$ In the same way (using the above equation for the second equality): $$ m+ s(s(0)) = s (m + s(0) ) = s( s(m))$$ It is not hard to understand that we can calculate $m+n$ in this way for any $n \in \mathbb{N}_0$ simply by expressing it as the $n$-th succesor of $0$. Therefore the recursively defined function $+$ is well defined.

$\endgroup$
3
  • 1
    $\begingroup$ I understand, but is there a way to write this as a logic statement? In other words, how would you say the $n$-th successor of $0$ with a logic statement? $\endgroup$ Commented Jul 5 at 21:13
  • $\begingroup$ @SpyridonManolidis check this and von Neumann ordinals $\endgroup$
    – powerline
    Commented Jul 5 at 22:32
  • $\begingroup$ @SpyridonManolidis Especially the 2nd link in the linked question above might provide some insight $\endgroup$
    – powerline
    Commented Jul 5 at 22:38
5
$\begingroup$

We can define it in a similar way to how we define $\mathbb{N}_0$—as the smallest set having certain properties. Let \begin{align} \varphi(u) &= (\forall x \in \mathbb{N}_0) (x, 0, x) \in u\\ &\quad \wedge (\forall xyz \in \mathbb{N}_0)((x, y, z) \in u \to (x, s(y), s(z)) \in u), \end{align} and let $\delta(y) = \varphi(y) \wedge \forall z (\varphi(z) \to y \subseteq z)$. We can then use the axioms of set theory to prove $\exists! y \, \delta(y)$, which justifies defining the symbol $+$ by $y = {+} \leftrightarrow \delta(y)$. We then adopt the shorthand, $x + y = z$ to mean $(x, y, z) \in {+}$.

In words, the above definition says that $+$ is the smallest set of ordered triples of natural numbers such that $(x, 0, x) \in {+}$ for every natural number $x$, and, whenever $(x, y, z) \in {+}$, we also have $(x, s(y), s(z)) \in {+}$.

Using the shorthand notation, the verbal description is even clearer. In that case, $+$ is the smallest set of ordered triples of natural numbers such that $x + 0 = x$ for every natural number $x$, and, whenever $x + y = z$, we also have $x + s(y) = s(z)$.

$\endgroup$
2
$\begingroup$

This is a recursive definition (closely related to proofs by induction).

To be well-defined, it requires two parts: a base case ($m + 0 := m$) and a recursive case ($m + s(n) := s(m + n)$). If we only had the recursive case, then this would indeed be a circular definition. But the presence of the base case provides a "stopping point", to which any application will ultimately resolve. Note that the base case has no $+$ operator in the defining right-hand expression.

Taking the simplest possible example, consider applying this to $1 + 1$. Knowing the definitions of $s(0)= 1$ and $s(1) = 2$, we see: $1 + 1 = 1 + s(0) = s(1 + 0) = s(1) = 2$. Note that we applied the base case in the first step, recursive in the second, and base case again the third.

Other applications would be longer chains of reasoning, but would involve recursively decomposing the addend in a similar way, with the decomposition stopping once all the units are expressed in terms of $s(0)$, and then recursively recombining. Note that this effectively formalizes the childhood strategy of addition by "counting on".

$\endgroup$
-2
$\begingroup$

The other answers are not technically wrong but are in fact conceptually quite misleading when it comes to foundations of mathematics. The reason is that if you do not already have a model of PA⁻ (discrete ordered semi-ring axioms), and claim to be able to construct such a model, such as starting from some structure $N$ with an element $0$ and an injective successor function $s$ on $N$ whose output is never $0$, then you must be working in some sufficiently strong foundational system already. If for example you are working in ZFC set theory, then you would get such an initial structure $⟨N,0,s⟩$ from the Infinity axiom and Specification axiom (schema). In particular, much of the power is hiding in the Infinity axiom; it was added entirely to ensure that we could construct such an initial structure.

And that is actually a big problem from a careful philosophical viewpoint. Note that the Infinity axiom literally says (by fiat) that there exists a set $I$ that includes $∅$ and is closed under $s$ where $s(x) = x∪\{x\}$ for every (set) $x$. We can conceptually imagine this set $I$ as long as we accept that we can indefinitely iterate this operation $s$ and collect together all the results we get. This is not an issue if we accept that we can concatenate arbitrary finite strings (say from a finite alphabet) indefinitely, starting from single-symbol strings. But there does not seem to be any other way of justifying such a conceptual construction! And it turns out that the very idea that there is a set of finite strings that includes "0" and "1" and is closed under concatenation already yields a model of PA⁻ under very weak assumptions.

So it is conceptually circular to define natural number addition using recursion, because the recursion theorem itself cannot be justified without relying on assumptions stronger than merely assuming that ℕ obeys PA⁻. ZFC proves the recursion theorem, of course, but it would not be wrong to say that we would be extremely unlikely to accept any foundational system (whether ZFC or not) that is not capable of constructing a model of PA⁻. In other words, addition on ℕ is a requirement and we would design our foundational system in order to be able to obtain that. Same for multiplication on ℕ.

Even in the field of mathematical logic, PA⁻ has a much more special place than Peano's axioms (without addition and multiplication), and is considered to be the most suitable base theory for many investigations into weak foundations of mathematics. There is a weaker base theory called Robinson's Q, but even Q has both addition and multiplication inbuilt! I hope that the above explanation has given you some insight into why addition and multiplication are more like fundamental assumptions that we must have than concepts that we can define in absence of equally strong assumptions.

$\endgroup$
6
  • $\begingroup$ I am a little sorry to see such a thoughtful answer downvoted with no reasons at all given (as I write), but I suppose it is probably way over the head of someone who stumbles over an implicitly recursive definition! $\endgroup$
    – PJTraill
    Commented Jul 10 at 22:54
  • $\begingroup$ @PJTraill: Perhaps you misunderstand the purpose of Math SE. It is not merely a place to satisfy the asker of each question, but it is supposed to be a repository of knowledge for the millions of other readers. The downvoters here clearly do not grasp the true foundational underpinnings of the natural number structure, and believe (wrongly) that the recursion theorem fully justifies the definition of addition. It does not, and they do not understand, but they do not wish to learn more, therefore they downvote with no comment. $\endgroup$
    – user21820
    Commented Jul 11 at 5:12
  • $\begingroup$ @PJTraill: For an analogous example, consider that many lousy textbooks claim to establish induction for ℕ based on the well-ordering principle for ℕ (both as schemas). Do you know that although they are classically equivalent, induction is actually more fundamental and constructive than well-ordering? This is why in some alternative foundations of mathematics they are no longer equivalent and induction is typically assumed but well-ordering is not. In fact, the correct way to do constructive mathematics requires induction and forbids general well-ordering. [cont] $\endgroup$
    – user21820
    Commented Jul 11 at 5:12
  • $\begingroup$ [cont] Unfortunately due to those lousy textbooks, many people mistakenly believe that well-ordering is more fundamental! The same thing is happening here. When people do not understand all the foundational assumptions, and are told via argument from authority that we can prove the recursion theorem and hence can use it to define addition on ℕ, it leaves a really bitter taste in the mouth of those who actually know. $\endgroup$
    – user21820
    Commented Jul 11 at 5:13
  • $\begingroup$ "it is conceptually circular to define natural number addition using recursion" is at best an opinion and at worst false: see type theory. Your rant is no less misleading than the other answers and it does not answer OP's question. $\endgroup$ Commented Jul 11 at 9:46

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .