Brief lecture 1 overview: (thread on the progression of our understanding of "Number")

Imagine some birds in a dense forest. They see 1 hunter go in, so they fly up high in the trees. They see 1 hunter leave, they fly back down.

Next 2 hunters go in, birds fly up. When 1 hunter https://twitter.com/AlexKontorovich/status/1303428954263629825
2/ leaves, the birds stay up. Another 1 hunter leaves, the birds come down.

Next 3 hunters go in, birds fly up. 2 hunters leave, birds stay up. 1 more hunter leaves, birds come down.

4 hunters go in, birds fly up. 3 hunters leave, birds *fly down*!

What can we infer from this
3/ anecdote? Birds know "1" and "2", but after that, it's "many" (or "infinity"). Those little bird brains.

Hey, how do you say this in your language: 1, 2, 3, 4, 5; 1st, 2nd, 3rd, 4th, 5th?

One/first (the words have nothing to do with each other!)
two/second (but!)
three/third
4/ four/fourth
five/fifth, etc.

1+10 = "eleven" (new word!)
2+10= "twelve" (new word!)
3+10= thir-teen
4+10= four-teen
5+10= fif-teen

The same patterns show for almost all languages worldwide.

We *were* those bird brains!

By the time we could really distinguish 2 from 3, we
5/ already realized we'd need a 4 and 5 and so on.

Such patterns continue in various forms. Ten 1's is called "Ten"; Ten 10's is a "Hundred"; Ten 100's is a "Thousand". No problem inventing new words, but it gets tiresome. So we allow "Ten Thousand" and "Hundred Thousand". But
6/ "Thousand thousand" sounds silly, so we call it million. Then billion (or milliard, in some languages), and only then do our human brains overtake the bird, and we get to:

trillion
quadrillion
quintillion etc

The point is that it takes a while to understand new numbers,
7/ where they come from and why they're needed.

Then we needed to split stuff up. Five loaves of bread but seven families wanting their "fair share". So we developed fractions, and all the headaches (algorithms) that come with that arithmetic.

Then we needed "precision":
8/ Three isn't accurate enough, zoom in: 3.1. Make it even more precise: 3.14, more: 3.14159 etc. So we need decimals, that is, the real numbers (in modern terms, "Cauchy sequences of rationals", or rather equivalence classes of such).

This is basically what we spend K-6 doing!
9/ All the "boring" (but very necessary) preparatory work for "real" stuff to come later (algebra, geometry, trig, calc, etc).

Then in 7th grade, out of nowhere we're slammed with "x". What the hell kind of number is "x"? Why have I never seen this before? Well, you can't even
10/ solve *linear* equations with integer coefficients without knowing how to manipulate fractions. So we had to wait until that was perfected to introduce algebraic thinking.

Wrong! 1st graders have *no problem* solving: Two bananas + 1 coin weigh the same as 1 banana+3 coins.
11/ Etc. We could be introducing *all* of the concepts of middle/high school math (algebra, cartesian coordinates, graphing, tangent lines, areas under curves, trig etc etc) in elementary school, just making sure the solutions of the problems are number systems the kids already
12/ know. Then when they see those concepts "for real" in 7th (or whatever) grade, it's not a "paradigm shift".

In college, they might study Z/nZ, applications to CS, crypto, etc. (Not taught in HS because the applications were discovered in 70s/80s, when curriculum was set.)
13/ In grad school, they might see p-adics. They're in many ways easier than real numbers!

General takeaway: there's so much more beautiful math we could (should?) be showing our kids. Let's not be bird brains!...
You can follow @AlexKontorovich.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled: