The Number Sense: How the Mind Creates Mathematics

  • Stanislas Dehaene
Oxford University Press: 1997. Pp.274 $25

Numbers are one of the great foundation stones of our civilization. Without them, the vast and intricate edifices of technology, science and trade could never have been built. Albert Einstein called them “the symbolic counterpart of the universe”.

Numerals have been used for almost as long as people have had agricultural surpluses to trade, at least 12,000 years. But where did the core concept of number — the idea of the ‘manyness’ of collections of things — come from? It is certainly not an easy idea. As Adam Smith wrote: “numbers are among the most abstract ideas which the human mind is capable of forming.” Did some ancient Ramanujan, sitting perhaps in a mud hut in Ur, invent them, or is a sense of number part of the way we naturally categorize the world?

The answer lies in part in understanding how the human brain deals with numbers. What makes the problem interesting, and this book timely, is that studies of numerical abilities from developmental psychology to functional brain imaging are converging on the idea that humans have inherited brain circuits specialized for numbers. It is one of the great virtues and values of the book that it draws these findings together succinctly and with verve.

Stanislas Dehaene reviews the numerical abilities of different species of animals, as revealed in ingenious laboratory experiments. There are ‘rat accountants’ that can count the correct number of shocks before responding; a clever parrot that can say the number of red objects on a tray; and primates that can add two pairs of numbers and then select the larger sum. There is no doubt that such behaviour is controlled by the numerosity of the stimuli and not by some covarying factor. But why should animals possess numerical abilities? To understand this, we need evidence, rather than speculation, about how animals use numerosity in the wild, not in the laboratory.

There are human infants as young as one day old who can distinguish visual arrays on the basis of numbers up to about four, and a few months later can add one and one, and subtract one from two. For them to do this, Dehaene points out, the ability must be innate, as the babies have not had time to learn. Dehaene is therefore arguing that the influential Swiss psychologist Jean Piaget was fundamentally wrong. Piaget believed that his experiments showed that children build the concept of number out of other, more basic, logical concepts and processes, such as transitive inference, one-to-one correspondence and an ability to abstract away from the sensory properties of the things being counted. The role of numerals and counting words was marginalized.

Dehaene is clearly angry about the influence that both Piagetians and the formalist mathematicians from the Bourbaki group have had on French education. “The reformers thought that children should be familiar with the general theoretical principles of numeration before being taught the specifics of the base-10 system. Hence, believe it or not, some arithmetic textbooks started off by explaining that 3+4 is 12 — in base 5! It is hard to think of a better way of befuddling a child's thinking.”

Work on patients with brain lesions, including Dehaene's own patients, shows that there are special areas of the brain, probably the left and right inferior parietal lobes, that could contain the inherited number circuits. They seem to carry out very basic numerical processes such as comparing the magnitude of two numbers. The left parietal lobe alone is responsible for more advanced functions, and when it is damaged patients can suffer from very specific numerical deficits, such as being unable to remember multiplication facts, despite being able to remember non-numerical facts. They may even be able to carry out other mathematical operations, including addition and subtraction. There are even reports of patients who can no longer do simple arithmetic but can still do algebra.

Dehaene thinks that these circuits do not represent numerosities as such but continuous approximations to them that he calls “analogue magnitudes”. The basic mechanism, “the accumulator”, parses the world into discrete units and stores in a memory an approximately equal quantity for each unit. The total accumulated quantity then represents the number of units detected. For Dehaene, therefore, the number sense is a kind of quantity sense. To get from these fuzzy representations to the discreteness of integers requires the uniquely human ability to create and use symbols, including number words, numerals and the idea of numerosities.

But why is the brain unable to represent discrete numerosities in the first place, as perceptual systems are designed to isolate discrete objects in the world? Indeed Dehaene, both in the book and elsewhere, is sometimes happy to talk of “numerosity detectors” that create discrete representations from stimulus information. These processes, called “subitizing”, enable even infants to distinguish at a glance, without counting, numerosities up to about four. Whether the brain represents numerosities as discrete or continuous quantities is an important issue, and its resolution will have consequences for our ideas of both brain specialization and the evolutionary history of the number sense.

Our ability to use numbers is as much a fundamental aspect of what makes us human as is our ability to use language. This book provides an excellent introduction to how this ability is organized in our mathematical brain.