(Flock of birds, from Flickr)
An image of flock of birds fades almost instantaneously from our visual memory - as Borges described memorably in his Argumentum Orithologicum. But if you get a chance to count the birds one by one, their exact number (78 in this case?) is represented by a symbol that is easy to remember and easy to manipulate. Numbers help us overcome the limitation that without some intermediate representation, none of our memory systems can represent exactly 78 objects as distinct from exactly 79.
For most people who use language to manipulate numbers, mentally representing exact quantities like 78 requires 1) knowing the number words and 2) producing the right number words - speaking them, at least in your head - at the moment you want to represent a corresponding quantity. The evidence:
- If your language doesn't have words for numbers (such as several Amazonian tribes), you make systematic estimation errors in manipulating large quantities.
- If you are experimentally prevented from using language in the same moment that you need to count (e.g. by verbal interference), you make the same kinds of errors as if you didn't have language for numbers to begin with.
1. It may be possible to prime arithmetic expressions unconsciously.
A recent paper by Sklar et al. uses a clever method called continuous flash suppression to introduce stimuli to participants while keeping them out of conscious awareness. When shown expressions like "9 - 3 - 4 = " using this method, participants were 10 - 20 ms faster to speak the correct answer (e.g., 2) when it was presented, compared to an incorrect answer. (Incidentally, an odd fact about the result is that the authors had much more trouble finding unconscious priming effects for addition than subtraction. )
I find this result very surprising! My initial thought was that participants might have been faster because they were using their analog magnitude system (indicating approximate rather than exact numerical processes). I wrote to Asael Sklar and he and his collaborators generously agreed to share their original data with me. I was able to replicate their analyses* and verify that there was no estimation effect, ruling out that alternative explanation.
So this result is still a mystery to me. I guess the suggestion is that there is some "priming" - e.g. trace activation of the computations. But I find it somewhat implausible (though not out of the question) that this sort of subtraction problem is the kind of computation that our minds cache. Have I ever done 9 - 3 - 4 before? It certainly isn't as clear an "arithmetic fact" as 2+2 or 7+3.
2. Richard Feynman could count and talk at the same time.
In a chapter from "The Pleasure of Finding Things Out," (available as an article here) Feynman recounts how he learned that he could keep counting while doing other things. I was very excited to read this because I have also noticed that I can count "unconsciously" - that is, I can set a counter going in my brain, e.g. while I hike up a hill. I can let my mind wander and check back in to find that the counter has advanced some sensible distance. But I never systematically tested whether my count was accurate.
This kind of test is exactly what Feynman set out do to. He would start counting, then begin another operation (e.g. doing laundry, walking around, etc.) and check back in with his internal "counter." He tested the accuracy of his count by measuring that he could count up to about 48 in a minute with very little variability when there was no interference. So he would do many different things while counting and check how close his count was to 48 - if there had been interference, he would be off in how far he had counted after a minute had elapsed.
The only thing he found that caused any active interference was talking, especially producing number words:
What's even more interesting is that Feynman reports that the statistician John Tukey could count and talk at the same time - by imagining the visual images of the numbers turning over. But apparently this prevented Tukey from reading while he counted (which Feynman could do!).... I started counting while I did things I had to do anyway. For instance, when I put out the laundry, I had to fill out a form saying how many shirts I had, how many pants, and so on.
I found I could write down "3" in front of "pants" or "4" in front of "shirts: while I was counting to myself but I couldn't count my socks. There were too many of them: I'm already using my "counting machine" ...
So these observations seem like they are consistent with the hypothesis that exact number requires using a particular set of mental resources, whether it's the resources of speech production (for counting out loud) or of visual working memory (for imagining digits or a mental abacus). But they, along with the Sklar et al. finding - also support the idea that the representation need not necessarily percolate up to the highest levels of conscious experience.
3. Ildefonso, a home-signer without language, learned to do arithmetic before learning to sign.
In A Man Without Words, Susan Schaller describes the growth of her friendship with Ildefonso, a deaf, completely language-less man in his 30s. Ildefonso grew up as a home-signer in Mexico and came to the US as an agricultural laborer. Over the course of a short period working with him at a school for the deaf, she introduces him to ASL for the first time. The story is beautiful, touching, and both simply and clearly written.
Here's the crazy part: Before Schaller has succeeded in introducing Ildefonso to language more generally, she diverts him with single digit arithmetic, which he is apparently able to do handily:
To rest from the fatigue of our eye-to-eye search for an entrance into each other's head, we sat shoulder to shoulder, lining up numerals in progressively neater rows. I drew an addition sign between two 1s and placed a 2 underneath. I wrote 1 + 1 + 1 with a 3 under it, then four 1s, and so on. I explained addition by placing the corresponding number of crayons next to each numeral. He became very animated, and I introduced him to an equal sign to complete the equations. Three minutes later the crayons were unnecessary. He had gotten it. I presented him with a page of addition problems, and he was as happy as my nephew with a new dinosaur book. (p. 37)It would be very interesting to know how accurate his computations were! This observation suggests that language for number may not critically rely on understanding any other aspects of language. Perhaps Ildefonso didn't even treat numerals as words at all (but instead like Gricean "natural" meanings, e.g. "smoke means fire").
All of these examples are consistent with the general hypothesis described above about the way language works to represent exact numbers. But all three suggest that our use of numbers need not be nearly as conscious or as language-like as I had thought for them to carry exact numerical content.
* With one minor difference: Sklar et al.'s error bars reflect the mean +/- .5 * the standard error of the mean (SEM), rather than the more conventional +/- 1 SEM. This is a semantic issue: the full length of their error bar is the SEM, rather than the SEM being the distance from the mean. Nevertheless, it is not standard practice.