One of the problems with behavioural science is that for generations its glossaries have been leaky, and some of its worst coinages have seeped into general circulation. When these reach the ears of 'real' scientists, the sniggers can be heard across the quadrangle. Of course, physicists have had their aether and chemists their phlogiston. But solecisms from behavioural science somehow seem funnier, and harder to get rid of.

Take anal fixation, for instance. Sigmund Freud invented it in the early twentieth century, and 'anal' still commonly means too orderly, too conscientious, or tending to collect and hoard. Freud's claim was that all children pass through an anal stage of development — roughly, toddlerhood — when faeces are fascinating and their control a goal. By curious dynamics, these concerns morph into mature adult motives — normal cleanliness or organization. But if your child becomes 'fixated' at the anal stage, watch out. You could end up with an accountant or a hygienist.

Not to worry, though. There is little evidence that a process akin to fixation ever occurs in childhood, least of all in the anal realm. And if we sometimes link coins and ca-ca in our minds, it is probably because Freud's phrase has long since become folklore. Fortunately for civilization, orderly people do exist. The trait runs in families, so it is probably partly genetic, although it undoubtedly also rests on patterns of child-rearing. But potties and dirty nappies? Hardly.

Burrhus Skinner: considered the brain as an effectively empty 'black box'. Credit: BETTMANN/CORBIS

If Freud invented a mental process out of whole cloth, the US psychologist B. F. Skinner went to the other extreme with his non-view of the brain. He and his followers grandly ignored brain science, which had become pretty heavy sledding. But some kind of justification for ignoring it was needed, because the brain is, after all, the organ that generates behaviour. So the brain became a black box, and for all intents and purposes an empty one. It didn't faze the behaviourist, because that paragon of rigour measured input and output, and discerned laws relating the two. As the laws of learning were considered to be universal, the stuff inside the skull was paradoxically best understood by ignoring it.

Alas for simplicity, the laws were not universal. Even rats learned some things better than others, linking odours to nausea much more easily than to shock, for example. Species also did a lot of phylogenetic backsliding in the elegant learning paradigms, misbehaving badly according to their instinctual lights — animals can be trained to exchange coins for food, but pigs will push them around with their snouts, raccoons rub them together and chickens peck at them. This 'instinctive drift' distracts them from the task they have learned. Worse, reward itself turned out to be a brain function, drawing erstwhile behaviourists deep into the black box to find out what this pleasure thing was. Now geneticists with their knockouts are picking learning apart at the macromolecular level — and, sure enough, genes for learning do exist.

Meanwhile, social-cognition theorists have come up with a phrase inferential enough to make one almost long for the black-boxers: theory of mind. Freud sought one, Skinner assiduously didn't, and most people don't bother to ask themselves whether they have or need one. Yet there is serious debate as to whether chimpanzees or four-year-olds have a theory of mind. Closely inspected, the phrase seems to mean something like perspective-taking or, when mutual, intersubjectivity. True, a four-year-old can see and act on another person's perspective whereas most three-year-olds can't.

This is fascinating stuff and something we need to understand. But a term such as 'theory of mind' simply stands in the way. It makes for catchy article titles but conveys no meaning. Is the maturing orbitofrontal cortex newly able to calm an impulsive and self-centred limbic circuit? Is there a downregulation of some neurotransmitter receptor, allowing a younger form of social mirror-imaging to grow into identification and parallel perspectives? As long as we are playing with pretty word-coins that substitute for brain functions, we will never know.

Social scientists, too, have their bad words, the worst probably being the organic view of society. Here, instead of a turning away from biology, we have an attempt to embrace it. But this inappropriate metaphor ends up keeping natural science at bay. Society is no organism, people are not cells. Cells, unless they are sperm, eggs or cancerous, cannot even partly secede from the organism, whereas individuals are constantly reappraising their membership in society. Many secede, and those who stay cooperate very imperfectly, because cooperation is not their main goal. What are called social pathologies are not derangements of an ideal, harmonious unit, but by-products of the normal flux of a gaggle of individuals going their own way. This is so because evolution by natural selection decrees it, and most social scientists have not come to terms with the fact that people are evolving animals.

This is not to say that biologists have all the answers. Understanding must come at every level of integration, each with its own laws. Physics cannot explain biology, and complexity theory tells us that we will not get the answers we need about mind and culture merely by reasoning upward from below. Still, you can't have a science that doesn't make some sort of integrative peace with the neighbouring, more fundamental sciences in the loose but still meaningful hierarchy of nature. We social scientists should stop our silly word-play and hit the biology books.