What Is Intelligence?
And is intelligence overrated?
Posted Nov 28, 2018
[Article updated on 17 June 2019.]
There is no agreed definition or model of intelligence. By the Collins English Dictionary, intelligence is ‘the ability to think, reason, and understand instead of doing things automatically or by instinct’. By the Macmillan Dictionary, it is ‘the ability to understand and think about things, and to gain and use knowledge’.
In seeking to define intelligence, a good place to start might be with dementia. In Alzheimer’s disease, the most common form of dementia, there is disturbance of multiple higher cortical functions including memory, thinking, orientation, comprehension, calculation, learning capacity, language, and judgement. I think it significant that people with dementia or severe learning difficulties cope very poorly with changes to their environment, such as moving into a care home or even into an adjacent room. Taken together, this suggests that, at its broadest, intelligence refers to the functioning of a number of related faculties and abilities that enable us to respond to environmental pressures. Because this is not beyond animals and even plants, they too can be said to be possessed of intelligence.
We Westerners tend to think of intelligence in terms of analytical skills. But in a close-knit hunter-gatherer society, intelligence might be defined more in terms of foraging skills or social skills and responsibilities. Even within a single society, the skills that are most valued change over time. In the West, the emphasis has gradually shifted from language skills to more purely analytical skills, and it is only in 1960, well within living memory, that the Universities of Oxford and Cambridge dropped Latin as an entry requirement. In 1990, Peter Salovey and John Mayer published the seminal paper on emotional intelligence, and E.I. quickly became all the rage. In that same year, Tim Berners-Lee wrote the first web browser. Today, we cannot go very far without having some considerable I.T. skills (certainly by the standards of 1990), and computer scientists are among some of the most highly paid professionals. Therefore, what constitutes intelligence varies according to our priorities and values.
Contemporary society holds analytical skills in such high regard that some of our political leaders cite their ‘high I.Q.’ to defend their more egregious actions. This Western emphasis on reason and intelligence has its roots in Ancient Greece with Socrates, his pupil Plato, and Plato’s pupil Aristotle. Socrates held that ‘the unexamined life is not worth living’. He typically taught by the dialectic or Socratic method, that is, by questioning one or more people about a particular concept such as courage or justice so as to expose a contradiction in their initial assumptions and provoke a reappraisal of the concept. For Plato, reason could carry us far beyond the confines of common sense and everyday experience into a ‘hyper-heaven’ (Greek, hyperouranos) of ideal forms. He famously fantasized about putting a geniocracy of philosopher kings in charge of his utopic Republic. Finally, Aristotle argued that our distinctive function as human beings is our unique capacity to reason, and therefore that our supreme good and happiness consists in leading a life of rational contemplation. To paraphrase Aristotle in Book X of the Nicomachean Ethics, ‘man more than anything is reason, and the life of reason is the most self-sufficient, the most pleasant, the happiest, the best, and the most divine of all.’ In later centuries, reason became a divine property, found in man because made in God’s image. If you struggled with your SATs, or thought they were pants, you now know who to blame.
As I argue in my new book, Hypersanity: Thinking Beyond Thinking, the West’s obsession with analytical intelligence has had, and continues to have, dire moral, political, and social consequences. Immanuel Kant most memorably made the connection between reasoning and moral standing, arguing (in simple terms) that, by virtue of their ability to reason, human beings ought to be treated, not as means to an end, but as ends-in-themselves. From here, it becomes all too easy to conclude that, the better you are at reasoning, the worthier you are of personhood and its rights and privileges. For centuries, women were deemed to be ‘emotional’, that is, less rational, which justified treating them as chattel or, at best, second-class citizens. The same could be said of non-white people, over whom it was not just the right but the duty of the white man to rule. Rudyard Kipling’s poem The White Man’s Burden (1902) begins with the lines: Take up the White Man’s burden/ Send forth the best ye breed/ Go bind your sons to exile/ To serve your captives’ need/ To wait in heavy harness/ On fluttered folk and wild/ Your new-caught, sullen peoples/ Half-devil and half-child.
People deemed to be less rational—women, non-white people, the lower classes, the infirm, the ‘deviant’—were not just disenfranchised but dominated, colonized, enslaved, murdered, and sterilized, in all impunity. Only in 2015 did the U.S. Senate vote to compensate living victims of government-sponsored sterilization programmes for the, I quote, ‘feeble-minded’. Today, of all people, it is the white man who most fears artificial intelligence, imagining that it will usurp his status and privilege.
According to one recent paper, I.Q. is the best predictor of job performance. But that is not altogether surprising given that ‘performance’ and I.Q. have been defined in similar terms, and that both depend, at least to some extent, on third factors such as compliance, motivation, and educational attainment.
Genius in contrast is more a matter of drive, vision, creativity, and luck or opportunity, and it is notable that the threshold I.Q. for genius—probably around 125—is not all that high. William Shockley and Luis Walter Alvarez, who both went on to win the Nobel Prize for physics, were excluded from the Terman Study of the Gifted on account of… their unremarkable I.Q. scores.
For the story, in later life Shockley developed controversial views on race and eugenics, setting off a debate over the use and applicability of I.Q. tests.
Salovey P & Mayer JD (1990): Emotional intelligence. Imagination, Cognition and Personality 9(3):185–211.
Rees MJ & Earles JA (1992): Intelligence is the Best Predictor of Job Performance. Current Directions in Psychological Science 1(3): 86-89.
Saxon W (1989): Obituary William B. Shockley, 79, Creator of Transistor and Theory on Race. New York Times, August 14, 1989.