‘If I believe something then I’ll say it,” Nobel laureate geneticist James Watson recently warned the Times of London’s Charlotte Hunt-Grubbe. And sure enough, the 79-year-old Watson, who co-discovered the structure of DNA in 1953, proceeded to shoot from the lip, bringing his stellar career to a crashing end in the process.
Watson, who is as famous for making controversial comments as he is for his science, said, among other things, that he’s “inherently gloomy about the prospect of Africa” because “all our social policies are based on the fact that their intelligence is the same as ours—whereas all the testing says not really.”
Not content with impugning an entire continent, Watson drove the matter home, remarking that while he hopes everyone is equal, “people who have to deal with black employees find this is not true.”
The reaction to Watson’s latest tirade was swift: His British book tour—to promote his latest missive, the aptly titled Avoid Boring People—was abruptly cancelled, and he was suspended from his position as chancellor of New York’s Cold Spring Harbor Laboratory.
Upon returning to the United States, Watson issued an apology, saying “I cannot understand how I could have said what I am quoted as having said. There is no scientific basis for such a belief.”
But the damage having been done, Watson voluntarily retired from the laboratory, which had become a world-class research institution under his leadership. And so the career of a man who had spent more than half a century in the scientific spotlight finally went dark.
While many people applauded these developments, others suggested that Watson was the victim of political correctness. Watson, the latter group argued, is just the latest casualty in an environment that doesn’t allow people to ask dangerous questions—or test dangerous hypotheses, such as ones involving racial difference—even in a university.
Now I’m all for testing any hypothesis, no matter how offensive it might seem, provided the hypothesis is conceptually clear. But hypotheses involving intelligence and race are inevitably hampered by the fact that intelligence and race are themselves unclear concepts.
The concept of intelligence—a complex collection of cognitive capacities—has, for example, been the subject of controversy for many years in many fields. While we still have much to learn, we do know that intelligence involves probably half of all of our genes, which interact with each other and with the environment in myriad ways. But we don’t know all the ways this occurs, as our knowledge is still in its infancy.
Consequently, the practice of testing intelligence has been even more controversial. Some critics have even charged that IQ tests measure nothing more than one’s ability to do well on IQ tests; while that’s probably an overstatement, it reveals the deep distrust of our methods for measuring intelligence.
The concept of race presents even greater difficulties. The debate between “race realists”—those who believe that the concept of race picks out something real in nature—and “social constructionists”—those who argue that race is created by our way of categorizing nature—shows no signs of abating, which isn’t surprising since it’s a species of the ancient, unresolved philosophical debate between realism and nominalism.
Although social constructionism is now associated with postmodernism, which is often anti-scientific, most scientists and scientific organizations have for some time accepted that race is a product of our way of grouping people, not something that exists independently of social classification. Or, as a 2001 New England Journal of Medicine editorial bluntly put it, “race is biologically meaningless.”
The scientific community came to this conclusion, not under the influence of the airy philosophy of the postmodernists, but as a result of evidence from genetics, the very field Watson helped popularize. Perhaps the single most influential figure, though, was former Harvard geneticist Richard Lewontin.
In 1972, Lewontin published a groundbreaking analysis that showed the vast majority of genetic variation occurs within races, while very little variation serves to distinguish the races. More recent analyses have largely replicated Lewontin’s results, though they have shown that intra-race variation is even greater than Lewontin thought.
Looking at this in a slightly different way, geneticists have long claimed that all human beings, regardless of race, share about 99.9 per cent of their functional genes. And this suggests that, despite obvious differences in how members of different races look, race really must be only skin deep.
Curiously though, recent evidence from genetics seems to have undermined these arguments. With the decoding of a complete human genome earlier this year, scientists have now concluded that we’re not as similar as we thought we were—people share perhaps 99 per cent, rather than 99.9 per cent, of their genes. (The complete genomes of two individuals are currently available on the Web—that of geneticist Craig Venter and that of, you guessed it, James Watson.)
Although Venter still maintains that race is a social fiction, some race realists have suggested that this new evidence confirms that race is indeed a biological reality. To bolster their argument, they note what is undeniably true—that even extremely small genetic differences can result in huge differences in how an organism looks or behaves.
So does this mean that race is a biological reality? To be sure, there are many small genetic differences among human populations, especially those that evolved in isolation from each other. (Watson makes this point in Avoid Boring People; had he left it at that, he would never have gotten himself into trouble.)
But as Kenan Malik, the author of The Meaning of Race, and probably the smartest person writing on race today, notes, “the real debate about race is not whether there are any differences between populations, but about the significance of such differences.”
Indeed, biological differences will always exist, but it will be up to us to give meaning to those differences, to decide whether, and in what context, they are significant.
For example, biological differences might be significant in medicine, in that certain populations are at increased risk of developing certain diseases. Any many race realists are physicians who fear that dropping the notion of race will make diagnostic medicine more difficult.
Yet race is a poor predictor of susceptibility to disease. Malik notes that in North America, sickle cell anemia is seen as a black disease. However, while relatively common in equatorial Africa, it’s rare in South Africa, and relatively common in southern Turkey and central India. In other words, the disease doesn’t fit our socially constructed view of the races—it is, rather, common among groups that live in areas with a high incidence of malaria, regardless of what race they are.
Similarly, while different populations might well display different cognitive capacities, there’s no reason to believe those differences will conform to our social view of races. And it’s absurd to group together all the populations of a continent, especially the continent of Africa, whose populations contain all the genetic variation that exists in humanity.
Our folkloric way of dividing up the races—into white, black, Asian and so forth—therefore has little basis in genetics, which spells trouble for those who wish to connect race with intelligence. In fact, genetics, far from supporting the social construction of races, might well serve to deconstruct them.
After all, the ideal in both medicine and morality has always been to treat people as individuals. But until recently—until the decoding of Watson’s and Venter’s genomes—that wasn’t possible. Now, geneticists suggest that it will soon be possible to decode any individual’s genome relativity easily and cheaply.
And this means that we will no longer have to group people together on the basis of their race or any other factor. Genetics will render race superfluous. And of all the people we’ll have to thank for that, none will be more important than—you guessed it again—James Watson.