Posted on November 9, 2011

Lightbulb Moment Shows Intelligence Can Be Learnt

Jennifer Oriel, The Australian, November 9, 2011

Iintelligence is higher education’s secret vice. Everyone wants more of it, but no one wants to acknowledge it exists.

A 2009 survey of Cambridge University students revealed that 10 per cent were taking drugs to boost their brainpower and recent studies across US campuses have found a quarter of students high on artificial intelligence.

While recent revelations from neuroscience illustrate intelligence is created through a constant interchange between genes and environment, the IQ wars continue to make scientific exploration of the brain controversial.

Human intelligence was once perceived as a fertile field of scholarly inquiry. French psychologist Alfred Binet created intelligence testing in the early 20th century to effect changes in education, assisting students with disabilities to learn. However, his tests were soon transformed by Stanford University psychologist Lewis Terman into quantitative IQ measures to examine mass populations and prosecute a eugenicist agenda.

By the 1960s, IQ tests were being used to support claims of fixed variations in intelligence based on sex, class and race, research that was revived in the 1990s with the publication of The Bell Curve, by psychologist Richard Herrnstein and political scientist Charles Murray.

Despite mounting public furore in the early 21st century, key educational and scientific figures maintained that IQ was a decisive factor in the relative educational and economic success of human populations. In 2005, then-president of Harvard University Lawrence Summers resigned amid a storm of controversy after claiming that sex-based differences in intelligence, rather than political discrimination, were the cause of women’s under-representation in science.

Two years later, Nobel Prize-winning geneticist James D. Watson was suspended as the chancellor of Cold Spring Harbor Laboratory when he claimed that IQ tests had proven the existence of race-based differences in intelligence.

The journal Nature captured the furore over academic research into intelligence with a series of articles in 2009 debating whether it should be permitted.

Those seeking an end to the research pointed to the history of science being used to justify crimes against humanity, such as in the eugenicist policies of Nazi Germany.

As the IQ debate raged past its century mark, neuroscientists were applying advanced technologies to the question of whether and how the brain makes intelligence. Neural-imaging techniques that measure the structure and activity of the brain led to the discovery of two types of intelligence–one crystallised, one fluid, and both variably influenced by interactivity between genes and the environment. PET scanning also revealed the brain changes its structure in response to external stimuli, corroborating the theory of neuroplasticity.

In the IQ wars, the brain had emerged triumphant and it was a lot smarter than the arguments raging in its name.

The effort to understand intelligence has gained a more popular following during the past two decades because of research illustrating its relationship to educational and life outcomes. University of Edinburgh psychologist Ian Deary and colleagues revealed in Nature Reviews that general intelligence is strongly predictive of social mobility, health and life expectancy. But it is the field of neuroscience that has produced the most remarkable findings about humanity’s intellectual capacity.

Neurologists Rex Jung from the University of New Mexico and Richard Haier of the University of California were among the first researchers to adapt brain-imaging technology to the study of intelligence. By combining MRI scans to study brain structure and PET scans to observe what happens inside it as we think, Jung and Haier deduced that intelligence arises from the density of white and grey matter in certain regions combined with the strength and efficiency of neural connectivity–or how well the different parts of the brain communicate.

Haier has proposed that neuroimaging should be used in education to assess student aptitudes, providing for pedagogy and curriculum adapted to observable cognitive strengths. But a corollary of neuroimaging research, neuroplasticity, reveals that the brain students have when they begin a course of education may be markedly different to the one in their heads at graduation.

The most well-known advocate of neuroplasticity is psychiatrist Norman Doidge, whose bestseller The Brain That Changes Itself proposed that human consciousness had a direct imprint on genetic expression. Doidge’s work revealed that repeated exposure to positive stimuli created neural pathways that supported healthier brains and behaviour while exposure to negative stimuli such as addictive substances wired behaviours into the brain, such as drug addiction.

Rutgers University neuroscientist April Benasich’s work on infant learning has advanced neuroplasticity research by introducing neuroeducation to improve the cognitive function of infants with developmental delays. She has found that, when detecting something new, the brain’s neural circuitry is interrupted quickly as it seeks to recognise and comprehend the stimulus. Neuroimaging has shown that the brain literally lights up as it learns. The lightbulb moment is more than metaphor, as it turns out.

Benasich’s belief in the interactive nature of cognition has led to a pioneering form of neuroeducation in which infants offered dual screens in cribs learn to respond more rapidly to alternating downloaded images and code faster reaction times into their brains. This acquired ability improves response to visual cues and, therefore, reading ability.

Benasich says: “My research firmly supports not only the existence, but the critical importance of, neuroplasticity . . . the development of intelligence proceeds in an intensively interactive way that is constantly changing and is influenced by genetic expression, temperament, as well as previous environmental influences.”

The realisation that intelligence can be learned is appealing, especially given the demands on the human mind in an increasingly interactive and globalised world. Science and technology have transformed our understanding of human potential with computer screens beaming a spectacular synaptic dance in the brain as it navigates regions to acquire and make meaning from knowledge.

It took science to reveal that the beauty and diversity of the brain as it learns how to be intelligent are not only philosophical, but also physical. The questing human mind is a symphony in sotto voce.