Thought patterns

The “cognitive revolution” may not have represented a paradigm shift, but it forever changed how we think about thinking

Illustration by Ben Coy

Illustration by Ben Coy

The mind wasn’t always held in high regard.

For the first half of the 20th century, thinking in experimental psychology was dominated by behaviourism. All researchers needed to do was show that behaviours were responses to surrounding environments. In such explanations, mental states were superfluous.

“The objection to inner states is not that they do not exist, but that they are not relevant in a functional analysis,” wrote B.F. Skinner, the most extreme and famous of the behaviourist bunch.

In line with this thinking, he created the “Skinner box,” which allowed researchers to train animals how to respond to a given stimulus. Within this tightly controlled chamber, rats learnt how many times they needed to press a lever to get a pellet of food, and pigeons could be trained to recognise themselves in mirrors — the classic (though problematic) way of testing whether an animal has self-awareness.

But such a simplistic interpretation of the mind wasn’t to last.

In September 1956, George Miller made the short trek from Harvard to MIT for a symposium on information theory — a mathematical framework for measuring how information is encoded and transmitted. Miller was there to present his findings about the limitations of human memory. But he wasn’t the only one applying the mathematical theory to the human mind: others there had used it to study vision, language, and the brain.

“I left the symposium with a conviction, more intuitive than rational, that experimental psychology, theoretical linguistics, and the computer simulation of cognitive processes were all pieces from a larger whole,” Miller later wrote.

The behaviourist B.F. Skinner (right) at Lausanne Collegiate School in 1964. Lausanne Collegiate School Archives/Wikimedia Commons (CC BY-SA 4.0)

The behaviourist B.F. Skinner (right) at Lausanne Collegiate School in 1964. Lausanne Collegiate School Archives/Wikimedia Commons (CC BY-SA 4.0)

That larger whole would eventually be forged through what’s known as the “cognitive revolution.” To Miller and his intellectual ilk, the mind not only mattered but also processed a vast amount of information, which was worthy of being studied in and of itself. Behaviourism endured, but by the 1970s, cognitive psychology was flourishing alongside it. So was cognitive science, a field that blended together elements from psychology, philosophy, linguistics, computer science, anthropology, and neuroscience.

In the late 1940s, when Skinner was putting rats and pigeons in boxes, Miller came across a new paper from the mathematician Claude Shannon outlining information theory. At its core, Shannon’s theory concerned itself with signals, or messages, sent from one point to another. These could be communicated via wires, radio waves, or even beams of light. The information they carried could be measured in discrete units, called bits.

Through this lens, Miller saw explanations for how speech perception worked — a problem that had long eluded him. “My life was never again the same,” he wrote in a brief autobiography.

At first, Miller tried the theory on, what else, behaviour, as it was still the theory du jour of the time. Along with his colleague Frederick Frick, he showed it was possible to figure out how likely a certain response was, given all of the responses that had come before.

By 1956, however, he had immersed himself in the study of short-term memory. You may have heard of this one. He reported that humans could temporarily store as many as seven discrete chunks of information, plus or minus two. This was true regardless of whether people had been asked to remember digits, decimals, letters, letters plus decimals, or words.

But there was a workaround to this limit. If what mattered was the number of mental chunks, and not the number of bits of information, then perhaps we could enlarge our memory span by increasing the number of bits per chunk. Indeed, when Miller asked participants to remember 18 digits grouped by pairs, they had no problem reciting back the 9 chunks. Together, the results cast the mind as a bona-fide information-processor, not unlike computers of the era.

Other studies pointed to the same idea. That same year, Jerome Bruner, Jacqueline Goodnow, and George Austin published their landmark work on how humans construct concepts. In learning about which things belong together and which do not, people seemed to test hypotheses based on previously helpful information. But they were quirks to their learning. For example, they showed a bias for events that confirmed their hypotheses rather than those that refuted them.

To Miller, these changes to the psychology’s focus culminated with the gathering of him and his like-minded colleagues at MIT. A new era of the mind had been ushered in.

 
A painted portrait of Claude Shannon, the 'father of information theory' photographed at the Abode of Chaos near Lyon, France. Thierry Ehrmann/Flickr (CC BY 2.0)

A painted portrait of Claude Shannon, the 'father of information theory' photographed at the Abode of Chaos near Lyon, France. Thierry Ehrmann/Flickr (CC BY 2.0)

 

As with plate tectonics, the formative years of cognitive psychology were those during which Thomas Kuhn’s The Structure of Scientific Revolutions was becoming popular. Perhaps not surprisingly, some cognitive psychologists described the birth of their field in Kuhnian terms.

“Now information-processing psychology is an established paradigm, and it guides the vast bulk of psychological research in human cognition. Our revolution is complete, and atmosphere is one of normal science,” wrote psychologists Roy Lachman, Janet Lachman, and Earl Butterfield in their 1979 book Cognitive Psychology and Information Processing.

When I was an a psychology undergraduate, I was well-aware of the cognitive revolution. Upon hearing the phrase for the first time, I immediately thought of the shift experienced by physics as it left behind Newton’s thinking and embraced that of Einstein.

But this period’s name contradicts its true nature. John Greenwood, a historian and philosopher of psychology at the City University of New York, has argued that the cognitive revolution was not in fact a revolution — at least, not the kind described by Kuhn.

“In a scientific revolution, people try to explain the same things,” says Greenwood. “Einstein explains the things that Newton’s theory explains better than Newton. It covers everything Newton’s covers and resolves its problems.”

Like behaviourists, cognitive psychologists tried to explain behaviour. But they were just as concerned, if not more so, with explaining cognition. They asked questions about how cognition worked — like “did cognition work like a regular computer?” Or “do we process information in parallel or in series?” — independent of how it shaped behaviour.

“Scientific revolutions are about theoretical paradigms, and you just didn’t have one for behaviourism,” Greenwood tells me. Cognitive psychologists were hardly “reacting to the inadequacies of behaviourist explanations”; they were inspired by innovations happening outside the strict confines of their discipline.

Greenwood isn’t alone in his position. By analysing interviews conducted with key figures from the cognitive revolution, a 2003 study from William O’Donohue, Kyle Ferguson, and Amy Naugle concluded that there was no evidence that psychology witnessed a scientific revolution. The interviewees neither spoke of a critical mass of anomalies amassed during behaviourism’s reign, nor did they see cognitive psychology as a better paradigm that could account for such anomalies.

Miller himself had said “I wouldn’t use words like ‘revolution’”, and an anonymous source told them: “If there was any negative aspect of [behaviourism] that contributed to these developments (but never determined them) it was the lack of attention to major social questions, to complex human behavior and to any kind of innovative theory”.

In Structure, Kuhn himself argued that psychology had yet to become a mature science, equipped with a real paradigm. Without a starting paradigm, it’s hard to see how there could have been a shift between two.

Still, the legacy of the cognitive revolution is hard to ignore. Last this writer checked, Miller’s memory study has been cited 28,192 times. Beyond cognitive psychology and cognitive science, there are now cognitive anthropology, cognitive linguistics, cognitive neuroscience, and cognitive neuropsychology. Read through a modern-day scientific paper and you may at once learn about experiments in a Skinner box, and the mechanisms of reward processing.

Greenwood sums up this lasting influence in his history of psychology: “One thing is certain. The cognitive revolution is an ongoing revolution, and theories of cognitive processing continue to be developed in creative and fertile ways.”

Edited by Diana Crow and Tessa Evans