The year was 1966, and artificial intelligence stood on the precipice of a profound transformation. In laboratories across America, machines were learning to think - or at least to create the illusion of thought. But more importantly, researchers were beginning to sense that true intelligence might require more than just clever algorithms.
At MIT, Joseph Weizenbaum had just unveiled ELIZA, a program that could mimic a psychotherapist through clever manipulation of language.1 ELIZA, named after Eliza Doolittle in the play Pygmalion, used language to produce an illusion: she elevated her digital elocution to the point where she could pass for a thinking being. Weizenbaum, a German-American computer scientist who had escaped Nazi Germany as a child, had created ELIZA as a way to demonstrate the superficiality of communication between humans and machines.
But as he watched people interact with his creation, Weizenbaum began to feel a creeping sense of unease. ELIZA, it turned out, was a little too good at her job. "I was startled to see how quickly and how very deeply people conversing with ELIZA became emotionally involved with the computer," Weizenbaum would later write. "They would ask for private time with it. They would confide in it personal problems as if the computer really understood and cared."
This visceral reaction to ELIZA would come to be known as the "ELIZA effect"2 - our tendency to anthropomorphize machines and attribute to them far greater intelligence than they actually possess. It was a phenomenon that would haunt Weizenbaum for the rest of his life, eventually leading him down a path of critique and dissent against the very field he had helped pioneer.3
Across the country at Stanford Research Institute (SRI), a different kind of artificial intelligence was taking shape. Shakey, a robot named for its jerky movements, was learning to navigate the physical world.4 Unlike ELIZA, who existed purely in the realm of language, Shakey had to contend with the messy realities of space and matter.
"We worked for a month trying to find a good name for it," recalled Charles Rosen, who headed the SRI artificial-intelligence group. "Ranging from Greek names to whatnot, and then one of us said, 'Hey, it shakes like hell and moves around, let's just call it Shakey.'"
Shakey was a mobile robot, capable of propelling itself from room to room, evading obstacles, and recovering from unforeseen circumstances. Its creation posed a different set of challenges than ELIZA's. While ELIZA grappled with the intricacies of human language, Shakey had to solve problems of perception, planning, and physical interaction with its environment.
While ELIZA and Shakey may seem quite different on the surface, they actually represented two facets of the same symbolic approach to AI that dominated the field in the 1960s. ELIZA focused on natural language processing, manipulating linguistic symbols to create the appearance of understanding. Shakey, although it had to interact with the physical world, still relied on symbolic representations and rule-based problem-solving at its core. The robot's vision system converted visual input into symbolic descriptions, which its planning system then used to make decisions. Both projects showcased the power and limitations of symbolic AI in different domains: ELIZA in conversation, and Shakey in navigation and task execution.
These two projects, so different in their approach and goals, were unknowingly setting the stage for a crucial realization in AI: that knowledge, not just reasoning, would be key to creating truly intelligent systems. This idea, still embryonic in 1966, would soon reshape the entire field.
As ELIZA conversed and Shakey explored, the researchers behind them were grappling with questions that went beyond mere programming. They were beginning to ask: What does it really mean to know something? How do we represent and use knowledge in intelligent ways? And how do we imbue machines with the vast, intricate web of information that humans take for granted?
And as these machines were learning to think and move, the world around them was in tumult. In China, Mao Zedong unleashed the Cultural Revolution, a tsunami of ideological fervor that would reshape Chinese society.5 Red Guards, many of them students barely older than ELIZA or Shakey, took to the streets, their Little Red Books held aloft like talismans against the old order.
In laboratories across the globe, scientists were decoding a different kind of language. MIT biochemist Har Gobind Khorana put the finishing touches on his work deciphering the DNA code, unraveling the genetic instructions that shape all life. As Khorana peered into the molecular dance of nucleotides, he was, in a sense, engaged in a conversation as profound as any ELIZA could muster - a dialogue with life itself.6
In the realm of space exploration, the Soviet spacecraft Luna 9 made the first soft landing on the Moon, beaming back photos that revealed a desolate, cratered landscape. These grainy images, transmitted across the vast emptiness of space, were a stark reminder of the distances we had yet to traverse, both technological and philosophical.7
Back on Earth, the distances between people seemed to be growing just as vast. In Vietnam, the war escalated, its brutal logic defying the neat algorithms of ELIZA's programmed responses or Shakey's problem-solving routines. The Black Panther Party was founded in Oakland, California, its members asserting a pride and power that no computer could comprehend. And in Mississippi, James Meredith embarked on his "March Against Fear," a solitary walk that would become a rallying cry for the civil rights movement.8
As these events unfolded, popular culture was grappling with its own visions of the future. The television series Star Trek debuted, capturing the imagination of millions with its vision of a future where technology and humanity coexisted in (mostly) harmony.9 As Captain Kirk and Mr. Spock explored strange new worlds, they grappled with questions of ethics, identity, and the nature of intelligence - questions that Weizenbaum and the creators of Shakey were beginning to ponder in their own work.
The Beatles, meanwhile, released Revolver, an album that pushed the boundaries of what was possible in popular music. Songs like "Tomorrow Never Knows" explored altered states of consciousness, hinting at modes of perception and understanding far removed from the binary logic of computers.10
As 1966 drew to a close, the creators of ELIZA and Shakey found themselves grappling with unexpected questions. Weizenbaum was increasingly troubled by people's willingness to open their hearts to a machine that could not truly understand them. The team at SRI, on the other hand, was learning that the physical world posed challenges that went far beyond the realm of pure logic.
Both projects revealed the gap between human intelligence and its artificial counterpart. ELIZA showed how easily we could be fooled by the illusion of understanding, while Shakey demonstrated the enormous complexity involved in navigating the physical world - a task that humans perform effortlessly.
The year 1966 was a crucible of change, a moment when the old certainties were crumbling and new paradigms were struggling to be born. From the cultural upheavals in China to the scientific breakthroughs in genetics, from the first tentative steps on the lunar surface to the bold strides of the civil rights movement, it was a year that defied easy categorization or understanding.
And there, in the midst of it all, sat ELIZA and Shakey - one a disembodied conversationalist, the other a clumsy physical explorer. Together, they embodied the promise and the limitations of artificial intelligence, holding up a mirror to humanity's hopes, fears, and deepest longings.