Michael
Kanellos |
The foundation of modern computing was something of an accident.
The Intel 4004 Microprocessor, which debuted thirty years ago Thursday, sparked a technological revolution because it was the first product to fuse the essential elements of a programmable computer into a single chip. |
Since then, processors have allowed
manufacturers to embed intelligence into PCs, elevators, air bags,
cameras, cell phones, beepers, key chains and farm equipment, among other
devices. But that's not the way the story was supposed to turn out. The 4004 was designed to be a calculator component for a Japanese manufacturer, which initially owned all rights to the chip. At the time, most Intel executives saw little promise in the product. The microprocessor's transformation of Intel and other PC-centric companies into titans of industry instead came through clever bargaining, some fortuitous design decisions and chance. "I think it gave Intel its future, and for the first 15 years we didn't realize it," said Intel Chairman Andy Grove. "It has become Intel's defining business area. But for...maybe the first 10 years, we looked at it as a sideshow. It kind of makes you wonder how many sideshows there are that never become anything more." In the past 30 years, of course, microprocessors and microcontrollers (embedded microprocessors with integrated components) have become ubiquitous. In 2000 alone, 385 million microprocessors were shipped and 6.4 billion microcontrollers went out factory doors, according to Mercury Research. "It is not an exaggeration to say that the microprocessor has made a fundamental impact on everyone's life in this country," said Linley Gwennap, principal analyst at The Linley Group. "Before the microprocessor, computers were these huge things...that filled up a room or at least were file cabinet size." |
The chip trio The 4004 was essentially the brainchild of three engineers: Ted Hoff, Stan Mazor and Federico Faggin. In April 1969, Busicom, a Japanese calculator manufacturer, contracted with Intel, then specializing in memory, to develop a series of custom chips for five upcoming machines. The concept had been considered inevitable; the difficulty lay in how to do it. Mazor, a former Fairchild Semiconductor engineer, joined Hoff to develop a design. Economically, a single chip was imperative. Busicom's original
specifications "would have taken about 16 different chips,"
recalled Les Cost-conscious Intel also required that the calculator chip fit into
the same 16-pin package the company used on its memory products. Pins, the
metallic channels on a pin package, serve as conduits for electrical
signals.
"We were very careful in being minimalistic," Mazor said.
"Management wasn't too interested in (the 4004). We got into the
computer business more or less by mistake."
After Hoff and Mazor completed the conceptual architecture, Intel's
Vadasz lured Faggin from Fairchild in April 1970 to construct the chip.
Like Hoff, Faggin had already established a reputation within the
industry. He had developed silicon gate technology, which allowed
designers to drop aluminum transistor gates, which were far larger and
harder to control.
Silicon gate technology "was smaller, faster, more reliable,
cheaper. What more do you want?" Faggin said.
To this day, disagreements swirl over who deserves the most credit for
the 4004. The architecture guaranteed the chips would work, said Mazor,
calling Faggin "the guy who stayed up all night and tested them to
see if they worked."
For his part, Faggin said that "anybody with a college degree
could design an instruction set," a fundamental part of Hoff and
Mazor's work in 1971--an opinion shared by some analysts. Mazor even
admits that he and Hoff borrowed liberally from IBM and Digital
instruction sets. Vadasz, who had a bitter falling out with Faggin in the
1970s, credits Hoff because he came up with the necessary creative
conceptual leaps.
In any event, deadlines had already become a crisis. On Faggin's second
day on the job, Masatoshi Shima, a Busicom engineer, arrived to check on
the project's progress. No work had been done since December. Shima hit
the roof.
"It was very close" to falling apart, Faggin recalled.
"It took me the best part of one week to calm him down."
Nonetheless, Busicom granted an extension to the contract.
Fourteen-hour workdays for Faggin and three drafting assistants followed.
Unlike current designers, who use high-end workstations to design
circuits, Faggin's team laid out circuit patterns with razor-thin strips
of rubylith, designing tape now considered archaic even by newspaper
layout rooms.
While the 4004 became the first microprocessor, Intel's total package
consisted of four chips: the 4001, a read-only memory (ROM)
chip for storing software; the 4002, a random access memory (RAM)
chip for data storage; and the 4003, an input-output device. By October,
working samples of the 4001 had been produced--a milestone.
"Before that time, I was under a lot of stress because I didn't
know if there were any 'gotchas,'" Faggin said.
Despite early success, the first batch of 4004 chips didn't work--a
quick look through a microscope showed the manufacturing team had
forgotten a crucial step. The memory still prompts a big laugh from
Faggin. Although the delays angered Busicom, the extension handed Intel its
first fortunate twist of fate. Some Intel insiders began to comprehend the
power of the invention, assisted by pushing from the three inventors.
Intel founder Bob Noyce, for instance, started to question whether the
4004 had broader implications, recalled Vadasz.
Meanwhile, the calculator business had become more cutthroat. By the
time Intel finished the 4004, Busicom wanted a discount. Intel made a
counteroffer: It would drastically cut the contract price if Busicom would
grant Intel a license to freely sell the chip outside the calculator
market. Busicom agreed.
Whoops. |
Mixed reaction An article in ElectronicNews heralded the release of the 4004. It processed 4 bits of data at a time, ran at 108 kilohertz (a tenth of 1 megahertz) and could perform mathematical calculations. It cost less than $100. Gordon Moore, Intel's CEO at the time, hailed it as "one of the most revolutionary products in the history of mankind." Others were less excited. "It was interesting, but it certainly wasn't perceived as a threat," said Nathan Brookwood, a processor analyst who was at that point working at Digital Equipment, the then-reigning titan in mini-computers. Years later, many still failed to grasp the concept. In 1975, a senior engineer at DEC told Brookwood that Intel would "never be a threat...That was the conventional wisdom in the mini-computer business in the mid-1970s to late 1970s." In April 1972, Intel released the 8008, which could process data in 8-bit chunks. Negotiations once again worked to Intel's advantage. The 8008 chip was designed for Datapoint, a terminal manufacturer in Texas that couldn't pay for it at the end of the contract. To settle, Datapoint granted Intel the rights to the chip, including the instruction set, which Datapoint developed. The instruction set eventually became part of the basis for the X86 architecture behind Intel chips today. "The irony is that the original instruction set was theirs, and the original motivation was theirs," Mazor said. The breakthrough moment for microprocessing came in 1974, according to many, with the 8080 processor. Not only did the chip feature a more complex instruction set, it came in a package with 40 pins, two innovations that greatly expanded its capabilities. "With 4-bit processors, the level of complexity is minimal," said Dean McCarron, principal analyst at Mercury Research. "The 8080 was a home run." |
So why Intel? By this time, though, competitors such as RCA, Honeywell and Fairchild had come out with microprocessors, many of which, such as Motorola's 6800 family, provided superior performance. Zilog, whose engineers included Faggin and former Busicom engineer Shima, received rave reviews for its Z80 processor. So how did Intel emerge as the victor? For one, the company strove to ensure that
adoption was as easy as possible. Along with chips, Intel sold complete
development systems to industrial designers to seed software development. "In a way, through that project, we
had the first PC, but we never capitalized on it," Vadasz said.
"With the emergence of the PC, that business disappeared."
Competitors also miscalculated demand.
National Semiconductor, for instance, marketed an expensive 16-bit chip in
an 8-bit world, recalled Mazor. "Everybody did everything else wrong,
and they did it with great effort," he said.
But most importantly, IBM selected the
Intel 8088 for the first PC in 1981. IBM had two PC projects: one in
Austin, Texas, and one in Florida. The Austin project relied on a Motorola
processor, but delays made IBM favor the Florida project.
"You can't underestimate the
importance of the IBM deal," McCarron said. "If it wasn't for
that, we'd be talking about Motorola vs. AMD."
Or not. In a final twist in the early
years, IBM required that Intel find a second source for the chip. The
company turned to AMD, singing a licensing agreement that effectively
helped create its lead competitor today. |
This
article was published in C/NET
News.com Special Reports in
November 14, 2001. It has been posted in this Web
site with the permission of C/NET.
James Redin. |