Skip to main content

Moore was known for innovations in semiconductors that helped transform the computer into the defining tool of modern life. His prophecy that computing capacity would grow exponentially — and with decreasing costs — was dubbed Moore’s Law and became the chip industry’s standard for decades. * The opposite of Bruce's Law Ticket costs will grow exponentially.

Intel Corp. co-founder Gordon E. Moore, whose innovations in the design and manufacture of semiconductor chips helped launch Silicon Valley and transform the computer into the ubiquitous, defining tool of modern life, died March 24 at his home in Hawaii. He was 94.

Intel announced the death but did not provide further details.

A central figure in the history of electronics, Dr. Moore famously predicted in 1965 that computer power would double each year for a decade, a forecast he modified in the mid-1970s to every two years. His prophecy that computing capacity would grow exponentially — and with decreasing costs — was dubbed Moore’s Law and became the standard that scientists for decades raced successfully to meet.

Making computers smaller, faster and cheaper meant integrating ever more circuitry onto slivers of silicon. Dr. Moore envisioned that these integrated circuits would “lead to such wonders as home computers — or at least terminals connected to a central computer — automatic controls for automobiles and personal portable communications equipment,” as he put it in the 1965 magazine article where he made his signature prediction.

Moore’s Law became the driving force in computer technology for the next half-century. “It’s what made Silicon Valley,” Carver Mead, the retired California Institute of Technology computer scientist who coined the phrase “Moore’s Law,” told the Associated Press on the law’s 40 anniversary.

“Innovation in electronics has as much to do with vision as it does with tinkering, and Gordon Moore saw the future better than anyone in the last 50 years,” said Michael S. Malone, author of “The Intel Trinity,” a 2014 history of the company. “The industry didn’t measure its performance by Moore’s Law. It designed and targeted its goals based on it, turning the law into a self-fulfilling prophecy.”

Intel led the rapid advance. In 1971, it introduced the first integrated circuit so powerful it could be called a “general-purpose programmable processor” — or microprocessor — the brain of a computer on a single chip. It had 2,300 transistors on a 12-square-millimeter piece of silicon, or a fraction of the size of a thumbnail.

“We are really the revolutionaries in the world today — not the kids with the long hair and beards who were wrecking the schools a few years ago,” Dr. Moore told a reporter at the time. (Today, Intel, still an industry leader, can put about 1.2 billion transistors in the same space.)

Dr. Moore knew that increases in computer power achieved by cramming more transistors into smaller chips eventually would run up against the laws of physics, with the size of an atom limiting the ability to shrink the silicon pathways on which electrons travel. But he cautioned against predicting “the end of progress” because scientists, he said, would continue to find ever more ingenious solutions.

“Every time someone declares Moore’s Law dead,” Malone said, “there’s some breakthrough.”

The integrated circuit

Dr. Moore started Intel in 1968 with physicist Robert Noyce. He was also a founder, with Noyce and six others, of Fairchild Semiconductor, established in 1957. Of Fairchild’s many inventions, two stand out as having revolutionized computing, and Dr. Moore had a significant hand in each.

The first was a chemical printing process to produce computer chips in batches rather than one at a time. The other, Noyce’s idea, was to place on one patch of silicon not just one transistor — the on-off switch of computers — but many, along with the wires to connect them. This was the integrated circuit, which evolved at Intel into the microprocessor. (A Texas Instruments scientist, Jack Kilby, simultaneously and independently invented the integrated circuit.)

Integrated circuits and the means to mass produce them set off the scientific and corporate race whose pace was set by Moore’s Law.

Fairchild, headquartered southeast of San Francisco, didn’t give stock options to its staff, and many scientists left to form new companies. Labeled “Fairchildren,” the companies included Advanced Micro Devices, National Semiconductor, LSI Logic and Intel.

The exodus from Fairchild transformed the surrounding countryside’s fruit orchards into Silicon Valley, a mecca for high-technology start-ups. An exhibit at the Computer History Museum in Mountain View has a “family tree” of dozens of the valley’s companies with roots in Fairchild.

“It seemed like every time we had a new product idea, we had several spinoffs,” Dr. Moore said in a 2015 interview done for the Chemical Heritage Foundation. “Most of the companies around here even today can trace their lineage back to Fairchild. It was really the place that got the engineer-entrepreneur really moving.”

From left, Andrew S. Grove, Robert Noyce and Dr. Moore in 1978. (Intel Corp.)

At Intel, Dr. Moore focused on moving products quickly from drawing board to customer. He fostered an entrepreneurial mind-set and streamlined operations, practices that became essential traits of Silicon Valley.

“When we set up Intel,” Dr. Moore told PBS talk show host Charlie Rose, “very specifically we did not set up a separate laboratory. We told the development people to do their work right in the production facility. … So we eliminated a step.”

Arthur Rock, who raised the initial financing for Intel and became its first chairman, described Dr. Moore to Fortune magazine in 1997 as a brilliant scientist who “more than anyone else set his eyes on a goal and got everybody to go there.” By contrast, Noyce, Intel’s first chief executive, “had strokes of genius, but he couldn’t stick to anything,” Rock said.

Dr. Moore succeeded Noyce as chief executive in 1975. For the company, critical days lay ahead, when Dr. Moore and his own hard-driving successor, Andrew S. Grove, refocused the company on making microchips that stored information (memory chips) rather than chips that processed information (logic chips). It proved to be a multibillion-dollar success story for Intel.

A friend’s chemistry set

Gordon Earle Moore was born in San Francisco on Jan. 3, 1929. He grew up in Pescadero, Calif., a farming community in San Mateo County. His father was an assistant county sheriff, and his mother helped run her family’s general store.

He was 10 when his family moved to Redwood City, not far from Menlo Park and Palo Alto. A neighborhood friend got a chemistry set for Christmas and invited young Gordon over to blow things up.

“Most people who knew me then would have described me as quiet,” he once quipped, “except for the bombs.”

Dr. Moore, the first person in his family to attend college, received a bachelor’s degree in chemistry in 1950 from the University of California at Berkeley. Four years later, he received a doctorate in chemistry from the California Institute of Technology, and he began working at Johns Hopkins University’s Applied Physics Laboratory in Laurel, Md.

In 1956, physicist William Shockley recruited Dr. Moore to Shockley Semiconductor Laboratory near Stanford University. That year, Shockley and two other scientists won the Nobel Prize in physics for work they had done at Bell Laboratories, including the invention of the transistor. A smaller, more reliable way to regulate electric currents, transistors would replace bulky, easily broken vacuum tubes in computers and other devices.

Within a year, Shockley’s overbearing management style — and a tendency to claim other people’s work as his own — prompted Dr. Moore and seven other scientists to bolt.

The “traitorous eight,” as Shockley called them, set out to be hired as a group to study and make semiconductors. They were rejected by more than two dozen companies. Finally, Sherman Fairchild, an inventor whose father was a founder of IBM, invested $1.5 million to start Fairchild Semiconductor with the rogue engineers.

Fairchild’s successes were so numerous that by the time the enterprise outgrew its first facility, Dr. Moore wrote in an essay, the tiles in the coffee room ceiling “were peppered with the imprints of all these champagne corks.”

After a management shake-up at Fairchild, Dr. Moore partnered with Noyce to found Intel. He stepped down as chief executive in 1987 and a decade later was named chairman emeritus. He relinquished that role in 2006.

Dr. Moore was a fellow of the Institute of Electrical and Electronic Engineers and a past board chairman of Caltech. His honors included the National Medal of Technology, awarded in 1990. A decade later he and his wife, the former Betty Whitaker, created a foundation with an endowment of more than $6 billion to support grants in conservation, science research and education.

In addition to his wife, whom he married in 1950, survivors include two sons, Kenneth and Steven, and four grandchildren.

Because of his stature in Silicon Valley, Dr. Moore was often called on to prognosticate about the future of science and technology. He liked to say he was not especially well suited for the role, having once dismissed the concept of the personal computer as “something of a joke.”

“The importance of the Internet surprised me,” he told the New York Times in 2015. “It looked like it was going to be just another minor communications network that solved certain problems. I didn’t realize it was going to open up a whole universe of new opportunities, and it certainly has. I wish I had predicted that.”


The SPL Rocks!

Pulled up to my house today
Came and took my little girl away!
Giants Stadium 8/28/03



Images (1)
  • mceclip0
Original Post

Add Reply

Link copied to your clipboard.