Book Review: The Intel Trinity, from David Lilienfeld

The Intel Trinity
by Michael S. Malone

First, this is an outstandingly written book, the post-war industrial biography of the Santa Clara Valley in California. It reads like a novel: Isaac Asimov meets Tom Clancy in the ease of reading. And the story presented is a compelling one. In short, it was an enjoyable read. Let’s dig a little deeper.

One of my majors in college was electrical engineering/computer science. It’s a bygone era. No one remembers much now about Unix, Version 6, the first version that allowed the computer, typically a PDP-8 or -11, to perform such that one didn’t think there was really a washing machine trapped inside the cabinet of blinking lights. I doubt that many recall when MSI stood for middle scale integration or LSI for large scale integration, indicative of the density of transistors on the chip. Ask an engineering student about an 8080 and you’re as likely to be told that that’s a low starting monthly salary for her to receive upon hiring just after graduation. That the 8080 (and arguably the 8008) is the origin of the modern PC is probably something about which she has no idea.

During my senior year, before it became apparent that I might actually be admitted to medical school (though I talked myself out of the admissions when the tuition hikes were communicated to me), I applied for a position at Intel. It was a small struggling enterprise at that time. I first heard about Intel in the early 1970s. Fortune had carried a story about the start-up and its audacious plans to take on IBM. Think DeLorean taking on GM. Except there’s no Back to the Future to immortalize a brand that failed so thoroughly in the marketplace. At the time, computers, filling rooms and needing constant room temperatures, and they used a thing called “core memory” as their principal “working memory” while running programs, generally in Fortran II, Algol (for the intelligentsia) or COBOL (which had all of the finesse of a Yugo—or Lada, if you prefer–going over the rough road section of the Ford test track in Ypsilanti driven by some test driver from Car and Driver doing 50-60 mph; a Lincoln Mercury it was not).

Core memory was about as stable as the temperature of the room in which it was located. From an engineering perspective, it was simple. From a manufacturing perspective, it was a bear, taking lots of effort (read man-hours) to manufacture. Many a home in Armonk, NY and other environs of Westchester County just north of NYC were paid for by IBM’s voracious need for core memory as the company seeming took over the back office world of repetitive tasks like tracking accounts. Further, core was limited in its speed, so there was an upper limit to how fast a computer could run. There were integrated circuit chips available for replicating some of what core memory would do, but it was expensive to manufacture, and the degree of integration was insufficient to displace core memory.

Enter Gene Amdahl. Amdahl had been the architect of the IBM 360, the Model T of computer systems (it was also the Chevrolet—IBM had learned from the Ford-GM competition forty years before that not all customers wanted black and would pay for the privilege of red). To say the 360 was revolutionary doesn’t really describe its impact. It created not merely an industry where none had effectively existed before—it changed lifestyles. Absent the 360, there was probably a real limit to how large a corporation could grow before management would be relying on inadequate information systems, as well as the human wherewithal to process what information the organization could gather. But IBM wasn’t moving fast enough as an innovator, and so Amdahl left IBM in 1970 to found the Amdahl Corporation. The idea was to craft a machine capable of outperforming an IBM computer while running programs that the IBM machine would run. Tall order, but he did it.

The idea behind Intel was to electronically compete with magnetic core memory. But I’m getting ahead of myself. The origins of Intel can be traced back to the awarding of the Nobel Prize to William Shockley as coinventor of the transistor while at Bell Labs. (Whether Shockley actually did invent the transistor or my third cousin once removed did [see: History of the Transistor]
is a different matter all together.) With his mother in Mountain View (just south of Palo Alto) in California) ailing, Shockley left Bell Labs and moved to moved to the Santa Clara Valley to be near to her. He wanted to pursue his work on transistors, but there weren’t many companies in the SF Bay area dealing with transistors.That Bell Labs was focused on geranium transistors, which are more difficult to manufacture with a higher cost of goods than its “brother” semiconductor—silicon, Shockley saw an opportunity to build a silicon transistor behemoth. He convinced fellow Cal Tech alum and Cal Tech professor Arnold Beckman, the CEO of Beckman Instruments (founded during the Great Depression to commercialize Beckman’s pH meter) to bankroll a new company focused on the “new” (see above) are of electronics, semiconductors—in particular, transistors. Beckman agreed to do so, and Shockley Semiconductor began life as a division of Beckman Instruments with an office at 391 San Antonio Road in Mountain View, California (at the corner of San Antonio and California around the point at which one crosses the CalTrains bridge; at least there is a sign there noting that the founding company of the industry had its headquarters—actually its only building—there. But that the same could be said of the first integrated circuit manufacturing plant (Fairchild’s, of course) on the southwest corner of Fabian Way and Charleston Avenue in Palo Alto, now occupied by a gasoline station—at last check, I didn’t see a sign here.

Shockley was a shrewd assessor of scientific talent, particularly in the field of solid state physics (of which semiconductors were a major component at the time), and he recruited a group of eight young engineers and physicists to staff his company. The notion of working directly with a Nobel laureate was a major appeal of joining this start-up, and the calling of California with its warm winters and seemingly eternal sunshine didn’t hurt, either. This group of eight would conduct the seminal work on how to manufacture transistors profitably. They just wouldn’t do it for Shockley Semiconductor.

By most accounts, Shockley was a brilliant scientist and an equally extraordinarily poor manager. Sometimes seeming paranoid, other times narcissistic, Shockley had difficulties in making timely decisions critical to the success of the enterprise. Fed up with the situation, the eight men discussed their options, and concluded that staying at Shockley would be a waste of their time. But what to do? They knew the promise of transistors, and the reality was that profitable manufacturing of this electronic part was challenging some of the major corporations in the US at the time—like AT&T with its conviction that geranium would be the future of electronics. But who would fund such an effort? Eugene Kleiner’s (yes, that Kleiner) father worked in New York City at Hayden Stone, a stock brokerage/quasi-investment bank of the era. Kleiner’s father had a colleague at the firm, Arthur Rock, and Rock knew of someone who might just be interested in funding the new venture, Sherman Fairchild.

Sherman Fairchild was a son of a Congressman and co-founder of IBM. His father, in fact, was the first chairman of the company, and Fairchild was IBM’s largest shareholder. Fairchild had developed an interest in photography as a child, and with the help of his father’s connections in Washington DC and eventually obtained military contracts for photographic equipment (during WW2, 80-90 percent of Allied aerial photography was conducted with Fairchild cameras)t. Fairchild Camera and Instrument thrived, and Fairchild himself had sufficient funds to take some chances in funding American start-ups in industries no one had ever heard of. When approached by Arthur Rock about the eight Shockley refugees (known to all but Shockley as the “Fairchild Eight”—Shockley called them the “Traitorous Eight”), he agreed to underwrite the effort. Thus was Fairchild Semiconductor, a branch of Fairchild Camera and Instrument, born, the result of a $1.5 million investment. Key to Fairchild’s future would be the production techniques the eight developed around the time of Fairchild’s establishment. Were those techniques developed while they were at Shockley or at Fairchild? That the eight had invested their own money (between $3K and $4K) into the development of the process would allow them, more or less, to leave Shockley without encumbrances.

Who was to lead Fairchild Semiconductor? There was a clear leader among the eight—Bob Noyce. Noyce made the decision that Fairchild would stay focused on silicon, and he oversaw what was in essence the commercialization of this key component of the electronic future. Basically, Noyce ran the front office. He exuded confidence, charisma, and could paint colored pictures across a black chalkboard with only white chalk available. Moreover, he had an uncanny instinct for making a decision that ultimately provided to be the right one—as often as not made impetuously (some would call him rash). He was called on to make a trek to NYC every month to present to the powers that be at Fairchild, something that none of the other members of the band of eight was up for doing.

The company was profitable within its first year of existence. Its first sale was to IBM for 100 transistors at $150 a piece. In a move suggestive of how improvisational the company was prepared to be, these transistors are shipped in a Brillo box. At the same time, Sherman Fairchild’s influence was felt at the company soon after its founding, and the impact would be felt throughout the Santa Clara Valley for decades to come. The story goes that Fairchild made a visit to the semiconductor company’s headquarters on a hot day. He was driven in a chauffeur-driven limousine. When the limo arrived at the headquarter, the chauffeur opened the car door for Fairchild and, after closing it, stood in full regalia by the car waiting for Fairchild to re-appear. The company founders were exasperated as each would go to a window and look out at this sight. Someone was hired to just stand in the sun? This was what corporate America, East Coast variant, was all about? They would have none of this formality. And since then, the Santa Clara Valley has, indeed, had none of it. Informality would reign. (Some of this is better detailed in Leslie Berlin’s The Man Behind the Microchip: Robert Noyce and the Invention of Silicon Valley—which is itself an outstanding scholarly effort.)

Fairchild established a reputation as a technological innovator outside of the Shockley design for a transistor with the invention of a planar transistor—made on a 2-D frame rather than the 3-D one that the Shockley transistor used. that opened the door for what would become the linchpin for the Santa Clara Valley: the creation of the integrated circuit. In 1961, Noyce’s insight brought him to invent (or co-invent) that integrated circuit—aka “the chip”. (I say co-invent because Jack Kirby at Texas Instruments came up with the same idea six months later, the difference being that Noyce’s chip rested on a silicon base and Kirby’s geranium. Kirby would be recognized by the Nobel committee four decades later for this invention; By that time, Noyce had been dead for a decade. Nobels are not awarded posthumously, though there is at least one instance where the laureate died between the time the decision to make the award and the time that the awards were actually given.)

Fairchild was already making lots of profits by the time the integrated circuit came along. But the IC made those profits from transistor manufacturing seem like a minor revenue line. ICs were where the money was to be found, and in dominating the IC marketplace, Fairchild was practically minting it. Along the way, Noyce developed a reputation for showmanship in Silicon Valley that, a quarter century following his death, is still unmatched. At one electronics industry trade association meeting, he proclaimed that henceforth, Fairchild would sell its chips for $1 each. His competitors thought he had gone mad. They couldn’t operate with chips priced that low. But Noyce later explained that the reality was that the trajectory for prices was downwards and that chips were reach $1 within a year anyway. This way, Fairchild got some great publicity, it made life tougher for its competitors, and Noyce’s own reputation as an industry leader soared. The latter would be a significant force for the remainder of the story. And no one would know it at the time, but Noyce had just tripped over another defining element of the industry growing within the Santa Clara Valley: Moore’s law—that the cost of a transistor in an integrated circuit would be halved every 18 months.

The law’s creator, Gordon Moore, didn’t think much of the idea when he wrote about it in an article in the mid-1960s. While Noyce was the public face of Fairchild, Moore ran the R&D department. As extroverted as Noyce was, Moore was an introvert. That’s not to say that there was brilliance behind the silence. And Noyce and Moore understood that each needed the other as part of their individual successes—Noyce’s ting to Moore’s yang. Two members of the trinity.

But if Moore ran R&D and Noyce the front office, who oversaw the grisly work of actually producing the chips? That task fell on the shoulders of a young Hungarian chemist name Andy Grove. Grove became a consummate engineer during his time at Fairchild. He came to know Gordon Moore and developed a relationship with Moore as a mentor. That was in contrast to Grove’s view of Noyce. To Grove, Noyce was everything Moore was not—cavalier, pompous, disshelved, and so on. From Grove’s perspective, R&D (Moore) and Manufacturing (Grove) were the only parts of the organization needed. Therefore, whatever Noyce did could be dispensed with and not have any deleterious effect on the company. There was one aspect to Noyce’s behavior that Grove found particularly troubling. For all of his impetuousness, Noyce was reluctant to make a decision that might impact on the affect of Fairchild employee’s for Noyce; stated simply, Noyce wanted to be loved by everyone. Perhaps in a small company that is achievable (arguably), but at Fairchild in the mid-1960s, it was impossible.

Noyce’s patent on the integrated circuit gave Fairchild its dominance in the industry, but it was his showmanship that gave him (and by extension Fairchild) its credibility in international markets. In Japan, he was venerated as practically a god of electronics, for instance. But for all of Noyce’s talents and Fairchild’s accomplishments, Fairchild was now starting to become a place less and less enticing as a place to invest or as a place to work. Many of those persons who would come to define the Santa Clara Valley business world during the next half century, like Jerry Sanders, would leave Fairchild and set up their own shops throughout the Valley. Those companies would come to be known as the Fairchildren. And all across the Valley were strewn integrated circuit assembly plants. Small wonder, then, that Don Hoefler, a writer for Electronic News, used the term “Silicon Valley” (suggested to him by his friend Ralph Verst) in a series entitled, “Silicon Valley, U.S.A.” which began in the January 11, 1971. (There is a certain irony that Electronic News was published by Fairchild Publications, but to my knowledge, there isn’t a relationship between the two Fairchild. Interestingly enough, Hoefler had also worked as a publicist at Fairchild Semiconductor during the 1960s.)

Watching these developments from the catbird’s seat at Fairchild, Noyce grew weary of being a CEO of a company that had grown beyond his grasp and in which any decision he made was sure to offend someone. For all of it external appearances of economic health, Fairchild was getting quite sickly internally, with expenses starting to run out of control. Matters weren’t helped by the semiconductor’s need for capital to build plants to keep up with demand and its parent’s desire to use it as a cash cow. Add in the frequent cross-country travel, and it’s hardly surprising that Noyce twice sought out Moore with the thought of leaving Fairchild to start a company focused on a particular aspect of electronics, not just the components of electronic assemblies. The second time, Moore agreed. They decided that the new company would be called Intel, short for “integrated electronics.” As for specific area of electronics Intel would focus on, that would be electronic computer memory, i.e., competing with iron core memory. Hearing of Moore’s departure, Grove practically demanded that Moore bring him along to the new company; Moore did so.

That, then, is the Intel Trinity: Noyce, Moore and Grove. And it would be the three of them running the company until the company hit a bump in the road and had to layoff staff. Doing so was something Noyce could not bear, and he subsequently stepped down as CEO and went off to focus on personal interests. What those personal interests were, exactly, for this “Mayor of Silicon Valley” wasn’t clear until his death. In cleaning out his closet, shoeboxes (some suggest it was only one, others more than one) of stock certificates and funding agreements—usually as seed capital–with a range of what were at that time small start-ups and have since grown to define the digital world—companies like Apple, Oracle, among others. By one estimate, had Noyce retained these investments into the present, his net worth would be in the 11 figures—and that assumes that he made no other investments along the way. His estate was valued not nearly as highly as some had expected in part because he invested so much of his personal funds in these start-ups. He was also beneficent, giving enough Intel stock to his alma mater, Grinnell College, that at one time it was the largest shareholder in the company.

During the 1970s, Intel focused—not so successfully—on electronic computer memory (RAM). As Moore’s law predicted, the price of transistors (read: memory) declined by half every 18 months. Further, the Japanese had entered the market and were pummeling the American semiconductor industry. By the late 1970s, it wasn’t clear that the industry—the continuation of which is seen in DC as a matter of national security—would indeed continue. (When I was my college’s IEEE Student Chapter President in 1977-8, I invited a fellow from the Dept of Commerce, George Lindamood, whose expertise was on the US semiconductor industry. Following his presentation on the history of the industry, I asked what he thought might happen in view of the Japanese companies pricing their chips so aggressively even while the US companies had opened their doors to the Japanese.He commented that the companies had refused to listen to the US government about not opening their doors to the Japanese, and that if the US companies couldn’t figure a way to survive, the US government was going to be challenged in how to deal with such a strategic loss.)

While all of this is unfolding, Intel has been approached by a Japanese company (remember, Noyce is still revered in Japan—probably to Grove’s annoyance) to create a chip for a calculator. I’ll cut to the chase and note that while the project itself didn’t produce much of note, one of the spin-offs within Intel did. It was the creation of the 4004, something somewhat close to a microprocessor chip. The next step was the 8008. It was rudimentary, but it was a microprocessor. These developments, though, were not the focus of the company’s efforts, so they were conducted as much out of the light of day as one might imagine. The 8008 didn’t have the efficiency needed to open a market that didn’t exist at the time. Hence, the development of the 8080, the first 8 bit microprocessor. For those of us who built the Altair 8800/IMSAI 8080 and who cut our programming teeth running a Basic version of STAR TREK painfully entered onto a paper tape with an ASR 33, the 8080 may not have had much memory, but it could actually execute a program. No surprises, therefore when in the late 1970s, Radio Shack introduces the TRS-80 (based around a successor chip to the 8080, made by Zilog, the Z-80), one of the first programs introduced for it was a spreadsheet. In short order, it was noted in Armonk, NY, the citadel of IBMland, that IBM engineers were buying these machines and taking them home. The rest of the story behind the PC I’ll leave for another time, and I’m sure there are some alternate versions. Hence, the 8080 was the portal into our modern digital age.But getting there for Intel wasn’t easy. For one thing, the company tried to create what it thought was a sure-thing—a watch using Intel chips. As a fashion accessory, it looked like something that might have been designed by Ed Wood. And what irked Intel engineers about the lack of commercial success was that they couldn’t understand why everyone wasn’t excited about keeping time with an accuracy of 0.000001%. Didn’t everyone want that? Even if the watch looked hideous? Wasn’t it worth having that watch? Well, actually, no. The market wasn’t impressed. Intel would later go into commercial branding with the “Intel Inside” campaign, but it hasn’t tried to launch a consumer product with much in the way of financial commitment since then. Lesson learned, I guess. That was the point at which I declined Intel’s job offer.

By the mid 1980s, Intel was barely holding on. In a fit of frustration, Grove went into Moore’s office (Noyce is out of the scene by now and Moore is the CEO) and said something like, “If we weren’t in memory chips right now, I wouldn’t have put us into them.” Moore heard the statement and fired back at Grove something to the effect of “Let’s pretend that we’re not in them and move from there.” Intel made itself over at that point, pouring all of its resources into microprocessors. Wintel was born.

I’ve probably gone on way too long already, but I wanted to give a sense of the flavor of this outstanding book, one which reads more like a novel than a corporate biography. It’s simply a wonderful read. Malone is to be congratulated on it.


Resources & Links