Intel: Semiconductors & Strategic Inflection Points
Few companies in the world can match Intel's rich history of pivotal moments. From Shockley Laboratories, Fairchild Semiconductor, and the start of Silicon Valley as we know it, Intel has through difficult strategic decisions been a centerpiece in the evolution of the American semiconductor industry. Join us as we tell the story of Intel and how the innovations from its legendary founders came to power the brains of modern computers.
Key Insights
Industry legends: Robert Noyce, Gordon Moore, and Andrew Grove built Intel and transformed the company into one of the world's most successful during the 1990s.
Bold decision: Facing intense competition in the memory business, Intel shifted its focus to its promising processor segment – proving to be an incomparable inflection point in the company's history.
Apple proposal: Intel famously declined Apple's offer to supply chips for the first-generation iPhone, due to a significant underestimation of the product's future demand and impact.
Silicon Valley, Shockley Laboratories, and Fairchild Semiconductor
The story of Intel is deeply intertwined with the development of the American semiconductor industry and the transformation of Silicon Valley into a global innovation hub. While the forming of this entrepreneurial center on the southern San Francisco Peninsula began earlier, events accelerated after WWII as technological advancements and innovation increased rapidly.
One of these advancements was the innovation of the transistor in 1947 at Bell Laboratories, Murray Hill, New Jersey, by a group of American physicists, including a man named William Shockley. After receiving the Nobel Prize in Physics in 1956, Shockley parted ways with his fellow researchers to start his own company. This new company, Shockley Semiconductor Laboratory, performed research into silicon semiconductor devices and positioned itself both strategically and geographically (in Silicon Valley) to attract top talent from the nation’s leading universities.
Shockley, who acted as manager of the company he had founded, proved to be difficult to work with and was often described as authoritarian and ill-tempered by his employees. The friction this caused led eight of the most prominent researchers to leave the company and start their own venture. The group became known as the “Traitorous Eight,” they went on to form Fairchild Semiconductor Corporation in 1957. The company quickly became a pioneering force in the semiconductor industry, playing a pivotal role in shaping the technological landscape of Silicon Valley.
Robert Noyce and Gordon Moore
Among Shockley's "traitors" were two key figures central to this story: Robert Noyce and Gordon Moore, both well-known for their common achievements in the semiconductor industry and their individual successes.
In 1959, Robert Noyce made a technological breakthrough by co-inventing the integrated circuit (IC). An IC is a tiny chip that interconnects multiple electronic components onto a single chip, such as transistors, resistors, and capacitors. This innovation significantly reduced the size and cost of electronic devices while simultaneously enhancing performance. Noyce's innovation revolutionized the electronics industry as it opened the doors to countless new possibilities.
The other key figure, Gordon Moore, made a lasting theoretical contribution with his observation known as Moore's Law, noted in 1965. This empirical prediction stated that the number of transistors on a microchip would double approximately every two years, while the cost of computing would fall. Originally a trend, Moore's Law evolved into a guiding principle for innovation across the semiconductor and tech industries, shaping long-term planning and R&D targets. The observation has influenced the rapid miniaturization of technology, making electronic devices smaller, faster, and more affordable over time.
Join 100,000+ other business enthusiasts
Sign up for Edge, our free weekly newsletter.
The Fairchildren and Founding Intel
Fairchild Semiconductor's paved the way for the creation of dozens of corporations that have had a major impact on the American semiconductor industry. These spin-offs, either directly or indirectly tied to Fairchild, became known as the “Fairchildren”. The most notable of these are AMD, National Semiconductor (acquired by Texas Instruments in 2011), LSI Logic (acquired by Broadcom in 2014), and the subject of this story: Intel.
In 1968, Noyce and Moore teamed up with investor and venture capitalist Arthur Rock to form NM Electronics. A month later the newly formed company changed its name to Intel, derived from INTegrated ELectronics. Intel's third employee was a man named Andrew Grove, a chemical engineer, who would come to have a strong influence on the company throughout the coming years.
Further reading: Texas Instruments: From Calculators to Master Capital Allocators
The Semiconductor Memory Market
From the outset, Intel identified the rapidly growing semiconductor memory market as its primary focus. The industry was undergoing significant transformation, with ICs beginning to replace traditional magnetic-core memory. With the rising demand for large-scale institutional computers, such as mainframes and minicomputers, competition intensified as companies raced to develop more efficient memory chips.
In 1970, Intel launched the Intel 1103, the first commercial dynamic random-access memory (DRAM) IC, which quickly became a bestseller across various applications. The success of the 1103 propelled Intel's growth throughout the 1970s. During the early years, Intel also created the first commercially available microprocessor, a diversification that would come to prove critical later on.
Increasing Competition and Strategic Success
Up until the 1980s, Intel had enjoyed great success in its memory business. However, the market dynamics were changing quickly with several Japanese semiconductor manufacturers, such as NEC, Hitachi, and Toshiba, entering the market. The supply side was now flooded with superior products from these incredibly advanced competitors ready to do whatever it took to offer the cheapest memory chip on the market. The memory product, earlier a unique and complex chip that was difficult to develop was becoming more and more accessible, and Intel found itself at a crossroads, needing to rethink its strategy to remain relevant.
After essentially building its entire business around memory products, shifting away from this successful division was anything but straightforward for Intel. But thanks to its rather small, yet promising, presence in the growing microprocessor market, Intel shifted its focus. The timing of this pivot proved crucial, as the PC industry was on the verge of explosive growth, with little competition in computer processor (CPU, central processing unit) development. A few years earlier, IBM had selected Intel's x86 microprocessor for its first personal computer, starting what would become a near-monopolistic era for Intel in the processor market.
As IBM now was looking to scale up production, they required a second-source manufacturer to ensure supply stability. This led to a pivotal agreement between Intel and fellow Fairchild spin-off AMD, which involved a 10-year technology exchange allowing AMD to second-source Intel's pioneering processors. With other PC manufacturers also turning to Intel for their processors, demand soared. To meet the increase in demand, Intel not only collaborated with AMD but also brought in other suppliers to second-source the processors, ensuring a consistent supply of its processors on the market.
Further reading: AMD: Shaping the Future of Semiconductor Processors
In 1985 Intel unexpectedly decided to cut the ties with its second-source suppliers for the newest iteration of its processors, the 386 chip, which was expected to be a big success. By taking this approach, Intel violated the terms of its agreement with AMD, leading to lawsuits and eventual damage payments. Despite the legal challenges, the outcome ultimately favored Intel's long-term business, allowing it to maintain control over the 386 chip design. The decision proved to be a major success as Intel scaled production and became the sole supplier of processors for the entire PC industry for several years, while AMD struggled to replicate the breakthrough chip. Although AMD eventually developed a competing product six years later, by that time Intel had launched another key initiative, further solidifying its market dominance.
Intel Inside
In 1991, as the second wave of the PC revolution gained momentum, Intel's CEO at the time, Andrew Grove, devised a brilliant marketing strategy. With multiple PC manufacturers vying for the best processors, Intel launched its now-iconic “Intel Inside” campaign. The company offered to cover half the marketing costs for PC makers, provided they prominently featured the "Intel Inside" logo. This campaign significantly raised consumer awareness of Intel's processors, making it a must-have for PC manufacturers to stay competitive in the booming market.
During these successful years, Intel's revenue increased from $1.9 billion in 1987 to over $26 billion by 1998, underscoring the massive success of Intel's shift to focus on microprocessors a decade earlier. Andrew Grove, who led the company through these transformative years, was in 1997 named Time Magazine Man of the Year, proof of the impact he had had on Intel and the American semiconductor industry.
“The person who is the star of previous era is often the last one to adapt to change, the last one to yield to logic of a strategic inflection point and tends to fall harder than most.”
From Andrew Grove's Only the Paranoid Survive: How to Exploit the Crisis Points that Challenge Every Company and Career. This quote perfectly captures what Intel managed to avoid at this pivotal moment, but also foreshadowing what the company would fail with later on.
Strategic Failure
Although Intel had incredible success in the PC market by shifting away from the memory business–not all strategic decisions ended up this successful. With Intel's reliable processors powering almost every computer, Apple's Mac was not an exception. With a business relationship already established, Steve Jobs approached Intel with an offer to supply chips for the first iPhone. Intel's CEO at that time, Otellini declined the offer as they believed the mobile phone processor market was too small to justify the significant R&D investment required. Designing chips is an expensive process, and Intel assumed that sales volumes wouldn't be sufficient to cover the costs.
In hindsight, this might be one of the greatest missed opportunities in business history, as Intel underestimated the iPhone's demand by nearly 100x. Furthermore, this not only proved to be a missed opportunity in terms of smartphone processors but also the start of Apple Silicon, which in due time would lead to Intel losing Apple and Intel losing Apple as a customer of computer CPUs.
Further reading: Apple: From Garage Upstart to Global Giant
Intel Today
The decision to pass on manufacturing chips for the iPhone can in retrospect be seen as a strategic inflection point, to quote Grove. Following the rejection of the offer, Intel faced several challenging years marked by failed diversification attempts and declining processor sales, driven by weakening demand and intensifying competition from AMD. What was earlier viewed as an Intel monopoly has increasingly shifted more to a duopoly between the two Fairchildren.
Further reading: Monopolies and Duopolies: Competition is for Losers?
IDM Model
In the semiconductor industry, Intel operates as an Integrated Device Manufacturer (IDM), allowing it to manage the entire chip-making process from design to fabrication. This allows Intel to control its production roadmap and drive innovations in manufacturing technology, such as EUV lithography and advanced packaging.
Today, Intel's primary focus remains on its x86 CPUs, where its Core and Xeon processor lines are central to its business. Intel's Core processors are widely used in consumer laptops, desktops, and high-end workstations, with the company maintaining strong relationships with major PC manufacturers such as Dell, HP, and Lenovo. The company’s Xeon processors are used in enterprise servers and cloud computing within major service providers such as Amazon Web Services, Microsoft Azure, and Google Cloud.
Intel's value chain strategy includes Intel Foundry Services (IFS), which offers its manufacturing capabilities to external companies. Through IFS, Intel provides advanced semiconductor fabrication for clients that design their own chips but lack manufacturing capacity, competing directly with advanced foundries like TSMC and Samsung.
Additionally, Intel is advancing in AI and high-performance computing, with its acquisition of Habana Labs and in the development of specialized chips like Gaudi processors designed for AI workloads.
In Conclusion
The story of Intel is deeply rooted in the American semiconductor industry, tracing its origins to the pioneering company Fairchild Semiconductor. While many are familiar with the pivotal moment when Intel turned down Apple, fewer know of the strategic decisions that allowed it to pursue and dominate a rapidly growing market for decades. With industry legends like Robert Noyce, Gordon Moore, and Andrew Grove at the helm, the company transformed into a near-monopolistic force in computing that it to this day reaps the benefits from.
Further reading: ASML: Architecting Earth's Most Complex Machines
Have you tried the Quartr mobile app?
Get free access to live earnings calls, transcripts, analyst estimates, and more