Texas Instruments: From Calculators to Master Capital Allocators

1 minutes reading time
Published 11 Oct 2024
Reviewed by: Oliver Hamrin

Few companies can claim to have left a mark on nearly every facet of modern technology, but Texas Instruments has done just that. What started as a small oil exploration business during the Great Depression has grown into one of the largest semiconductor companies in the world, churning out billions of chips every year. The almost 100-year-old history of TI is nothing short of remarkable, including a Nobel Prize for one of humanity's greatest inventions, a near-monopoly in calculators, and a long-term pursuit of capital allocation perfection.

Key Insights

  • Inventing the chip: In 1958, Texas Instruments engineer Jack Kilby reshaped the future of technology with his groundbreaking invention of the integrated circuit (IC).

  • History of the iconic graphing calculator: How Texas Instruments developed a near-monopoly and became an essential in mathematics education worldwide.

  • TI 2000: The strategic initiative that radically transformed TI's semiconductor business, sharpening its focus and consolidating its operations.

  • IDM business model: The reasoning behind the company's vertically integrated business model, from chip design to in-house manufacturing.

  • Strategic long-term priority: A deep dive into Texas Instruments' capital allocation framework through the lens of its acclaimed Capital Management presentation.

Detecting Oil During the Great Depression

This fascinating story dates back nearly a century, to 1930, when John Karcher and Eugene McDermott founded what would eventually become Texas Instruments (TI). While the company's headquarters remain in Dallas, Texas, everything else has changed multiple times since then.

Drawing on the founders' extensive backgrounds in geophysics, the newly established Geophysical Service Incorporated (GSI), as it was called back then, became the first company to offer reflection seismograph exploration services to the oil industry. Developed by Karcher, this innovative technology employed signal processing to detect underground oil, positioning GSI as a pioneer in advanced geophysical techniques.

Despite the economic challenges of the Great Depression, GSI managed its way through the 1930s in the volatile oil market, before its business took an unexpected change of direction. As World War II broke out, companies around the world faced a new reality. Some were forced to shift production to support the war effort, while others saw their businesses crumble by the conflict. In the United States, although the economy was not as severely impacted as in other regions, the war still brought significant changes, with its ripple effects creating challenges but also opportunities in certain industries.

Equipping the U.S. Military During WWII

As the U.S. got involved in the war in 1941, GSI opportunistically expanded its business in the wartime economy. The expertise the company had developed in seismic technology was redirected toward producing electronic equipment for the military. Specifically, GSI repurposed its oil detection technology to create anti-submarine sonar detectors, capable of detecting submarines from aircrafts. Soon thereafter GSI had secured military contracts from both the U.S. Navy and the Army Signal Corps.

The wartime experience helped the company recognize the potential outside of the oil industry. After the war, GSI steadily expanded beyond its legacy oil detection business, with a growing emphasis on consumer electronics. Patrick Haggerty, a former Navy lieutenant, joined the company in 1945 and played a crucial role in shaping its future venture into the world of consumer electronics – a market that Haggerty believed offered them even greater opportunities than the military market.

In 1951, the company restructured under the name General Instruments but soon discovered that the name was already in use by another American company in the same industry. This led to another and, to this day, definitive name change – Texas Instruments was born. The rebranding was to reflect its new focus beyond its geophysical services. In this new reorganization, GSI served as an oil exploration subsidiary, which it remained until 1988 when most of it was sold to Halliburton.

A Texas Instruments sign on the Dallas Expressway, 1957.
A Texas Instruments sign on the Dallas Expressway, 1957.

The Innovative 1950s and TIs Major Breakthrough

The 1950s was an eventful and successful decade for TI. After merging its way into the New York Stock Exchange followed by a rampant acquisition spree, the company gained the capital and momentum to explore a new avenue: semiconductor manufacturing. In 1952, TI acquired a license from Western Electric, a subsidiary of AT&T, to manufacture its newly patented germanium transistor. The investment quickly paid off, and within two years, TI was mass-producing high-frequency germanium transistors. In fact, TI developed and launched the first-ever transistor radio in 1954, the Regency Radio. It sold over 100,000 units and became a raving success.

Thanks to its stable military business and its early success with its expansion into other electronics, the company's revenue grew drastically in the 1950s, from $20 to $92 million. This strong foundation fueled internal optimism, with high hopes that TI could establish itself as a global giant in the consumer electronics industry.

Nearing the end of a highly successful 1950s for Texas Instruments, one of the company's – well, frankly, the world's – most pivotal moments was about to unfold. In 1958, a newly hired engineer named Jack Kilby invented what was about to be called the integrated circuit, or IC for short. This was a revolutionary advancement that completely reshaped the world of electronics and the biggest semiconductor leap since the invention of the transistor a decade earlier.

What Kilby figured out was that you could integrate components like transistors, resistors, and capacitors onto one single piece of germanium, as opposed to the currently used and labor-intensive assembly of individual components. The practical application of this insight dramatically reduced the size, cost, and power consumption of electronic devices. Ironically enough, Kilby did not receive his Nobel Prize for this invention until the year 2000.

Another highly influential figure who can't be left out in the story of the integrated circuit is a man named Bob Noyce. At the time though, Noyce was working at Fairchild Semiconductor, which he founded in 1957 with a group of scientists who all left Shockley Semiconductor (which was founded and run by a man named William Shockley, one of the co-inventors of the transistor). Shortly after Kilby's germanium-based IC, Noyce developed his own silicon-based version. Silicon's abundance, superior thermal properties, and ability to form a stable oxide layer made it more suitable for high-performance applications.

Some experts mean that it was thanks to Noyce's silicon version that the technology became as ubiquitous and powerful as it is today – viewing him as the “real” inventor of the two. Although, the consensus view seems to be that both Kilby and Noyce contributed to this breakthrough in their own way. Kilby acknowledged Noyce's contribution in his Nobel acceptance speech. And as if this was not enough, Noyce also co-founded Intel a decade later, but that's another story.

Jack Kilby's first integrated circuit prototype.
Jack Kilby's first integrated circuit prototype.

In the early days of ICs, manufacturing was complex and fraught with challenges, resulting in high costs that limited their potential applications and slowed broader market adoption. Despite these obstacles, TI developed the first computer to use silicon ICs for the U.S. Air Force in 1961, demonstrating the superior processing power when using silicon. Around the same time, the Air Force also utilized TI's ICs for its missile program. TI's technology enabled the deployment of America's first intercontinental ballistic missile, helping to bridge the so-called missile gap with the Soviet Union during the Cold War.

Much thanks to the U.S. government's heavy spending on technology, chipmakers like TI could invest more aggressively in both R&D and CapEx, which in turn lead to both smaller and more powerful ICs and higher manufacturing efficiency. This efficiency quickly translated into sharp price drops across the industry, allowing more and more advanced chips to make their way into consumer products by the mid-1960s. A significant milestone for TI occurred in 1969 when the U.S. technology giant IBM (the largest company in the world by then in terms of market cap) started integrating TI's chips into most of their computers.

The Space Race and Beyond

After proving its capabilities with anti-submarine sonar detectors during World War II and helping close the missile gap – a term referred to the U.S. government's efforts to enhance domestic missile technology and stockpiles to ensure a credible deterrent against potential Soviet aggression – during the Cold War, Texas Instruments once again was tasked by the U.S. government to help supercharge the nation's efforts in the space race.

In fact, TI's relationship with NASA dates back much further than the space race. Beginning already in 1958, NASA's Explorer 1, the first American satellite, was equipped with TI semiconductor components. Additionally, in 1967, the company developed the first solid-state radar unit, a small and reliable radar, usable in airborne communication systems and made possible by IC technology.

Two years later, on July 20, 1969, TI's contributions to man's history in space became even more notable when several of its semiconductor components and radar units were integrated into the spacecraft for Apollo 11's successful mission to the Moon. TI's involvement in the historic moon landing helped establish its credibility in the semiconductor and electronics industry. The use of ICs in such a high-profile, critical application was yet another demonstration of the reliability of the technology.

The Explorer 1 and Apollo 11 missions marked the beginning of a long and successful partnership between TI and NASA, as well as other space agencies. Over the years, TI’s chips have been key ingredients in the success behind famous space exploration achievements such as the International Space Station, the Mars Rover, and the James Webb Telescope, to name a few.

Developing a Math Essential

While most people around the world utilize Texas Instruments technology in some way every day without even knowing it, many are familiar with the brand thanks to its iconic graphing calculators. What has now become a near-monopoly began a few years after the invention of the integrated circuit, as TI sought to bring this innovation into the consumer market. In 1967, Jack Kilby, along with two colleagues, created a battery-powered handheld calculator known internally as the "Cal-Tech."

In the early 1970s, Texas Instruments developed the first single-chip microcontroller, the TMS1000, which proved invaluable for their new invention. This tiny microcontroller combined all the elements of computing onto a single piece of silicon and could be used to create a host of computer-controlled appliances, making it ideal for digital calculators. In 1972, TI introduced the Datamath pocket calculator. By incorporating only one IC, TI drastically reduced manufacturing costs, enabling mass production.

The success of the Datamath was instant and lifted sales of TI calculators from 3 million units in 1971 to 45 million in 1975.

Texas Instruments' iconic Datamath calculator from 1972.
Texas Instruments' iconic Datamath calculator from 1972.

TI's vision was for their calculators to become permanent fixtures in classrooms worldwide. However, there was some resistance in the early years, as calculators were seen as preventing students from learning to calculate on their own. Throughout the 1980s TI established a relationship with the National Council of Teachers of Mathematics. The council had recommended that mathematics programs should take full advantage of calculators and TI hoped that its close collaboration would lead to its calculators becoming the educational standard.

In less than a decade calculators went from being banned to being mandated, and in 1994, their place in the U.S. educational system was cemented when the SAT allowed the use of calculators. Since then, TI has become a cornerstone of mathematics education, effectively establishing a monopoly. Major textbooks worldwide feature illustrations of Texas Instruments calculators, emphasizing not only the quality of the product but also the strength of the relationship between TI and educational institutions.

Graphing calculators capable of plotting graphs and solving functions were first introduced to the public in 1990 and have since been a consistent cash cow for TI. Although TI does not report its calculator segment separately, some reports suggest that even conservative estimates place the profit margin of graphing calculators at well over 50%. Recent reports from a few years ago indicated that the entire calculator portfolio accounts for less than 5% of TI's total revenue.

Another educational tool, released by TI at this time was the iconic Speak & Spell. Introduced in 1978, this handheld device used voice synthesis to help children learn spelling and pronunciation through interactive games. It became incredibly popular, available in several languages, and made a lasting impression by appearing in the 1982 film E.T. the Extra-Terrestrial.

Expansive Decades and Streamlining the Semiconductor Division

Jumping back to the 1970s, while the U.S. government remained TI's primary customer, the widespread adoption of chips through IBM's mainframe computers quickly began to change, with demand skyrocketing. Anticipating this development, TI's management had already established manufacturing plants in Europe, Latin America, and Asia, positioning the company to fully capitalize on the surge in demand.

Building on its success with ICs, Texas Instruments expanded its semiconductor business into several rapidly growing industry segments. This included establishing a presence in areas like Dynamic Random-Access Memory (DRAM), Digital Light Processing (DLP), and Digital Signal Processing (DSP). TI's diversification saw its chips powering a wide range of electronic devices worldwide – from audio equipment to projectors and telecommunications systems.

During this expansive period, TI also tried venturing into the booming home computer revolution – a space currently led by IBM. For a few years, TI experienced great success, driven by rebate programs, selling around a million computers. Then suddenly, in 1983, sales declined sharply due to intense competition, leading TI to ultimately exit the market.

Texas Instruments personal computer adverts from the 1980s.
Texas Instruments home computer adverts from the 1980s.

Similarly, TI faced challenges in its other markets during the 1980s due to increasing competition and a perceived overemphasis on the management's less thought-out expansion plans. Critics accused the company of arrogance, suggesting that it neglected shifts in market conditions. Since its diversification into consumer electronics in the 1950s, TI had maintained its ambitions to become an electronics conglomerate with businesses spanning multiple segments. As the company began losing share in not only the consumer electronics market but also the semiconductor market, it faced a critical decision.

To address the difficulties, the company launched a strategic initiative in 1989 called "TI 2000" to radically transform Texas Instruments and re-architect its semiconductor business. The goal was to shift TI's focus from topline growth at any cost – leading the company into various low-quality product segments – to long-term sustainable margins. Additionally, a key element of the plan was to foster a more innovative environment, something that had been lost during its rapid expansion into new segments in pursuit of growth at any price. This led to several strategic partnerships with key industry players such as Hitachi, Samsung, Ericsson, Sony, and General Motors.

The initiatives started with TI 2000 eventually marked the end of an era. In 1997 – more than five decades after their first military contracts during the GSI era – the company divested its defense division. This divestment initiated a decade-long stint of mergers and acquisitions aimed at streamlining the company's operations – narrowing the company's focus on fewer semiconductor segments. During this period, TI acquired numerous companies to strengthen its market share within its refocused areas, ultimately becoming the semiconductor powerhouse known today with a distinct emphasis on analog and embedded processing.

Investing Through a Crisis

Following its transformation, TI faced the Global Financial Crisis. What began in 2007 shook the global economy to its core, leaving industries and entire economies in initial uncertainty and eventual recession – the cyclical semiconductor industry was no exception, experiencing a steep decline in demand. This period was very challenging for TI, leaving management no choice but to lay off over 10% of its workforce, but it also taught them a few very important lessons. In fact, so important that they would reshape the company's operating system. To gain insight into the company's reasoning during this period, one can read the CEO letters from former CEO Richard Templeton.

In his 2008 letter, Templeton emphasized the challenges TI faced in the "impossible economy." However, he viewed it from a different perspective, insisting the turbulence would give the company “... the opportunity to move faster and become stronger”. Besides predicting that 2008 would be a game-changing year in TI's history, he also ensured that the company's actions would ground TI in Analog and Embedded Processing – the two current divisions of TI which we will dig deeper into later on.

Over the next two years – while many of its competitors and industry colleagues were in dire straits – TI invested against the cycle in areas that would further strengthen its position in the market. The same goes for TSMC, which was in a similar position of strength entering the crisis and could invest its way through it to come out even stronger. Facing a revenue drop of roughly 40% in six months, TI prepared its operations for what it foresaw on the other side of the crisis. By expanding manufacturing capacities and temporarily suspending many of its operations, TI was ready to respond, eventually doubling its production output in a few months as demand started to return.

In hindsight, these years turned out to be a turning point for Texas Instruments' business operations, with the company's new business divisions significantly outpacing their respective markets in 2010. Additionally, this critical period marked a strategic, almost philosophical, shift for the business, which we will discuss more later on.

A Vertically Integrated Chip Powerhouse

After decades of venturing in and out of different business segments – ranging from military sonar equipment to personal computers – TI truly found its niche in the semiconductor industry. While many of its U.S. industry peers carved out niches in specific parts of the unbundled value chain, TI's niche centers on two types of chips it has mastered throughout the years: analog chips and embedded processors.

While Texas Instruments' analog chips and embedded processors differ in their use cases, they complement each other in many ways and are often sold to the same customers. Briefly, analog chips specialize in interfacing with the physical world by handling continuous signals that can take any value within a range, not just binary values like 0 and 1. Embedded processors, on the other hand, are designed for task-specific computation and control in digital systems. A typical use case where these chips work together is in automotive response systems. For example, an analog chip detects low tire pressure by measuring continuous signals from a pressure sensor; this information is then processed by an embedded processor, which alerts the driver through the car's dashboard system.

TI's chips are used in a vast array of industries, including automotive, aerospace, medical, industrial, telecommunications, consumer electronics, and more. To put one of these customer groups into perspective, today's vehicles contain anywhere from 200 to 2,000 semiconductors. TI offers close to 3,000 automotive products, providing solutions for systems such as braking control, engine management, and navigation, to name a few.

Two divisions might sound narrow, but the vast breadth of TI's product portfolio makes it one of the most diversified chipmakers in the world. TI offers roughly 80,000 products to over 100,000 customers, leading to a production of over 10 billion chips every year – reaching hundreds of millions, possibly even billions, of end-consumers. This almost incomprehensible reach has many claiming TI is a good indicator for the economy as a whole.

Texas Instruments operates as an Integrated Device Manufacturer (IDM). Unlike fabless companies such as NVIDIA or Qualcomm, or foundries like TSMC, TI handles both the design and manufacturing of chips in-house.

Its wide product portfolio differs from many other companies in an industry where constant innovation typically is key – ironically enough for a company that literally invented parts of the technological foundation that the entire industry relies on. Most of TI's chips are in categories less exposed to technological disruption, with more price stability and less need for updates because their use cases do not change over time. These dynamics allow TI to keep a large inventory, as the shelf life for many of their products is very long. Therefore, there are fewer reasons for customers to change suppliers, with long-term reliability and price being the most important factors.

The life cycles for many of its products typically range between 10 to 15 years, some even extending even further. By this, many of TI's products involve long-term contracts, with continuous delivery of chips to its customers over multiple years. For example, with its automotive customers, whose vehicle models typically have production runs lasting over 5 years, both parties usually commit to contracts that cover the entire production cycle. The stable and long-term focused nature of TI's customer demand – apart from the inevitable industry down-cycles caused by fluctuations in the greater economy of course – makes the IDM business model ideal for TI.

Having control of the entire process, from design to manufacturing, means that TI can avoid large parts of the potential external disruptions associated with high-volume semiconductor manufacturing. They can also easily grow their existing customers' accounts either by quickly developing a new chip for a specific end-user-driven purpose, or simply by selling more of their existing products. Their vast scale – with a footprint spanning 15 manufacturing fabs around the world that would cost tens, or possibly even hundreds including land and ancillary infrastructure, billions to replicate – also means that they can offer competitively low prices while enjoying high margins.

Ensuring Stability in the Apple Ecosystem

One of Texas Instruments' most fruitful business relationships is its long-standing role as a chip supplier for Apple's popular products. TI, for example, supplies chips for Pencil Pro, Vision Pro, and iPhone 15 – whose key chip suppliers we have visualized below. With millions of these devices sold every year, this represents one of many great volume income streams for TI.

The exceptionally high standards set by Apple for its suppliers, along with the scale required to meet this demand, exemplify the strength and resilience of TI. The risk associated with switching from a supplier of tens or even hundreds of millions of components could be immense, making such transitions uncommon and reinforcing long-term partnerships as long as the supplier can deliver as promised and doesn't price gauge.

Infograph visualizing the key chip suppliers for Apple's iPhone 15.
Key chip suppliers for Apple's iPhone 15.

Masters of Capital Allocation

As hinted earlier, TI's success in navigating the Great Financial Crisis became an internal blueprint for effectively managing challenging periods. It laid the foundation for the company's capital allocation framework – a defining characteristic of Texas Instruments. This disciplined pursuit aims to enhance shareholder value over the long term, encapsulated by the quote featured on the company's investor relations page:

“The best measure to judge a company's performance over time is growth of free cash flow per share, and we believe that's what drives long-term value for our owners.”

Infograph visualizing Texas Instruments (TI) FCF per share growth vs decline in shares outstanding
Texas Instruments FCF per share and shares outstanding between 2002-2022.

For many companies, such statements might be clichés used to impress the market, but in the case of Texas Instruments, the past decades have shown these are not empty words. The company truly lives by their words. In fact, between 2002 and 2022 (before the elevated CapEx cycle they are currently in), TI grew its FCF per share by 827%. And to further hammer in the importance of taking the long-term view and allocating capital wisely, the company has held its iconic annual Capital Management presentation every year since 2013. A rarely-seen transparent presentation designed to provide investors with insights into its long-term strategy, financial performance, and capital allocation priorities.

In the Capital Management presentation from August, 2024, CEO Haviv Ilan started off in typical fashion by reminding listeners about TI's four sustainable competitive advantages “that in combination provide tangible benefits and are difficult to replicate”. He then gave a thorough update on the company's strategic investments during its current elevated CapEx cycle, the business drivers for each business division, and their FCF per share expectations going forward.

TI's long-term strategy for maximizing shareholder value has three elements, one of which is the four competitive advantages that Haviv referred to. The other two are disciplined capital allocation and efficiency – the latter of which is defined as “getting our investments in the most impactful areas to maximize the growth of long-term free cash flow per share”.

Listing Texas Instruments' (TI) 4 competitive advantages based on their business objectives and strategy
The four sustainable competitive advantages that Texas Instruments business model is built upon.

The disciplined capital allocation part of the strategy can be divided into three buckets: capital expenditures (including both R&D investments associated with the development of products and build-out and refurbishing costs for expanding its manufacturing footprint), dividends, and share repurchases – all of which the company regularly employs depending on the business cycle. Regarding the latter two, TI has increased dividends every year for the past two decades while simultaneously engaging in annual share buybacks, reducing shares outstanding by close to 50% since 2004. True to their focus on long-term shareholder value creation, their buyback activity is of course highly dependent on their perceived valuation of the share.

While we have not had any major financial collapses since 2008, the semiconductor industry has faced several downturns thanks to its cyclical nature, most recently in 2023. Demand has been, to say the least, volatile in recent years depending on where in the semiconductor value chain you are operating. While the pandemic created long-term supply chain disruptions because of large-scale order cancellations and staff constraints in the larger parts of the market, the surge in demand for AI applications instead created a huge undersupply in others. TI's business has not been immune to this down-cycle, but management – true to their long-term objective – has stayed disciplined.

Just like during the Great Financial Crisis, the company has directed its resources toward areas it believes will most improve long-term performance and shareholder value. In this current downturn, TI decided to launch an elevated capital expenditure cycle lasting six years, aiming to “uniquely position TI for the next 10 to 15 years.” The cycle is expected to continue for one to two more years, primarily involving ramped-up investments in new fabrication facilities, mainly in the U.S.

A key aspect of this strategy is the development of 300mm wafer fabs, which are crucial for producing analog and embedded processing chips at scale. These wafers are larger than the traditional 200mm wafers, allowing for more chips per wafer, thereby improving efficiency and reducing costs over time. The potential $30 billion investment is supported by funding from the U.S. CHIPS Act, aiming to make the U.S. more self-reliant in semiconductor production. With Texas Instruments' successful track record and transparency in its capital allocation strategy, there is a strong trust in the company's management to come out even stronger on the other side once again, with their north star metric, FCF per share, leading their way.

Closing Thoughts

The story of Texas Instruments is a story of inventions, reinventions, and resilience. The company has in some way, shape, or form been involved in many, if not all, of mankind's major technological achievements since the invention of the transistor. TI has also reinvented itself many times over, growing from a small oil exploration services company to one of the largest chipmakers in the world. Whether it's the smartphone in your hand, the car you drive, or the microwave you heat your lunch in – if it runs on electricity, there's a very high chance it's powered by a certain company from Texas.


Why are finance professionals around the world choosing Quartr Pro?

With a broad global customer base spanning from equity analysts, portfolio managers, to IR departments, the reasons naturally vary, but here are four that we often hear:

Increase productivity

Eliminate hours of searching for specific data points buried deep inside company material.

Get a one-stop-shop solution

Everything you need for qualitative public market research in one single platform.

Uncover rare insights

Understand the qualitative aspects of entire industries or specific companies.

Leverage the power of AI

Incorporate AI functionality into your daily workflow.