Antipodes: Opportunities in AI are expanding beyond Nvidia

Markets have chosen Nvidia as the ultimate AI winner, but Alison Savas says other companies are stealing its thunder

Business woman using mobile smartphone on global network connection and data customer connection on blue background, Digital marketing, Data exchanges, Innovative and technology.
4 minutes

By Alison Savas, investment director at Antipodes Partners

Nvidia continues to defy gravity. Despite being one of 2023’s best performing global equities, up almost 240%, its share price has continued to rise yet another 130% this year.

With a market cap of $2.8 trillion, Nvidia is now the third largest company in the MSCI ACWI behind Microsoft and Apple, and its runway for growth is strong thanks to its near monopoly over AI chips and pricing power.

The market has blessed the stock as the ultimate AI winner, but we know that with any non-linear change, the landscape will shift over time.

We are currently in an arms race to build more capacity and train increasingly sophisticated models. But the question is, how sustainable is the current level of spending?

Current estimates anticipate that spending will continue to grow at 70% a year, from $45bn in 2023 to more than $400bn by 2027. To support this growing AI hardware, surrounding infrastructure also needs to be upgraded – we estimate an additional $350bn will need to be invested along-side that $400bn. This will take total data centre investment to $750bn by 2027.

These numbers are staggering, and to justify this level of spend, companies will need to find ways to monetise models. As companies look to scale their AI models the spotlight is squarely shifting to reducing the total cost of compute.

Competition is building from Nvidia’s traditional semiconductor rivals as well as the cloud giants that are developing in-house accelerator chips (an alternative to GPUs) to support both training and inferencing workloads (the workloads from deploying, or using, the AI model).

See also: The tech giants are facing an ‘existential event’

Researchers are also working on ways to increase the algorithmic efficiency of the AI models to get better use out of existing chips; startups are contemplating using alternative GPUs to run AI models once they are deployed into the real world; and smaller models are being deployed to run locally on devices without the need to use GPUs in data centres.

All these methods aim to reduce the cost of compute, so the phenomenal growth that Nvidia has experienced is not guaranteed into the future.

The capability of these large language models is transformational, and there will be more than one winner from this cycle of innovation despite the way the market is behaving today.

It has the potential to transform traditional parts of the economy, where AI can drive revenue or significantly reduce costs, as well as change the way consumers interact with the digital and physical world.

Investors should be looking for pragmatic value exposure to AI – stocks that can benefit from the AI investment cycle but are mispriced relative to their business resilience and growth profile. Two such ideas are Taiwan Semiconductor Manufacturing (TSMC) and Qualcomm (QCOM).

TSMC is the picks and shovels play of AI given its critical role in the supply chain. It is the largest and most sophisticated foundry in the world with a near monopoly over the manufacture of the most advanced semiconductor chips. The GPU or accelerator chips that are currently deployed in data centres are more than likely manufactured by TSMC.

Its competitive strength is evidenced by Intel’s challenges scaling its foundry business, and Samsung Electronics’ inability to mass produce leading edge chips at the same volume, quality and cost as TSMC.

See also: Does the ‘magnificent seven’ tech wobble signal a turning point?

Explosive demand for AI chips places TSMC in pole position to harvest those investments for growth and profitability. We see the company growing earnings by 15% to 20% a year and it’s priced at only 14 times earnings.

Geopolitical risks do exist, but given TSMC’s critical role, both superpowers are still very dependent on the company.

Beyond first order beneficiaries, investors should also be thinking about edge applications. Qualcomm is a global leader in low power compute and connectivity chips. Qualcomm’s expertise allows it to flex its creative muscle by designing AI chips for devices like phones and laptops.

Some of Microsoft’s new Surface tablets and laptops, for example, will be able to run certain AI tasks locally on the device, powered by chips from Qualcomm2.

Running AI models locally results in lower cost (no data centre required), better security (sensitive information is not being sent to the cloud) and a better user experience from lower latency (avoids internet lag).

Unlike downloading a new app, users need to upgrade their hardware to access these new AI features. By bringing AI from the cloud to the device, Qualcomm benefits from this refresh cycle as well as via delivering more semiconductor content to the device.

Nvidia is today’s undisputed AI leader, but the landscape will shift. There are obvious parallels to the dotcom bubble.

The fibre optics communications boom in the late 90s led to game changing technology that ultimately enabled all the things we take for granted today. But capacity was overbuilt in the short term which led to a period of digestion as investment receded and stock valuations came back down to earth.

See also: Schroders Capital launches AI analyst platform