The transformative power of AI and LLMs
There has been a lot of excitement about AI recently and for a good reason. AI is a transformative technology for consumers and enterprises, creating new opportunities previously unimaginable. Narrow AI, systems designed to perform a narrow task like voice recognition or recommendation systems, have been with us for some time. However, Generative AI has captured investors’ attention recently. More specifically, text-generating Large Language Models (LLMs).
The distinguishing factor of LLMs is their deep understanding of language, allowing them to perform a wide range of language-based tasks, setting them apart from older narrow AI systems. The shift is happening now due to a convergence of trends in model architecture, data and compute which have combined to drive exponential growth.
From autopilot to copilot – a paradigm shift in AI
During a recent interview with Time magazine, Microsoft CEO Satya Nadella discussed the evolution of AI and how it is going to unleash a new wave of productivity and remove drudgery from our lives:
“AI itself is very much present in our lives. But if anything, it’s moving from being autopilot to being a copilot that helps us at our work. You put the human in the centre, and then create this tool around them so that it empowers them.”
Addressing the productivity crisis through AI
Australia is facing the largest fall in productivity on record, and accelerating wage growth will push prices higher absent an increase in productivity. At the same time, enterprises are looking to do more with less as they face labour shortages post the pandemic. Although tech products can often be solutions in search of problems, this one is quite the opposite, which is why there is palpable excitement from tech leaders.
Amazon CEO Andy Jassy, on the last earnings call, commented:
“But we’re not close to being done inventing in AWS. Our recent announcement on Large Language Models and generative AI and the chips and managed services associated with them is another recent example. And in my opinion, few folks appreciate how much new cloud business will happen over the next several years from the pending deluge of machine learning that’s coming.”
Gaining exposure to AI – noteworthy companies
There are several ways investors can gain exposure to AI, which we have previously written about here. As excitement about AI builds, many companies are talking about AI and it can be difficult to separate the true beneficiaries from those just leveraging the hype. So I have narrowed the list down to a group of companies with moats we feel comfortable assessing to be sustainable over the long term. However, moats shift daily, and these investments must be monitored closely.
- ASML Holding NV: manufactures extreme ultraviolent lights (EUV) lithography technology used to produce patterns on silicon, helping to make chips faster and more efficient.
- Taiwan Semiconductor Manufacturing Co Ltd: manufactures most of the high-end GPUs used in the leading data centres around the world. They use ASML equipment in the fabrication process.
- Nvidia Corporation: designs and sells GPUs and most of their high-end chips are manufactured by TSMC. Nvidia has also built a full-stack AI solution meaning it can also build the entire data centre for you and provide the software to run it.
- Alphabet, Amazon, Microsoft: CSPs or cloud service providers like Amazon Web Services, Microsoft Azure, and Google Cloud Platform purchase Nvidia GPUs for use in their data centres.
Geopolitical factors influencing AI
Semiconductors have become a key battleground in the growing rivalry between the US and China. And Washington’s bid to curb exports of leading-edge technology to China has placed land mines around some of these companies. The US recently banned the sale of EUV lithography technology to China, catching ASML in the crosshairs. At the same time, TSMC produces most of the highest-end logic chips from its manufacturing base in Taiwan and this carries a certain level of geopolitical risk.
In the most recent quarter. Nvidia shares saw a historic surge, increasing by 24.4%, nearly reaching a market value of $1 trillion. The record-breaking growth was fuelled by a more optimistic than expected sales forecast of $11 billion for the current quarter, marking a 64% increase from the previous year. This forecast exceeded Wall Street’s expectations, contributing to the stock’s rise.
Nvidia CEO, Jensen Huang, highlighted the growing demand for artificial intelligence applications and computational power as a key driver behind the company’s success, with Nvidia’s chips playing a crucial role in AI system creation. The excitement about AI has raised Nvidia’s share price to record levels, and the valuation is full, absent the release of any new products.
The case for investing in cloud service providers (CSPs)
We believe investors will do well to invest in cloud service providers. CSPs have been under pressure over the last 12 months as enterprise customers looked to reduce their cloud spending as they sought business efficiencies. CSPs have accommodated these requests, and as a result, their revenue growth has slowed, providing an opportunity for long-term investors. One of the primary benefits of hosting applications in the cloud is the ability to scale up and down as required, and it is not surprising enterprises have chosen to lower their costs.
However, over the next decade enterprises will move their workloads from on-premise to the cloud as they strive to become more agile and productive. Their capacity to improve efficiency, enhance decision-making processes and provide innovative solutions to complex problems based on new AI services will demand expansive and flexible cloud storage. This places the CSPs in an exceptionally strong position.
Democratising AI: a new growth avenue for CSPs
In addition to the ongoing cloud migrations, generative AI presents an exceptionally large growth opportunity for CSPs that we believe is underappreciated. Generative AI models are democratising AI. In the past, businesses needed a team of computer scientists to build their own AI solutions however, with the advent of foundational LLMs, businesses of all sizes now have access to state-of-the-art models that can be easily customised with their own data.
These CSPs are building their platforms to allow users to select, train and deploy their fine-tuned models in the cloud. This will lead to an explosion of custom LLMs that, owing to their size and complexity, need to be stored and run in the cloud. And due to security issues around data, businesses will be inclined to use the services offered where their data is already located, putting the CSPs, again, in a favourable position.
The big three in AI: Microsoft, Google, and Amazon
Microsoft and Google have been leading the charge, with Microsoft investing in Open AI and Google investing billions in AI R&D over the last decade. Both companies will benefit from continued cloud growth but also have the software distribution to billions of end users through which they can deploy their own generative AI solutions. Despite lacking these touchpoints, as the largest CSP, Amazon has a material opportunity to participate in the growth of AI in coming years.
As we embark on a significant transformation, it’s an exciting time for technology investors.
About the Author
The Education of a Value Investor by Guy Spier
The Digitisation of Everything - Retail
The Digitisation of Everything - Entertainment
The Digitisation of Everything - Automotive
The Digitisation of Everything - Payments