Investing in the Age of AI: A Primer for Investment Professionals
The rise of artificial intelligence (AI), particularly large language models (LLMs)i, marks potentially one of
the most transformative shifts of our era. It can have implications on not just how we work and conduct
business, but how we thinkii and interact as humans. However, this industry is still in its nascency and
even leading professionals in the field disagree on the pace, final capabilities, and eventual impactiii of the
‘AI revolution.’ That said, the question seems less Will it have an impact? and more How far-reaching will
this impact be?
In certain capacities, this new technology has already begun reshaping industries at various levels,
creating burgeoning opportunities. For investors, this first wave can be usefully divided into three distinct
stages: the rise of large language models, the emergence of AI-native applications, and the search for
power.
The Rise of Large Language Models
Since the launch of ChatGPT in November 2022, the capabilities of LLMs have rapidly advanced, albeit
subject to the ‘AI scaling law,’ which stipulates improving performance with larger models, more data, and
computing power balanced by diminishing returns in costs and efficiency. Nonetheless, capabilities
continue to expand, notably in reasoning and reliabilityiv, with early models providing basic conversational
responses and the newest models solving complex logical problems, summarizing intricate research
papers across disciplines, and generating advanced code with a high degree of accuracy.
Reasoning refers to AI’s ability to synthesize information, understand relationships, and draw
conclusions.
Reliability essentially means consistency—can AI deliver high-quality, relevant results every
time, or just occasionally? This may have implications beyond baseline functionality: Do you
choose your favorite coffee shop because they provide a consistently just-above-average
cappuccino every morning, or because every one in five is among the best you’ve had, with the
other four middling to poor?
For investors, this stage is already yielding notable returns. Much of the recent gains in the S&P 500 have
disproportionately favored companies benefiting from the LLM revolution. By extension, companies like
NVIDIA have seen meteoric rises in stock value as demand for AI chips surges. These chips are the
engines powering LLMs and the resultant AI, processing vast amounts of data at lightning speed and
making it possible for AI systems to learn and perform complex tasks in everything from healthcare to
autonomous driving. As AI adoption accelerates and scales across industries, the demand for these chips
has skyrocketed and will continue to rise.
Investing in AI chips is akin to the "shovels and picks" strategy during a gold rush—regardless of which
industries emerge as winners or losers in the AI revolution, the infrastructure enabling the technology will
remain indispensable. By focusing on this essential technology, investors can benefit from the broader AI
wave without being tied to the performance of specific industries.
In some ways, this seems reminiscent of the early days of the internet boom, where investments in
infrastructure—like broadband and server technologies—paved the way for the rise of companies like
Google, Amazon, and Facebook.
The Rise of AI-Native Applications
But how are these LLMs being leveraged on the ground currently? The answer lies in the rise of AI-native
applications, which fall into two broad types:
Productivity-Enhancing Applications: These tools are designed to enhance the productivity of individuals and teams. For instance, GitHub Copilot helps developers write code faster, while enterprise offerings like Microsoft Copilot streamline workflows across entire organizations, reducing costs and improving efficiency.
Scaling Operations Through AI: Historically, scaling a $100 billion company into a $1 trillion enterprise required expanding the workforce, outsourcing operations, or pursuing mergers and acquisitions—sometimes at the cost of quality and efficiency. AI-native applications introduce a new paradigm with potentially widespread consequences both positive and negativev: scaling with AI instead of human labor. This ability to scale without human bottlenecks would fundamentally change the rules of growth.
However, despite its genuine promise (and potentially inflated hype), most Fortune 500 companies are not ready for this transition.
Knowledge and Familiarity Gap:
Many organizations lack AI-native talent to build roadmaps for transitioning to AI-aligned operations, while anything new is essentially disruptive and change takes time at both at an organizational and personal level. (A new type of car with a beautiful exterior needs mechanics who know their way around the engine and aren’t afraid to get their hands dirty when something goes wrong under the hood; otherwise, you’re left sitting with a useless, massive, expensive paperweight.)
Data Foundations:
Enterprise data systems are often fragmented; they are akin to scattered puzzle pieces—some neatly organized (structured data) and others in random piles (unstructured data) and not designed to work harmoniously with LLMs, which need these pieces to fit together seamlessly to work effectively. Without this integration, the model struggles to make sense of the information.
Lagging Infrastructure:
Such rapid advancements have left enterprise architecture and middleware struggling to keep pace to host these improvements and refinements, making it difficult to move AI applications from experimentation to meaningful production. Overcoming these hurdles requires strategic investments in talent, data modernization, and infrastructure upgrades. For investors, companies making these investments are well-positioned to lead in the AI era.
The Search for Power
As AI-native applications move to production and begin creating value, a new choke point is emerging:
energy. To put it simply, historically, economic growth was driven by population growth, as humans were the
primary unit of work. However, with AI systems, energy demands are becoming the limiting factor. These
models are extraordinarily energy-intensive, requiring vast computational resources, with significant
implications for infrastructure. Training a single LLM can consume as much energy as hundreds of
thousands of households, placing a significant strain on infrastructure. This energy demand is reshaping
how companies approach AI deployment, with a growing focus on efficiency and sustainable power
sources. Companies like Microsoftvi and Metavii are already positioning themselves near abundant power sources
to meet the energy demands of their AI operations. Access to reliable and scalable energy is becoming as
critical as talent and data in this new paradigm. Power sources could emerge as a key driver of economic
growth in the AI era, shaping the next frontier of technological and economic competition, and serving as
a foundational enabler for advancements in data processing and automation. Access to reliable, efficient,
and sustainable energy will not only determine the scalability of AI systems but also influence the
economic competitiveness of nations and industries. As AI-driven technologies demand unprecedented
computational power, innovations in energy generation, storage, and distribution could redefine global
leadership in the digital economy.
Creation of Renewable Energy Markets
Companies specializing in renewable energy production, storage, and infrastructure (solar, wind,
and hydro) are poised to experience significant growth, driving innovation and job creation.
However, such rapid development of renewable infrastructure could also face resistance due to
unchecked land use and environmental concerns, while also pushing out otherwise innovative
smaller businesses unable to scale quickly to compete with sharply rising costs.
Emergence of Carbon Credit and Pollution Mitigation Markets
The substantial energy requirements of AI systems are giving rise to secondary markets focused
on offsetting emissions, such as carbon credit trading and innovations in pollution capture and
mitigation. This trend incentivizes greener practices across industries and supports companies
offering carbon-neutral solutions. On the other hand, this may encourage "pay-to-pollute"
practices, where companies invest in offsets rather than reducing emissions, potentially delaying
meaningful environmental progress.
Development of More Energy-Efficient Technologies
The push to reduce energy consumption is spurring markets for cutting-edge energy-efficient
hardware, including AI-specific chips, energy-optimized data centers, and smart grid
technologies. These advancements can cascade into other sectors, promoting more sustainable
energy use overall. However, the high initial costs of developing and adopting these technologies
could leave behind or fully exclude smaller firms or nations with limited resources, further
widening the technological divide between developed and developing regions.
So What’s Next?
All of the above developments could redefine the economic landscape and lead to substantial returns for
early movers. To fully capitalize on AI’s potential, organizations must strategically invest in these technologies and cultivate the necessary skills within their workforce. For investors, the key lies in staying disciplined and focusing on companies that not only harness the transformative potential of AI but also demonstrate sound fundamentals and leadership, a willingness to evolve, practices based on sustainability, and a capacity for responsible growth. By prioritizing sectors and firms with resilience and adaptability, investors can build portfolios that do more than chase trends—they can contribute to innovation that drives both meaningful returns and lasting impact. Large language models are advanced artificial intelligence systems designed to understand and generate human-like text based on vast datasets. They use deep-learning techniques, particularly transformer architectures, to analyze language patterns, enabling tasks such as translation, content creation, and complex problem-solving. Examples include OpenAI’s GPT models, Google’s Bard, and Meta’s LLaMA.
ii https://bigthink.com/thinking/the-mechanized-mind-ais-hidden-impact-on-human-thought/
iii https://onezero.medium.com/a-i-isnt-as-advanced-as-you-think-eeeaf4b085cf; https://www.noemamag.com/the-
danger-of-superhuman-ai-is-not-what-you-think/
3iv OpenAI’s GPT-4 exhibits “human-level performance” on professional benchmarks - Ars Technica
v https://eng.vt.edu/magazine/stories/fall-2023/ai.html
vi https://www.nytimes.com/2024/09/20/climate/three-mile-island-reopening.html
vii https://www.datacenterdynamics.com/en/news/meta-signed-geothermal-energy-deal-to-power-data-centers-in-
us/?utm_source=chatgpt.com
4