How Can Companies Turn AI Promise into Profits?

Equity investors should look beyond the hype for companies with clear strategies to profitably monetize the benefits of generative AI.

In any technology revolution, the transition from hope and hype to productivity and profits is rarely smooth. While commercial uses of generative artificial intelligence (AI) are just starting to emerge, equity investors can map out strategies to find companies that are best positioned to reap business advantages.

Ever since ChatGPT was released to the public in November 2022, generative AI has been seen as a groundbreaking technology. Companies across industries are talking up its business potential and investors are on the lookout for early leaders. Spending on AI-related software, services and infrastructure is projected to surge (Display). Yet for all the amazement and excitement about a new catalyst for innovation, the path to profitable monetization of the technology remains unclear.

Generative AI Market Is Poised for Explosive Growth

How to Monetize Generative AI: Productivity vs. Pricing

Companies can make money from generative AI in several ways. Users of the technology can find ways to improve productivity with AI. Providers of the technology to those users—the “platforms”—will profit if they can achieve favorable price points. And “picks and shovels” suppliers sell the underlying hardware needed to run the technology. These paths to monetization are intricately linked.

The market has quickly identified “picks and shovels” winners—as seen in this year’s performance of NVIDIA, which makes graphics processing units (GPUs) that are essential for AI. Sorting out the winning strategies among platforms and users is a lot murkier. But we’re starting to see these companies take various approaches to turn productivity gains into profits.

Some companies have predicted that AI could unlock productivity improvements of 20%–30%. While a few high-profile cases focused on worker redundancies, far more have focused on increasing output with the same employee base. For example, AI can potentially perform many menial, time-consuming tasks, freeing up professionals to add more value for their employers.

Delivering productivity gains will depend on the technology’s cost. For example, a company seeking to improve the productivity of a $100,000-a-year employee by 25% will face an entirely different value proposition if the AI technology for that worker costs $5,000 or $20,000. As a result, at this stage of the technology’s evolution, many investors are concentrating their attention on how AI vendors will price the technology.

For AI platforms, finding the right price point is partly dictated by the cost of computing infrastructure. AI-enabling technology is very expensive, as the supply of critical infrastructure such as GPUs remains extremely limited. As a result, AI vendors must balance their customers’ productivity expectations against their own cost of servicing them.

Three Broad Pricing Strategies

While the commercialization of AI is still in its infancy, we’re already seeing three key pricing strategies. Understanding the dynamics of these strategies can help investors analyze whether different types of companies are on track to profit from the technology.

Subscriptions: Companies that can integrate AI features to enhance existing products will have instant access to a potentially lucrative customer base. Microsoft is already doing this by charging $30 per user/per month for a service called Copilot, which adds AI capabilities to applications within its Microsoft 365 suite. Some investors anticipated a much lower pricing point. So why did Microsoft charge more than expected? Were customers willing to pay more because productivity gains already exceed expectations? Or was the technology proving more expensive for Microsoft than it expected? It’s too soon to say, but it may be a bit of both. Google is going down a similar path, having recently announced a $30 per user/per month pricing for its Duet AI service for G Suite enterprise applications.

A la carte: As more companies adopt AI technologies, they’ll need more computing infrastructure to run their AI queries. We believe that many will choose to leverage the native AI platforms of cloud vendors such as Amazon.com, Google and Microsoft. Because their usage may be sporadic, and because AI infrastructure costs so much, the cloud vendors will likely charge them on an a la carte model, in our view. OpenAI has pioneered this consumption model by charging enterprise customers according to the number of “tokens” they use, with each token representing about 750 words. Microsoft, OpenAI’s infrastructure partner (and minority investor), has said that 2% of its Azure cloud growth in the third quarter will come from generative AI consumption.

As a feature: Some AI providers might integrate AI capabilities into products without charging for the enhanced services initially. Instead, the strategy would aim to enhance the value of the product with AI added as a feature. Eventually, the company might impose across-the-board price increases, justified by the value that has been added. Adobe has historically used this approach with its Creative Cloud and Acrobat products. This approach often makes the most sense for products sold to consumers and small businesses, who may balk at paying more for a feature that they may or may not use. Once they incorporate the new AI capabilities into their workflow, it can be easier for them to accept a price increase later.

The Consumer-Facing Conundrum

Investors looking for meaningful profits from consumer-facing chatbots may be disappointed. Basic query-and-response engines such as ChatGPT and Google’s Bard are already becoming commoditized. Companies that dominate consumer-facing spaces—consumer devices, internet search engines, social networks—will be pressed to show how they can creatively deploy AI to create value for consumers. Apple has announced plans to develop its own chatbot, while Google is considering using AI chatbots to help deliver a more focused list of responses to queries.

These AI products will be hard to directly monetize, in our view; it’s more likely that they will be used to serve targeted ads, just as these platforms currently do. From the consumers’ perspective, these products will look like an enhancement to keep them within those vendors’ ecosystems rather than as an incremental stream of revenue.

As the technology develops, pricing strategies will evolve too. With a roadmap of monetization strategies at hand, investors will be better equipped to separate companies that are proficient at puffery from those poised to generate AI-driven profits that can support investment returns.

Authors
Michael Walker
Portfolio Manager/Senior Research Analyst—Concentrated Growth
James T. Tierney, Jr.
Chief Investment Officer—Concentrated US Growth

The views expressed herein do not constitute research, investment advice or trade recommendations and do not necessarily represent the views of all AB portfolio-management teams.

Related Insights