Nvidia on Wednesday delivered quarterly results highlighted by record-breaking revenue and better-than-expected profits, something that has become the norm for the company.
And while top-notch performance across Datacenter, Gaming and Professional Visualization serves as the backdrop to the company’s more than doubling in share gains this year, perhaps the most interesting story is one that is out of this universe — Nvidia’s growing influence and opportunity to be the technology behind the metaverse.
Also by Daniel Newman: These are the next three mega-cap tech stocks you’ll be hearing more about
The awakening to a blended future where our physical and digital worlds collide has been teased over decades of mostly hyped-up virtual reality — bulky headsets, user-unfriendly software. It’s frankly a bit jarring.
The recent name change by Facebook to Meta
and CEO Mark Zuckerberg’s $10 billion (minimum) commitment to building the metaverse has brought the topic back to the mainstream. And the vision for Meta, while provocative in its demonstration, was just a cool graphics rendering of an idea.
has been at the center of building our blended future through engaging a massive developer ecosystem and addressing practical use cases for collaboration, digital twins and a platform for developing life-like avatars that can exist in a digital world.
So while Facebook, or Meta, or whatever we call them may have reinvigorated our interest in a blended digital and physical universe, Nvidia has an actual product, strategy and trajectory around its Omniverse product that has the opportunity to both catalyze growth as its own category, but also growth across Nvidia’s portfolio.
And for a company growing as quickly as Nvidia, with so much momentum, that should be exciting to shareholders.
Evolution of Omniverse
The earnings call late Wednesday provided a clear indicator of the company’s commitment to the metaverse. CFO Collette Kress spent a significant amount of time highlighting the recent GTC product launches for the company’s Omniverse Platform.
The first of two significant announcements was a Replicator service that will allow developers to seamlessly intertwine real-world and fully synthetic data — enabling developers to fully simulate physically accurate 3D worlds, which will be important for developing the future of autonomous vehicles, robotics and rendering the most compelling life-like graphics on the silver screen.
The other announcement that grabbed headlines was Omniverse Avatar, which is the company’s platform for generating interactive AI Avatars. This platform will tie together a number of core Nvidia SDKs including AI, Speech, computer vision, NLP, simulation and recommendation engines.
The immediate applications for Omniverse Avatar include collaboration, customer service and content creation. At GTC, the company was able to highlight the advancements of Omniverse Avatar by showing how an Avatar would be able to interact intelligently with a customer seeking to order food at a restaurant.
The technology further enables collaboration by removing barriers to interaction, including language and environment. The collaboration capabilities were demonstrated by how an interaction in two locations, with noisy backgrounds and different languages, could be seamlessly turned into a quiet, fully legible conversation.
How Nvidia will make money with Omniverse
The question that takes the Omniverse out of simulation and back to reality for investors is whether this is a moneymaker that can propel Nvidia’s growth prospects forward. The answer is an emphatic yes. And unlike Meta’s bold undertones of “we have no clear route to monetization,” Omniverse has a more immediate path to revenue, which CEO Jensen Huang highlighted in the Q&A portion of the analysts call.
Licensing will be a significant path to revenue. While Huang’s response in the Q&A to the first question, which was Omniverse-centric, was primarily philosophical, he clearly articulated licensing as an important route to monetizing Omniverse.
With licensing, Nvidia can monetize the 40 million creatives and designers, which is the number of Omniverse designers using the platform today. Huang used $1,000 per year per user.
He further emphasized that the 40 million creates and designers would be augmented by intelligent bots such as vehicles. If there are 100 million vehicles with a digital twin-like avatar, those could also have a per-year, per-vehicle licensing arrangement.
Customer service bots, which will require millions of intelligent bots to deliver smart retail and warehouse interactions, also share this licensing potential. Take hundreds of millions of intelligent bots and beings, and multiply by a licensing fee, and the business model becomes anywhere from interesting to game-changing for Nvidia.
Perhaps even more encouraging is that the Omniverse will serve as a catalyst of growth across the entire portfolio for Nvidia. Because to build the metaverse, you need to tie together a series of fundamental technologies powered by immense compute resources.
All of which Nvidia offers, whether it be GPUs on PCs, in the Cloud, or on dedicated Omniverse servers. In short, to build the Omniverse, computer graphics, simulation and AI have to combined to create these multi-dimensional universes.
Huang went on in his remarks to suggest the revenue impact of Omniverse would likely be about 50% hardware and 50% licensing — numbers that will likely skew as this evolves.
However, the point here is that even if the licensing model is still nascent, developers will need some serious hardware to build all of this. Much of which Nvidia will be readily willing to provide, putting the Omniverse as just one more component of Nvidia’s route to a trillion-dollar market value.
Daniel Newman is the principal analyst at Futurum Research, which provides or has provided research, analysis, advising, and/or consulting to Microsoft, Zoom, Salesforce, AWS and dozens of other companies in the tech and digital industries. Neither he nor his firm holds any equity positions with any companies cited. Follow him on Twitter @danielnewmanUV.