Search

NVIDIA’s Omniverse Announcements At GTC Further Validate The Merging Of Our Physical And Digital Worlds

NVIDIA's Omniverse Announcements At GTC Further Validate The Merging Of Our Physical And Digital Worlds
Getting your Trinity Audio player ready...

Over the past month, we have been inundated with visions of a future where our physical and digital worlds collide. In the wake of a litany of bad press, Facebook, now Meta, founder Mark Zuckerberg wagged the proverbial dog by renaming the company Meta and showing us a future where we immerse ourselves as avatars into a digital world of work and play that will be bound by no physical or geographical limitations — it will just be us, anywhere, with anyone, doing just about anything.

To get to this next phase Zuckerberg claims to be carving out at least $10 billion to deliver, and that’s likely a conservative estimate. In addition, the company will need to develop what it refers to as the Metaverse, which will take a massive onslaught of technology and development. At its recent MS Ignite event, Microsoft also laid the groundwork for its version of a metaverse, most notably built around its productivity platform, Microsoft Teams. With 200 million-plus monthly active users hosting meetings on Teams and a world that has become uniquely familiar with remote meetings, the idea of entering a hybrid world where our digital selves can meet, collaborate, produce, and feel more connected makes sense. Microsoft is planning to partner with Facebook Workspace to accomplish this per a recent joint announcement.

However, to build this future digital meets the physical world, there is a massive requirement for software, frameworks, data (real and synthetic), and models to create a lifelike high-quality experience. And while Meta and Microsoft may be garnering many headlines, NVIDIA has come out at this week’s GTC event with a compelling case that the company will be a critical contributor to the Meta, ahem, Omniverse. Moreover, NVIDIA is quite adamant that the term Metaverse is far from the accepted nomenclature for this next-generation technology.

While Mr. Zuckerberg was talking about the long tail to get to this future state, NVIDIA was active at this year’s GTC rolling out new offerings in its Omniverse portfolio that will undoubtedly be instrumental in developing these parallel worlds.

Synthetic Data Created in a Virtual World, For Real World Applications

The first announcement from NVIDIA was around synthetic data. The company calls it “Omniverse Replicator Synthetic Data-Generation Engine for Training AIs.” A mouthful for sure, but in order to get us to a state of having lifelike avatars that look, sound, feel, and engage like us, we not only need an enormous volume of data, we must also create an entire parallel world of digital twins where we can optimize autonomous vehicles and robots. We’ll also need even more data — specifically data that can be generated in a virtual world.

The Omniverse Replicator enables autonomous vehicles and robots to engage in diverse and rare scenarios and conditions that would be difficult to replicate in the real world. These digital twins would create synthetic data that will fill gaps in real-world data required to create future technology and be critical to the evolution of AI. This technology will bring down the cost of developing real-world technology by first existing and optimizing in a virtual environment. Autonomous vehicles and robots are, as I mentioned, two easily understandable examples of this. And, given the fact that we’re talking about a fully digital environment, the ability to create an onslaught of invaluable data in a condensed time frame is highly achievable.

Your Avatar is Waiting

The second announcement from NVIDIA will be more easily digestible for the technologically normal who understand how we get inside the machine. With the advent of NVIDIA Omniverse Avatar, the company is now offering a technology platform for generating interactive AI avatars. The name effectively gave it away, but this stuff is technical enough, so no complaints here. Bottom line, in order to have a metaverse or Omniverse or whatever we ultimately settle upon from a nomenclature standpoint, we must first create intelligent virtual versions of ourselves and others. NVIDIA’s CEO Jensen Huang referred to them as Intelligent virtual assistants, and in the company’s release, he said, “The dawn of intelligent virtual assistants have arrived.”

Much like the company’s early inference engines Jarvis and Merlin, developed for conversation and recommendation, the company has come up with two real world examples with its Project Tokkio for customer support and Project Maxine for video conferencing. Demonstrations of avatars conversing on climate science were shown, along with a customer service avatar that was engaging with customers in a restaurant to take their orders. Project Maxine showed the power of what collaboration could look like, and something I imagine the Meta/Teams tie-up above would want to consider. In one example of Project Maxine, the company showed how it could enable people using collaboration technology while in noisy environments speaking different languages to have a seamless conversation down to the inflection in each participant’s voice.

Omniverse, Metaverse, Regardless, NVIDIA Will Be a Massive Player

NVIDIA’s arrival as an essential contributor to this emerging space, whatever we wind up calling it, shouldn’t surprise anyone. The company has all but been anointed as the king of AI Training, and its GPUs are held in the highest regard by gamers and enterprises alike. Furthermore, NVIDIA regularly sees its GPUs come out on top of the MLperf benchmarks — an important and highly regarded indicator of ML performance for the most popular accelerated workloads. NVIDIA has also been a staple in the development of autonomous vehicle technology, and its technology is used in popular Tesla models. The company has announced mainstream fully autonomous technology just this week at GTC with its NVIDIA Drive Hyperion 8.

In summary, the physical and digital worlds we live in are quickly colliding. We are only at the beginning of this journey and are already hearing the narratives of endless tech companies about their intention to contribute to this ecosystem. NVIDIA, however, with Omniverse, has been building toward this for a long time. The company is not only going to be a part of this story, but it will contribute to it in a meaningful way, setting a high bar for others to try to emulate.

Disclosure: Futurum Research is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author does not hold any equity positions with any company mentioned in this article.

Author Information

Daniel is the CEO of The Futurum Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise.

From the leading edge of AI to global technology policy, Daniel makes the connections between business, people and tech that are required for companies to benefit most from their technology investments. Daniel is a top 5 globally ranked industry analyst and his ideas are regularly cited or shared in television appearances by CNBC, Bloomberg, Wall Street Journal and hundreds of other sites around the world.

A 7x Best-Selling Author including his most recent book “Human/Machine.” Daniel is also a Forbes and MarketWatch (Dow Jones) contributor.

An MBA and Former Graduate Adjunct Faculty, Daniel is an Austin Texas transplant after 40 years in Chicago. His speaking takes him around the world each year as he shares his vision of the role technology will play in our future.

SHARE:

Latest Insights:

Intel Showed Progress in Q1 2024 Results Led by Double-Digit Growth in Intel Products and Intel Foundry Delivering Breakthrough Intel 3 Production
The Futurum Group’s Ron Westfall and Daniel Newman assess Intel Q1 2024 results and why Intel’s new foundry operating model provides transparency and the new Intel Products immediately bolster the Intel enterprise AI proposition.
Bedrock’s New Enhancements Are Designed to Streamline the Development of Advanced Generative AI Applications
Steven Dickens, Vice President and Practice Lead at The Futurum Group, provides his insights into the announcements from AWS on the enhancements to Amazon Bedrock, including Custom Model Import, Model Evaluation, and advanced Guardrails.
In a discussion that spans significant financial movements and strategic acquisitions to innovative product launches in cybersecurity, hosts Camberley Bates, Krista Macomber, and Steven Dickens share their insights on the current dynamics and future prospects of the industry.
The New ThinkCentre Desktops Are Powered by AMD Ryzen PRO 8000 Series Desktop Processors
Olivier Blanchard, Research Director at The Futurum Group, shares his insights about Lenovo’s decision to lean into on-device AI’s system improvement value proposition for the enterprise.