Search

NVIDIA GTC 2022

The Six Five team discusses NVIDIA’s GTC 2022 event.

If you are interested in watching the full episode you can check it out here.

Disclaimer: The Six Five Webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we do not ask that you treat us as such.

Transcript:

Daniel Newman: First and foremost, GTC, a little over a week ago, had the chance to jump on CNBC and talk a little bit about it. So I’ll put that in the show notes. But listen, this is a huge developer conference. It’s a big moment, and if you haven’t been paying attention, NVIDIA’s been under a little bit of pressure over the last few quarters, after a meteoric rise that took it to nearly a trillion dollar market cap, the company has come back to earth, probably after finding out, A lot of its GPU business was more closely tied to crypto than people expected. Two, as the supply chain has somewhat regulated, there’s been an inventory glut, and over the last quarter you might have seen a little revision to NVIDIA’s numbers.

Now we’re not here to talk about numbers, but I thought it was worth pointing out where we are. So the question I was asked is, well, did NVIDIA do enough at GTC to stimulate growth and get it back on its better way? Now the macro environment is crappy. Terrible policy all around. Hopefully some changes in November. But they did announce new platform with Ada Lovelace, which is going to power its new GForce, RTX 4090. It’s got new rate tracing technology. They launched a whole new drive platform, which, if you saw, and we’ll talk about the Qualcomm event later, seems like they need that. Now I want to focus, because there’s too many things to talk about at once. I’m going to dive in and just talk a little bit about what the company announced with the omniverse. Pat, for me, that was the most exciting thing was the company’s migration to cloud.

Now we hear companies like Meta talking about a future in the metaverse. Well, the reality is, is that for us to get to a point where people can consume these immersive experiences, you need to be able to develop the software to do it. There’s a lot of compute that required to do this. There’s a lot of development that’s required to do this, and NVIDIA is very well positioned. So if you’re not familiar with the omniverse, that is the answer that Jensen and the NVIDIA team have been talking about to basically enable developers to build applications for the metaverse. The company took it one step further with this announcement of their cloud services. So basically they’re going to democratize the capabilities, and basically build out more tools that developers can utilize to build for a metaverse in an immersive future. And they’re calling it the Omniverse Cloud.

Now, there’s a bunch of little announcements. They have a nucleus cloud, app streaming, a replicator, a farm, and an Isaac sim, they call it. So they’re focused on, A, with the nucleus to enable 3D designers teams to collaborate anywhere to access scene description, 3D scenes and data. So when you’re developing these ecosystems, you’re going to have developers all over working together, like Figma, Pat, but for development environments. Then you’ve got app streaming, which basically allows for the streaming of omniverse reference apps. You’ve got the replicator, which, as you know, in order to create simulations, you need to basically replicate real world data, that can be then created in this digital environment. So they created a tool for that. They have a farm for using multiple cloud instances on the omniverse, and then they also did a scalable robotics simulation app. Now, try not to read all this.

It’s just so much at one time. So many things happen at once. Yeah, that was, by the way, if I have one complaint, too many things announced and I feel like we’re not going to be able to do it justice because I’m not going to be able to talk about all the things that were announced. Now, the one thing I guess I’ll just say about the metaverse and the omniverse solution as a whole is, I think we tend to think a lot about the applications like Facebook or what’s going to be, but the real opportunity, in my opinion, is much more industrial. I’m thinking the omniverse, the replicator simulation. This is for designing buildings that are going to be able to withstand massive environmental challenges, for instance. This is for the ability to drive vehicles for millions of miles in a real world environment where we can test the stability of a system.

This is for the ability to collaborate on the design of complex products and services with teams around the world. This is where I really see the big opportunity path for the omniverse in the Metaverse. Of course, we’re going to get the applications where we’re going to be able to go into retail stores and be able to tour a new home. But I’m thinking more and more the idea that you can have photorealistic, 3D, synthetic environments created that are going to be utilized. Is the future digital simulation? Think about healthcare applications, Pat, the ability to test surgical procedures. We’ve heard about this a while. Let’s move in this direction. NVIDIA’s building a lot of tools for it. And with the further democratization of omniverse, I think it’s going to be more available and we’re going to see more to come. It’s one of the most exciting areas, in my opinion, for NVIDIA’s future.

Patrick Moorhead: Yeah, that’s good stuff Daniel. It’s funny, I’m thinking 30 years into the future, and I’m wondering if omniverse and metaverse is just this short stop until we get to full automation. Because if you think about it, we feel good that we’re having a digital twin because we are creating something digitally that is actually controlling something in the real world. And I’m just thinking, hey, if I look in the future, and you look at autonomy, where it’s not the human necessarily pulling the switch in this omniverse, but it’s just happening automagically. So anyways, I’m going to bring us back to ground here and talk about a few things that NVIDIA did on the edge. And it’s interesting, NVIDIA’s one of the few companies that can really lean into the future, and they came out with two entire lines of edge products that cover robotics, all the way from entry level robots, think vacuum cleaners to high-end robots that are in distribution centers, moving carts around and are called AMRs, autonomous robots.

So the first announcement I want to talk about is Jetson or Nano, and think of this as entry level robotics. They increase the performance by eight X, which is just absolutely nuts if you think about it. And that’s really an order of magnitude higher. Essentially what that means is that these little entry level robots have to rely less on the cloud and can make more decisions while being unconnected. Think about drones for a second. You wonder, hey, how big is this market? Well, this Jetson AGX Orin has over a thousand customers and partners. Think of companies like Canon, John Deere, even Azure for the edge. So it truly is incredible how quickly that this market is going in. You had made a comment about the omniverse and medical, and looking at data well, NVIDIA brought out a brand new product called IGX for Medical Edge AI, which is essentially, in my opinion, has the capability to fundamentally change, not only how we figure out what’s wrong with people or the diagnosis, but also preventative care.

I don’t know if you’ve seen this, but a bunch of startups popping up who are essentially scanning your body, and within a half an hour or an hour are giving you the results of where they potentially found a tumor that’s no bigger than one centimeter. So hats off to NVIDIA. The capabilities for a whole body scan or the ability to speed up time to diagnosis, where you might have to, let’s say, ship it off to an x-ray outsourcer or an image outsourcer, might be an India or something like that. Imagine the ability of having the combined intelligence of a hundred thousand doctors to be able to read that data and get it back to you within five minutes before you leave.

The psychological effects of knowing what’s going on, I’m sure you can empathize with the, hey, you go in, you get something done and hey, come back in a week and we’ll tell you if you have something awful. And then if you’re anything like me, I’m thinking about what’s wrong, what’s wrong, what’s wrong. But again, hats off to NVIDIA. They came out with this NVIDIA Clara Holoscan, built on IGX, which is a great example of these new types of platforms.

Daniel Newman: Yeah. Pat the medical implications are huge. Love it. It’s a huge market. We saw Oracle buy Cerner, we’re seeing more and more vertical veneer, and then eventually it’s going to be depth and this is going to help provide that depth.

Author Information

Daniel is the CEO of The Futurum Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise.

From the leading edge of AI to global technology policy, Daniel makes the connections between business, people and tech that are required for companies to benefit most from their technology investments. Daniel is a top 5 globally ranked industry analyst and his ideas are regularly cited or shared in television appearances by CNBC, Bloomberg, Wall Street Journal and hundreds of other sites around the world.

A 7x Best-Selling Author including his most recent book “Human/Machine.” Daniel is also a Forbes and MarketWatch (Dow Jones) contributor.

An MBA and Former Graduate Adjunct Faculty, Daniel is an Austin Texas transplant after 40 years in Chicago. His speaking takes him around the world each year as he shares his vision of the role technology will play in our future.

SHARE:

Latest Insights:

T-Mobile Raises 2024 Guidance Driven by Q1 2024 Service Revenue, Profitability, and High-Speed Internet Breakthroughs Plus Record Low Postpaid Phone Churn
The Futurum Group’s Ron Westfall and Daniel Newman examine T-Mobile’s Q1 2024 results and why they expect T-Mobile to fulfill its raised 2024 guidance as the company is outperforming its rivals across important mobile network service categories.
Generative AI-Powered Workflows Are Helping to Fuel Performance Across All Key Business Areas
The Futurum Group’s Daniel Newman and Keith Kirkpatrick cover ServiceNow’s Q1 2024 earnings and discuss how the company has successfully leveraged generative AI across its platform to drive revenue growth.
A Game-Changer in the Cloud Software Space
The Futurum Group’s Paul Nashawaty and Sam Holschuh provide their insights on the convergence of IBM, Red Hat, and now potentially HashiCorp and the compelling synergy in terms of developer tools, security offerings, and automation capabilities.
Google Announces Q1 2024 Earnings, Powered by Revenue Gains across Cloud, Advertising, AI, and Search
The Futurum Group’s Steven Dickens and Keith Kirkpatrick cover Google’s Q1 2024 earnings and discuss how the company’s innovations across cloud, workflows, and AI are helping it to drive success.