Search

We are Live! Talking SAP, Dell, Microsoft, NVIDIA, Marvell, Lenovo

On this episode of The Six Five Webcast, hosts Patrick Moorhead and Daniel Newman discuss the tech news stories that made headlines this week. The six handpicked topics for this week are:

  1. SAP Sapphire 2023
  2. Dell Technologies World 2023
  3. Microsoft Build 2023
  4. NVIDIA Q1 Earnings
  5. Marvell Q1 Earnings
  6. Lenovo Q4 Earnings

For a deeper dive into each topic, please click on the links above. Be sure to subscribe to The Six Five Webcast so you never miss an episode.

Watch the episode here:

Listen to the episode on your favorite streaming platform:

Disclaimer: The Six Five Webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we ask that you do not treat us as such.

Transcript:

Daniel Newman: Hey, everybody. We’re back, and this is another episode of The Six Five Podcast, the weekly show. It’s episode 169. We were away for a week and a half, but don’t hold it against us. We’re flying around the world, we’re telling stories, we’re singing songs. It is the week ahead of The Six Five Summit, but you know what, Pat, I missed everybody, I missed you, and we had to get back and I’m so glad to be here. How are you today?

Patrick Moorhead: I’m doing great. We really did have to get back, and staying on that weekly cadence is easy except when you’re, addition to your day job, shooting 62 videos for The Six Five Summit 2023 coming up on June 6th to June 8th. Keeps you busy, but there are very few venues where you and I can riff on so many important topics. Well, maybe that’s not true at all, I guess, social media, CNBC, stuff like that where it’s not like a full-blown white paper or some column or blog. First, I like to get on as an excuse to just see you and then I just stare at myself for the entire video.

Daniel Newman: That’s normal, by the way. There’s nothing narcissistic about the fact that you have the self-view up and you spend a lot of time looking at it. Everybody, do Pat a favor. Jump on Twitter. Tell him this is normal. He is a handsome guy. I wouldn’t have ever recorded with Pat if I hadn’t noticed the handsomeness. No, in all serious though, we have a great show this week. There are so much to cover and we did miss a week and we are playing a little bit of catch up. We won’t catch up on everything, but if you want to see what we’ve been doing and thinking, check out our Twitter streams. We pretty much leave it all out there for you. There’s not a lot there. You can even see me riding a bull if you want, but this week, we’re going to hit a few of the events over the past few weeks.

There was another wave of earnings. So we’ve got SAP Sapphire. We’ve got Dell Tech World 2023. We’ve got Microsoft Build. We’ve got NVIDIA’s earnings. We’ve got Marvell’s earnings. We’ve got Lenovo’s earnings. There was some other things going, and like we did say, we do have our Six Five Summit coming up. We hope you’ll check it out. We put a link in the show notes. Register for it. It’s going to be kicked off this year by the CEO of Broadcom Hock Tan. He’s going to be talking about the VMware acquisition, which is very interesting and some other things about Broadcom.

Pat, we got a ton of content to cover. I know you hate when I drag on the intros too long, so I’ll do one more thing. This show is for information and entertainment purposes only. So while we will be talking about publicly traded companies, please do not take anything we say as investment advice. Got that? Do the opposite or actually don’t do anything. Just listen and be entertained.

Pat, I know that your favorite thing to say when you’re hosting and then you’re the lead host on the show is, “I’m going to go ahead and call my own number on the first one.” So that’s what I’m going to do. I’m going to go ahead and call my own number, and let’s start off talking a little bit about SAP Sapphire 2023.

Now, we were moving, shucking, and shaking that week. I was at Lattice Semis Investor Day. I also went to ServiceNow’s knowledge event and I was virtually attending SAP Sapphire. SAP Sapphire had a big conference this year. Christian Klein, CEO, kicked it off and really focused on business AI. So as we’ve heard generative AI come to the surface, and we’ve heard that term come to the surface, I think the big inflection for all of us is, “Okay. These big large language models are table stakes now. We’ve already heard Google and Microsoft say they don’t necessarily have a moat with their big chat solutions for the open internet,” but we’re all leaning on a common narrative that the business or enterprise AI opportunities are huge.

We went to IBM Think. We came back with WatsonX. Well, SAP Sapphire’s event was all about business AI. So the company really did want to focus in on telling a story to say, “Hey,” across the massive ecosystem of customers, hundreds of thousands of SAP customers that they are building what they’d like to call AI for business.

A couple of things that I took note of that were important. First of all, the company was really big about partnerships at this year’s event. So they announced partnerships with Microsoft and they announced partnerships with Google. So they’re going all in on choose your own adventure as it pertains to connecting to the public cloud. Now, remember, SAP is all about platforming, so you can use the public cloud of your choice and they’re not necessarily looking for you to, well, you can’t use any hyperscale infrastructure as a service from SAP. So that was the first thing that they were doing.

They were also very focused on their sustainability footprint through generative AI. They were basically saying that they extended the R to have their generative tools help companies manage carbon better, and that’s been a big part of the SAP story is the sustainability footprint management, which is going to be a new tool. It’s something that they launched at this year’s event.

Then I think the third thing that was really interesting was they did announce a very cool Microsoft 365 co-pilot Viva Learning extension with SAP success factors. So as you know, one of the great applications, I think, for generative AI is going to be HR. You have the ability to understand and profile the employees and personnel in your organization. You can create deterministic factors of success and less likelihood of success to help identify better candidates, to be able to more immediately address skilling, to have more successful review processes, and to make sure that we’re putting people on the right path.

We know that retention of talent is a huge issue in this world, and so AI and generative AI and the tools that it could create seem to have a really big implication there, and taking SAP data and combining it with what Microsoft is doing and success factors together was a great example of an AI for business use case.

By the way, this seemed to be thematic throughout the whole event. So the company’s focusing on its BTP, business technology platform, adding AI capabilities, delivering on partnerships, creating generative tools that are going to be based upon a rich set of meaningful proprietary data, and that’s where generative AI is going to really show its value in the enterprise.

Last thing I’ll say is I believe SAP has created a nice moat in terms of architecture with business AI because it seems like everyone else is saying enterprise AI, and I like the idea that maybe SAP can really hug the business AI line and utilized that as something that’s definitive to SAP’s value proposition with putting proprietary data plus AI, plus open source large language models together to drive meaningful productivity and better outcomes for business.

Patrick Moorhead: Gosh, you did a great summary there. NetNet, at the show, SAP wanted to make the case, and I think it did a pretty good job that, “Hey, we’re helping customers transform their business models in the cloud, get greater visibility around sustainability, and improve supply chain resilience,” and around that was this required adder for AI. There’s nothing new about AI, but what’s new is implementing it for hardcore business applications and also using the recent generative AI to do that.

So whether it’s ERP, whether it’s digging into that in the supply chain, I liked some of the examples that AI is using. So for instance, predictive replenishment as an example, intelligent product recommendations. Obviously, SAP has a pretty big HRM system with success factors and talking about improvements to align people with projects, people with opportunities. That just makes total sense. Again, while AI is not new, I think integrating it with the veracity that these companies are headed to on private content as opposed to the Encyclopedia Britannica, I think, is big.

I don’t think we scratched the surface with companies’ desire to get around this, not only to deal with the short term financial headwinds that they have, but even looking in the future how they can get advantages on their competitors and serve their customers better. Working on a writeup. Should be out next week. So check it out.

Daniel Newman: Hey, buddy. Good stuff. It was a good event. Like I said, time is flying. It’s actually already a couple weeks in the past, but I think that’s going to be a big future direction and business driver for SAP. I think they’re one of those key companies in terms of unlocking the potential of all that enterprise data.

Pat, we had a big week last week. This is going to be recency in the memory. We were out in Las Vegas for three days for Dell Technologies World, another great event. Why don’t you kick it off?

Patrick Moorhead: Yeah. So it was a good show overall. I think this is my 11th Dell Tech World I guess when it was called Dell World. Sorry about the dog barking in the background. It was one of the clearest Dell Tech Worlds that I’ve attended in a while. First thing I do when I come to a show like this is I gauge the confidence of the senior management team and whether it’s Michael Dell, Chuck Whitten, Jeff Clark. They were confident, very confident in everything that they were saying. If there’s any company that knows how to weather a storm, it’s Dell, the ups and downs, the ability to do that.

More times than not on the other end of the funnel, Dell has done something to put themselves in the lead in a certain market segment. I would say the big takeaway for me, which I was not pleasantly surprised, but I’ve been advocating for the hybrid multi-cloud for more than a decade and was doing it before it was cool and it was good that I got that in, Dan, but it was good to see that was really the headline behind that, putting in managed services to be able to best take advantage of the cloud.

Now, strategically, what the company is doing is using storage as its core starting point, and that makes sense, right? It’s a position of strength. If you can parlay storage into data and data into AI and making money across the multi-cloud, I think your good chance of cranking in a ton of business. There was a lot of talk about Apex, which is the overall as a service brand for pretty much everything as a service, even if it’s workforce solutions, future of work cloud. There was also, as you would expect, as we’ve seen at every show, AI discussions, and Jensen was on video from Nvidia with Jeff Clark going through a new project that they’re working on to really enable companies, like we said in the prior piece in SAP, to work on private data sets.

Moving forward, I think Dell’s biggest challenge will be to convince people why these operations, these AI operations are done best on the private cloud and on the edge. The edge is not going away anytime soon, but I think enterprises are trying to figure out, “Hey, where’s the best place to put this training in this inference?” Who knows? Maybe it’s training in one place and inferring in a different place. What I do know is industry still hasn’t figured out federated learning at the edge, where essentially you could do everything between the edge and just spread out the inference goodies out there across everybody and make incremental changes.

All in all, I thought it was a really solid event. I think Dell likely scored some points, and with its competitors’ shows coming up afterwards, its more primary competitors, it’ll be interesting to see if this influences how their content is packaged or not. I had one senior executive tell me, I won’t name who that is, says, “Yeah, we created some problems for our competition going into this.” I saw that more, not just a bunch of ax wielding or ax shaking, but I noticed it too, and that doesn’t make it true, but I think it’s just, again, I’m really looking forward to see what the next few shows have in store and, hopefully, it’s multi-cloud. Hopefully, it’s multi-cloud fabrics because that is the future.

Daniel Newman: Yeah. So Pat, I think you called it out. There was the overarching theme that is AI, that’s just finding its way into any and every conversation right now. Then there is the underpinnings, the picks axes, shovels, and everything else that’s going to be required to actually deploy this. We’ll talk about Nvidia later, but the big boom right now is all about an arms race to deploying the volumes of infrastructure that are going to be required to train all these models and to deploy enterprise AI, whether that’s probably starting with a lot of hyperscalers, doing it as a service. Then of course, enterprises, larger ones, are going to have to determine a way to do it on their own prem.

Of course, data privacy, security, governance, these are going to be huge topics. So companies like Dell have a big opportunity to play in that plumbing, in that picks and axes part of the universe. Obviously, Dell has this big demarcation right now because you’ve got the devices part of Dell, which is huge and it’s a little bit in a … You could say that it’s not Dell that’s in a rut, but the whole device ecosystem is a bit of in a rut. We had that huge buying driven by the 2020 and 2021 work from home events and now you have a shift away and everything’s more infrastructure focused.

So Dell has a ton of potential there or, A, just continuing in their leadership of storage, servers, all the things that the company has to sell to enterprises, and then they have a richer set of Dell Apex, which is their as a service and on-prem as a service offering. That’s moving forward. Of course, there’s a lot of competition there. This is the multi-cloud universe and what’s happening is big hyperscale cloud is moving towards prem and the prem providers are moving towards the cloud and everyone’s meeting at a different destination.

Now, Dell did 102 billion dollars in revenue last year, 102 billion, yet the market cap of the company’s under 40, which is just a remarkable gap in terms of value. So that makes Dell one of the more undervalued assets in the marketplace given its size, scope, scale, revenue. Pat, we talked to the head of services, Doug Schmitt. They have 60,000 strong just to provide global service to their customer base. It is a tremendously large organization.

Pat, Project Helix is interesting. It sounds to me like another move for these companies to pre-configure build an image out servers and devices, and they can be quickly deployed to be leveraged and utilized for generative AI applications. I expect that to be something that will be launched across the industry, but some of it was good, some of it was interesting. Of course, Jeff Clark’s demo with Gen AI, was very funny. He was talking to Jen Felch, their CIO, but at the same time, he did a really great demo when he was talking to Gen with the G AI.

My take, Pat, as a whole is Dell has a lot of the core technologies and requirements to be a big player in this transition. They also have a lot of proving to do in this particular moment in time because the cloud providers are definitely trying to make it very lucrative. They’re trying to make it very subscription driven, and all of the on-prem consumption services are still light compared to what the public cloud providers are able to offer. I think Dell made some good progress and I think that the fact that multi-cloud will not just be multiple hyperscale clouds but will be multiple clouds, edges, telco does create a compelling argument for the real architecture that most companies are trying to build against, and Dell does have a broad set of solutions to support those customers.

So also just a quick note of appreciation to the leadership, Michael, Jeff.

Patrick Moorhead: Made their selves available, didn’t they?

Daniel Newman: They really did, Pat. We spent a lot of time with Sam. You did a great interview with John Roese, the CTO. There’s just a lot of access. I can never get over just how generous Michael Dell is with his time and he does really value the perspective of the analysts and he takes his time to make sure the analysts are given the information they need.

All right. Let’s keep running here. Let’s move to the third topic, Microsoft Build. That was another overlapping event. I’m pretty sure you and I landed in Vegas and jumped on our laptops to watch the Analyst Day at Microsoft Build. Look, this year, Pat, I don’t even know what to say anymore. Should we just call our show like AI Six Five or Six Five on AI?

Patrick Moorhead: At least now, and listen, analysts do well in times of pandemonium and we’re in pandemonium right now, right?

Daniel Newman: Lord, Lord, yes. So this was an AI centric event. It was a AI layer cake of everything. It spread across Microsoft’s portfolio. It spread from consumer, a lot of focus on that. So some highlights, ChatGPT got Bing integration. Bing added a bunch of new plugins. Windows 11 now has a ChatGPT-powered AI assistant. Windows 11 got some cloud OS backup and restore. Then I’m going to leave this one for you because I know you do a lot of devices, but there was some pretty big Windows 11 arm updates too that came out of Build, if you want to cover that one. I don’t have the depth on that one, but I thought there was some moves that were made there.

I also noted that there was a few things for the enterprise, Pat, that were notable too. Azure AI put out a number of governance updates for content safety. There was a, I guess, GA now of the ready-to-use document and conversation summarization for Azure cognitive service for language. Then Microsoft, and this is probably one that you and I will definitely want to come back and talk more about at some point, and that was the reveal of Microsoft Fabric.

So as we know, we’re going to have a lot of disparate data that’s going to live across the enterprise or organizations, public, private, hybrid. You’re going to have structured, unstructured. We’ve been saying for a long time in order to do the most important ML in analytics, you needed to have the right fabric. We often talk about companies like Cloudera that have very interesting hybrid fabrics and we talk about this, but you know the public cloud providers are going to start to build this.

They’re going to start to create a more seamless fabric for, A, data management, and then of course, B, there’s going to be a ton of requirements for fabric for networking because for all this stuff to happen, it’s going to have to happen very efficiently, very low latency. It’s going to have to be cost management because it’s going to be extraordinarily expensive to be building, training, and inferencing all this data. So Microsoft, as you could expect, is launching a new fabric.

So my take, Pat, across the board of the Microsoft Build event is it was all about what the company’s done with its open AI investment. There was a number of additional co-pilot capabilities that were announced, ChatGPT capabilities announced, Azure AI capabilities. This is the year of AI for Microsoft. It’s the if you don’t see it involved in some part of the portfolio, you’d have to ask yourself why, but as of right now, Microsoft is all in from device to the cloud, to the edge, and this is another event to reinforce its strategy.

Patrick Moorhead: After getting the generative AI jump a few months ago, it’s amazing. I think it was February. You and I were at the unveiling up in Redmond. They announced here at Build exactly what you would expect, which is, how do I extend generative AI to my developer community? It’s a very diverse developer community. At one side, you have Azure and on the other side, you have windows and devices, and then you have everything in the middle. So what did they do that you would’ve expected? Growing this AI plugin ecosystem. How do I plug into and leverage ChatGPT services inside of Microsoft? That’s Bing, that’s dynamics, that’s 365 co-pilot, that’s Microsoft 365 co-pilot, it’s Bing. How do I plug in to that environment? So that was obviously a big one.

Then some enhancements to Azure AI Studio to increase the simplicity, and I’ll call it power of plugging into all of Azure AI including content safety. Now, one of the questions that I know enterprises have is, “Hey, how do I manage safety here? Is it going to be Microsoft doing this or is there going to be ability for me to fine tune?” I think the answer was there all the time. You have different states, you have different countries, you have different regions, you have different ways the companies want to moderate content. I think the more that Microsoft takes themself out of the loop, the better. Else, it gets very political.

I’m going to go on Microsoft Fabric. I want to dive into that. What I’m trying to figure out is what it really is and what it isn’t. Is this a repackaging of some of the tools that they’re doing? By the way, I like fabrics. I get excited about fabrics. I can’t contain myself when we’re talking about fabrics, but there’s a lot of work that I have to do to get underneath this.

So Microsoft always has to watch how much content they apply on each area. Now, I thought there would be more discussion about on device AI, but I think Microsoft weighed that heavily. Now, Panos Panay, the Chief Product Officer, had a very good blog that came out and I think crystallized what they’re doing. This is a thing called hybrid AI loop that supports AI development across platform, across Azure clients supporting AMD, Intel, Nvidia, and Qualcomm.

I knew looking in my crystal ball that the company would have to support these four vendors even though right now from a performance per watt on the right type of tops, I think Qualcomm has the lead. Microsoft did their studio effects first and deeper on Qualcomm. While we don’t have exact specifics about the next generation Orion platform based on the Nuvia core, I am hearing that it is gigantic, and the ability to operate in a hybrid AI mode. You have the cloud on one side, you have the device on the other, and then you have a hybrid mode in the middle.

I think it’s pretty clear how you architect for and build for the ends of the spectrum, but I think this hybrid element where, “Hey, let’s just say you don’t have enough oomph at the endpoint and you need to, for lack of a better term, burst to the cloud,” that is going to take some architecture on determining when you burst. There might be some workloads that aren’t burstable because you don’t want to be sending in any information into the cloud, but I do believe that that is the future of computing, which is this hybrid element of AI.

Qualcomm and Intel had some very focused posts out there showing how they’re supporting it. Qualcomm talked about Orion and Intel talked about Meteor Lake and Intel, how it was enabling the VPU, I can’t stand that name, the VPU, I think a video processing unit, but talking about how it’s going to be supporting it. I do think in the end from a performance per watt in the right type of AI, Qualcomm has the early lead here, and I do appreciate the entire company rallying around edge AI when a lot of companies who you would expect to be rallying around and really driving it are not, but, hey, Qualcomm has the lead, they’re going to drive it.

By the way, for Qualcomm, it’s not just about the PC, right? It’s about the smartphone, it’s about the PC, it’s about the industrial IoT edge, it’s about the car. I know that I’m going to be doing a lot more writing on this in the future.

NetNet, I think it’s good for consumers and I think this is good for businesses, and if nothing else, it is going to shake up the landscape like we haven’t seen in probably more than a decade. Daniel, I had my doubts early on whether this was the smartphone moment, and I say that tongue in cheek because smartphone moment actually happened 10 years before the smartphone moment that I think most of us think about, but it is going to change everything. It’s not that it’s about me, but it’s changed my workflows. I know inside of your company, The Futurum Group, is doubling down on a, what do you call it, an AI analyst capability.

Daniel Newman: AI analyst, analytics, and content.

Patrick Moorhead: Yeah, and I have a couple engineers looking at a few options for my company as well. Exciting stuff.

Daniel Newman: It is exciting, Pat. By the way, I do think you know leaned in a lot on the devices, and with Qualcomm, I really stand to believe that it’s going to be on these processor companies, Intel and Qualcomm, to really show the value of the content in their chips in terms of how it’s going to empower this AI continuum and all this hybrid slash AI applications. Pat, it’s not that different than 5G in my mind that what the market needs is if you want to see the boost of valuation for these device companies or these companies that provide processors for device, we have to start to be able to quantify how these devices are actually going to be able to take and do these workloads because if it can all be done in the cloud on some light client, then the value of all that on device processing is less. So we’ve got to show how that’s going to work. I think that’s going to be something we’ll start to hear more and see more from Intel, Qualcomm, AMD, and others.

Pat, your number. I was just trying to fill a little air there because you went for a while on Microsoft. Give you a chance to breathe and let’s go on to Nvidia because this is the topic du jour, topic du jour week. It’s you.

Patrick Moorhead: I know. I know. Just when you think you’ve got the topic du jour, a new one comes along. Let me hit the lead upfront. Nvidia upped their forecast for the next quarter by four billion dollars.

Daniel Newman: With a B.

Patrick Moorhead: With a B, which increased their market cap more than their competitors are worth. That got a little bit of attention. The side story is they beat revenue by 10%. They beat EPS by about 20%, but it was that upward guide that got literally everybody talking out there. Forget their gaming market is down precipitously, right? Forget about Professional Workstation being down. Auto up, good. Nobody cared. Everybody cared about what generative AI meant. Let me try to break it down for you and hopefully it’s pretty obvious if you’ve listened to the pod is that you need something to first of all train these large models, and these large models can be text, which is an LLM. It can be based on images like stable diffusion, and it can be also videos, which you can pull down a model from a hugging face and have fun.

You probably saw the Joe Rogan fake video podcast that was cranked out by essentially putting in a text script ingesting hours and hours of his video, and boy did it look real, right? So just back at the envelope, think of these types of models, these large models taking 10x the capabilities, the GPU capabilities to train. The reason I say GPU and not ASIC is that, quite frankly, Nvidia is the standard right now for training as it’s just so flexible. You want to do 27 different types of models, GPUs are your bet.

Now, Nvidia does put little tiny ASICs in there to accelerate things like transform models and things like that, but in the end, it’s just the raw horsepower of that GPU and the flexibility that keeps that going. I was on Street Signs last night, CNBC Street Signs, and a lot of debate on, how long does this go? Let’s just say for a second the training of these new generative AI models are only with the largest of companies. I can see that. The inference though, when you then take those models and you run those models and you blend those against your private data, that’s going to be an opportunity that I think is going to be there for everybody, whether it’s Intel, AMD and, of course, Nvidia.

I think Nvidia’s training probably we’re looking at probably a three quarter phenomena. I’m not going to say before we’re done, but before we see that getting back to what I would call normal growth. There’s other companies then as we’ll talk about like Marvell who are seeing the benefit of that, if nothing else, to string along these GPUs. So again, nothing else mattered other than that data center number and that data center number that was driven by generative AI training this quarter.

I think that the way that I look at this is give Nvidia time for these AAA games to come out, and also these year over year tough comparisons on the gaming side. If I look at gaming, last year they were all in the three billion range. In fact, a year ago, that gaming number was 3.6 billion. Now, it’s 2.2 billion. NFTs flamed out, which even though Nvidia did its best to segment gaming cards just for that, they neutered gaming cards for crypto. What happened is given that there were so many different types of NFT and so many different types of crypto, the GPU just became, even if it was neutered for let’s say Ethereum, all the other versions of it became the ultimate device. So what we’re seeing again is a recalibration of Nvidia’s gaming number. There we go.

Daniel Newman: All right. So there you have it. Metaverse is on pause, but AI is all the rage. Now in the future, I do see there’s going to be a coming together, by the way, of these two things. People aren’t fully appreciating just yet, but there will be, and this will, of course, fuel a bit of a web3 revolution too because I do believe we’re going to need to have some way to create tokenization of content and assets because as fake becomes easier to create, how do we separate real from what’s not real? That’s going to be increasingly difficult and I think that will be an opportunity for some of these different applications.

Look, killer quarter. Don’t mistake the two years ago when I said Nvidia would be the next trillion dollar company, when I wrote that on Market Watch. It wasn’t a mistake. It was obvious. AI had the biggest upside. It’s going to revolutionize industries and it will be industry five will be the industry revolution fueled by AI. Last summer when I was beaten up by the whole Twitter universe and for going on CNBC and saying, “Hold on.” I picked two. It was at their absolute lows. It was Nvidia and Microsoft. Nvidia had fallen like 75% and I said something along the lines of, “This is going to be the opportunity of a lifetime.”

Pat, I don’t know what I would do if I didn’t spend at least 10% of my day doing victory laps, but I do like when I get it right. So what we’ve gotten right here is that every company is buying every single piece of silicon they can get their hands on in order to be ready for the inference boom. So right now, you leaned into training, Pat. This is what it’s about. Right now, everybody understands that there’s a huge requirement to be able to train all their data, and that means tons and tons of hardware GPUs will be required. You can probably glean that the largest swath of this massive four billion dollar guide up is hyperscale, meaning these folks are going to be building out the infrastructure to be able to support the next wave of infrastructure platforms and software that are going to be built on top of things like Azure and AWS.

Of course, we will see different flavors and varieties, but there is no end-to-end system right there that’s in the market that’s ready right now the way that Nvidia is. So Nvidia has a huge leap and a huge gain, and this has been on the backs of … AMD has not been able to get the whole stack in the software right. Intel’s lagging behind. There are some ASICs. Google’s built their TPUs, and Meta’s building some proprietary AI capabilities through hardware. We know companies, Pat, you and I have tracked Groq and SambaNova and the companies that are building some specialized ASICs and accelerators. There’s some efficiencies to be created there, but Nvidia has the color on this particular market right now.

The interesting thing has been the second wave and we’ll talk about Marvell and Broadcom and some of the others in a future segment of all the other companies that are getting a similar boost to what Nvidia got. Look, Nvidia added … Pat, I don’t know if you said this. I was trying to find the stat while you were talking. So something along the lines of 260 billion dollars in market cap in 24 hours. I just want to pause, just take a breath. Okay. Let’s add up what that is. That’s Dell, Intel, and AMD. I think that’s the three of them together. That is the entire market cap of-

Patrick Moorhead: Intel is 120. Actually, AMD is 204.

Daniel Newman: Okay. AMD went up big though after this all happened though. The day of, AMD was 160 at that time because I looked it up. IBM’s 120 or 110, but my point is it’s bigger than Qualcomm and Intel together right now. So the fact of the matter is that’s not the … and it breached a trillion dollars in a day. So the fact of the matter is is that we’re just seeing the beginning of what this is going to be. I’m pretty sure that Nvidia will have a hard time keeping up with its demand for many, many quarters to come on the enterprise side to try to support the build out both on the enterprise and cloud scale.

Interestingly enough, it secretly covered for the fact that gaming is soft, Proviz is soft, Omniverse is soft, crypto is soft, automotive is soft. So it’s interesting because this one particular category right now is taking all the weight. Try, nobody else cared. The only other thing I’m going to say here is I’m just going to seed this idea. Nvidia, I have to imagine at some point is going to be under some interesting pressure with its end-to-end closed ecosystem when everything gets built on it. It seems like it’s rife for some regulatory oversight. I’m not sure. I just mean we’re spending months trying to pass something like a Broadcom VMware deal that has almost no definitive antitrust or anti-competitive attributes to it and certainly nothing that couldn’t be handled with a couple of concessions.

You now have a closed software hardware and, by the way, maybe some bundling things going on with the fact that I’m pretty sure, from what I understand, Pat, you can’t buy an A100 just the silicon anymore. You’re buying systems. Everybody’s saying you have to buy the whole systems top to bottom now. So I’m just saying it’ll be interesting to see if that gets explored.

Patrick Moorhead: Making 70 points on that. Something has to give, Daniel, and that could be regulatory, it could be a big competitive move. As we discussed on the show too, and DGX Cloud, the way that that’s operating where it’s more the gaming model where Nvidia creates all the value and then it’s distributed to others. Something’s going to change and it could be a combination of all three, but Nvidia does need to watch themselves in how they move forward, particularly given the regulatory environment.

Daniel Newman: Well, Pat, I think it’ll be an interesting thing to watch and explore as they’re trying to regulate the large language model. The only way these things are being trained and built is in Nvidia’s sandbox. I mean that’s where this is all happening right now. So like I said, you cannot easily move any of these workloads off the Nvidia platform. At this point, it’s still very difficult. Will it get looked at? I don’t know. I think our regulatory bodies don’t seem to pay attention to the most obvious antitrust issues. So I wouldn’t be surprised if this one floats under, but I guess I’m saying it was an amazing quarter for Nvidia, but if you’re looking for maybe something to be cautious about, it’s the fact that they are the only game in town right now.

Patrick Moorhead: They’re the only game in town for training. That is for production level highest performance training.

Daniel Newman: Yup. There you go, and thanks for qualifying all of that. All right. So speaking of companies that have gotten massive boosts, let’s talk a little bit about Marvell, Pat. You and I went out to San Jose and actually had time to spend some time with the executive team there. Didn’t talk about earnings, of course, because it was the day before, but it’s always good to see Chris Koopmans and team out there and appreciate their hospitality, but what we were asking about is, is there an AI story here from Marvell? So obviously, Marvell saw a bit of a Cinderella story over the course of the ’20-’21 year as the custom silicon for cloud and networking.

The company really has a ton of the content required for anything optical, and it’s been a huge success in that particular industry, but as the market pulled back, semiconductor slowed down, they were a company that got stuck with a little bit of inventory. They had some slowdowns related to the companies they were providing to and they saw their stock fall by 50%, 60% during that period of time. So it was this big peak and then this big goalie. The question marks were, what’s going to be next for the company? I think everybody had a question, is there an AI opportunity?

So this quarter, Matt Murphy, CEO, came out and the company had a beat, but it wasn’t a massive beat. I think it was a few cents on the earnings. It was a few dollars on the revenue, but they came out and they basically said they have about 200 million dollars right now in revenue related to AI, and they believe that they’re going to be able to double that in the next year. Now, that’s okay, but it was 200 million to 400 million. You’re talking about a company that did about 1.3 billion this quarter. So it’s still a fairly small part of the company’s overall revenue stream.

Having said that though, I think the signal of intent was really big. So first of all, the claims to double, much like Jensen saying a four billion dollar up on the guide is probably a very conservative number. I think that they know they can do a lot better than that. Pat, I think that the thing is Nvidia with InfiniBand and with NVLink has a fairly tight grasp on the networking of all of the GPUs, but in the future, there’s going to be two things happening. One is we got to connect all these racks up for all this AI, and second of all, there are going to be second, third, and fourth players that are going to enter the market that are going to need high performance optical connectivity that’s going to run rack to rack.

Marvell is positioned really well there to grow and to gain business. Like I said, since you’re really talking about growth from a fairly small start at about 200 million, I think that that 400 million is conservative and they can easily see that double or triple in the years ahead of that. This earnings though, Pat, they got a 25% bump on this result. Now, let’s be candid what this is about. This isn’t all about Marvell’s story yet, but this is all about the fact that Marvell was successfully able to connect its role in the future of AI to the growth that you’re seeing for Nvidia and for AI training and ultimately for inference.

I think that’s really important, the company’s acquisition strategy, it’s overtures into fiber, it’s ability to really grow in the data center and, of course, that it has connective tissues across the stack. They’ve moved heavily into 5G. They’ve moved heavily into automotive. The ability to build silicon for networking and high performance networking is going to be a big thing.

Pat, I think you and I really sat down and even as you and I talked we’re like, “There is a tailwind and a pull through effect that all this demand for AI training is going to have, and companies like Marvell, like Broadcom are likely going to be the biggest beneficiaries.” So I think the market agreed with us. I think everything else about this quarter’s earnings were just business as usual, but it was really, the fact is everyone’s ears were open following the drafting tailwinds of Nvidia’s. Ken Marvell capitalized. Matt Murphy, Chris Koopmans and team, they were able to capitalize. They saw 20 plus percent gains following their earnings, and I think it’s just the start for their AI story.

Patrick Moorhead: Yeah. Great analysis there, Dan. I was on Street Signs last night talking Nvidia, and also, they asked me what companies are related. I brought up Marvell and I tried to very simply explain that when it comes to training, GPUs in isolation don’t add a lot of value. You have to network these things together to make it look like one planer or surface. Those interconnections between those GPUs need to be really, really fast, and that’s where Marvell comes in is they’re the company that networks the GPUs together.

I wrote a column on Meta as an example. Their research cloud is 160,000 GPUs that are networked together. I’m not saying that that’s Marvell or it’s not, but that just gives you an idea how important it is to pull those together. Marvell did double down on growth in certain areas and was crushing it in those highest growing areas, and whether that was the cloud, whether that was 5G, they do really well in automotive as well, and that’s chugging along just nicely, but this AI training, I’ll call it training now, inference like the letter boost is a good one.

Now, it’s not just networking, okay? Marvell has capabilities not only in custom SOCs, but also opportunities in full scale AI ASICs. So I know the company is talking a little bit about what to expect with these, but they have designed wins in both ASICs and in AI-focused SOCs, which I think is going to surprise a lot of people out there. That’s just the gift that keeps on giving, right? That’s a multi-year engagement. That’s multi-year benefits that’s coming down the road. Oh, by the way, all that custom work on the front end, you’ve got NRE. So it’s not like you get to the altar and even if they pull the plug, there’s zero. At least you’re making some money on the NRE, not nearly as much as if they would’ve taken over the line, but I can’t see any US-based CSP with a custom piece of AI silicon that helps lower the cost of AI anybody pulling the plug on that. So congrats, Marvell. It’s good to see it happening and looking forward to covering this in the future.

Daniel Newman: Yeah, and thanks for pulling together about the ASICs and custom SOCs. That is a big part of the business. Probably a little farther down the road before it’s going to be as impactful on the AI side, but it is something I think I saw some numbers and into ’25 could be a very interesting part of the business as well. Pat, let’s call this final number here.

Patrick Moorhead: Yeah, let’s hit Lenovo.

Daniel Newman: Let’s talk about Lenovo Q4 earnings, Pat.

Patrick Moorhead: So every Q4 that Lenovo brings out is hard to dissect, okay? When I ask the company why they focus on the full year versus the quarter, I like their explanation, which is, “Our investors want to see the long term,” which I totally get, but I think it’s very important that we hit the quarter, but probably 98% of the presentation and the release were about the year. So overall for the quarter is exactly what you would expect from a company that’s 57% PC revenue in a really troubled PC market where all elements of the value chain are pulling back. You’ve got soft demand. You have them needing to decrease their finished goods inventory, whip inventory, and everything in between, but I do think it’s important to go peel the onion back.

Overall, the revenue was down 24%, which would make sense given that the PC group is down 33%. EPS was down big time, 73%, but again, let me drill down. We have a chance to talk to Ken Wong, who runs SSG and his team that graciously brief us strategically on what’s happening there. They were up 18%. As you would expect and as we’ve seen the entire year, 50% of that business was outside hardware break fix. So that includes things like managed services. You have all of the as a service fitting underneath here, things like true scale. The company has also created what I think is a fairly comprehensive digital workplace solutions that’s, again, it’s not just PCs. We’re talking phones, we’re talking PCs, we’re talking office UC equipment.

I think if there’s any company who is going to do well in this, it’s likely going to be them if nothing else, given the sheer broad base of what they do. I’m interested to see also, obviously, how HP pulls that together given their acquisition of Poly. You got the chance to talk to Kirk Skaugen. I unfortunately was sick, couldn’t get my butt out of bed that morning…

Daniel Newman: Glad you’re better though, buddy.

Patrick Moorhead: No, it’s going to be back. My nose is starting to fill up here. I’m going to power through this.

Daniel Newman: My allergies are horrible today. I’ve got the itchiest face. It’s Texas allergies, man.

Patrick Moorhead: Yeah, at least you’re not getting bit by a tarantula.

Daniel Newman: Did you see that thing?

Patrick Moorhead: I did now. Yeah. Dan sent me a picture last night of a tarantula, tarantulas and scorpions. If you live in the hill country or anywhere near a forest, you will find them or they will find you. Data center group crushed it, up 56%. Okay, folks, 56%. Now, think about that. Compare that to the supplier base, how AMD and Intel are doing. They’re not growing 56% in the data center group, which clearly means that these guys are taking share. So while it was hard to break out exactly what the puts and takes were in the data center group, it likely continued to gain market share in server, storage, and in software.

Few reminders about the business. Lenovo’s number one in top 500 high performance computing. They actually added more slots to that dominance, which I know we don’t cover a lot of HPC on here, but I think is a big bragging right for a lot of things. They’re number one in entry and mid-level storage. Called the ban one to four price. If I can give IDC some kudos for defining that, but five years ago, people thought that Lenovo Data Center would be dead. Now, they’re crushing it, crushing it in server, crushing it in storage, and doing a really good job in software. I’m interested in the future to see the impact of generative AI given that they’re such a huge supplier to the hyperscalers. Again, I would think that them being inside of every hyperscaler cloud that’s using generative AI, they would almost have to see an uplift based on that.

Daniel Newman: So there you go. Is that all you got? Is that it?

Patrick Moorhead: That’s it, baby. I just talked about the entire company. What more do you want, dude?

Daniel Newman: Listen, Pat, you hit it really well. You could not be a company that has over half your revenue tied to PCs and have a great fiscal year. It just wasn’t possible. It was such a tough year, but this is really a company that in the future here, especially if this AI at the edge and on-device thing takes off will be a massive beneficiary, the company is progressing in its categories, its service categories under Ken Wong, it’s infrastructure categories under Kirk Skaugen, and then, of course, PC is cyclical. I don’t care what the circumstance is, it just maybe has the most cyclical boom, but if you do not believe that AI content is going to drive a huge wave of form factor and new device purchases over the next few years, you’re probably not paying attention. This is a trend line. This is being drawn for you.

So depending on your horizon of when you’re looking for things to happen, we’re probably a year or two out for seeing a really big boom in PCs again because all this generative AI stuff is going to need more horsepower. So that’s the good side of it. I think that overall, the company’s done a good job of maintaining its growth, its margin in the server space. It’s got a story that’s starting to bubble up around generative AI. It’s going to probably look similar in some ways to the stories you’re hearing from the other large hardware OEMs. It’ll be their own version of it. The SSG business is definitely ramping up. I see them pivoting from a core set of service offerings to a bit more of an outcome driven approach for things like digital workplace, for things like on-demand consumption services. I think that we’re going to see that continue to materialize over the next couple of years.

It’s hard to look at the numbers holistically and think it was a great year because as a whole, I think their earnings were down 70% from a year on year basis, but I think when you kind of look at how this year was seeded and where growth came from, growth came in services, growth came in infrastructure. As companies, and we’ve seen this movie play out before, Pat, as companies pivot away from a core business that has been hyperdependent, that has this cyclicality and they build up these second and third moats of higher margin, higher performance, and diversified revenue, they become better companies, they become more capable companies, and that’s what I think we have with Lenovo.

Super excited, by the way, to have EVP Kirk Skaugen as one of our day openers. Pat, you and I are going to jet off, and if you’re watching this, maybe it’s today, but we’re going to jet off and actually capture that content in the near future. I think Lenovo is a very interesting company. It’s one to watch. Pat, it was one that, like I said, there were bright spots, but you definitely had to pick them out between some tougher numbers that were mostly driven by this year’s very steep PC decline.

So there we have it. That’s our podcast. That’s our week, buddy. By the way, it was the Six Ten again. We did six topics, five minutes each person per topic, and it’s an hour and I don’t think there’s a thing we can do about it.

Patrick Moorhead: No. I think I talked more than normal.

Daniel Newman: Well, you had a lot to say. You had some good stuff. We know a week off, we missed you, but it’s good to be back. For everyone out there, we appreciate you being part of the family, you tuning in, you signing up for the June 6th to June 8th Six Five Summit. Pat and I are so excited to bring you all the best content. So many great minds and thinkers. Register, sign up now. Share it with your friends. Make this thing go viral. We appreciate y’all tuning in. Pat, any last words of advice for our friends, fans, viewers, watchers, critics, all the people that are out there participating in our show today?

Patrick Moorhead: No. Whether you love us or hate us, we appreciate you, and I appreciate you tuning in and we missed a week, but we’re back, and maybe we’ve made up for it with the Six Ten.

Daniel Newman: Six Ten, baby. Six Ten. Stay with us, subscribe, be part of our community for this week, for this episode, for Patrick Moorhead, Daniel Newman, myself. We’ll see you all later. Bye-bye now.

Author Information

Daniel is the CEO of The Futurum Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise.

From the leading edge of AI to global technology policy, Daniel makes the connections between business, people and tech that are required for companies to benefit most from their technology investments. Daniel is a top 5 globally ranked industry analyst and his ideas are regularly cited or shared in television appearances by CNBC, Bloomberg, Wall Street Journal and hundreds of other sites around the world.

A 7x Best-Selling Author including his most recent book “Human/Machine.” Daniel is also a Forbes and MarketWatch (Dow Jones) contributor.

An MBA and Former Graduate Adjunct Faculty, Daniel is an Austin Texas transplant after 40 years in Chicago. His speaking takes him around the world each year as he shares his vision of the role technology will play in our future.

SHARE:

Latest Insights:

Lisa Martin shares her insights on modern MarTech with Thomas Been, CMO of Domino Data Lab. They unveil the essence of modern marketing, discuss understanding audience motivations (the art) and how to swiftly address customer needs (the science).
In this episode Keith Kirkpatrick discusses the news coming out of the Zendesk and Avaya Analyst Days, focusing on new product enhancements around AI, corporate strategy, and automation.
New GenAI Model Provides Greater Accuracy and Detail and Faster Generation
Keith Kirkpatrick, Research Director with The Futurum Group, covers Adobe’s beta release of Firefly Image 3 Foundation Model and a new beta version of Photoshop, which includes new features and capabilities.
An Assessment of The Key 5G Ecosystem Developments Including Azure Private MEC Inroads, New VMware Telco Cloud 4.0 Moves, and Vonage Singtel API Alliance
The Futurum Group’s Ron Westfall and Tom Hollingsworth review recent high impact telco cloud, MEC, and APIs moves including the progress of Azure Private MEC in supporting manufacturer private 5G network implementations, VMware Telco Cloud Platform Release 4.0 ready to ease VNF and CNF use, VMware Telco Cloud Platform RAN benefits, and how the Vonage Singtel partnership is uplifting overall API prospects.