Search

We are Live! Talking Google, Salesforce, Lenovo, Groq, T-Mobile, and Microsoft – The Six Five Webcast

On this episode of The Six Five Webcast, hosts Patrick Moorhead and Daniel Newman discuss the tech news stories that made headlines this week. The six handpicked topics for this week are:

  1. Google Goes Generative AI for Enterprises
  2. Salesforce Goes GPT
  3. Lenovo Storage Announcements
  4. Groq Goes LLaMa
  5. T-Mobile Buys Mint, Plum, Ultra
  6. Microsoft Announces Copilot for Microsoft 365

For a deeper dive into each topic, please click on the links above. Be sure to subscribe to The Six Five Webcast so you never miss an episode.

Watch the episode here:

Listen to the episode on your favorite streaming platform:

Disclaimer: The Six Five Webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we ask that you do not treat us as such.

Transcript:

Patrick Moorhead: Hi, this is Pat Moorhead, and we are back for another Six Five live weekly podcast with my bestie, Daniel Newman. We are flying all over the place. I am broadcasting from… I don’t know. Who knows? Gosh, might have something to do with horses. Could you see anything that might be horse-related to my background, Dan?

Daniel Newman: I don’t know. Is it a bunker in the shores of the Mediterranean where you are? No, no, that doesn’t look right. Is it a mountain cabin where you’re going to go? No, no, that doesn’t look right either. Where are you, Pat? Where in the world is Patrick Moorhead?

Patrick Moorhead: I’m in Ocala, Florida. I am falling out after a nice analyst event in Poughkeepsie, New York with IBM. Got to see some really cool stuff, including the Quantum Data Center. It’s actually online that customers are using. I was told we were the first non-IBMers to get a view of this, but maybe we were the first non-IBMers, non-customers, non-clients that were in there, but it was great, reconnect with Ross Mauri. I saw one of your awesome analysts, Steven, there. We broke bread, I think every night.

But I’m glad to be here. This is my favorite thing to do of the week. We just kind of sling it and make it happen. If this is your first time to the Six Five, first of all, I need to ask you why and what’s wrong with you. But what we do is we do cover five topics – six topics, five to 10 minutes each, depending on how much blather we can come up with. We hit the news to get context, but we’re trying to bring our viewers and listeners the best context out there.

We’re going to talk about public companies, but don’t take anything we say as investment advice. And, Dan, we actually don’t have any earnings and I’m pretty glad, not that earnings annoy me, but I’d rather hit new product launches.

Daniel Newman: There were a few earnings, but not too many. There was earnings from, I think, Adobe, they crushed it, but SVB Bank was probably the big news of the week. But you and I don’t need to cover it because every other person on the planet became a banking expert this week. They went from social experts to digital to crypto bros to NFTs to another…the banking expert crowd. But you got a little fun with that, Pat, you got a little fun with the fact that-

Patrick Moorhead: Well, totally. And I turned on the show that all your friends are on, CNBC, that you were back-slapping with at the ServiceNow party and, literally, the only thing they were talking about was banking. So boring.

Daniel Newman: It was pretty wild, though. I think two weeks ago, Jim Kramer made a call to buy SVB Bank at like $330. I mean, obviously this was a hard one to call, but, I mean, it’s just when you play that back in arrears you’re like, “Wow.”

Patrick Moorhead: Did you ask Kramer about that when you saw him?

Daniel Newman: I didn’t. I didn’t. This was a very loving crowd. It didn’t feel like the right place for contention.

Patrick Moorhead: Well, hey, we have a great show today and we’re not talking about the banking crisis, but actually we just did. We’re talking about Google going Generative AI for Enterprises. We’re talking about Salesforce going GPT. We talked a little bit about Salesforce last week, but we’re going to fill in and dedicate part of the show to that. Lenovo made some pretty big storage announcements, not just product announcements, but also related to share and related to business, and I love that.

We’re going to talk about Groq going LLaMA, and we’re going to talk about T-Mobile, basically, buying everybody out there. And then we’re going to end this up with another Generative AI topic. Microsoft announced it’s Copilot for Microsoft 365. Wow, look at that: One, two, three, four out of six topics on AI. But, hey, this is real. This could be the big bend in the curve going up and to the right?

Daniel Newman: It certainly could be.

Patrick Moorhead: I’m going to call my own number here on Google going Generative AI for the enterprise. Google had an event early in the week where they had disclosed… Well, I mean, actually let me step back. Google had talked about Bard and I related that and they showed a bunch of, what I would consider, “consumery” scenarios and they are the leader in search by a mile. But this event went in and talked about their enterprise play. Google has Google Cloud. Obviously, they have Google Workspace. The first thing that came out was an API for developers called PaLM. And I hate to think of face palm. I can’t help it. I didn’t name this, but the big L means large and the big M means model.

And Google is really good at creating APIs. I mean all the way back, gosh, when Google Search started, they led the charge in many respects with API-based computing. They also, in addition, announced what’s called Maker Suite. And Maker Suite is an – what they’re trying to create is an easy-to-use platform that I would say normals can use, where instead of getting right in and having to doing what my son does with C Plus+, it’s a little bit more drag-and-drop and text-based. It’s pretty much everything you would expect to hear from Google.

The other thing that they brought out is, that Google is notorious for – not notorious – but they’re really good in Google Cloud at doing the land-and-expand related to data for enterprise customers. And they have a tool called Vertex AI, which is the east to west total platform for AI, all the way from ingest to running inference models and everything in between. And then, finally, they showed some really slick demonstrations of Google Workspace. Google is the number two productivity package on the planet. A lot of students, a lot of enterprises, use them and they’re in direct competition with Microsoft.

Holistically, I think the big picture here is that Generative AI is creating incredible new use cases that help enterprises (a) increase revenue; (b) decrease costs; (c) increase velocity of getting things done; and I would say (d) getting closer to your customer in a more natural way that satisfies them. So I’m super excited that Google threw their axe into the sea and we have something to compare against everybody else like Microsoft and Salesforce.

Daniel Newman: There was a lot that came out from Google this week, and I think the market was pretty reasonable to have expected it, Pat. I mean the Bard initial launch, I think we would agree, did not go to plan. It did not go to plan for Google. And this is a little bit of what happens when you have years and years of work being prepared for a certain moment in time, and then it’d be like Tesla’s about to launch its next vehicle in 24 months and then the competition comes out with something that’s going to destroy it in a month and then all of a sudden they had to announce the car a week later.

And so Google have been working on this, and I think you and I have been pretty outspoken about the fact that Google, for a long time, has really led this category. It was not something that the market necessarily expected. Microsoft sort of changed the timeline, changed the trajectory, pulled a lot of things forward, and I think Google now is looking across its portfolio, its R&D, its research and its go-to-market plan to figure out, “What can we bring to market quickly to let the industry, let enterprise, let everyone in the cloud space know that we are not going to just allow lying down for Microsoft to have the entire sort of Generative AI narrative in the market.” And so that means they have to hit it from a few ends. They have to hit it for the developers, they have to hit it for the cloud and giving the tools. And then, of course, they need to hit it at the app level.

Where Microsoft has been very successful straight away, and we’re going to talk more about Microsoft later, so I don’t want to over-rotate to them, but this is where the competition lies right now. It really is with Microsoft, is they’ve been very successful in showing and demonstrating at the app level ways that these tools and technologies are going to be available and usable for your everyday knowledge workers. And so this is what Google really needed to lean into with this instead of announcements around Workspace was what are the sort of things that users that are every day…there’s hundreds of millions – is there a billion Workspace – it’s a huge number of workspace users, Pat. I don’t know off the top of my head, but effectively it’s this massive number of users and what are they going to be able to do with it? Well, if you don’t recall for some time now, if you use Gmail, my company, we use Google Workspace. So-

Patrick Moorhead: Yeah. And my backend is Google Workspace too, and I do Microsoft front end, but there’s some things that Workspace is just quicker at.

Daniel Newman: And the same here. We use a lot of productivity tools from Microsoft, but we use Google for our email and a lot of our other workspace. And the point is for a time now it’s been doing some of the generative. This is the things that I think people are missing now because they’re seeing generative in a new light, but you started typing a message, “Dear, Pat, I think we need to…” And it would say maybe, “Move the…” And it would fill in, “Meeting.” It’s been doing this kind of generative thing for us for some time, but now we’re seeing this at a new level. And so Google’s really releasing kind of, “Hey, this is what it’s going to be able to do.” Maybe it would be quickly reply to something based upon your email. Or, by the way, you know how it would send you reminders, Pat? Like, “You need to respond to this.” That was being done with a level of AI machine learning and generative to understand what’s important and what’s not? What needs to be the front?

So I guess the real analysis I want to give here is that Google’s been doing this for a while. This is not new. They’re pulling features forward and they’re making them more advanced. For instance, the ability to maybe conversationally give a concept to your doc in Google Docs or have it proofread or have it edit and rewrite something for you is pretty interesting. We’ve all seen the stable diffusion demos, but Google’s also offering things to be able to do auto-generated images based on inputs from you that could be utilized in Google presentations. So Google Workspace is doing a lot of things that are going to sort of mirror what we’re going to talk about later with M365, and that, I think, is really the pull forward.

So the tools and the plumbing and the picks and the axes, Pat, that were announced are pretty important, but I really think that the market needed to see and needed to hear one more time is that Google is actually not new to this, has been doing it for a while and now you’re starting to see it pull forward. I do think they were put on their heels, but I also think they will catch up in time and it will be a very competitive race, which I always say, Pat, is good for everybody.

Patrick Moorhead: Yeah, competition is really good and I think many of us got bored with AI and Generative AI. Natural language models really kind of woke us up from that slumber. So let’s talk a little bit more about AI related to Salesforce. Salesforce is going GPT. They did this last week.

Daniel Newman: Well, as you know that Microsoft, when it announced initially its GPT for Dynamics 365, it also announced that Salesforce is going to be able to utilize OpenAI advanced AI models out of the box. So that’s kind of why you ended up with Salesforce GPT. This is using OpenAI. This is using the same platform and technology that you’re hearing about from Microsoft. So they’re calling it Einstein GPT, and they’re saying it’s the world’s first Generative AI CRM technology delivering AI-created content across sales, service, marketing, commerce and IT interaction at hyper scale. Microsoft would contest that, to be accurate, but I think that maybe the way they wrote the claim to say sales, service, marketing, commerce and IT, they maybe had found a loophole that made it an accurate statement. But, nonetheless, this is sort of the difference between an arms race, Pat, in marketing and an arms race in technology and, in the end, the best technology with the best marketing will win. So, right. There you go. There you have it.

But really what you’ve got going on with Einstein GPT is that Salesforce has its own proprietary models and, of course, using OpenAI, combining it, this is where it starts to get really interesting, Pat, when we talk about things like where does proprietary data play a role? This is one of those great examples of a system of record providing deep customer insights that are proprietary to a single organization that has things like meeting data, interactions, recordings from Zoom calls or Teams meetings and being able to utilize all that data together and do something like draft a sales email and get a proposal put together quickly.

This is really interesting. As you and I both know running companies, response rate and time and speed to response is super important to our customers, being able to ingest from a meeting what a customer’s request might be for more information. Say, an RFI or an RFQ comes in, is able to ingest that from a meeting or from an email, understand what’s being desired, pull from a pricing index or a pricing model that lives inside your CRM, generate a quote, write up an email and then send that email, takes a lot of time off the plate of the salesperson and provides a lot of productivity and efficiency inside of an organization.

And so this is really where I think the integration from kind of using the open internet for things like GPT, which is what most of us so far have experienced is starting to incorporate where proprietary and unique data becomes really interesting. Salesforce is the most utilized cloud-based CRM on the planet. It has a large user base, lots of data, it has lots of workflows and processes, and you start to look at how ML AI Generative can enable a salesperson or a service worker to be more efficient in responding how e-commerce could move faster, how proposals, quotations, execution, delivery can be monitored, managed and automated, you start to see a lot of deflationary value here, Pat.

So I think this is early days. I think Salesforce – this is a great example of a company that had some progress. It’s been doing the Einstein thing for a while, tying it together, starting to help people visualize it and, of course, taking advantage of its leading market position becomes opportunistic for Salesforce. I also think it’s really early days, so I’d like to see a little bit more. I want to watch the workflows, the quality of the interactions it creates, kind of understanding the data sets that it’s leveraging off outside of the OpenAI and what kind of lives inside of the proprietary Salesforce models? But, Pat, I don’t know about you, but if our sales team have a meeting and that meeting automatically enabled the generation of a list of requirements, create a proposal, draft up an email and send it, I’m in, man. I am in. And, like I said, Salesforce won’t be the only company to do it, but this is where it’s going and there’s a lot to like about that.

Patrick Moorhead: Yeah, CRM is just a CRM, CX marketing, sales. I mean, I read about a study in the Wall Street Journal that talked about how low the quality of service has become and how pissed off consumers are. And we all know the frustration of sitting on a call or trying to chat with a really bad chatbot that really doesn’t know much about anything. So maybe this leads to less handoffs and things like that.

So, hey, let’s dive into an infrastructure topic and that is Lenovo. So people have been seeing Lenovo’s prowess in their data center business, led by Kirk Skagen, but most of the focus has been around a compute, where they’re driving revenue and gaining market share. But little do people know that Lenovo actually has a very large storage business and growing. And what I thought was interesting is that they are, literally, the number one storage market share under $25,000 and that’s 61% of the market.

So you’re going to drive some big numbers and that does include things like JBOD and stuff like that. It’s not necessarily NAS and SAN, but their higher level systems have some growth too. They saw 22% growth in the mid-range which, by the way, that is an area that Dell Technologies takes super seriously. All-Flash had over a hundred percent year-on-year growth, but overall, they’re the number five storage player on the planet, which has to wake people up, has to surprise people and know this isn’t a China/Asia thing. This is very well distributed across the world.

And the third thing that they brought out was TruScale. They give a number for growth, which was 600% year-on-year. I don’t know what the number is. My assumption is that in comparison to their overall data center number, it’s low, but it’s good to see them come out and disclose the growth. So Matt Kimball, newly appointed Storage Analyst, will be doing the write-up on its new high density JBOD closure and also ransomware protection updates that they brought to their suite. But hats off to Kirk and team there. Not only are you driving it in compute but also driving it in storage.

Daniel Newman: The success story continued and I continue to believe that the company has the ability to grow across the infrastructure domain because, one, it’s willing to take chances; two, it continues to link together its big portfolio, its global reach, its very robust supply chains. And it’s also been able to really defy sort of the relationship it has as being tied to China. And I think that’s because it’s handled and looped in the right messaging around its security and its ability to be a good partner to government. I know that it’s got some FedRAMP approvals lately, and these things are all really important to notate, Pat. The storage opportunity is really significant. I believe it’s so significant that I went out and bought one of the largest analyst firms that was dedicated and focused on this particular domain. And the opportunity for it to grow and continue to grow in this space is very lucrative.

We’ve talked over the last few weeks about the growth of Pure Storage. We’ve talked over the last several years about the stronghold that Dell has on the market. But, look, there’s a correlation that is going to be very symbiotic, I would even say, between storage and compute, and all of this rapid growth and onset of Generative AI is going to be so infrastructure-intensive that companies that are supplying infrastructure have a huge growth opportunity, cloud scale, enterprise scale. And, of course, Lenovo has shown with its work in compute, it can accomplish this. So I have no doubt given the relationship that exists between compute and storage, that Lenovo’s in a really good place here, Pat.

So you hit a lot of the high notes. I don’t think I need to dwell too much here, but Kirk Skagen and team continue to do a really good job. Your tweet was definitive. Good data in it. Hope it gets a little bit more attention. I’m going to retweet it right now. And, like I said, I think Lenovo is going to be one to continue to watch.

Patrick Moorhead: Good stuff. In fact, I believe that compute and storage are so interlocked that I combined swim lanes under Matt, so I am all in on that. Let’s move to the next topic. Surprise. It’s going to be about AI. Groq is going LLaMA. Let’s reintroduce our audience to Groq and we’re going to have to explain what this LLaMA thing is too.

Daniel Newman: So LLaMA is the Meta’s large language model that has been worked on and was released last month by Meta Platforms, Facebook’s parent, and just like, oh, what’s going on with ChatGPT? It’s iteration and attempt to power bots and generate human-like effects. So Groq is another company we work with. It’s a chip startup focused on AI. The company’s kind of ethos is bringing the cost of compute down to zero, which is very interesting because the cost of compute with Generative AI is going which way, Pat?

Patrick Moorhead: Well, the cost per doing it is going down over time, but people want more, so it’s going up.

Daniel Newman: So the cost per query on GPT is significantly higher. It makes me think about a rollercoaster going up where it’s like tic tic tic… What you’re getting is you’re getting mass adoption of generative capabilities. We know with Bing, everybody went tomorrow and started using Bing. The amount of demand on Microsoft’s data centers and compute would be exponential. And, yes, you’re absolutely right, Pat, over time the market would figure out how to do it for less. But, overall, the amount of compute resources required to do a Generative AI query is substantially higher than a traditional search query.

The amount of compute using GPUs, by the way, which is what most Generative AI is being used, it’s mostly being trained on NVIDIA. I think they have about 90% of that market right now. And GPUs are relatively inefficient. They suck a lot of power. And in a world where sustainability is one of the underlying governances of every business, is we want to be water-positive. We’ve heard AWS. We’ve heard Microsoft. We want to lower our carbon footprint. Well, we’ve got about 1% of the world’s energy right now being consumed by data centers and that number is going up.

So, anyways, it’s kind of a runaround of what’s going on here with Groq. Well, the interesting thing is GPUs and compute and this relatively rapid correlation of growth in compute utilization means that we’re going to see all these challenges about costs go up. How do you deliver to our customers? What is Microsoft, Salesforce, Google going to spend? How do they build out their data centers to support this? And a company like Groq becomes kind of interesting because it has very unique capabilities of software and compiling to be able to take a model like LlaMA – and this is what it did – it took the model and moved it from the NVIDIA GPUs and recompiled the code and started running it on its GPUs. And the findings were that they could do it more efficiently and lower utilization of power.

And this becomes an interesting question mark, is are there other chip players besides NVIDIA that have a chance to really be influential and, potentially, be disruptive? We know AMD is leaning hard in on AI. I’ve been in a lot of conversations with Intel. Intel with Habana Gaudi and open oneAPI. Open-source is looking at an approach with Groq. But these startups, companies like Groq, Cerebrus, like Sambe Nova, are trying to build very specific application specific chips that could, potentially, be disruptive to NVIDIA, run a model more efficiently, but the hard part, Pat, is when you have CUDA and you have all these developers building, moving models from one hardware set to another is really difficult.

And so Groq did this in just a few days, and that’s kind of the really interesting thing. I talked to their CEO about it is the ability to use their compiler without tons of developers having to optimize code to be able to move it from one piece of hardware to another was really, really interesting and should be exciting to the market because we have to solve those two problems, Pat. We have to solve the cost problem. I mean, NVIDIA’s going to make a fortune on this Generative AI movement, and we’ve seen its stock rip because of it, but there needs to be a challenger here.

And because the other side of it, Pat, the sustainability side of it, is we need to look at doing it more efficiently. I know you and I are all about measurable sustainability. We talk about this all the time. Not doing it for the sake of green washing and marketing. Do it for the sake of the fact that we really have a challenge of creating enough energy to support all this growth. So using inefficient chips to do things like Generative AI long term is not the answer. So either the GPUs need to become more efficient or we need to look at this custom silicon, these ASICs, that could, potentially, run these large models at a lower cost.

Patrick Moorhead: Good analysis there, Daniel. And Groq is one of the players that I do think is going to be left standing looks like. I mean, it appears to me they’ve managed their cash, their investments. And one of the biggest problems if you talk to end users, people who try to use this, is the software, and they would like a more flexible software infrastructure and they do want more competition. The benefit of a GPU is that as these models change so much, their programmability, the trade-off from being the most efficient, kind of rears its head, right? And that’s one of the benefits. And, heck, even people do training and inference on CPUs. It’s actually people do more training and inference on CPUs than they do on GPUs and that’s when the data center is dark and they’re trying to use resources. So different strokes for different folks.

At some point these models will change. And I don’t know… I keep thinking it’s going to be five years, but look at the growth, look at the size of these models that this didn’t come out of nowhere. In fact, the industry had been talking about these large models, natural language models, for forever. So I’d like to see Groq roll out some customers on this, as well. I do applaud them, though for doing this disclosure and giving information out. The company doesn’t disclose information like this, but I hope it gets them some attention and other people evaluating and using their silicon.

Daniel Newman: It’s the moment, Pat. I mean, if companies like these aren’t talking in this moment, when is the moment? And it was nice to see Reuters… I think it was Stephen Nellis maybe that picked it up, covered it? Yeah, it was-

Patrick Moorhead: Yeah.

Daniel Newman: Yeah. I’m just saying it was good to see someone pick it up because, like I said, it’s almost as if this last month that Microsoft and NVIDIA are the only two companies in this space and there are other companies that we need to pay attention to that are going to disrupt. So-

Patrick Moorhead: Moving on to something unrelated to AI. Basically, T-Mobile is buying everybody left now.

Daniel Newman: I heard it was a deep fake of Ryan Reynolds.

Patrick Moorhead: Yeah. So carriers for years have operated as MVNOs, which is, if I have excess capacity, I’m going to let somebody ride in the back of it. Come up with maybe a prepaid or a challenger brand that enables that. And if the network gets congested, they get bumped to the back of the line in terms of how fast it can go. So this has been in there for years and Mint has always ridden on the back of T-Mobile’s 5G network. So T-Mobile bought Mint, that’s their prepaid pay. Ultra, which is an international heavy duty, South America. And then Plumb, which is a wholesaler that sells to other people at bulk, and they went in there and bought all three with one fell swoop. Cash and stock deal. Less cash more stock. They’re going to operate these as separate brands.

And I have to talk about just the hilarious quote that Ryan Reynolds had where he talked about – he did some corporate stuff but he said, “Hey, we are so happy T-Mobile beat out an aggressive last minute bid for my mom, Tammy Reynolds.” So, I mean, it just shows how much fun the Un-carrier can have on the consumer side. I sometimes wonder how this impacts their T-Mobile-for-business side. I do think you can be cool, edgy and enterprise. I saw Sun do it back in the day when a sea of black and gray boxes… They came in with purple boxes and just made it fun. I think you see a lot of companies out there that have fun in the storage. I mean, the EMC was notorious back in the day for having fun and being cool and being enterprise grade.

But, anyways, hats off T-Mobile. Before 5G even came out, I wrote that they would be the number one 5G and they had the lead in the US on 5G network, and the company keeps making me look smart every day. And this is, basically, a way to soak up capacity.

Daniel Newman: Yeah, it was a pretty smart move. And I think the just Gravity, whether it’s been Rex MFC or Mint Mobile, Ryan Reynolds is not just an actor. He may be a better entrepreneur than actor. I’m being honest about it because he’s done quite well. T-Mobile has had a few challenges with things like security that have been notable, but its expansion on 5G has been significant, rapid and has quickly jetisoned the company, in my opinion, to the number one mobile company. I know there are different metrics in which they use to assess number one, but I think its coverage and, of course, it’s connection to Deutsche Telekom makes it a very pleasurable experience for me, at least, traveling around the globe with T-Mobile.

I will say, though, Pat, I was in Dustin or in that area this week and I had endless blackout spots down on the Gulf Coast of Florida. And I’m not talking about a three bars of 5G. I’m on T-Mobile by the way, my whole company’s on T-Mobile. So maybe a little service request. I could not get a text out with three bars supposedly of 5G, and it was very stressful for me.

Having said that, these buys, Pat, continue to show incredible ambition, and beyond the fact of my own little whining for maybe not being able to get a connectivity in a certain spot-

Patrick Moorhead: That’s a big deal. That’s not whining. I mean, that’s a big deal. You have to have connectivity, and I don’t know if it’s just sheer luck, but that was an issue that I had with T-Mobile before and the reason that I didn’t have a phone. My second phone is T-Mobile now. But, no, they need to look at that. Could have been over-saturation network. Whatever it is, it can’t happen. Did you have a chance to ask if Verizon and AT&T folks were having a challenge?

Daniel Newman: There was a fairly robust set of complaints running in this little area. I was in that little area. You remember the movie The Truman Show in Seaside? And there was this cute little quaint town and it’s like I, literally, could not get on a phone call and get audio through. Sometimes a Zoom call doesn’t go well or a Teams meeting when you don’t have great bandwidth. But I had two or three bars and nobody could hear me. So it was actually funny. I was able to hear in. So it was almost like… Remember those old video calls where you could only hear but you couldn’t see because of the encryption? It was really frustrating, though.

So it also shows, by the way, that we left Mobile World, Pat, and you and I kind of laughed about how people were saying we were in the late stages of 5G and it’s like, no, no, no, no. Not only do we have a lot of opportunity ahead with enterprise 5G, but we still have a long way to go on the consumer side. But, like I said, I’m very bullish on T-Mobile. There’s some cleanup to do, but its acquisition and M&A strategy and its overall Un-carrier persona has definitely transcended what has long been a well-disliked industry by most people and it’s created a cool sort of buzzy brand. So good for T-Mobile. Work to do, but, heck, Pat, isn’t there always?

Patrick Moorhead: There really is. I mean, we’re not there. And I remember saying,” Hey, we’re 50% through the build out.”

Daniel Newman: Oh, yeah.

Patrick Moorhead: I want to revise that. I think we’re 35% of the build out when I look at massive MIMO and some of them that you said that are there. I think the industry way over-promised on 5G and now it’s paying the price for it.

Daniel Newman: When it works, it’s awesome. When you have a really good millimeter wave connection and you’re in a city blazing, it’s amazing. It’s just half the time it isn’t better than my advanced LTE experiences, where it probably is advanced. All right, we got one more, right? So is it my turn or your turn?

Patrick Moorhead: Are you going to be the host now?

Daniel Newman: You know I do this, sometimes. Sometimes you move the shy-rons when I’m supposed to move them and sometimes I take the hosting role when you’re supposed to do it.

Patrick Moorhead: When you get bored with the topic, you just want move to the next. But let’s move to the next topic. And we’re doing bookends here for AI at the beginning, the middle part of the chapter and the end. But Microsoft had a gigantic announcement yesterday bringing out Copilot for Microsoft 365. Dan, do you want to lead this one?

Daniel Newman: Yeah, Pat, because we didn’t do the GPT-4 announcement here because there were so many other announcements. But, by the way, GPT-4 came out, which is kind of a big deal. That’s going to be a lot of the foundation here of taking what we kind of experienced to the next level. Oh, and we also forgot to mention Honeywell and the new CEO. I’m just squeezing that into the Microsoft segment. Okay, so… It totally fits, right? I just mean things that didn’t get talked about because there were so many things to talk about. And so we went to the event, what? In early February, Pat, and the Bing powered Edge browser, kind of, open internet GPT, was disclosed. And then quickly after, we’ve now seen Dynamics come out and a big set of announcements around Dynamics 365 and ChatGPT and OpenAI. And this week we saw the Copilot being now announced for M365, which means Teams, all the Office tools, they’re all getting supercharged with GPT.

Now, Pat, you and I, and a few others, have had a chance to be briefed and see some of these demos, and I’m blown away. I just want to say I’m blown away, one, at the pace of innovation that’s coming out of Redmond, the company is doing an incredibly good job of taking that 10 billion investment into OpenAI and supercharging its entire portfolio. If you can’t start to kind of predict what’s ahead for you, take your head out of the sand. It’s not that hard to figure out.

Now, the exact sets of features that will be launched, this gets really interesting, Pat. Microsoft was able to show demonstrations of which you could take a technical marketing document and from that technical marketing document you could, say, basically, create a press release. And, voila. And I read this thing and there’s a lot of kind of controversy about the accuracy and the quality because it’s so exciting to see the generative thing happen that sometimes you forget to actually read and be like, “Is this factual? Is this accurate?” And I think we do all have to remember, though that this is reinforcement. So as we continue to do more of this, as we continue to create more of this and continue to provide more data to the models, the models will get better and they will continue to work better. But what I read when I saw this demonstration and the press release it created, pretty impressive.

And then, of course, it has customization. So they call it Copilot, by the way, because the whole idea is this is not supposed to be a displacement tool. And this is going to be a really interesting conversation we’re going to be having with the market, with Wall Street, with employees, with advocacy firms. I wrote a book called Human Machine. I don’t know if you remember that? I don’t talk much about the books I’ve written. I wrote seven, by the way. So-

Patrick Moorhead: That’s one of the things you don’t talk about.

Daniel Newman: Yeah, one. I rarely talk about my accomplishments because I don’t have any. No, I’m kidding. But the whole idea with Human Machine was sort of this convergence, and we were kind of calling this – this was four or five years ago – that this was going to happen. But what we’re seeing now is this sort of augmentation process. So what this Microsoft Copilot does, is it augments the worker. So your comms team used to have to sift through hundreds of technical documents, meet with product managers, product marketing teams, meet with comms, meet with executives, strategy teams, come together and figure out, “How do we take two years of work on developing a new product and put it into four paragraphs in 500 words and send it out on a news wire?” Well, now you don’t have to do any of that. You can take a single spec document and write something that I think gets you 90, 95% of the way there. And that’s really the way Copilot is being positioned.

Now, in the real world that means we’re upskilling and we’re bringing our workforce to a higher level of productivity and a higher level of cognitive quality in the work that they’re going to do. Of course, there’s also markets like Elance and Upwork that are entire communities of people that do what, Pat? Write press releases for you, create a PowerPoint. By the way, same marketing document can draft out a really impressive five-slide PowerPoint. And you can say things to it like, “I want slide three to be visual showing a split of three main bullet points with three pictures and three bullets beneath each,” and it will do that for you.

And I don’t know about you, Pat, but I hate creating PowerPoint decks. And so what did I used to do? I used to hire people to create my decks for me. And now I do wonder is, “Would I still do that or would I use Copilot?” And I think the answer is you’d probably do a little bit of both, but where you might had to hire someone and take a week to get you a PowerPoint deck, Pat, now you have Copilot create it and you have someone spend an hour making it look perfect for you. So these are things that are going to create great efficiencies. It’s really exciting. You can kind of see how this fits in. It’s in the edge. By the way, great article on Forbes by a senior contributor named Patrick Moorhead where he talks about the AI Copilot for 365. There was advancements made in Teams, there’s advancements made in other areas.

I will pause here because I want to give you oxygen, but my big thing to watch has been upskilling, how this work changes that knowledge work category and, of course, the accuracy that’s going to continue to improve with the utilization of these models. But, Pat, I was really impressed.

Patrick Moorhead: Man, I got to tell you, these demos speak to me in the way that I do work, but I know the way that billions of others get stuff done. What I liked about it too is you can go from a PowerPoint deck to a Word doc. You can go from a Word doc to a PowerPoint deck. You can string together two or three pieces of input and it cranks something out.

Now, I want to use this thing. First off, hats off to Microsoft for unleashing and letting people other than what Google calls trusted testers to use this. You and I were using this on day one in Redmond for basic Bing Chat. And I can’t wait. Microsoft has not opened this up to analysts, that I’m aware of, or press. I’m really looking forward to that. This could fundamentally change the way that we do work, that my team does work. And it’s not just n equals one. It’s n equals a billion people. Check out the article that I did, a tag team analysis with Melody Brue and myself, my VP of Modern Work.

But what’s next, Dan, I mean, for these folks? They’ve kind of shown what they’ve done. Now they have to deliver and there’s going to be mistakes. Nothing aggravates me more than people saying, “Hold on. Slow down.” I mean, it’s not like we’re talking about self-driving cars that can kill people. We’re talking about PowerPoint, Word and Excel. I was at AltaVista when Google was forming, and I remember people loved Google because the search was parsed. And back then, for our engineers, that was a dangerous thing. It’s not the real internet. People didn’t want the fricking real internet. It was ugly. It brought up a bunch of college essays as its top return. They wanted the parsed internet. And Google, they got taken to the mat on everything. Every new feature that Google Search added, they got taken to the mat.

So I urge people to be caught. Just don’t be stupid. Be cautious. But I’ve been a big believer in this in my entire work career, if you’re not breaking some glass or pissing somebody off, you’re not leaning in hard enough. It’s just the degree of the risk that comes out. So hats off to Microsoft. That is our last topic of the day. Daniel, we just ripped through there, didn’t we?

Daniel Newman: And can I just pile one thing onto your last comment?

Patrick Moorhead: Sure.

Daniel Newman: So Search isn’t always right either. And it’s kind of interesting to me that everyone’s like being super scrutinizing all these GPT capabilities and looking for inaccuracies. Like, look, misinformation has been spreading for multiple years now. And, by the way, some of it’s been done by the machine and some of it’s been human created. I mean, we’ve, literally, decided to filter out factual data for political reasons and over time. What I mean is there’s a lot of reasons searches don’t return accurate info. GPT will probably be more accurate in time because of reinforcement and training and learning, but all technology as it rolls out, is flawed. So, to your point, there’s flaws. It gets better. It gets improved. If we like this thing, like the first PowerPoint it creates, it might not be the one I would take and do in a stage in front of 10,000 people, but you know what? You got to play with this stuff. You got to invest. That’s how we got gaming to where it is today. That’s how we got to self-driving capabilities. That’s how we’ve got to the internet where it’s at.

It is interesting, and I think you said something really profound. There’s no reason to slow down. We, as humans, have qualities that AI will never have and we just need to continue to speed up our own progress alongside with the compute. I think we got a really fascinating road ahead. So I thought it was very profound what you said, and I just wanted to pile on just a little.

Patrick Moorhead: No, and listen, we don’t have to have a super duper regimented podcast here. Let’s talk how we want to talk. Let’s do this. Riffing off what you said, I do find it fascinating that we, humans, expect machines to be better than us, not the same, but a lot better. Statistically speaking, a lot of the autonomous driving technologies can save lives, particularly around dozing off, drunk driving, on prescription meds, driving. But we expect more. I totally understand that. But I think the reason that we have this creep in computing in the car is because of those increased expectations.

I remember a semiconductor company, which will remain unnamed, that was telling me that they could do this on a smartphone nine years ago, okay? And now we’ve got twin supercomputers with ASICs driving this, it still isn’t enough for AD. Anyways, Dan, great show. We cranked through a lot. Heck, we didn’t bloviate like we normally do here. I don’t know if it’s lack of sleep kind of wanting to get in. All I know is I’ve got, I don’t know, nine meetings going to crank through today as my family is off riding horses.

Daniel Newman: Have a good one, my friend. I’m following a similar suit. Looking forward to the weekend. I don’t know about you, but my vacation, I tried to take one this week. I just want everyone to know I work half days, which was good, only 12 hours a day, and it was a great week. So always a great week. So much going on in tech, Pat. Love the show. Love the show. Love on.

Patrick Moorhead: I just want to thank our audience for coming in here and spending time with us. Hopefully, you can see that it’s Dan and ours favorite time of the week. We appreciate you. If you like what you heard, hit that subscribe button. We will see you next weekend – sorry, next week. Same bat time. And don’t forget about the Six Five summit coming up. What was the date again, Dan? June?

Daniel Newman: Six to eight.

Patrick Moorhead: Yeah, June six to eight. Sign up for it. It’s going to be bigger. It’s going to be badder.

Daniel Newman: Announcements coming. We have to start announcing some of these big speakers, Pat. The demand is off the charts.

Patrick Moorhead: I know. It’s great stuff. Anyways, thanks. We love you. Take care. Have a great weekend.

Author Information

Daniel is the CEO of The Futurum Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise.

From the leading edge of AI to global technology policy, Daniel makes the connections between business, people and tech that are required for companies to benefit most from their technology investments. Daniel is a top 5 globally ranked industry analyst and his ideas are regularly cited or shared in television appearances by CNBC, Bloomberg, Wall Street Journal and hundreds of other sites around the world.

A 7x Best-Selling Author including his most recent book “Human/Machine.” Daniel is also a Forbes and MarketWatch (Dow Jones) contributor.

An MBA and Former Graduate Adjunct Faculty, Daniel is an Austin Texas transplant after 40 years in Chicago. His speaking takes him around the world each year as he shares his vision of the role technology will play in our future.

SHARE:

Latest Insights:

On this episode of The Six Five Webcast, hosts Patrick Moorhead and Daniel Newman discuss Apple Vision Pro developers losing interest, U.S. awards Samsung and Micron over $6B in CHIPS Act funding, does AMD have a datacenter AI GPU problem, Adobe’s use of Midjourney, Samsung knocks Apple off of number 1 market share, and Arm says CPUs can save 15% of total datacenter power.
In Recent Years, the Concept of a Sovereign Cloud Has Gained Significant Traction Among Nations Seeking Greater Autonomy and Security in Their Digital Infrastructures
The Futurum Group’s Steven Dickens observes that Oracle's recent $8 billion investment in Japan not only expands its cloud infrastructure but also strategically aligns with the growing global trend toward sovereign cloud solutions.
Hammerspace, Seagate, Quantum, LucidLink, and Resilio Are Among the NAB Products of the Year for 2024
Camberley Bates, VP of Data Infrastructure at The Futurum Group, covers the significance of data infrastructure at the NAB Show 2024 and the Product of the Year Awards.
The Circular Economy Can Help Your Company and Business Ecosystem Improve Sustainability Efforts and Create New Innovation Opportunities
Olivier Blanchard, Research Director at The Futurum Group, encourages you to download the latest whitepaper from The Futurum Group, which highlights circularity—a perfect read for Earth Day.