Search

The Six Five On the Road: IBM Systems — Benefits of Fundamental Science & Technology Innovation

On this episode of The Six Five – On The Road hosts Daniel Newman and Patrick Moorhead had the opportunity to sit down with key executives across IBM to talk about their full-stack infrastructure and the future of computing.

In this interview segment, Patrick and Daniel were joined by IBM’s Ross Mauri, GM IBM Z and LinuxONE, to explore the benefits of fundamental science and technology innovation.

Watch their other IBM conversation segments:

IBM’s semiconductor vision and ecosystem with Mukesh Khare, VP Hybrid Cloud at IBM Research

How IBM’s Cloud fits into their full stack and impacts the future of computing with Hillery Hunter, GM, Cloud Industry Platforms & Solutions, CTO IBM Cloud, and IBM Fellow

Distributed infrastructure, AI, and how IBM’s vision of the future of computing extends to Edge with Nicholas (Nick) Fuller, VP, Distributed Cloud at IBM Research

How Quantum Computing is shaping the future of IT with Jay Gambetta, IBM Fellow & VP Quantum Computing at IBM Research

Watch the full episode: IBM’s Full Stack Approach to the Future of Computing

To learn more about the IBM Research, check out their website.

Watch our interview here and be sure to subscribe to The Six Five Webcast so you never miss an episode.

Listen to the episode on your favorite streaming platform:

Disclaimer: The Six Five Webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we do not ask that you treat us as such.

Transcript:

Patrick Moorhead: And that’s all right. And next up, we’re going to be talking to Ross Mauri who runs the Z division. He’s going to be talking holistically about systems and it’s a great segue coming from Mukesh who talked about semiconductors and what the company is doing in that area.

Daniel Newman: Always great to talk to Ross, connect the dots.

Patrick Moorhead: Ross, it’s great to see you again. I think we bumped into each other at one of the first IBM thinks that was back in person again. I love that venue by the way, but we’re here in Albany right now talking we’re on kind of this IBM road tour, talking about the future of computing and really getting underneath of what IBM is doing, looking at your top to bottom approach. And here we are. So it’s great to see you.

Ross Mauri: Thanks, Pat. It’s great to see you too. And I love being here in Albany. The researchers and scientists here really make a difference for the future of my platform, the Z platform. And so it’s great to be in this setting and to see both of you here today.

Daniel Newman: Yeah. The road to Albany is beautiful. It’s very green, very lush. If you haven’t been here, I think people should probably visit sometime, but we’ve had the chance to speak to you. And now in a number of your colleagues, and we’re really kind of looking at this future of compute narrative and this full stack story, and IBM is such a interesting, compelling story. There’s so much history, and then obviously so much innovation going on. And I want to start there with you, Ross. We just got finished. We talked to Mukesh, looking at the whole kind of semiconductor research part of the business, where that’s heading. And of course, you leading a big part of the systems business and IBM Z. I want to kind talk about what we talked with Mukesh about with the semiconductor and the research and how that really informs and drives the business and systems for you.

Ross Mauri: So we’ve had a very long, decades long partnership with IBM research. And yes, it is around the Silicon. It is around the chips. It is around fundamental computing paradigms and research is really… I mean, I would say they’re an essential part of the Z business. And one of the things I know that heterogeneous computing and full stack integration is coming into Vogue now, but that’s something the mainframe platform has basically taken advantage of for decades. And it does start at the Silicon because the fundamental performance, reliability, and security of the system is baked in from the chip packaging to the central electronics complex and including the memory operating system. I can stack it up.

But again, research is fundamental to most layers of the Z system. And again, back to semiconductors. What research has been able to push for us, not only in terms of density and speed of circuitry and things that allow you to pack more horsepower and more capabilities into a smaller area, but also some fundamental breakthroughs like around security, like post quantum crypto. And we’ve seen the recent [inaudible 00:20:29] announcement. It was really thrilling that all four of the first accepted algorithms had IBM researchers participating in them. And two of them were led by IBM, but that type of partnership and leveraging of innovation is key to my business. And I love the fact that research is looking 5, 10, 15, 20 years down the pike. They’re solving hard problems with us, for our clients, even before our clients run into those problems, so to speak. So I think that’s one of the great things… to speak. So I think that’s one of the great things. And again, to me, it all really starts at the semiconductor.

Patrick Moorhead: It’s a great time right now. I would say if I could look back 20 years ago, people were talking about semiconductors being this commodity, and I’ve always been a big believer that people allow themselves to be commoditized. And here we are now where I think everybody knows what a semiconductor is, and I’ve always loved your fit for purpose approach that seems to be in Vogue right now. So I don’t know if you started the trend or you showed people how it could be done, but a lot of companies have jumped on the bandwagon. And here we are talking about this. One other thing that people might not know as well is that IBMZ in the mainframe is very much cloud enabled. Your clients obviously know this, but a lot of people don’t. So I’m curious though, how does Z fit into the overall IBM cloud story?

Ross Mauri: So I think we fit very, very well. I’m happy to say… I mean, you can take cloud at different layers of the stack again. You could take it, I would say at the highest layer where we have IBM mainframes and IBM power systems fully integrated into the virtual private layers of the IBM public cloud, so that you can access a mainframe capability, but through a [inaudible 00:22:33] service, through a web service.

Patrick Moorhead: In fact, give a big time security service.

Ross Mauri: That’s right. That’s right. In particular, what clients want is they want to keep their own crypto keys. They don’t want anyone else touching them. And the mainframe has I would say the best cryptographic capability of any system in the world. And so by putting some mainframes in the IBM public cloud, all the services, whether they’re running… They’re not necessarily running on the mainframe, they might be running VMs on X86, but they’re accessing those crypto services. So again, at the highest level, the mainframe integrates well with the cloud, but then there’s all different layers that I would say that are just as important in a multi-cloud hybrid cloud world.

Connecting applications via something, a common platform like Red Hat OpenShift, and having containers be more of a fundamental way to develop applications, not just for portability, but I would say for better manageability, being more agile and the mainframe integrates with every key Red Hat product into the OpenShift stack. And again, there’s layer after layer, at middleware layer, at service layer, like a Kubernetes OpenShift layer down deeper. We’re integrated all the way. And IBM has two strategies really that we talk about the highest level, hybrid cloud and AI, and we’re integrating at all levels on both of those.

Patrick Moorhead: Okay. Real quick. Wait, are you telling me that the IBM systems can… They run Linux and they run containers and you can address them in the public cloud. And they’re part of a complete hybrid cloud architecture. I’m being a little snarky, of course, but it’s like-

Ross Mauri: Yes, absolutely.

Patrick Moorhead: Not many people know this, but I think it’s a real testament to a lot of the success that you’ve been having over the past few years.

Ross Mauri: Thanks. And our clients have been telling me that they want us to better integrate. And again, with some common software platforms, that’s been very easy. Though, we always have to put a little bit of the mainframe into what we do. If something does become distributed, say the security model and things like that, because again, our clients are… Especially the ones in regulated industries are basically, they’re counting on the highest level possible security that our systems have for their applications and their data.

Daniel Newman: Something interesting that we had the chance to speak about in one of our briefings interactions, Ross, was sort of how the chip technology has advanced so much, the design, the packaging to be able to do more on chip. And in your newest Z16 iteration, this sort of ties into the whole hybrid story that we’re trying to tell here though, you brought AI much closer. Is that a great example in your mind though of how the relationship with research and systems and how you develop products? Is that maybe a concrete example in this most recent generation of what’s really happening in that relationship?

Ross Mauri: I think it’s a fantastic example because research had been looking at AI and doing AI Silicon for a number of years and they were doing test beds and demonstrations. And we saw one of the engines that they had built. And it dawned on some of our lead technical folks about six years ago that what if we were to actually embed an inference engine, an inference accelerator right onto the microprocessor, kind of on a different side of the bus than everybody else has AI implemented? What could we change? And so research worked with us and it was actually their logic for the AI inference engine that we took. And then we adapted it and made it really robust and a lot of error checking and all that as you expect in a mainframe and integrated it onto silicon. And so now we can do things that no other system can do. With guaranteed a millisecond or less latency, we can do 300 billion inferences a day on one system, on one mainframe system.

So we’re bringing high performance, low latency inferencing into the decision process of something like a transaction that a bank or a payment company might do. Game changing.

Patrick Moorhead: By the way, I guess the fancy word we use that for anything that’s not let’s say a general purpose CPU is heterogeneous computing. You talked about AI acceleration, but you do a lot more too. You have crypto, you have FPGAs. You were doing [inaudible 00:27:12] before it was cool. Okay. Now it’s cool too, but let’s dial out a little bit. How is this changing the way that you’re recommending your clients look at the overall estate of their enterprise architecture?

Ross Mauri: Well, it’s in a number of ways, but I would say one that’s really fundamental to what they do today is as they use the mainframe for a transactional engine and the operational data that’s created is really core business data that’s relevant for many uses within a business after it’s created. And so the two decades ago, or maybe more people started copying the data off the mainframe, because they didn’t think they could do things like advanced analytics or AI on a mainframe.

So they’d copy it off. When you copy data, you open up a security risk because you now have multiple copies of usually highly valuable data. There’s complexity. There’s cost to it. And one of the things and clients are saying is we want to be able to use this data more real time. So that means as opposed to copying it off and processing, post-processing, let’s do more real time. And that’s what, again, the partnership with research, listening to clients, and then trying to bring the value of the integrated stack back to the integrated stack to the middle of our clients businesses. And we’re doing that.

Daniel Newman: That’s great. Yeah. There’s a lot of excitement here. Clearly, this most recent cycle has shown a lot of momentum. It’s always very interesting. And Pat, I think you’d share my sentiment to sort of hear how these different threads come together, how the research. You and I always talk about how we need more R and not just D. And there’s a lot of that going on here in Albany where we’re talking, Ross, and of course it’s always great to sit down with you. Thanks so much for your time.

Ross Mauri: Absolutely.

Patrick Moorhead: Yeah, I appreciate it. Yeah, you did the Six Five Summit two years, so you’re not a newcomer, but here we are again, Six Five on the road in Albany.

Ross Mauri: I’ll be back.

Patrick Moorhead: Yep. Thanks so much. Thank you.

It’s always great to talk to Ross and I’m just getting to know his team and the Power team. They were doing cool SOCs and applications, pacific integrated circuits and accelerators before it was cool. And I think they deserve a lot of credit for that. And it’s not for just doing it. They’d actually get business, their clients get business value out of it.

Daniel Newman: It’s a lot of fun when we have the chance to talk to Ross about that too. All that business value and some of those specific customer cases. Look forward to maybe someday sharing those a little bit more publicly. But when you hear how the research turns into value, I think that’s when I as an analyst get really excited because sometimes you hear about a concept and a concept and a concept and you go, what’s the intrinsic value? What’s the market value? What’s the consumption value? And this is what we’re starting to feel is that these investments and really what is a full stack future of computing architecture are starting to come to fruition.

Patrick Moorhead: No, it is good. I mean, you’ve got research to the development, to the product platform, to clients and delivering value. You don’t see that in too many companies, but it is good to see. There’s not enough of it out there, but I always love talking to Ross.—

Author Information

Daniel is the CEO of The Futurum Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise.

From the leading edge of AI to global technology policy, Daniel makes the connections between business, people and tech that are required for companies to benefit most from their technology investments. Daniel is a top 5 globally ranked industry analyst and his ideas are regularly cited or shared in television appearances by CNBC, Bloomberg, Wall Street Journal and hundreds of other sites around the world.

A 7x Best-Selling Author including his most recent book “Human/Machine.” Daniel is also a Forbes and MarketWatch (Dow Jones) contributor.

An MBA and Former Graduate Adjunct Faculty, Daniel is an Austin Texas transplant after 40 years in Chicago. His speaking takes him around the world each year as he shares his vision of the role technology will play in our future.

SHARE:

Latest Insights:

Generative AI-Powered Workflows Are Helping to Fuel Performance Across All Key Business Areas
The Futurum Group’s Daniel Newman and Keith Kirkpatrick cover ServiceNow’s Q1 2024 earnings and discuss how the company has successfully leveraged generative AI across its platform to drive revenue growth.
A Game-Changer in the Cloud Software Space
The Futurum Group’s Paul Nashawaty and Sam Holschuh provide their insights on the convergence of IBM, Red Hat, and now potentially HashiCorp and the compelling synergy in terms of developer tools, security offerings, and automation capabilities.
Google Announces Q1 2024 Earnings, Powered by Revenue Gains across Cloud, Advertising, AI, and Search
The Futurum Group’s Steven Dickens and Keith Kirkpatrick cover Google’s Q1 2024 earnings and discuss how the company’s innovations across cloud, workflows, and AI are helping it to drive success.
Intel Showed Progress in Q1 2024 Results Led by Double-Digit Growth in Intel Products and Intel Foundry Delivering Breakthrough Intel 3 Production
The Futurum Group’s Ron Westfall and Daniel Newman assess Intel Q1 2024 results and why Intel’s new foundry operating model provides transparency and the new Intel Products immediately bolster the Intel enterprise AI proposition.