Search

Confidential Computing: How Vendors are Looking to Deploy Trust-Based Computing Models – Futurum Tech Webcast

In this episode of the Futurum Tech Webcast, I was joined once again by my partner, Daniel Newman, for the next installment on our series on Confidential Computing and how vendors are looking to deploy trusted-based computing models, specifically in areas such as Trusted Execution Environments, Enclaves, and homomorphic encryption, as well as a discussion on what’s ahead. These conversations are a precursor to a research brief we are in the midst of completing and will hopefully serve to whet your appetite for a deeper dive very soon.

Confidential Computing: How Vendors are Looking to Deploy Trust-Based Computing Models

To recap our conversation of last week, Daniel and I touched on the instances of cybersecurity breaches in the news, the average cost of a data breach, the impact on careers that a data breach inevitably causes, as well as current legislation that’s been introduced in the U.S. around requirements for reporting of data breaches in a specific and timely manner.

Today, our conversation centered around:

Revisiting Operational Trust vs. Technical Trust

Operational Trust is the kind of trust within an organization that we’re accustomed to and revolves around the thought that better and regular training, stricter rules, compliance, certifications, etc. are what will keep an organization safe. While that may be in part true, Technical Trust, which is the focus on removing people from the security equation altogether through deployment of technological solutions rather than those other things is where we need to be heading.

What is the Goal of Confidential Computing and the Complexities Around the States of Data

We discussed the goal of Confidential Computing, which is at its most basic the goal of reducing the ability for a systems administrator of a platform to access data and code inside Trusted Execution Environments sufficiently so that this path is not an economically or logically viable attack during execution. Data exists in one of three states: At rest on a storage device, in transit between two locations across a network, and when it’s in use as its being processed by applications. Confidential Computing is the protection of data in use by performing computation in a hardware-based Trusted Execution Environment and covers software attacks, protocol attacks, cryptographic attacks, and base physical attacks.

The Confidential Computing Consortium

The Confidential Computing Consortium is a group founded by the Linux Foundation and comprised of some of the biggest names in technology who have partnered to focus on security data in use using hardware-based TEEs and accelerating the adoption of Confidential Computing through open collaboration.

The Role Hardware Plays in Security (and Confidential Computing)

When security is your end game, rooting security in silicon and working outward should be the foundation of your strategy.

This was a quick but wide-ranging conversation. Daniel and I spoke more about the specifics on Trusted Execution Environments, and what is in scope as it relates to Fully Homomorphic Encryption, a class of encryption methods first envisioned in the 70s and now a fundamental part of Confidential Computing. Whether you’re a senior leader focused on making security a fundamental part of business strategy (and we hope that you are), or a CISO charged with keeping your organization safe, this is a conversation you won’t want to miss.

Be sure to be on the lookout for our soon-to-be-published research brief on the topic of Confidential Computing — we think you’ll find great value in it. Until then, dig in to this webcast by watching the video here:

Or listening to the podcast here:

And while you’re there, be sure and subscribe to our YouTube or Podcast channels so that you won’t miss an episode. We are generally pretty interesting — we promise.

Disclaimer: The Futurum Tech Podcast is for information and entertainment purposes only. Over the course of this podcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we do not ask that you treat us as such.

Other insights from Futurum Research:

Oracle And Red Bull Racing Partner To Advance F1 Analytics 

Benu Networks Upgrades BNG: SASE And 5G AGF Additions Bring Cloud-Native Edge To A Carrier Near You

Automation Anywhere, Google Cloud Partner To Expand RPA Use In The Enterprise

Transcript: 

Daniel Newman: Hey everybody, good morning and welcome to the Futurum Tech Webcast, live today on LinkedIn, live on Facebook, and maybe live in other places, but also for those of you that are catching this later recorded or on-demand, we appreciate everybody here.

Welcome to the show. Shelly Kramer, my esteemed colleague/partner at Futurum Research, and of course me, Daniel Newman, Principal Analyst at Futurum Research as well, talking confidential computing as a multi-part series, and this is the second of a three-part. And it’s going to be part of a report that we’ll be putting out very soon, very soon, be patient with us here. And if you followed our show last week, you will know exactly what we’re talking about. But if not, we’ll give you a quick rundown in a second. But before I get started, before we jump into the show, Shelly Kramer, good morning, how are you?

Shelly Kramer: Good morning. I’m great. Thanks.

Daniel Newman: Yeah. So, of course we always let people know what time it is, that really helps them in case they’re listening to this at night. But in all serious, we really like this series, this topic, this webcast. It has been a crazy couple of months in cybersecurity. And a lot of you have probably heard about these hacks, SolarWinds, recent Microsoft Exchange hack, and this is driven a lot of attention to the industry. We all like a headline, let’s just put it that way, we all like a story and a headline, but as analysts, it is really our job and our responsibility to think a little bit beyond the headline, to think about what can we talk to the market about to help them better understand the situation, better understand how to invest in their technology, whether it’s Cloud migration, whether it’s application security or device, minimizing threat surfaces. But one thing I know I feel, and I know you’ve always been a big cybersecurity person, Shelly, is that this is a topic that it seems no matter how much we talk about it, we need to talk about it some more.

Shelly Kramer: It’s kind of like data privacy, right? It’s this gigantic alligator that we are never done wrestling.

Daniel Newman: It does not end. And by the way, that’s a great example. And this sort of overlaps with privacy, by the way.

Shelly Kramer: It does, absolutely.

Daniel Newman: Because in order to truly provide privacy, you need to secure data, so this stuff does tie together. So let’s do a quick recap. Last week, you and I jumped and walked through as a whole, we talked about the cybersecurity in the headlines, which I mentioned. We ran down what we can learn from what’s happened over the past few months. Cost of data breaching, we talked about not just the legit hard costs, but the soft costs, whether that could be the employment issues for CIOs and CISOs that were at the helm when these breaches happened, all the way to long-term customer trust, and what that can mean for a business, for its customer attrition, for its net promoter scores.

Of course, we know there’s a lot of focus in legislation, you wrote a great piece on Futurum, we’ll put it into the show notes, but you were talking about some bi-partisan, yeah, I know, crazy, huh? Bipartisan legislation that’s being developed to make companies more immediately disclose breaches, which by the way, a lot of people don’t recognize this, but by the time you hear about a breach, it’s not uncommon that these companies have had people in their systems poking around, extracting data, for weeks, months, or even in some cases, years. And that’s something a lot of people don’t understand. So this is a good moment for government intervention.

And then of course, we ran through a bunch of other case studies from targets, for instance, and others that sort of stemmed this whole discussion. But today, and by the way, I really want to cross the chasm from this broader cybersecurity and jump into this confidential computing discussion. I want to look at what vendors are trying to accomplish, the reason for trust-based computing models, especially in these trusted execution environments, what’s going on with Enclaves, and then of course, homomorphic encryption. I’m practicing that word.

Shelly Kramer: I’m very proud of you.

Daniel Newman: Did I say it right?

Shelly Kramer: You did.

Daniel Newman: I’ve read it for years, but I just don’t say it out loud very often.

Shelly Kramer: It’s a tough one. Yeah.

Daniel Newman: So let’s revisit this, operational trust and technical trust, Shelly. I talked about it at the end. This is really, in my eyes, what confidential computing is all about.

Shelly Kramer: Yeah. And I think sometimes most of us don’t break these down in our heads, but operational trust is the thought that better and regular training, and stricter rules, and compliance and certification, all of those things, those things are important. We’ve worked with many clients in the space of providing training and compliance and all that sort of thing. Those things are important. But today, operational trust alone is not enough. And so then we’ll shift and talk about technical trust, and that’s really where we need to head.

Technical trust is the focus on removing people from the security equation and deploying technology solutions rather than the training, and the processes, and the compliance, and the certification. And the industry as a whole needs the ability to make it possible to run applications on somebody else’s computer, but where the owner of the computer can’t influence or observe what’s happening. It sounds kind of weird, I know, but this can be achieved through the deployment of technology that has no reliance on human intervention. And that’s really what we’re talking about when we’re talking about confidential computing, and what we’re talking about when we look at what’s the next gen of security protection for organizations.

Daniel Newman: Yeah, absolutely. And the genesis of all this is, as we’ve moved to Cloud, companies have had to rethink who can access the data, and how they’re able to access the data, and why they need to access the data. If you think about some of the biggest threat surfaces inside of an organization, it’s often people.

Shelly Kramer: Right.

Daniel Newman: And you talked about that with operational assurance, Shelly, but oftentimes the people have the capacity, because they’re administrators of the systems, to also be able to view or extract or take a snapshot of an application and the data, and that data can get migrated. It’s like anytime you have a PC that’s company owned and data has been sent around, oftentimes it’s sent around in an application.

But we often say, let’s download the CSV, we want to manipulate this data, play with this data. Well, all of a sudden this data is no longer in the secured environment, it’s now on someone’s machine.

Shelly Kramer: Right.

Daniel Newman: And administrators often have no reason to need to look at data, especially if you think about it in some highly regulated type spaces where you have things like credit card and financial data, you have HIPAA type data, and so we’ve had to build more hardened systems. But confidential computing as a whole, I guess we’ve talked around it a lot, but it really comes down to the ability to protect data in kind of all three states, right? We’ve got data at rest, you got data in transit, and we’ve gotten pretty good at that in terms of protecting it. But what about when data is being used in an application, and being able to manage it in all three states? That’s a pretty big problem. But as a whole, Shelly, how do you think about, when you’re explaining confidential computing, how do you think about defining it for somebody? Because it isn’t as easy as it sounds sometimes.

Shelly Kramer: You know what? I was looking at my notes and I totally didn’t even pay attention to the question you asked me.

Daniel Newman: Oh.

Shelly Kramer: What question did you ask me?

Daniel Newman: Oh, no, I was saying when you’re trying to explain confidential computing, and I know you do a pretty elegant job of explaining things, sometimes I like to dance around, but how do you sort of introduce confidential?

Shelly Kramer: Okay. So it’s really all about managing, more accurately and more safely, data that’s in use. Okay. It’s doing that in a hardware-based, trusted execution environment. And it sounds really weird, I know, to be able to think about. And back to what you said, Daniel, we can protect data when it’s at rest by encrypting it, and we can protect data in transit also by encrypting it, it’s a little trickier, but we can do that. But, this is the ability to actually work with data in a hardware-based, trusted execution environment, so that as the data is being used, it’s protected. That’s not where we are. That’s where we need to be.

Daniel Newman: Yeah, absolutely. We’re going to talk a little bit more, by the way, about what trusted execution environments are, because I think it would be probably worthwhile for our listeners to dig into that. But I think you hit it pretty well, in terms of the protection of data in use. And I mean, that’s been the area where often data has to be decrypted in order to be used in an application. So like you said, when it’s sitting at rest, you encrypt it, no one can do anything with it. So if you take a log file or you download a big customer data file, and while you’re hacking someone’s system, it’s a lot of hash data, it’s encrypted, you can’t do a lot with it. You’d have to be a cryptography expert, maybe take years or need a quantum computing machine in partnership with some high power computing to make that happen. It’s just not realistic. Same thing with when that data is in flight.

But when we actually want to use the data, that often is a vulnerable moment for data to be messed with. And so we have to figure out a way to fix that, and that’s where trusted execution environments come in.

I want to take a step back because we’ve talked about what it is, and we’ve talked about what’s going on in this whole state, but this confidential computing, it’s not new, it’s been around for a little while, and it’s actually become more of a community approach. And in our research paper that we’re putting together, we outline this, we provide sort of a broader introduction to all the companies. But this really starts back, the Linux Foundation, which has been very involved in this whole thing, as these open standards are trying to develop ways to protect this data. But you’ve got a lot of companies and there’s what’s called a confidential computing consortium. Just a quick rundown of who’s involved, but you’ve got Alibaba, you’ve got Arm, you’ve got Baidu, ByteDance, Fortinet, Google Cloud, Huawei, IBM, Intel, Microsoft, Red Hat, Swisscom, Tencent, and VMware. That’s a who’s who’d list.

Shelly Kramer: Right.

Daniel Newman: And I will say, there’s a noticeable omission of one of the biggest Cloud companies on the planet, Amazon. And we’ll talk more about this later, but Amazon has built its own approach, what’s called Nitro, and then an extended capability for compartmentalizing the data called Nitro Enclaves, and we’ll talk more about that at a later time. But if you really look at that list, it is the who’s who of Cloud, and software, and hybrid Cloud companies that are really trying to deal with this.

Shelly Kramer: Right. Well, and what they’re doing is so cool. So really whether on the public Cloud, on-prem servers, the Edge, the consortium is making it easier to run and move quickly between environments, between computing environments. And they’re supporting confidential computing, they’re hosting technical open source projects and open specs. They’re bringing hardware vendors, Cloud providers and developers together to grow its market value. They’re setting up regulatory standards, which is much needed, and then they’re building open-source tools environment for the trusted execution environment development by building the right kind of open source tools. So they’re really working on a lot of things. And again, confidential computing is really in its nascent stages, but even in those nascent stages, there’s a lot of big name involvement, big tech involvement, and a lot of really cool things that are happening.

Daniel Newman: Yeah. And that’s a really nice rundown. I mean, consortiums are important, and sometimes they don’t move as quickly as people would like, but building standards, you hear sometimes about 3GPP, which is building standards for 5G. Well, this is another area where you’ve got different stakeholders. You’ve got the Cloud players, you’ve got hardware and infrastructure builders and developers, and you’ve got actual software developers. And these three groups have to work together concurrently to make something like this hum. It can’t do just one thing because ultimately if developers can’t build apps and code using this technology, it won’t get used.

Shelly Kramer: Right.

Daniel Newman: And vice versa. So, that’s really important to know. And the other thing that is sort of important is, there is different layers. So you hear, we mentioned like Intel, for instance. So there is a system and chip layer, there is a hardware and core layer, and then, like I said, there’s a software and edge layer, and hardware plays a really big role in this.

Shelly Kramer: Absolutely. Absolutely.

Daniel Newman: So I’d love for, and I don’t know if you want me to jump in, but I’d love to just chat a little bit about that. What do you think? You want me to take that or do you?

Shelly Kramer: Well, I think that, I’ll do it. So the basic premise here is that the security of the entire stack is only as strong as the layers below it, okay? Just like the foundation of your house, right? I mean, you build a house on a strong foundation. So when you’re focused on implementing sort of the ultimate layer of security, you’ve got to focus on the lowest elements in the stack. And in that case, that means focusing on the silicon components of hardware. When you do this, you remove vendors from many layers of the stack, the operating system, the device driver, the platform, the vendors. And then you look at who operates the system, who administers the system? Are they on-prem? Is it in the Cloud? You’ve got service providers, and their admins, who are off the list of trusted parties. And what you’re doing is you’re just reducing the overall exposure to potential compromise at any point in that system life cycle.

Daniel Newman: Yeah, absolutely. I think you hit that really well. There are all these different parts, and again, chip layer, hardware layer, software, application, development, pipeline layer, and all these things really have to work together, but you can not for a moment, not consider hardware. And as we move to Cloud, so often we sort of omit that. We kind of, “Oh, hardware is hardware,” but there’s a difference between running it in a virtualized environment and on bare metal. There’s a difference between what’s running on-prem and in the Cloud. And all these different things create a different requirement for being able to manage this. So, let’s get back to what we were talking about earlier here though, and define a little bit that trusted execution environment, Shelly.

Shelly Kramer: I’m going to let you take that one.

Daniel Newman: Okay.

Shelly Kramer: I’ve been talking a lot.

Daniel Newman: I know, I like hearing you. You’re good at this. So, really what it comes down to is it’s an environment for executing code. And so, in the trusted execution environment, which by the way sounds, “Oh, that’s simple. I just explained the terminology in itself.” But, it really is the asset management of the surrounding environment needs to be able to ignore threats. It needs to be able to have that high level of trust. It needs to be able to deliver data confidentiality. It needs to be able to keep the integrity that you cannot have unauthorized. Someone has actually likened this to the blockchain, you can’t make a change without it being noticed.

Shelly Kramer: Right.

Daniel Newman: Or noted. And then the code integrity itself, so, you kind of want to break it down, and this is how the Linux Foundation breaks it down, so I won’t take credit. But when they basically look at what a trusted execution environment is, there’s three properties that they talk about. They talk about data confidentiality, so meaning unauthorized entities cannot view data while it is in use within a trusted execution environment. So you can use it, but you can’t see it.

The second is data integrity, unauthorized entities cannot add, remove or alter data while it is in use within the environment. That’s like what I said, that’s kind of why blockchains become popular, anytime something gets changed, there’s a log of it. Well, in this case, you simply just cannot do it.

And then the third is the integrity of the code. And that’s basically that unauthorized entities cannot add, remove or alter code executing inside of the trusted execution environment. And it’s really important, I think, as we look at confidential computing to understand what that is, because that at the heart of it, the trusted execution environment, is what confidential computing is all about.

Shelly Kramer: Absolutely. And so what we’re our goal here is, is really to reduce the ability, we touched on this earlier, for a systems administrator of a platform to access data and code inside that trusted execution environment, so that this path is not in any way economically viable. Like there’s nothing in it for me. I can’t get in there. And so what it does is it limits software attacks, protocol attacks, cryptographic attacks, and base physical attacks. What this does not do, what is not in scope for confidential computing, and I think this is also important, are sophisticated physical attacks, obviously. Upstream hardware supply chain attacks, like an attack on a CPU, attack at the chip manufacturer, and that sort of thing. So, that’s really kind of the scope there, the goal of confidential computing is to remove the ability and the access for a systems administrator to do anything nefarious.

Daniel Newman: Yeah. That’s a great point. And it’s good to make sure everyone out there that’s listening and paying attention just kind of notes, it solves a lot of problems, it doesn’t solve all the problems.

Shelly Kramer: Right.

Daniel Newman: And there is some standard and some elevation in terms of making sure that data in all three states are being protected. It certainly mitigates a lot of unsophisticated or baseline level attacks. The most sophisticated attacks are going to look at every opportunity to get across and touch any surface that’s vulnerable in terms of the threat. And so that’s why it’s important to sort of note what’s outside of the scope.

It’s also important to note, though, that some companies are doing more than just that base layer of confidential computing. So, all those companies involved that we mentioned, all the Cloud providers mentioned, are offering confidential computing at some level, but then you get to that next layer. You look at IBM’s hyper-protect services, it’s a specialized service that sits on top of confidential computing that adds additional layers and additional compartmentalization of data requiring different key access that makes it even harder than the limitations that are created through standard confidential computing. And this, by the way, is exactly what AWS has done with Nitro and its Hyper Enclaves. And again, Enclaves is not part of the confidential computing consortium, but it is the Amazon AWS approach, and you simply cannot ignore what the largest Cloud provider in the world is doing.

Shelly Kramer: Absolutely.

Daniel Newman: They’re taking their own route. So, as we sort of wrap up here, we can’t not talk about fully homomorphic encryption, Shelly.

Shelly Kramer: We can’t, especially because that word is sometimes just such a challenge. Or, you could call it FHE. And that’s really a class of encryption models that were envisioned in the 70’s, and constructed by a gentleman named Craig Gentry, who at the time worked at IBM.

And what that is, is that with normal encryption, the sender encrypts with a public key, the recipient decrypts using the key, the recipient performs the computation on the data, re-encrypts, sends back. If you’re involved in security operations, you’re transferring data, this probably sounds familiar to you. The data is scrambled for transmission. And then if it’s intercepted, it cannot be stolen in the clear form.

What homomorphic encryption does, it differs in that it allows computational tasks to be performed directly on the encrypted data without requiring access to a secret key. We talked about this before, using data securely that you can’t see. And the result is that the computation remains in its encrypted form, and it can later be revealed by the owner of the data with a secret key. Again, when we talk about Bitcoin, right? Secret key, secret access, there are some similarities there. So talk a little bit, explain for our audience, Daniel, if you would, a little bit about how fully homomorphic encryption can be applied.

Daniel Newman: Yeah. So, there’s essentially three ways, what do we call it? Three main domains, and we outline this pretty comprehensively in our research. And I’m just going to keep teasing that Shelly, it’s going to be a motivation for everybody to come back. It’s going to be motivation for you and I to finish this thing.

Shelly Kramer: Absolutely.

Daniel Newman: But we’ve really dug into this. And there are these domains, and one is privacy. So remember I said you cannot talk about security without privacy. So, FHE provides big improvements in privacy, and that’s because of data being processed with third parties without divulging what the actual data or insight from that data processing creates. The second is regulatory compliance, it enables stricter compliance. And you know we’re going to see more and more government intervention, compliance and regulatory policy being created for privacy. Well, this is a technology that can help organizations process workloads with encrypted data and not expose unencrypted and sensitive information. So that’s two.

And then the third is Cloud security. Look, a lot of people have wanted to debate whether Cloud is as secure as prem. And I think overall these big Cloud providers have proven, overwhelmingly, they’re able to secure data every bit as well as prem. But, we always want more Cloud, we want to move more highly regulated workloads, things that sat on mainframes, things that have been protected for a long time, onto Cloud, or even to just be connected to Cloud, certain data can be Mo mobile between Cloud and these more secure environments. You need to be able to make sure that that data can be protected, even in a public Cloud environment, and that is something that fully homomorphic encryption enables.

So, by the way, this is a lot. And, you and I are technical, and we work in this space, and we get to be briefed and analyze. I will say, I’ve done briefings on confidential computing now with multiple of these companies, these vendors, I’ve also done an in-depth briefing with AWS, Amazon and these vendors, but Shelly, this stuff, I joke when I got off one of the briefings I go, “I think I need a nap.”

It’s complex. And for an engineer that is entirely focused on cryptography and building secure environments, this stuff is, it’s Thanksgiving dinner conversation. But even if you’re a CIO or you’re a technology leader or an analyst, an influencer or someone that’s in the media, and you’re writing about these topics, this is a lot to come together.

Shelly Kramer: It is.

Daniel Newman: But, coming full circle, we cannot any longer ignore the technological ability to build trust.

Shelly Kramer: Correct.

Daniel Newman: We’ve focused entirely on operations for so long.

Shelly Kramer: And we’re failing. I mean, we’re failing operationally, and some of it is a processes thing, some of it’s a legacy infrastructure thing, some of it’s a skill set thing. There’s a very well-known dearth of highly skilled IT talent, especially in the cybersecurity space. I mean, it’s a pressure cooker of a job, by the way. I mean, immense responsibility for the safety of an organization is a huge thing. The threats, we have no idea the depth of the SolarWinds attack. We have no idea the depth of the attack for the Microsoft Exchange servers. Those are still being discovered. Those are still going on. I mean, hacking is a full-time, very, very lucrative business. So really stepping back from the, we’ve been preaching, for so long, the fundamental tenants of security, it’s about training, ongoing training, and the right processes and being certified, and compliance, and bottom line, it’s not enough.

Daniel Newman: Yeah. Absolutely. Well, listen, I think this has been great. We’ve got one more part where we are actually going to look a little bit more in depth to the vendor landscape. We’re going to talk about a little more specifically with some of the different vendors that are participating are doing, probably across from the chip and core layer, all the way up to the software and development layer, because, again, we think this topic warrants more than one cast, more than one podcast, webcast, video cast, live stream, whatever you want to call it, we don’t care, we just appreciate you watching and listening.

So for this show, Shelly, for the Futurum Tech Webcast, for this live stream, for this morning, or afternoon or evening, whenever you listen to this, thanks for tuning in. Hit that subscribe button. Join us for the rest of our shows. Keep an eye on the show notes, we referenced a bunch of resources. Shelly’s article, the white paper, the other episodes we’ve done, we want you to keep reading and keep with us. But for now, we’re going to say goodbye for the live viewers, we’ll see you later. For everyone listening on demand, thanks.

 

 

Author Information

Shelly Kramer is a Principal Analyst and Founding Partner at Futurum Research. A serial entrepreneur with a technology centric focus, she has worked alongside some of the world’s largest brands to embrace disruption and spur innovation, understand and address the realities of the connected customer, and help navigate the process of digital transformation. She brings 20 years' experience as a brand strategist to her work at Futurum, and has deep experience helping global companies with marketing challenges, GTM strategies, messaging development, and driving strategy and digital transformation for B2B brands across multiple verticals. Shelly's coverage areas include Collaboration/CX/SaaS, platforms, ESG, and Cybersecurity, as well as topics and trends related to the Future of Work, the transformation of the workplace and how people and technology are driving that transformation. A transplanted New Yorker, she has learned to love life in the Midwest, and has firsthand experience that some of the most innovative minds and most successful companies in the world also happen to live in “flyover country.”

SHARE:

Latest Insights:

Horizon OS Is Meta’s Overdue Horse-Before-Cart Moment for XR Adoption
Olivier Blanchard, Research Director at The Futurum Group, shares his insights about Meta’s Horizon OS announcement, outlines how up its app developer ecosystem may be the missing link in Meta’s mainstream market XR adoption strategy.
Lisa Martin shares her insights on modern MarTech with Thomas Been, CMO of Domino Data Lab. They unveil the essence of modern marketing, discuss understanding audience motivations (the art) and how to swiftly address customer needs (the science).
In this episode Keith Kirkpatrick discusses the news coming out of the Zendesk and Avaya Analyst Days, focusing on new product enhancements around AI, corporate strategy, and automation.
New GenAI Model Provides Greater Accuracy and Detail and Faster Generation
Keith Kirkpatrick, Research Director with The Futurum Group, covers Adobe’s beta release of Firefly Image 3 Foundation Model and a new beta version of Photoshop, which includes new features and capabilities.