For this special episode of the Futurum Tech Webcast, Principal Analyst and host Daniel Newman welcomes Alexis Crowell, IoT Marketing Global Lead at Intel to discuss the AI at the edge journey many SMEs are experiencing right now including how Intel can provide the solutions and expertise needed to gain a competitive edge in the market.
Their conversation includes a look at:
- Why putting AI closer to the data source can improve response time, efficiency, and data security.
- The technologies Intel specializes in including CPU, VPU, FPGA, networking, memory & storage, that are needed to deliver AI at the edge.
- Integrating AI into edge strategies especially the value AI can bring when extracting insights from mountains of data.
- We also discussed a few case studies on how companies have successfully integrated AI at the edge and the benefits that they have reaped including GE Healthcare that used AI at the edge to glean insights from the MRI machines improving performance and efficiencies.
This was a fascinating discussion about an ever-relevant topic, and we’re glad to have you as part of it. Check it out below:
Disclaimer: The Futurum Tech Podcast is for information and entertainment purposes only. Over the course of this podcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we do not ask that you treat us as such.
Other insights from the Futurum Research team:
Image Credit: IEEE
Daniel Newman: Welcome to Futurum Tech TV. I’m Daniel Newman, your host today for this conversation with Alexis Crowell at Intel. We’re going to be talking about AI at the Edge. I’m Daniel Newman, your host, like I said, principal analyst and founding partner at Futurum Research and Alexis, welcome to Futurum Tech TV.
Alexis Crowell: Thank you. Thanks so much for having me.
Daniel Newman: Yeah, it’s a lot of fun. Part of the COVID-19 pandemic, we’ve taken our analysis to TV, our conversations, to video, more personality, getting more face to face in a world where you cannot get face to face. But I’m excited about this conversation, Alexis, talking about AI at the Edge. But before I do that for everyone out there, that’s taking the time to spend with us. Tell everybody about yourself and what you do at Intel.
Alexis Crowell: Oh, sure. It’d be happy to. So for the last three and a half, four years, I focused on AI. I’ve been on the product marketing and marketing side of telling our AI story and bringing technology to all of the different customer bases. Because AI is underpinning what will happen going forward. So I’ve been really fortunate to be at the forefront of that, in both helping customers implement, but also then getting to go tell their stories back into the market. And right now I’m focused, purely on IOT. And how are we bringing the entire IOT story together. But explicitly AI, given how much AI, especially inference is going to happen at the Edge.
Daniel Newman: Yeah, there’s no doubt about it. The AI and IOT story, some of… partners of yours are using terms like AIoT. In fact, you’re seeing that out there right now, but we are really seeing that with the vast data creation, collection, ingestion, that’s going on, that’s putting a lot of strain on resources. That’s putting a lot of pressure on companies to try to utilize and maximize. We know the best companies are differentiating themselves by really aggregating all that data and using it in the way they interact with customers, the way they buy products. So that means if the best are doing it, everyone else is going to be required to keep up and they’re going to need the tools and the technologies to do so.
And you sort of alluded to it in your introduction and no question, your passion Alexis, for the technology, it comes through the screen and you can just get it in your smile. You’re beaming when you talk about it. But in all serious, it’s an exciting space. I mean, you literally cannot come across a technology right now, whether it’s consumer stuff, the way we watch our movies and the way we get information to when we shop online to the way we basically take data from parking lots and people walking down streets to be able to optimize retailers or optimize the way a government is managing its city. But, take a step back, AI at the Edge. What is it? Why is AI at the edge? Kind of what’s the discussion right now that’s going on around AI at the Edge as opposed to just AI in general.
Alexis Crowell: Sure. You know, so we think about AI as really being a foundational technology. So it’s not so much about a workload. It’s about, how do we get more insights, more actionability out of data. We’ve all heard this, that in the last two years, more data has been created than like the last 200 or whatever it is. It’s huge. But how do you… it’s only 2% that’s actually been analyzed or acted on. So how do we flip that? And what AI really allows us to do is by moving that algorithm, those insights closer to where the data is created, we can act faster and more readily, against what we’re seeing.
So let’s take an example, computer vision. Having a robot or a computer, see what’s in front of it or what’s happening around it is a use case that can be used on factory floors. It can be used in retail, it can be used in restaurants, all over the place. But, let’s say you’ve got a robot on a factory floor that’s maybe moving a machine from one spot to another, you don’t want that computer vision algorithm running in the cloud or running far away. That’s going to add latency to potentially stop that robot if somebody walks in front of it. How do you add that safety feature? If you’re having to deal with the latency of running it back to the cloud? So not only is it action. You can act faster because there’s no latency required, but you also then get insights that can be more localized to what you’re seeing around you, so that you can continue to move your business forward faster.
Daniel Newman: Yeah. There’s a symbiotic partnership that is going to take place between Edge and Cloud. There’s no way you can do all the compute for every piece of data being created at the Edge. But there are certain workloads that unquestionably, you think about autonomous vehicles, for instance, you think about, you mentioned computer vision in the post pandemic or COVID world, where we’re going to have an extra amount of focus on hygiene, safety, social distancing. You’re seeing a lot more companies looking at… like in retail, as retailers want to open in certain capacities, well, how do we manage that? Well, computer vision could essentially look at a space and be like, “Oh, we have the right amount or too many, or not enough people in a space.” How do we set up the store to make sure people can distance or a restaurant? So those are a lot of great applications that require the right compute and the horsepower. And in more real time is going to be critical.
Alexis Crowell: Exactly.
Daniel Newman: Go ahead.
Alexis Crowell: Or think about the engagement. If you take your retail example, if you use all of these autonomous elements to set up your store, you still want that personal touch for the most part, especially certain brands as you walk in the store, when we can. So using things like NLP and natural language to help have kind of an interactive, “Hey, I’m really looking for a new pair of jeans, point me to the right spot.” But it doesn’t necessarily have to be a human interaction in that moment, but it still feels a little bit more personal to the customer walking in.
Daniel Newman: Yeah. I liked that a lot. And I do think it’s a combination. I think some of what we’re seeing in this kind of, what I call, the Pandemic Playbook is going to be incrementally. It’s not going to be phased out. I think we’re going to learn from these things and continue, but we’re going to start to return more towards experience.
Alexis Crowell: I think so.
Daniel Newman: People come into a restaurant it’s not just going to be about, are you sitting far enough apart? It’s about how do we really make people happy and deliver the best offer and the right moments. Or in the workplace, how do we make meetings better? Not just because we want people to distance, but because, why were we doing that thing in the first place? Why were people walking in the room, trying to use some very confusing touch panel when they could just walk in and be like, “Call Alexis.” And it would know from conversational that I’m meaning you. And it can take data off my Edge device, my phone connected to that network locally to know who the most popular Alexis might be in my phone or who I’d be most likely to be calling from work. So those are some great examples.
I’d love to ask you too, because you guys at Intel, and we work and have advised and worked with the company for a long time, focus a lot on customer, but being really an advisor. So despite the fact that you’re a manufacturer, there’s a lot of consult going on. And so, as you’re talking to these customers about implementing AI at the Edge, what are you suggesting? Because I think it could be a lot to absorb whether it’s a government, whether it’s a retailer. I think the options must feel limitless and it must be creating a little bit of fear in terms of action. And I imagine you’re trying to help that.
Alexis Crowell: Yeah, absolutely. I mean, I haven’t talked to a company whose board of directors, hasn’t said, “You need an AI strategy.” And then the CEO kind of goes into paralysis by analysis. Like, “Oh my gosh, where do I start?” So there’s a couple of things that we try and help walk people through that eases the concern. First off, don’t think about AI as this big, massive, scary thing. One of the best things that you can really think of as a business owner is, what’s the intersection between scarcity, where you might not have enough resources to go do something, and knowledge.
So I’ll put a really tangible example around that. If you’re a rural hospital and you’ve got an MRI machine and you’ve got an X-ray machine, but you don’t have a radiologist on staff, how can you still help your patient without having a three or four day turnaround time because you’re sending those images off somewhere else. You can put AI at the Edge in that instance and help get to a diagnosis faster. Now you’d still want a radiologist to check it, but it gets you started down the path. So if you think about that intersection; scarcity, we don’t have a radiologist, but knowledge cause you need a specialty to really read those images. That’s a great place to go start. And if you think about it that way, you kind of take bite sized chunks, it makes the implementation easier.
The other piece that I think is really important is, we’re super proud of the fact that Intel and our hardware goes forward and backwards on compatibility and use cases. And a lot of times you don’t actually have to go get new systems and new equipment to test and try something. We’ve worked with a number of companies where we said, “Hey, do you know your Intel Xeon’s actually have inference boost capabilities where you could run a lot of what you’re trying to do on the hardware you’ve already implemented.” So it doesn’t have to be a huge cash outlay either, in terms of investment to get started. And if you break those pieces down, then you’re giving folks step-by-step tangible ways to go implement that doesn’t seem so daunting.
Daniel Newman: And then you can sort of test concepts too. You could do it in the cloud, knowing that as you remove or on your prem and your data center and that you can remove latency by putting hardware at the Edge for a certain thing, but they can test it on Xeon, they can use their current ERP’s or their current big data tools to essentially accelerate using DL boost. That’s really interesting.
By the way you made me laugh, when you talk about the radiology examples. I’ve had multiple cases now where my kids who are athletes, have broken a bone and we’ve had it misread one way or the other. Where it was misread in the emergency room and then the radiologist called said, “No, it is broken.” Or vice versa. As AI becomes so much more pervasive, things like that, those mistakes are going to be made less frequently.
Alexis Crowell: Exactly.
Daniel Newman: They’re going to have so much more data to be able to see, this is why this got misread. This is how this got misread. It can actually give an indicator to that person making that initial read like, “We’ve seen this before.”
Alexis Crowell: Exactly.
Daniel Newman: But that latency, that differentiation is small, but that also creates a really big difference in patient experience. And when you’re a hospital or a hospital system, that experience is the moneymaker that’s actual measurements.
Alexis Crowell: Exactly. Or well, if you’re the concerned parent. Either side is changing how you then want to go forward with that healthcare system. And there’s a ton of healthcare systems out there. So how are we helping people have better very scary moments, become a little bit more… seem a little bit more empathetic. GE Healthcare is one of our big partners in this space. They’ve already started that with some of our tech. And if you start to then pile on tools like OpenVINO or we have a dev cloud. So to your point on how do you test and know where it then landed, you can do it there. There’s all of these great pieces that are in place that we can help walk customers through. So that it’s much more accessible than it might sound when your board of directors says, “Hey, go get me an AI strategy.”
Daniel Newman: Yeah, amazingly, I feel a lot like it’s where big data was three or four years ago where it was, “Just do it.” And it was like, “Do what?” And it was like, “Do one thing, do two things, do three.” And that’s the point is what are the most important workloads to improve employee safety, customer experience, accessibility, compliance, whatever it is. Those couple of things that you can do, do one, nail it and then rinse and repeat. And that’s kind of what we ran up against in big data.
Alexis Crowell, I want to thank you so much for spending a few minutes with me here on Futurum Tech TV. It’s a great discussion everybody. Check out the links below. If you’re looking at this on YouTube and we’ve got some links where you can learn more about what Alexis and I talked about here today on Futurum Tech TV. If you hit us on the Twitters and you’re watching this as a piece of a video on Twitter, go ahead and click into the profile and click onto the link. Because we’d love for you to see the whole video, but for now, for this episode of Futurum Tech TV, thanks Intel for partnering up on it. Thanks Alexis. We got to go. Bye bye now.
Alexis Crowell: Thank you.