Search

The Cost of The Next Big Thing – Artificial Intelligence

The News: The Information recently reported that the cost of running ChatGPT is estimated at around $700K per day. While the dollar amount itself is staggering, it also brings into question the larger concept around the cost of AI – including the resource and energy intensive nature of AI programs. See more from The Information’s reporting here.

The Cost of The Next Big Thing – Artificial Intelligence

Analyst Take: AI has been a buzzword that technology vendors have exclaimed to be “the next big thing” for quite some time. Often, I have taken the phrase “AI-powered” to mean somewhere between nothing at all, to some form of limited machine learning – but certainly a long way from true artificial intelligence. Over the past few months however, the whole world has been seemingly obsessed with ChatGPT – and for good reason. The AI chatbot is arguably the first step toward a real, intelligent AI. ChatGPT goes well beyond a niche machine learning model or baseless “AI-powered” marketing messaging, and with the rapid rate of development – both of ChatGPT itself and newly emerging competitors – it is beginning to feel as if AI is truly becoming the next big thing.

While ChatGPT has been the hottest topic around, there have been countless takes both for and against the development of AI, but one of the more interesting things I have read is about the cost of operating ChatGPT. When I first read about The Information’s findings, I said to myself “wow, $700K per month – that’s a lot.” Only upon re-reading the article did I realize that it was not $700K per month – it said $700K per day. This daily cost is staggering – and this is only for ChatGPT, certainly other competitors racing to develop their own solution are tallying a hefty sum as well.

What gives me pause, however, is not solely the astronomical dollar amount that is being poured into AI – but rather a total cost associated with these programs. While I don’t have a full itemized breakdown of the $700K daily bill, it’s not hard to imagine that quite a bit is spent on energy due to the high-powered servers, GPUs, and massive storage capacities used in AI applications. This brings about a bit of a paradox in which the IT industry as a whole has been experiencing a recent push to reconcile its large energy usage and carbon footprint, and yet the area deemed “the next big thing” appears to be somewhat of an environmental disaster. It is this environmental cost that brings into question the value of AI.

While ChatGPT is certainly an impressive technical feat, when considering the huge costs involved – including computational resources, energy expenditure, and of course actual cash – the question becomes: is the value of AI actually worth the cost? The question, of course, is one that is simply too early to actually answer. It’s fairly easy to make an argument that this current iteration of AI – essentially chatbots that can provide a mostly accurate alternative to a Google search – might not justify the overall cost. But it should also be considered that AI is still in a very early stage of development, and development is moving quite quickly. As development accelerates, it is reasonable to expect the value in which AI provides to grow as well.

Personally, I am not a definitive expert on AI applications. So, to consider its future use, I asked a source that is quite close to subject matter – ChatGPT. When asked how AI might be used in the future, ChatGPT listed a number of areas that AI could have a significant impact on including healthcare, education, finance, transportation, agriculture, entertainment, and interestingly enough: environmental sustainability. Specifically, around sustainability ChatGPT stated “AI can be used to monitor and manage environmental risks, predict and prevent natural disasters, and optimize resource consumption.”

When weighing the cost and value of AI, two things stand out to me. First, the cost of AI today, is much too high to sustain. And second, the potential value of AI is much too great to ignore. A world in which AI can be used to detect and treat diseases, automate transportation, prevent environmental disasters, and more is truly worth striving for. At the same time, a “neat chatbot” that costs $700K a day and is likely using excessive amounts of energy is a bit hard to stomach.

Development of AI is not going away, whether it is currently too costly or not – it’s the next big thing. I would hope, however, that moving forward, the cost of these AI programs is ultimately balanced. As AI is developed to be more intelligent, more useful, and more practical, it needs to also be developed to be more energy efficient and less resource intensive. Just as there is a focus on responsibly developing AI without bias, there must also be initiatives to responsibly develop sustainable AI.

Disclosure: The Futurum Group is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author does not hold any equity positions with any company mentioned in this article.

Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of The Futurum Group as a whole.

Other insights from The Futurum Group:

Weka Launches Sustainability Initiative for AI, ML, and HPC

The Environmental Impact of Accelerators

Storage, Sustainability, and ESG Reporting

Author Information

Mitch comes to The Futurum Group through the acquisition of the Evaluator Group and is focused on the fast-paced and rapidly evolving areas of cloud computing and data storage. Mitch joined Evaluator Group in 2019 as a Research Associate covering numerous storage technologies and emerging IT trends.

With a passion for all things tech, Mitch brings deep technical knowledge and insight to The Futurum Group’s research by highlighting the latest in data center and information management solutions. Mitch’s coverage has spanned topics including primary and secondary storage, private and public clouds, networking fabrics, and more. With ever changing data technologies and rapidly emerging trends in today’s digital world, Mitch provides valuable insights into the IT landscape for enterprises, IT professionals, and technology enthusiasts alike.

SHARE:

Latest Insights:

In a discussion that spans significant financial movements and strategic acquisitions to innovative product launches in cybersecurity, hosts Camberley Bates, Krista Macomber, and Steven Dickens share their insights on the current dynamics and future prospects of the industry.
The New ThinkCentre Desktops Are Powered by AMD Ryzen PRO 8000 Series Desktop Processors
Olivier Blanchard, Research Director at The Futurum Group, shares his insights about Lenovo’s decision to lean into on-device AI’s system improvement value proposition for the enterprise.
Steven Dickens, Vice President and Practice Lead, at The Futurum Group, provides his insights into IBM’s earnings and how the announcement of the HashiCorp acquisition is playing into continued growth for the company.
New Features Designed to Improve CSAT, Increase Productivity, and Accelerate Deal Cycles
Keith Kirkpatrick, Research Director with The Futurum Group, covers new AI features being embedded into Oracle Fusion Cloud CX with the goal of helping workers improve efficiency and engagement levels across sales, marketing, and support.