Home > News

DeepSeek’s cheaper AI inference costs will actually lead to higher total spending, says Amazon CEO

If we consider the bigger picture, that is
Last Updated on
DeepSeek’s cheaper AI inference costs will actually lead to higher total spending, says Amazon CEO
PC Guide is reader-supported. When you buy through links on our site, we may earn an affiliate commission. Read More

By now, everyone is likely aware of DeepSeek, the AI LLM from China that has been framed as a game changer in the AI world due to its low cost, reduced computational requirements, focus on open-source accessibility, and more. While many are comparing DeepSeek to other AI LLMs like ChatGPT, highlighting how it costs only a fraction to build, Amazon’s CEO seems to have a different perspective on this “cheap” frenzy surrounding DeepSeek.

According to a post shared by The Transcript on X, Amazon CEO Andy Jassy addressed the DeepSeek situation during Amazon's Q4 2024 earnings call, saying that cheaper AI inference costs will lead to higher total spending, not less. He believes that as AI becomes more affordable, companies will increase their usage, according to AWS history.


AMD launches latest Ryzen 9 9950X3D & 9900X3D CPUs!

AMD's highly anticipated Ryzen 9 9950X3D and 9900X3D chips have finally arrived! Below, we will be listing all the latest listings from the web's biggest retailers.

*Stock availability and pricing subject to change depending on retailer or outlet.


DeepSeek is making AI more accessible but may fuel more spending

To further prove his point, Jassy gave an example of their cloud computing company, AWS. When it launched in 2006, many assumed that the lower cost of cloud storage and compute power would lead companies to spend less on infrastructure. However, what actually happened was that businesses, excited by the affordability, reinvested their savings into building more advanced and scalable systems, “ultimately spending much more in total technology.” Jassy believes the same pattern will apply to AI inference.

“The cost of inference will substantially decrease. What you heard in the last couple of weeks at DeepSeek is just one piece of this, but everyone is working on it. I believe the cost of inference will meaningfully decline, making it much easier for companies to integrate inference and generative Al into all their applications.”

Amazon CEO, Andy Jassy

So in the aspect of DeepSeek and its low-cost AI inference, Jassy believes that instead of reducing total tech spending, lower inference costs will drive even broader AI adoption, with companies using these savings to expand their AI-driven operations.

The end of closed-source AI is already happening

Jassy’s statements, in one way or another, align with what experts like Michael Pound are saying about DeepSeek, which has changed the game slightly by taking a more open approach. This transparency makes it easier to see aspects of the model, such as datasets, model parameters, and scale. As a result, companies no longer have to rely on technology from tech giants like OpenAI and can now use that technology for free thanks to DeepSeek to enhance their chatbots. Or, as Jassy puts it, making it much easier for companies to integrate inference and generative AI into all their applications. On top of that, lower hardware requirements are a plus, with DeepSeek already running on a Raspberry Pi.


32
AI chatbots

Which AI chatbot do you use most often?

About the Author

Hassam boasts over seven years of professional experience as a dedicated PC hardware reviewer and writer.