Expert claims DeepSeek could bring “the end of closed-source AI”
Table of Contents
DeepSeek, a new AI model hailing from China, has been making all kinds of headlines lately. The sudden plunge in stock value for several US tech companies, including Nvidia, made sure that everyone took notice. It is seen as the largest open-source large language model to date, and it has achieved that without costing billions – the AI model reportedly cost $6 million to put together.
And now, tech and AI experts have been reflecting on the sudden impact DeepSeek’s R1 model has made. In a recent video from the popular YouTube channel Computerphile, one expert explains why DeepSeek is a game changer for AI.
AMD launches latest Ryzen 9 9950X3D & 9900X3D CPUs!
AMD's highly anticipated Ryzen 9 9950X3D and 9900X3D chips have finally arrived! Below, we will be listing all the latest listings from the web's biggest retailers.
- AMD Ryzen 9 9950X3D
- AMD Ryzen 9 9900X3D
- AMD Ryzen 9 9950X3D - CA
- AMD Ryzen 9 9900X3D - CA
- AMD Ryzen 9 9950X3D - Newegg
*Stock availability and pricing subject to change depending on retailer or outlet.
Closed-source AI may no longer be a viable option
Michael Pound, a researcher at the University of Nottingham, with a focus on machine learning and deep learning, shared his thoughts on DeepSeek in a recent video. On the issue of open versus closed-source, he says “personally I think openness is a good thing”. He also adds that DeepSeek has “changed the game a little bit” by showing “you can train with more limited hardware”.
Whereas US tech giant OpenAI, creators of ChatGPT, keep their models behind the scenes, exposing them through an API or web interface, DeepSeek has taken a much more open approach. This is something that Meta has already been doing with its open-source Llama AI models. This makes it easy to see aspects of the model such as data sets, model parameters, and scale
However, even if you have access to an AI model – like in the case of Llama – that doesn’t mean you can train them from scratch. That currently takes a lot of processing power, likely in the form of a Nvidia GPU server farm. Nvidia has made a fortune from its AI chips, the latest of which (on Blackwell architecture) is “progressing smoothly” according to CEO Jensen Huang.
“We could be seeing the end of closed-source AI, because it may just not be viable.”
Michael Pound, Researcher, University of Nottingham
What DeepSeek has achieved with what Pound calls “essentially consumer hardware” seems nothing short of a miracle and only adds to the idea of ‘openness’ – “it’s a leveler” says Pound. Yes, the hardware that DeepSeek is using is no doubt expensive, but it seems like a step in the right direction which aims to level the playing field; you don’t need a supercomputer with over 100,000 Nvidia GPUs.