Expert claims DeepSeek could bring “the end of closed-source AI”

Table of Contents
DeepSeek, a new AI model hailing from China, has been making all kinds of headlines lately. The sudden plunge in stock value for several US tech companies, including Nvidia, made sure that everyone took notice. It is seen as the largest open-source large language model to date, and it has achieved that without costing billions – the AI model reportedly cost $6 million to put together.
And now, tech and AI experts have been reflecting on the sudden impact DeepSeek’s R1 model has made. In a recent video from the popular YouTube channel Computerphile, one expert explains why DeepSeek is a game changer for AI.
Prime Day is finally here! Find all the biggest tech and PC deals below.
- Sapphire 11348-03-20G Pulse AMD Radeon™ RX 9070 XT Was $779 Now $739
- AMD Ryzen 7 7800X3D 8-Core, 16-Thread Desktop Processor Was $449 Now $341
- ASUS RTX™ 5060 OC Edition Graphics Card Was $379 Now $339
- LG 77-Inch Class OLED evo AI 4K C5 Series Smart TV Was $3,696 Now $2,796
- Intel® Core™ i7-14700K New Gaming Desktop Was $320.99 Now $274
- Lexar 2TB NM1090 w/HeatSink SSD PCIe Gen5x4 NVMe M.2 Was $281.97 Now $214.98
- Apple Watch Series 10 GPS + Cellular 42mm case Smartwatch Was $499.99 Now $379.99
- ASUS ROG Strix G16 (2025) 16" FHD, RTX 5060 gaming laptop Was $1,499.99 Now $1,274.99
- Apple iPad mini (A17 Pro): Apple Intelligence Was $499.99 Now $379.99
*Prices and savings subject to change. Click through to get the current prices.
Closed-source AI may no longer be a viable option
Michael Pound, a researcher at the University of Nottingham, with a focus on machine learning and deep learning, shared his thoughts on DeepSeek in a recent video. On the issue of open versus closed-source, he says “personally I think openness is a good thing”. He also adds that DeepSeek has “changed the game a little bit” by showing “you can train with more limited hardware”.
Whereas US tech giant OpenAI, creators of ChatGPT, keep their models behind the scenes, exposing them through an API or web interface, DeepSeek has taken a much more open approach. This is something that Meta has already been doing with its open-source Llama AI models. This makes it easy to see aspects of the model such as data sets, model parameters, and scale
However, even if you have access to an AI model – like in the case of Llama – that doesn’t mean you can train them from scratch. That currently takes a lot of processing power, likely in the form of a Nvidia GPU server farm. Nvidia has made a fortune from its AI chips, the latest of which (on Blackwell architecture) is “progressing smoothly” according to CEO Jensen Huang.
“We could be seeing the end of closed-source AI, because it may just not be viable.”
Michael Pound, Researcher, University of Nottingham
What DeepSeek has achieved with what Pound calls “essentially consumer hardware” seems nothing short of a miracle and only adds to the idea of ‘openness’ – “it’s a leveler” says Pound. Yes, the hardware that DeepSeek is using is no doubt expensive, but it seems like a step in the right direction which aims to level the playing field; you don’t need a supercomputer with over 100,000 Nvidia GPUs.