We all know that the development cost of GPT-3 was more than $4 million. But what’s the development cost of its predecessor, the GPT-2? Are you curious to know the development cost of this language model?
In this article, we’ll talk about the development cost of GPT-2. So, let’s begin the discussion.
What’s GPT-2’s Development Cost?
You might surprise that OpenAI funded the GPT-2 project, a research institute co-founded by tech moguls like Elon Musk and Sam Altman.
According to reports, the development cost of GPT-2 was just $50,000. This is almost nothing in comparison to GPT-3’s development cost. And this is precisely why there is a massive difference between GPT-2 and GPT-3.
Hardware Cost
As the model was new, and the first major effort was in, OpenAI needed much computing power to train GPT-2. The institute used a combination of CPU and GPU clusters to run the massive computations for the training process.
Researcher Salaries
The researchers who worked on the GPT-2 project were some top minds in artificial intelligence. As a result, they commanded high salaries. At that time, hiring top talent was essential for developing cutting-edge AI technology like GPT-2.
Conclusion
Developing the GPT-2 language model wasn’t that expensive. But the efforts and inputs were massive. This is because it was the first-of-its-kind model, and there weren’t many resources available.
OpenAI invested in the project, which included expenses related to hardware, researcher salaries, and other development costs. However, the result was a language model that had the potential to revolutionize the field of natural language processing.
GPT-2 has already been used in various applications, from language translation to content generation. It’s safe to say that the investment in GPT-2 was well worth it.