Amazon may be working on a new large language model (LLM). Codenamed Olympus, the generative AI model is rumored to have two trillion parameters – which is more than GPT-4! If the rumors are true, Amazon Olympus LLM will come out of the starting gate competing with OpenAI’s GPT-4 Turbo for the title of ‘world’s most powerful AI Chatbot’. So, what will this titan of artificial intelligence be capable of, and when might it arrive?
What is Amazon Olympus AI?
Amazon Olympus AI is a rumored large language model in development under former Head of Alexa Rohit Prasad, who reports directly to CEO Andy Jassy. As Amazons SVP and Head Scientist of General Artificial Intelligence, Prasad has declined to comment on a confirmed project name, citing a condition of anonymity required while developing such an ambitious large language model.
The name of the LLM, at release, may not be Olympus. In addition, the AI chatbot that it will power may be named differently. Not many LLM’s go by the same name as their respective chatbot. So far we know this to be the codename for an AI model that Amazon is currently training.
We also know that Amazon plans to integrate Olympus into its Amazon Alexa voice assistant. This will give it the most advanced natural language processing capabilities of any voice assistant, as compared to Apple Siri and Hey Google Assistant. Alexa AI is predicted to competently compete with, or even dethrone, OpenAI’s GPT-4 model in 2024.
Essential AI Tools
The huge datasets of these top models make it essentially impossible for an AI startup to compete with homegrown models. With bigger AI models comes increasing training costs, with top-performing models costing millions of dollars to train after investing millions into the (most likely NVIDIA) hardware required. The dedicated resources required for model training are no issue for existing tech giants, but the AI efforts of AI models startups are unlikely to ever catch up with the exponential pace of todays largest models.
Is two trillion parameters a lot?
Two trillion of anything is a lot. But in all seriousness, the speculated two trillion parameters of Amazon Olympus puts it far ahead of GPT-4, and potentially ahead of the cutting-edge OpenAI model GPT-4 Turbo, which was only just announced at OpenAI DevDay.
We can think of parameters as the synapses in a human brain – the connections between neurons. In other words, the connections and relationships between different data points. The more parameters an AI model has, the more variables it can consider in a single zero-shot operation. As a result of this contextual understanding, the machine can replicate what we consider human intelligence.
By comparison, the human brain may consist of 85 – 100 billion neurons (if you’re lucky). These neurons connect together with a potential 100 trillion synapses – 50 times that of Amazon Olympus. That said, in the grand scheme of things, wow. That is impressively close.
When will the Amazon Olympus LLM be released?
There is no confirmed release date for the Amazon Olympus LLM. Rohit Prasad, who leads the project, has declined to comment. However, the date is anticipated to be revealed in December, ahead of a probable 2024 release.
Amazon Olympus vs Titan AI
Previously, Amazon has trained a smaller language model called Titan. It was initially delayed because it didn’t perform well compared to ChatGPT. However, this smaller model is now in operation via AWS (Amazon Web Services) and the new, much more powerful AI will be released separately.
The Seattle, Washington-based tech giant have also invested $4 Billion USD into AI safety research firm Anthropic, which also had representatives at the UK AI Safety Summit in Bletchley Park. Anthropic are responsible for AI chatbot Claude, which employs the Claude-2 LLM, also more powerful than the free version of OpenAI’s ChatGPT.