Amazon “invests millions in training a large, ambitious language model”, reports Reuters“hoping it can compete with top models from OpenAI and Alphabet, two people familiar with the matter told Reuters.”
The model, named “Olympus,” has 2 trillion parameters, the sources said, which could make it one of the largest models currently being trained. OpenAI’s GPT-4 model, one of the best models available, is said to have a trillion parameters…
The team is led by Rohit Prasad, former head of Alexa, who now reports directly to CEO Andy Jass… Amazon believes that having local models could make its offerings more attractive on AWS, where enterprise customers want access to the highest-performing models,” people familiar with the matter said, adding that there was no specific timetable for the release of the new model.
“While the number of parameters does not automatically mean that Olympus will outperform GPT-4, it is a safe bet that it will, at a minimum, be very competitive with its rival OpenAI” says a financial editor at the Motley Fool – as well as Google’s nascent AI projects.
Amazon could have a key advantage over its competitors, one that CEO Andy Jassy discussed in the company’s third-quarter earnings call. Jassy said: “Customers want to integrate models with their data, not the other way around. And much of this data resides in AWS (Amazon Web Services), as the undisputed leader in the cloud infrastructure market segment….”
Amazon will likely exploit Olympus in other ways as well. For example, the company could make its generative AI coding companion CodeWhisperer more powerful. Jassy noted on the Q3 call that all of Amazon’s “significant businesses are working on generative AI applications to transform their customer experiences.” Olympus could make these initiatives even more transformative.
They point out that Amazon’s profits more than tripled in the third quarter of 2023 compared to 2022.
And Amazon’s stock price has already jumped more than 40% in 2023.