News
May 10, 2024

Microsoft's MAI-1: 500B Parameters set against OpenAI?

Microsoft is building a massive AI model called MAI-1 to compete with the likes of OpenAI's ChatGPT, potentially shaking up the race for the most powerful language technology.

Microsoft Enters the Ring with MAI-1

According to reports, Microsoft is said to be developing its own artificial intelligence model independent of OpenAI. This presents an opportunity for CEO Satya Nadella to demonstrate that his company can succeed in the AI competition without relying on the maker of ChatGPT.

MAI-1 is said to be another monstrous 500-billion parameter language model. This powerhouse should theoretically stump smaller Models like LLaMa 3 and Mixtral 8x22b, which have 70B and 39B parameters. The development is overseen by Microsoft AI CEO - Mustafa Suleyman (co-founder of DeepMind and Inflection AI).

The development of MAI-1 has the potential to alleviate certain worries surrounding Microsoft's excessive emphasis on its ongoing collaboration with OpenAI, particularly as the competition intensifies in the quest to create superior generative AI technologies.

Real-world applications

Such a model is most likely needed to assist Microsoft's in-house efforts with Copilot, and support of AI in Windows. Microsoft's entry intensifies competition, spurring internal development and forcing both companies to innovate faster.

A Collision Course: Microsoft vs. OpenAI?

So, does this model race mean OpenAI is under more pressure? Well, the CTO of Microsoft Kevin Scott says no, it’s business as usual: “ We build big supercomputers to train AI models; our partner Open AI uses these supercomputers to train frontier-defining models; and then we both make these models available in products and services so that lots of people can benefit from them. … There's no end in sight to the increasing impact that our work together will have. “

A Chance at the First Place

Although Microsoft recently released Phi-3 as the most efficient Small Language Model, the first place is to be determined between Large Language Models. With new powerful competitors like Claude 3 Opus entering the ranks, it only makes sense to produce MA-1, which might become Microsoft’s in-house flagship AI Model.

The Future of MAI-1

According to one source from The Information, it is reported that the specific purpose of MAI-1 has yet to be determined, even within Microsoft. The optimal use will depend on how well it performs. To train the model, Microsoft has been using a significant cluster of servers equipped with Nvidia GPUs. They have also been gathering training data from different sources, which includes text generated by OpenAI's GPT-4 and publicly available internet data.

Additional details about MAI-1 might be disclosed at the upcoming Microsoft Build conference, which is scheduled to be held in Seattle from May 21 to 23.

Footnotes

Try ChatGPT-4 API today with the AI/ML AI Playground.

We're excited to see what amazing projects you will bring to life. Happy coding!

Written by Sergey Nuzhnyy

Get API Key