Microsoft is developing an artificial intelligence chip to power large language models, according to a report by The Information.
According to two people with direct knowledge of the project, Microsoft has been working on the chip, internally codenamed Athena, since as early as 2019. The chips are already available to a small group of Microsoft and OpenAI employees, who are testing the technology, one of the employees said.
Image source Pexels
Microsoft hopes the chip will perform better than the chips it currently buys from other suppliers, saving it time and money on its expensive AI development, sources said. Of course, tech giants such as Amazon, Google and Facebook are currently making their own chips for AI as well.
Microsoft wants to use the Big Language model in all their applications, including Bing, Microsoft 365 and GitHub,” SemiAnalysis principal analyst Dylan Patel told The Information. “Large-scale deployments using off-the-shelf hardware would cost tens of billions of dollars.”
More than 300 people are already said to be working on Athena, and Microsoft expects the AI chip to be ready next year, although it’s unclear whether it will be open for use by Azure customers. It’s also unclear what this means for Microsoft’s supercomputer partnership with Nvidia.
Separately, two sources said that Microsoft’s next-generation Surface pc will include an NPU to accelerate artificial intelligence and machine learning (ML) on the device. Microsoft currently uses microprocessors from AMD, Intel and Qualcomm, all of which have or will add NPUs to their chipsets.(IT House note: The Surface Pro 9 with Qualcomm technology is the first Surface PC with an NPU)i