Reluctant to depend on Nvidia, Microsoft will launch its first AI chip next month
TIME.CO, Jakarta – Microsoft it is said that it will be launched chip artificial intelligence (TO THE) the first production itself next month. This news comes from tech publication The Information. This effort was made to reduce the reliance on the use of Nvidia graphics processing unit (GPU) chips, which are in high demand, but supply is limited.
Microsoft’s AI chip, called Athena, is designed for data center servers. “This GPU is expected to compete with Nvidia’s flagship H100 GPU, currently used by Microsoft and other cloud providers to support large language models (LLMs) and other AI applications,” as quoted by Gizmochina, on the 8th October 2023.
News about Athena was first learned in April 2023. The chip is expected to be introduced at the Microsoft Ignite conference taking place November 14-17, 2023.
The development of Athena comes at a time when demand for AI chips is on the rise. Large language models (LLMs), in particular, require a lot of computing power to train and run. As a result, there is a shortage of AI chips, resulting in increased product prices.
Microsoft-backed OpenAI is also reportedly exploring the possibility of developing its own AI chips. By developing its own AI chips, it hopes to reduce its dependence on Nvidia and other chipmakers.
Announcement
This could also help Microsoft reduce costs and improve the performance of its cloud services. Microsoft continues to work to roll out AI-powered features to Bing, Office apps, GitHub, and elsewhere.
Tech giants like Google and Amazon are also reportedly developing their own AI chips. Beyond that, there are of course many other companies that rely on Nvidia chips to support the latest major language models. This shows that the AI chip market will grow rapidly.
Always update latest information. Listen latest news and news selected by Tempo.co on the Telegram channel “Tempo.co Update”. Click https://t.me/tempodotcoupdate join. You have to-to install First the Telegram application.
Quoted From Many Source