Microsoft built a Supercomputer to power OpenAI's ChatGPT
Microsoft CEO Satya Nadella and OpenAI CEO Sam Altman at the Microsoft campus in Redmond, Wash. on July 15, 2019. (Photography by Scott Eklund/Red Box Pictures)

Microsoft and OpenAI have built a new supercomputer specifically for ChatGPT and other generative AI technologies, with multiple iterations as chips improve. The supercomputer is located in Redmond, Washington, and contains over 285,000 processor cores and 10,000 graphics cards. First announced almost three years ago, Microsoft claimed at the time that it was one of the largest supercomputer clusters in the world.

Each graphical processing unit (GPU) chip that powers the AI supercomputer costs $10,000. Although initially created for the purpose of video games, GPUs are capable of performing thousands of simultaneous mathematical calculations.

Hopper, the codename for the successor to the NVIDIA A100 GPU, is now shipping as “H100.” It is expected that these new GPU chips will power future Microsoft Azure supercomputers for Bing AI.

Scaling ChatGPT for Bing AI: $4 billion

Financial analysts estimate Microsoft’s Bing AI chatbot, which is powered by an OpenAI ChatGPT model, needs at least $4 billion of infrastructure to serve responses to all Bing users.

Training an advanced large language model (LLM) could cost “high single-digit millions of dollars”, according to Rowan Curran, a Forrester Research analyst.

The cost of these tools is likely to continue to rise as they become more powerful and more widely used. This could limit the adoption of generative AI technologies by businesses and individuals.

OpenAI’s ChatGPT reached over 100 million users in the month of January, which analysts estimate cost the company over $40 million in infrastructure costs.

The NVIDIA A100 Tensor Core GPU, which is utilized by Microsoft Azure supercomputers to perform elastic, resource intensive artificial intelligence workloads. Microsoft uses tens of thousands of these chips to support Open AI's ChatGPT services for Bing and Azure.
The NVIDIA A100 Tensor Core GPU, is utilized by Microsoft Azure supercomputers to perform elastic, resource-intensive artificial intelligence workloads. Microsoft uses tens of thousands of these chips to support Open AI’s ChatGPT services for Bing and Azure. (Source: NVIDIA)

The supercomputer is expected to improve the performance of ChatGPT, which is a chatbot that can generate text, translate languages, and write different kinds of creative content. ChatGPT has been trained on a dataset of over 500 billion words, and it is able to generate text that is indistinguishable from human-written text.

The Azure supercomputer will be used to train ChatGPT on larger amounts of data, which will improve its ability to generate text, translate languages, write different kinds of creative content, and answer your questions in an informative way. Microsoft hopes that ChatGPT will eventually be able to replace human customer service representatives.

The supercomputer is also being used to train OpenAI’s other generative AI technologies, such as Dactyl, which can generate realistic images of human hands, and DALL-E 2, which can generate images from text descriptions.

Microsoft commits $10 billion to OpenAI for access to artificial intelligence

Microsoft and OpenAI have been working together since 2019, when Microsoft invested $1 billion in OpenAI. The two companies have since collaborated on a number of joint projects.

Microsoft has since extended the partnership and investment and has now committed over $10 billion in OpenAI for ChatGPT development.

This also marks a significant investment by Microsoft in the field of open-source software. OpenAI is one of the leading developers of open-source AI software, and Microsoft has been a major contributor to OpenAI’s work. The construction of Azure supercomputers will allow Microsoft to further support OpenAI’s work and to make its AI technology more widely available.


Discover more from Cybersecurity Careers Blog

Subscribe to get the latest posts sent to your email.