CEO Elon Musk has announced that 100,000 Nvidia H100 GPUs will be used to train artificial intelligence startup xAI's chatbot "Grock 3." The "Grock 3" will appear later this year.
Business Insider introduced on the 1st (local time) that Elon Musk said on X (Twitter), "Something great will come out when we complete the Grook 3 training with 100,000 H100s at the end of this year."
H100 is the core of AI development, and the price per chip is estimated at 30,000 to 40,000 dollars. This means that the company has invested 3 billion to 4 billion dollars in training for the new model.
However, it is unclear whether the company actually secured 100,000 units of H100. It could have rented computing through a cloud service provider. In fact, it was reported that in May, it invested 10 billion dollars (about 14 trillion won) in Oracle and rented a cloud server.
Also, last month, it was said that CEO Musk diverted $500 million (about 700 billion won) worth of H100 purchased for Tesla to xAI.
Earlier in April, CEO Musk said about 20,000 H100 GPUs were needed for the "Grook 2" training, which is scheduled to be released in August this year.
Meanwhile, in addition to xAI, other big tech companies are also focusing on securing GPUs.
The most well-known of all is Meta. Meta CEO Mark Zuckerberg said he will purchase 350,000 H100 GPUs by the end of this year and plans to secure 600,000 AI chips, including other types of chips. This is equivalent to 18 billion dollars.