Key to achieving that goal is xAI’s Colossus supercomputer. The impressive computing powerhouse was built in Memphis, Tennessee, to train the third generation of Grok. xAI’s Grok is a large language model AI, much like Open AI’s ChatGPT. It is available to premium X (formerly Twitter) subscribers.
Impressively, xAI completed Colossus in just 122 days. It then began training its first models 19 days after the installation. According to Nvidia, these systems often take many months or even years to make.
Much like ChatGPT, Grok’s large language models are trained by analyzing massive amounts of data, requiring vast computing power. The data used includes text, images, and other content most often procured online.