comscore

Elon Musk reveals xAI relied on OpenAI models “partly” to train Grok

Elon Musk confirmed in court that xAI partly used OpenAI models to train Grok, shedding light on how AI systems are being developed.

Edited By: Shubham Arora | Published By: Shubham Arora | Published: May 01, 2026, 05:10 PM (IST)

  • whatsapp
  • twitter
  • facebook
  • whatsapp
  • twitter
  • facebook

The ongoing legal battle between Elon Musk and Sam Altman has already brought out several details about how AI companies operate. One of the more notable moments came during Musk’s testimony, where he was asked about how xAI trained its chatbot Grok. news Also Read: India seeks access to Anthropic’s Mythos AI, but US hesitates to share: Here’s why

During questioning in a California federal court, Musk acknowledged that xAI has used models from OpenAI as part of its training process. When asked directly, he did not give a straightforward answer at first, saying that such practices are common across the industry. When pressed further, he confirmed it had been done “partly.” news Also Read: OpenAI case heats up: Elon Musk and Sam Altman face off in high-stakes trial

This is one of the few instances where a company has openly spoken about using a competitor’s AI systems in this way, especially during an active court case. news Also Read: After XChat, Elon Musk plans X Money for payments and banking: What you need to know

What the admission is about

The discussion in court was centred around a concept known as model distillation. This involves using one AI model to help train another. In simple terms, a more advanced system can be used as a reference to improve the responses of a newer model.

Musk explained the idea in general terms, saying that companies often use other AI systems to validate or improve their own models. When asked if this applied to xAI and OpenAI, he agreed to an extent.

The statement more or less suggests that Grok wasn’t built entirely on its own and may have taken cues from existing AI systems. It also shows how things are moving in this space, where companies are trying to improve their models faster instead of starting everything from scratch.

Why this is being discussed now

This has come up during the ongoing case between Musk and OpenAI. Since both sides are already arguing about how the company’s core values have changed over time. While the case itself is focused on governance and control, the testimony is also revealing how AI development works behind the scenes.

At the same time, concerns around model distillation have been increasing across the industry. Companies like OpenAI and Anthropic have already raised questions about how their systems are being used by others, especially when it involves large-scale querying to replicate behaviour.

There have also been broader discussions around whether this kind of usage falls within acceptable limits or crosses into violations of platform rules.

The legal and technical grey area

Model distillation is not a new concept, and it is often used within companies to create smaller or more efficient versions of their own models. The issue becomes more complex when it involves using systems developed by competitors.

In many cases, this does not clearly fall under existing laws but may go against the terms under which these AI tools are made available. That is why companies have started putting more restrictions on how their models can be accessed or used at scale.

Musk’s statement adds to this conversation, especially because it confirms that such practices are not limited to one region or a specific group of companies.

Where xAI stands in the AI space

During the same testimony, Musk also spoke about how he sees xAI compared to other players. He described it as a smaller company with a limited team, especially when compared to larger AI labs.

Add Techlusive as a Preferred SourceAddTechlusiveasaPreferredSource

He also shared his view on the current landscape, placing Anthropic ahead, followed by OpenAI and Google. This gives some context to why xAI may be using available tools and techniques to build its own systems faster.