OpenAI says it has evidence China’s DeepSeek used its model to train competitor

The San-Francisco-based ChatGPT maker told the Financial Times it had seen some evidence of ‘distillation’, a technique used by developers to obtain better performance on smaller models by using outputs from larger, more capable models. This allows them to achieve similar results on specific tasks at a much lower cost.

Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Related Posts

Trump and Carney’s odd bromance…

“Trump is an ideological opportunist, as opposed to being an ideological purist,” Fen Hampson, a professor of international…