The San-Francisco-based ChatGPT maker told the Financial Times it had seen some evidence of ‘distillation’, a technique used by developers to obtain better performance on smaller models by using outputs from larger, more capable models. This allows them to achieve similar results on specific tasks at a much lower cost.
OpenAI says it has evidence China’s DeepSeek used its model to train competitor
Subscribe
Login
0 Comments
Oldest