The San Francisco-based ChatGPT maker told the Financial Times it had seen some evidence of “distillation”, which it suspects to be from DeepSeek. The technique is used by developers to obtain better performance on smaller models by using outputs from larger, more capable ones, allowing them to achieve similar results on specific tasks at a much lower cost.