At 175b parameters it’s the largest of its kind. And with a memory size exceeding 350GB, it’s one of the priciest, costing an estimated $12m to train. Fortunately for competitors, experts believe that while GPT-3 and similarly large systems are impressive with respect to their performance, they don’t move the ball forward on the research side of the equation. Rather, they’re prestige projects that simply demonstrate the scalability of existing techniques.