GPT-3 is the most powerful language model ever built. This is due more than anything to its size: the model has a whopping 175 billion parameters. To put that figure into perspective, its predecessor model GPT-2—which was considered state-of-the-art and shockingly massive when it was released last year—had 1.5 billion parameters.
There is no question that GPT-3 is an impressive technical achievement. It has significantly advanced the state of the art in natural language processing. It has an ingenious ability to generate language in all sorts of styles, which will unlock exciting applications for entrepreneurs and tinkerers. Yet a realistic view of GPT’s limitations is important in order for us to make the most of the model. In a welcome dose of realism, OpenAI CEO Sam Altman made the same point earlier today on Twitter: “The GPT-3 hype is way too much....AI is going to change the world, but GPT-3 is just a very early glimpse.”
https://www.forbes.com/sites/robtoews/2020/07/19/gpt-3-is-amazingand-overhyped/#7a274cc81b1c