GPT-3 could change the WWW dramatically. It’s a language AI which can generate complete articles with one click. Only requirement: A short beginning of the article. GPT-3 is the abbreviation for “Generative Pre-trained Transformer 3”. It’s the third version of GPT, and was published in 2020. With 175 billion parameters the neural net behind GPT-3 is very powerful.
The results are also very astonishing, accurate for a lot of languages. This is also one reason why access to GPT-3 was limited. OpenAI, the company behind GPT-3, had concerns that the system could be exploited, for example to generate massive amounts of fake news or spam. But in the meantime OpenAI implemented enough security measures that should prevent this. Therefore lets have a closer look at GPT-3:
Contents
How you can use GPT-3
The registration for the GPT-3 API is now open, with no queue. You have only to agree to the policies, which tell you, that you don’t use GPT-3 for fraud or political influence. The access is also denied for dictators 😉
Get started here: https://openai.com/
Payment
You pay the API access per 1.000 words or punctuation marks (tokens). The price range is between 0,08 and 6 US-Cents.
Challenges of the texts created with GPT-3
As I said, it is very easy to create new texts automatically. The big problem here: You could get facts but also fictional passages. It’s not like a journalist writing your content, it’s more like a novelist writing a story for you. It reads like it was written by a human being, but in terms of content it doesn’t have to be true. There is also no explanation, why or how GPT-3 comes to this result. You can’t trust the GPT-3 generated texts blindly.
What’s also important to know: Even if GPT-3 reflects real facts, the foundation for these might be outdated. The reason is simple: the training data that forms the basis for the language model becomes outdated from month to month. The world is changing, so the model must also be adapted from time to time. But every new training costs a lot of resources and therefore also money (approximately 5 million Dollar). So if the last training is already a few months agp, new events, persons or facts can’t be reflected by the model.
GPT-3 alternatives
Of course, GPT-3 is a very powerful mode, but it’s not the only one. There are a few alternatives, like Gopher from Deepmind, which even consists of 280 billion parameters. Microsoft’s Megatron-Turing NLG has almost twice as much, with 530 billion parameters. But that doesn’t even come close to the Chinese language model. With 1750 billion parameters is Wu Dao 2.0 from Beijing, more than ten times larger than GPT-3. Wu Dao 2.0 is capable of processing text and images simultaneously.