3 min read
OpenAI’s new general-purpose natural language processing model has just opened for private beta, and it’s already being lavished with praise from early testers.
The model, known as Generative Pretrained Transformer, or simply GPT-3, is a tool that can be used to analyze a sequence of words, text, or other data, and expand on this to produce an original output, such as an article or image.
The potential of this technology was recently demonstrated by Zeppelin Solutions CTO Manuel Araoz, who generated an entire complex article about a faux experiment on the popular Bitcointalk forum using a basic prompt as a guideline.
In the piece, titled "OpenAI's GPT-3 may be the biggest thing since bitcoin," GPT-3 generated a 746-word blog entry that describes how GPT-3 was able to deceive Bitcointalk forum members into believing its comments are genuine. At several points in the text, GPT-3 also describes several possible use-cases for language prediction models, noting that they could be used for "mock news, 'researched journalism', advertising, politics, and propaganda."
Other than a handful of slight issues, including an omitted table and missing screenshots that were referenced in the text, the text is practically indistinguishable from that written by a real person.
This text was generated using just a title, a handful of tags, and the short summary below:
"I share my early experiments with OpenAI's new language prediction model (GPT-3) beta. I explain why I think GPT-3 has disruptive potential comparable to that of blockchain technology."
The Argentine Computer Scientist also used GPT-3 to make complex texts more understandable, write poetry in the style of Borges (in Spanish no less), and write music in ABC notation, among other things. Similar results were also generated by Debuild.co founder Sharif Shameem, who managed to make GPT-3 write JSX code from a basic description of a website layout.
The latest version of the Generative Pretrained Transformer, GPT-3, appears to completely blow away the capabilities of its predecessors, by including an incredible 175 billion learning parameters which allow the AI to be pointed at practically any task. This makes GPT-3 by far the largest language model today, at an order of magnitude larger than Microsoft's 17 billion parameter Turing-NLG language model.
Access to the GPT-3 API is currently by invite only but there is a waitlist for the full version. Pricing is yet to be decided.
Decrypt-a-cookie
This website or its third-party tools use cookies. Cookie policy By clicking the accept button, you agree to the use of cookies.