Artificial intelligence research outfit OpenAI Inc. recently made the latest version of its GPT-3 general-purpose natural language processing model available in private beta, and its capabilities are astounding early testers. Input any text, and GPT-3 would complete it
- In July, OpenAI, an artificial-intelligence research lab based in San Francisco, began allowing limited access to a new software called GPT-3. It is being claimed as by far the most powerful “language model” ever created. GPT stands for “generative pre-trained transformer”
- A language model is an artificial intelligence system that has been trained on an enormous corpus of text; with enough text and enough processing, the machine begins to learn probabilistic connections between words. In other words, GPT-3 can read and write well.
- GPT-3 is a machine learning system that has been fed 45TB of text data, an unprecedented amount. Training allows it to generate written content: stories, code, legal jargon, all based on just a few input words or sentences. And the beta test has already produced some jaw-dropping results.
- However, after some initially promising results, GPT-3 is facing more scrutiny. The model faced criticism recently when Facebook’s AI head Jerome Pesenti called out bias coming out of a program created with GPT-3. The program was a tweet generator; anyone could type in a word and the AI would come up with a relevant, 280-characters-or-less sentence.
- But these issues are expected to be rectified soon. Once it is officially launched, GPT-3 could be enormously useful and it is expected to change the way how a lot of things are being done now.
- Machines that can understand and respond to humans in our own language could create more helpful digital assistants, more realistic video game characters, or virtual teachers personalized to every student’s learning style.
- GPT-3’s flexibility is a big advantage. Matt Shumer, the chief executive of a company called OthersideAI, is using GPT-3 to build a service that responds to email on your behalf — you write the gist of what you’d like to say, and the computer creates a full, nuanced, polite email out of your bullet points.
- From a single sentence, or even a few words, it can generate a full five, well-written paragraphs. It unleashes a lot of creativity.
- GPT-3 was trained off of 175 billion parameters from across the internet, including Google Books, Wikipedia, and coding tutorials etc.
- GPT-3 is the third generation of OpenAI’s Generative Pre-trained Transformer, which is general-purpose language algorithm that uses machine learning to translate text, answer questions and predictively write text. It works by analyzing a sequence of words, text or other data, then expanding on these examples to produce entirely original output in the form of an article or an image.
- After originally publishing its GPT-3 research in May, OpenAI gave select members of the public access to the model last week via an API. And over the past few days, a number of samples of text generated by GPT-3 have begun circulating widely on social media.
- A company called Latitude is using GPT-3 to build realistic, interactive characters in text-adventure games. It works surprisingly well — the software is not only coherent but also can be quite inventive, absurd and even funny.