GPT3-OpenAI: 3 demos that will let you rethink about AI capabilities
Published Jun 17 2022 08:47 AM 8,810 Views
Microsoft

Open AI GPT3 is the 3rd generation of OpenAI’s Generative Pretrained Transformer models. During the training process, it was fed with almost all the content existing over the internet. It’s one of the largest neural network ever trained, with 175 billion learning parameters.

But what does it can do with all this data and computational power? Let me show you 3 demos that will let you rethink about AI capabilities.

 

The main capability of GPT3 Open AI models series is to be able to “complete” your input prompt: that means that the model tries to guess how to complete the text, given a start text injected. This is also the main difference from most other natural language processing (NLP) services, which are designed for a single task, such as sentiment classification or named entity recognition. Instead, the completions endpoint can be used for virtually any task including content or code generation, summarization, conversation or creative writing.

 

1. Creative writing: can OpenAI write a sonnet?

Let’s start by asking to the model to write an original sonnet on a style similar to Shakespeare’s sonnets.

carlottacaste_0-1655476457730.png

The very first row of text in the screenshot is my input request, while the green highlighted text is the output from the model. And what an impressive output! It’s well structured as a poem and it’s hard to believe that was not written by a human.

 

2. Language translation: can OpenAI translate from English to Italian?

Let’s now challenge our model with a new task. Let’s take the 1st paragraph of the poem it just generated and let’s ask to translate it in Italian (just because I am an Italian native speaker, but choose your language for your own tests!).

carlottacaste_0-1655476069055.png

 

What an awesome result again! The translation is very accurate and the result also mirrors the same spacing and punctuation of the original text.

 

3. Software development: can OpenAI write a Python function?  

I truly believe that being a developer does imply some sort of creativity. That’s why the next test will be to ask to OpenAI to write some Python code to define a dictionary of 3 cities (with corresponding locations) and then  a function to loop in the dictionary and count how many locations contain a ‘u’ character.

 

carlottacaste_0-1655476118969.png

The comment lines were written by me as input, while the Python code is the model’s output. Great result again! We got 2 chunks of correct and neat Python code, starting from natural language.

 

For the first 2 demos I used the “text-davinci” model, which is the most capable model of the GPT3 series. For the third demo I used the “code-davinci” model, which is the most capable model of the Codex series, the GPT3 successor, trained on Github data. In both cases I didn’t customize the models with domain data.

 

If you are still skeptical or if I intrigued you enough with this examples, test it yourself by signing up for a free grant at Overview - OpenAI API. Be aware that even using the same prompts you could obtain slightly different results from me, since the model generate every time different text.

 

OpenAI GPT3 models introduced a new way to interact with deep learning models, by expressing any task in terms of natural language descriptions, requests, and examples, fine-tuning the input prompt until the model “understands” and it meta-learns the new task based on the high-level abstractions it learned from the pretraining. Designing the prompt becomes in a certain sense a new programming paradigm.

1 Comment
Co-Authors
Version history
Last update:
‎Jun 17 2022 07:42 AM
Updated by: