What is GPT-3?

Short & Sweet AI - En podkast av Dr. Peper

Kategorier:

Some have called it the most important and useful advance in AI in years. Others call it crazy accurate AI.GPT-3 is a new tool from the AI research lab OpenAI. This tool was designed to generate natural language by analyzing thousands of books, Wikipedia entries, social media posts, blogs, and anything in between on the internet. It’s the largest artificial neural network ever created.In this episode of Short and Sweet AI, I talk in more detail about how GPT-3 works and what it’s used for.In this episode, find out:What GPT-3 isHow GPT-3 can generate sentences independentlyWhat supervised vs. unsupervised learning isHow GPT-3 shocked developers by creating computer codeWhere GPT-3 falls short.Important Links and Mentions:Meet GPT-3. It Has Learned to Code (and Blog and Argue)GPT-3 Creative FictionDid a Person Write This Headline, or a Machine?Resources:Disruption Theory - GPT-3 Demo: New AI Algorithm Changes How We Interact with TechnologyForbes - What Is GPT-3 And Why Is It Revolutionizing Artificial Intelligence?Episode Transcript:Today I’m talking about a breathtaking breakthrough in AI which you need to know about.Some have called it the most important and useful advance in AI in years. Others call it crazy, accurate AI. It’s called GPT-3. GPT-3 stands for Generative Pre-trained Transformers 3, meaning it’s the third version to be released. One developer said, “Playing with GPT-3 feels like seeing the future”.Another Mind-Blowing Tool from OpenAIGPT-3 is a new AI tool from an artificial intelligence research lab called OpenAI. This neural network has learned to generate natural language by analyzing thousands of digital books, Wikipedia in its entirety, and a trillion words found on social media, blogs, news articles, anything and everything on the internet. A trillion words. Essentially, it’s the largest artificial neural network ever created. And with language models, size really does matter.It’s a Language PredictorGPT-3 can answer questions, write essays, summarize long texts, translate languages, take memos, basically, it can create anything that has a language structure. How does it do this? Well it’s a language predictor. If you give it one piece of language, the algorithms are designed to transform and predict what the most useful piece of language should be to follow it.Machine learning neural networks study words and their meanings and how they differ depending on other words used in the text. The machine analyzes words to understand language. Then it generates sentences by taking words and sentences apart and rebuilding them itself.Supervised vs Unsupervised machine learningGPT-3 is a form of machine learning called unsupervised learning. It’s unsupervised because the training data is not labelled as a right or wrong response. It’s free from the limits imposed by using labelled data. This means unsupervised learning can detect all kinds of unknown patterns. The machine works on its own to discover...

Visit the podcast's native language site