What is GPT-3? | Tinkered Thinking

Check out the Episode Page & Show Notes

Key Takeaways

  • GPT-3 is a new tool from OpenAI. The acronym stands for Generative Pretrained Transformer.
    • “We’ve probably reached an inflection point in the progress of artificial intelligence” – Tinkered Thinker
  • “The ramifications for how this advance might rattle through society seem ubiquitous.  Language is the fabric of society.  Our use and misuse of language dictates the rise and fall of all our endeavors, and a computer just got really really good at imitating our language.”
  • GPT-3 was trained using text from the internet, with an amount of text that is inconceivable for a single human being to think about reading
    • “Think of every comment, every post, every description, every pdf book freely available, all the tweets and blogs, manuals, dissertations, threads, rantings – all of it.  That was the block of text that was given to the neural net for it’s training. ” – Tinkered Thinker

Intro

What Is GPT-3?

  • GPT-3 is a new tool from OpenAI. The acronym stands for Generative Pretrained Transformer.
    • It’s a program that accepts some text and then generates a continuation of this text.
      • “We’ve probably reached an inflection point in the progress of artificial intelligence” – Tinkered Thinker
  • GPT-3 mimics the style, tone and character of whatever text you feed it
    • GPT-3’s contributions get better and better as the entire context of the story grows
  • “GPT-3 was trained to generate text using a computational model that bares a lot of similarity to our jumble of neurons we call a brain” – Tinkered Thinker
    • This computational model is referred to as a ‘neural net’

How Does GPT-3 Work?

  • GPT-3 was trained using text from the internet, with an amount of text that is inconceivable for a single human being to think about reading
    • “Think of every comment, every post, every description, every pdf book freely available, all the tweets and blogs, manuals, dissertations, threads, rantings – all of it.  That was the block of text that was given to the neural net for it’s training. ” – Tinkered Thinker
  • GPT-3 has become remarkably good at solving the question of: what word would make sense to come next? 
    • The training of the neural net essentially asked this question of itself over and over. Imagine the neural net reading 99 words and guessing the 100th word. Then imagine it doing this for an astronomical number of times. After a while, it gets pretty good at this game.
  • “GPT-3 doesn’t actually know any words…Through all of this weighted calibration using embedded language, GPT-3 has ‘learned’ the subtle rules that dictate how we humans pick our words in different contexts.” – Tinkered Thinker

Thoughts on GPT-3

  • Why all the hype around GPT-3?
    • “It does an astonishingly good job at that guessing game.  It’s so good that often you just can’t tell that it’s generated text, that wasn’t written by a human.” – Tinkered Thinker
      • “The ramifications for how this advance might rattle through society seem ubiquitous.  Language is the fabric of society.  Our use and misuse of language dictates the rise and fall of all our endeavors, and a computer just got really really good at imitating our language.”
  • Two example use cases for GPT-3:
    • If you gave it enough information about your life, it could become a therapist in your pocket that’s available at any time and remembers everything about you
    • If you’re a human rights activist, it could read a 2,000-page long bill and summarize it into 5 pages and identify any sentences that you should be concerned about

Tinkered Thinking : , , , ,
Notes By Alex Wiec

More Notes on these topics

Top Insights and Tactics From

31 Best Podcasts of All Time

FREE when you join over 12,000 subscribers to the
Podcast Notes newsletter

No Thanks