Jul 22, 2020 - Technology

Meet the AI that can write

Illustration of fountain pen writing 1 and 0

Illustration: Eniola Odetunde/Axios

A new general language machine learning model is pushing the boundaries of what AI can do.

Why it matters: OpenAI's GPT-3 system can reasonably make sense of and write human language. It's still a long way from genuine artificial intelligence, but it may be looked back on as the iPhone of AI, opening the door to countless commercial applications — both benign and potentially dangerous.

Driving the news: After announcing GPT-3 in a paper in May, OpenAI recently began offering a select group of people access to the system's API to help the nonprofit explore the AI's full capabilities.

  • The reaction from many who tried GPT-3 was nothing short of ecstatic. "Playing with GPT-3," tweeted developer Arram Sabeti, "feels like seeing the future."

How it works: GPT-3 works the same way as predecessors like OpenAI's GPT-2 and Google's BERT — analyzing huge swathes of the written internet and using that information to predict which words tend to follow after each other.

  • What sets GPT-3 apart is the vast amount of data it was trained on half a trillion words.
  • And the program has 175 billion parameters — the values an AI aims to optimize during training — which is 10 times more than its closest competitor.
  • The result is what Suraj Amonkar of the AI company Fractal Analytics calls "the best language model in the field of AI."

Details: As early testers begin posting about their experiments, what stands out is both GPT-3's range and the eerily human-like quality of some of its responses.

  • Want to write poetry like Walt Whitman? Prompt it with a few lines of "O Captain! My Captain!" and GPT-3 will fill out the rest, generating new poetry in the style of the Bard of Brooklyn.
  • Looking to write workable computer code? GPT-3 is your geek squad.
  • Care to engage in a philosophical debate about the nature of God with an AI? GPT-3 is the stoned freshman-year roommate you never had.

Yes, but: Give it more than a few paragraphs of text prompts and GPT-3 will quickly lose the thread of an argument — sometimes with unintentionally hilarious results, as Kevin Lacker showed when he gave GPT-3 the Turing Test.

  • GPT-3 can use its vast dataset to predict words, but as Amonkar notes, "It most probably does not even have any semantic understanding of the underlying words."

The big picture: Just because GPT-3 lacks real human intelligence doesn't mean that it lacks any intelligence at all, or that it can't be used to produce remarkable applications.

  • Technologist and entrepreneur Azeem Azhar argues that GPT-3 should be seen as a major step forward in knowledge manipulation. Unlike Google's search engine, which organizes the information of the web but still requires you to find the answer, GPT-3 aims to produce direct answers for a query, wrapped in a nice bow.
  • It's still far from perfect, but focusing on those imperfections risks missing GPT-3's true significance. "I'm pretty sure Gutenberg's first press wasn't as good as the modern press," says Azhar.
  • GPT-3 is a "contender for the most spectacularly newsworthy happening of 2020," economist Tyler Cowen writes for Bloomberg. He imagines how it could "generate an entire ecosystem" of spinoffs and services.

Of note: OpenAI has already begun partnering with commercial companies on GPT-3, including Replika and Reddit, though pricing is still undecided.

The catch: As OpenAI itself noted in the introductory paper, "internet-trained models have internet-scale biases." A model trained on the internet like GPT-3 will share the biases of the internet, including stereotypes around gender, race and religion.

  • A system that can generate near-human quality writing could easily be used for misinformation, phishing and other hacking efforts. And while malicious humans already do all of those things, GPT-3 could represent a step change, like going from a pistol to an AK-47.

The bottom line: Humans who assemble letters for a living aren't out of a job — yet. But we may look back upon GPT-3 as the moment when AI began seeping into everything we do.

Go deeper