Illustration: Eniola Odetunde/Axios

A new general language machine learning model is pushing the boundaries of what AI can do.

Why it matters: OpenAI's GPT-3 system can reasonably make sense of and write human language. It's still a long way from genuine artificial intelligence, but it may be looked back on as the iPhone of AI, opening the door to countless commercial applications — both benign and potentially dangerous.

Driving the news: After announcing GPT-3 in a paper in May, OpenAI recently began offering a select group of people access to the system's API to help the nonprofit explore the AI's full capabilities.

  • The reaction from many who tried GPT-3 was nothing short of ecstatic. "Playing with GPT-3," tweeted developer Arram Sabeti, "feels like seeing the future."

How it works: GPT-3 works the same way as predecessors like OpenAI's GPT-2 and Google's BERT — analyzing huge swathes of the written internet and using that information to predict which words tend to follow after each other.

  • What sets GPT-3 apart is the vast amount of data it was trained on half a trillion words.
  • And the program has 175 billion parameters — the values an AI aims to optimize during training — which is 10 times more than its closest competitor.
  • The result is what Suraj Amonkar of the AI company Fractal Analytics calls "the best language model in the field of AI."

Details: As early testers begin posting about their experiments, what stands out is both GPT-3's range and the eerily human-like quality of some of its responses.

  • Want to write poetry like Walt Whitman? Prompt it with a few lines of "O Captain! My Captain!" and GPT-3 will fill out the rest, generating new poetry in the style of the Bard of Brooklyn.
  • Looking to write workable computer code? GPT-3 is your geek squad.
  • Care to engage in a philosophical debate about the nature of God with an AI? GPT-3 is the stoned freshman-year roommate you never had.

Yes, but: Give it more than a few paragraphs of text prompts and GPT-3 will quickly lose the thread of an argument — sometimes with unintentionally hilarious results, as Kevin Lacker showed when he gave GPT-3 the Turing Test.

  • GPT-3 can use its vast dataset to predict words, but as Amonkar notes, "It most probably does not even have any semantic understanding of the underlying words."

The big picture: Just because GPT-3 lacks real human intelligence doesn't mean that it lacks any intelligence at all, or that it can't be used to produce remarkable applications.

  • Technologist and entrepreneur Azeem Azhar argues that GPT-3 should be seen as a major step forward in knowledge manipulation. Unlike Google's search engine, which organizes the information of the web but still requires you to find the answer, GPT-3 aims to produce direct answers for a query, wrapped in a nice bow.
  • It's still far from perfect, but focusing on those imperfections risks missing GPT-3's true significance. "I'm pretty sure Gutenberg's first press wasn't as good as the modern press," says Azhar.
  • GPT-3 is a "contender for the most spectacularly newsworthy happening of 2020," economist Tyler Cowen writes for Bloomberg. He imagines how it could "generate an entire ecosystem" of spinoffs and services.

Of note: OpenAI has already begun partnering with commercial companies on GPT-3, including Replika and Reddit, though pricing is still undecided.

The catch: As OpenAI itself noted in the introductory paper, "internet-trained models have internet-scale biases." A model trained on the internet like GPT-3 will share the biases of the internet, including stereotypes around gender, race and religion.

  • A system that can generate near-human quality writing could easily be used for misinformation, phishing and other hacking efforts. And while malicious humans already do all of those things, GPT-3 could represent a step change, like going from a pistol to an AK-47.

The bottom line: Humans who assemble letters for a living aren't out of a job — yet. But we may look back upon GPT-3 as the moment when AI began seeping into everything we do.

Go deeper

17 mins ago - World

Nuclear free-for-all: The arms control era may be ending

Illustration: Aïda Amer/Axios

The mushroom clouds over Hiroshima and Nagasaki have remained unreplicated for 75 years in part because the U.S. and Soviet Union — after peering over the ledge into nuclear armageddon — began to negotiate.

Why it matters: The arms control era that began after the Cuban Missile Crisis may now be coming to a close. The next phase could be a nuclear free-for-all.

Pelosi, Schumer demand postmaster general reverse USPS cuts ahead of election

Schumer and Pelosi. Photo: Alex Wong/Getty Images

House Speaker Nancy Pelosi (D-Calif.) and Senate Minority Leader Chuck Schumer sent a letter to Postmaster General Louis DeJoy on Thursday calling for the recent Trump appointee to reverse operational changes to the U.S. Postal Service that "threaten the timely delivery of mail" ahead of the 2020 election.

Why it matters: U.S. mail and election infrastructure are facing a test like no other this November, with a record-breaking number of mail-in ballots expected as Americans attempt to vote in the midst of a pandemic.

2 hours ago - Science

CRISPR co-discoverer on the gene editor's pandemic push

Photo illustration: Aïda Amer/Axios. Photos: Brian Ach/Getty Images for Wired and BSIP/UIG via Getty Images

The coronavirus pandemic is accelerating the development of CRISPR-based tests for detecting disease — and highlighting how gene-editing tools might one day fight pandemics, one of its discoverers, Jennifer Doudna, tells Axios.

Why it matters: Testing shortages and backlogs underscore a need for improved mass testing for COVID-19. Diagnostic tests based on CRISPR — which Doudna and colleagues identified in 2012, ushering in the "CRISPR revolution" in genome editing — are being developed for dengue, Zika and other diseases, but a global pandemic is a proving ground for these tools that hold promise for speed and lower costs.