Illustration: Eniola Odetunde/Axios
Why it matters: The deal provides a way for many companies to have access to the technology while seemingly allowing Microsoft to establish guardrails and parameters for how the technology can be used.
The big picture: GPT-3, which was trained on half a trillion words to optimize for a staggering 175 billion parameters, has generated all kinds of buzz in recent months, with MIT Technology Review declaring it "shockingly good."
Between the lines: While the algorithm doesn't actually "know" much of anything as factual, it's capable of writing surprisingly clearly text on just about anything, by analyzing huge swathes of the written internet and using that information to predict which words tend to follow after each other.
Flashback: Microsoft said in May it was building a supercomputer within its Azure cloud specifically for OpenAI and has also invested $1 billion in the San Francisco-based company.
Go deeper: Meet the AI that can write