Get the latest market trends in your inbox

Stay on top of the latest market trends and economic insights with the Axios Markets newsletter. Sign up for free.

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Catch up on coronavirus stories and special reports, curated by Mike Allen everyday

Catch up on coronavirus stories and special reports, curated by Mike Allen everyday

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Denver news in your inbox

Catch up on the most important stories affecting your hometown with Axios Denver

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Des Moines news in your inbox

Catch up on the most important stories affecting your hometown with Axios Des Moines

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Minneapolis-St. Paul news in your inbox

Catch up on the most important stories affecting your hometown with Axios Minneapolis-St. Paul

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Tampa-St. Petersburg news in your inbox

Catch up on the most important stories affecting your hometown with Axios Tampa-St. Petersburg

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Illustration: Caresse Haaser/Axios

Big data got us here, but small data will get us the rest of the way. That's the mantra coming from AI researchers at the forefront of their field, who are casting about for the next big breakthrough.

Details: Inspired by how children learn, they are experimenting with methods that will allow them to train up AI systems with a tiny fraction of the inputs required today — and then set the systems loose on a new problem that they've never seen before.

Background: The deafening fuss around AI is driven by deep learning, a technique that allows machines to pick out subtle patterns from enormous datasets.

  • It's great for all sorts of lucrative and interesting tasks, like driving cars and reading brain scans. And it can get better and better as it eats up more data.
  • But amassing and labeling vast amounts of data is cumbersome and slow — or even impossible, when there's just not much information available.

The next frontier is AI that learns on its own, rather than being explicitly fed information, and algorithms that can take what they know in one arena and apply it to another — like kids learning how the world works.

Driving the news: A panel of leading AI scientists laid out the state of the art at Stanford on Monday, at the launch of the university's Institute for Human-Centered AI. Among the various stabs at solving the data problem:

  • Curiosity-based AI, which would find gaps in its knowledge and gather the missing data itself — like a two-year-old finding her way about the world, according to Berkeley psychology professor Alison Gopnik.
  • Transfer learning, the long-sought but still out-of-reach principle that an AI system can apply what it's learned in one domain to a similar one.
    • "Just like children, we think that to learn things about the world properly you need to be an active learner," said DeepMind CEO Demis Hassabis.
    • "I really think that's the direction we need to be going in as the field: How do we actually build more general systems that can take … a new task and do well on that," said Jeff Dean, head of Google AI.
  • Compositional knowledge, the idea that computers can put together disparate experiences and pieces of information into a larger whole."The kind of thinking that Daniel Kahneman refers to as 'thinking slow' — that's the kind of thinking that we haven't really worked out how to get artificial intelligence to do," said Stanford computer scientist Christopher Manning.

Go deeper

Ben Geman, author of Generate
20 mins ago - Economy & Business

GM's shrinking deal with Nikola

Illustration: Rebecca Zisser/Axios

General Motors will no longer take an equity stake in Nikola Corp. or build its pickup truck, under a revised deal that still envisions GM as a key tech supplier for Nikola's planned line of electric and fuel cell heavy trucks.

Driving the news: The revised agreement Monday is smaller in scope than a draft partnership rolled out in September that had included a $2 billion stake in the startup and an agreement to build its Badger pickup.

1 hour ago - Technology

Exclusive: Facebook's blackout didn't dent political ad reach

Photo: Valera Golovniov/SOPA Images/LightRocket via Getty Images

Americans saw more political ads on Facebook in the week before the 2020 election than they did the prior week despite the company's blackout on new political ads during that period, according to Global Witness, a human rights group that espouses tech regulation.

Why it matters: The presidential election was a key stress test for Facebook and other leading online platforms looking to prove that they can curb misinformation. Critics contend measures like the ad blackout barely made a dent.

Wall Street wonders how bad it has to get

Illustration: Aïda Amer/Axios

Wall Street is working out how bad the economy will have to get for Congress to feel motivated to move on economic support.

Why it matters: A pre-Thanksgiving data dump showed more evidence of a floundering economic recovery. But the slow drip of crumbling economic data may not be enough to push Washington past a gridlock to halt the economic backslide.