Axios Future

October 26, 2019
Welcome back! Thanks for reading. Get in touch with me by replying or emailing [email protected]. Erica, who writes Future on Wednesdays, is at [email protected].
📺 Next on “Axios on HBO”: Iraqi President Barham Salih gives an exclusive interview, (sneak preview), IMF Managing Director Kristalina Georgieva talks socialism and taxes, and Sen. Kamala Harris gives her take on 2020.
- Plus, explore the only offshore wind farm in the U.S., and see a debate over the severity of the next recession.
- Tune in Sunday at 6 p.m. on HBO.
Today's issue is 1,808 words, which shouldn't take more than 7 minutes to read.
1 big thing: The hidden costs of AI

Illustration: Eniola Odetunde/Axios
In the most exclusive AI conferences and journals, AI systems are judged largely on their accuracy: How well do they stack up against human-level translation or vision or speech?
Yes, but: In the messy real world, even the most accurate programs can stumble and break. Considerations that matter little in the lab, like reliability or computing and environmental costs, are huge hurdles for businesses.
Why it matters: Some stumbles that have tarred high-profile AI systems — like facial recognition that fails more often on darker faces, or medical AI that gives potentially harmful advice — have resulted from unanticipated real-world scenarios.
The big picture: Research often guns for incremental advances, juicing an extra percentage point of accuracy out of a previously proposed model.
- "We feel like this is the only thing that gets rewarded nowadays in the community," says Roy Schwartz, a researcher at the Allen Institute for Artificial Intelligence. "We want to argue that there are other things to consider."
- In a paper posted this summer, Schwartz and several co-authors proposed that researchers present not just the accuracy of their models but also the computing cost it took to get there — and the resulting environmental toll.
That's one of several hidden considerations that can drive up the cost of creating an AI model that works in the real world. Among the other important factors accuracy doesn't capture:
- The cost of labeling training data, which is still mostly done by humans.
- The reliability of a system or, conversely, its tendency to make critical mistakes when it sees a new situation it hasn't been trained to deal with.
- Vulnerability to adversarial examples, a special kind of attack that can cripple certain kinds of AI systems, potentially making an autonomous car blind to a fast-approaching stop sign.
- Bias, or whether a model's accuracy varies depending on the kind of person or thing it is evaluating.
- Interpretability of a model's results, or how easy it is to understand why an AI system made a particular prediction.
"The machine learning model is just a tiny piece of the machine learning product," says Andrew Ng, founder of Landing.ai, a startup that helps companies set up AI processes.
- "One of the challenges is that you ship a system and then the world changes in some weird and unpredictable way," says Ng, who previously started Google's and Baidu's AI operations.
- "Anyone can download code from GitHub, but that's not the point," Ng tells Axios. "You need all these other things."
One of the byproducts of the hidden costs associated with increasingly accurate AI systems is that they can make it hard for a new startup or cash-strapped university lab to compete.
- "The AI models that are being designed and developed don't always contemplate the application of the model," says Josh Elliot, director of AI at Booz Allen Hamilton.
- Big companies offer pre-packaged AI programs — like one for detecting street signs, say — but users often lack the computing firepower required to tweak them.
- That means the reigning players get to decide the important problems worth trying to solve with AI, Schwartz says, and everyone else has to go along.
What's next: Schwartz, Ng and others propose putting more effort toward solving problems more efficiently rather than making a slightly more accurate model at an enormous cost.
Go deeper: How not to replace humans
2. Teaching robots to see

Illustration: Sarah Grillo/Axios
Machine vision is a crucial missing link holding back the robotization of industries like manufacturing and shipping. But even as that field advances rapidly, there's a larger hurdle that still blocks widespread automation — machine understanding.
Why it matters: Up against a shortage of workers, those sectors stand to benefit hugely from automation. But the people working in warehouses and factories could find their jobs changed or eliminated if vision technology sees new breakthroughs.
The big picture: Machine vision can help robots navigate spaces previously closed off to them, like a crowded warehouse floor or a cluttered front lawn. And it's critical for tasks that require dexterity, like packing a box with oddly shaped objects.
- Plus, AI can help make sense of the avalanche of video footage recorded daily, which far outstrips humanity's ability to digest it.
- Companies are scrambling to make use of that data to understand how people and vehicles move, or to check for tiny imperfections in new products.
- The rise of AI-monitored cameras is also making surveillance inescapable at work and in public spaces.
Driving the news: In a report first shared with Axios, LDV Capital, a venture firm that invests in visual technologies, predicts an upheaval in manufacturing and logistics, driven primarily by computer vision.
- "The majority of global factories, ports, and warehouses are understaffed and ill-equipped to meet still-rising requirements," the report reads. Visual technologies will help change that, LDV argues.
- In China, some "lights-off" factories have been built to operate without a single human present. But the U.S. will largely see robots employed in factories and warehouses not custom-built for robots, says Abby Hunter-Syed, VP of operations at LDV.
Yes, but: It'll take more than just high-fidelity cameras and fast AI perception to make an intelligent robot.
- A big unsolved challenge is imbuing robots with a deeper understanding of the world around them, so that they can interpret what they see and react to it.
- "Domestic robots, for example, are just not going to arrive until machines can interpret scenes well," says Gary Marcus, co-founder of robotics company Robust.ai. "You can do Roomba, but not Rosie the Robot."
A broad understanding of the world helps us humans avoid confounding errors when we look around.
- Even if we see a cloud perfectly shaped like a horse, we never actually think it's a flying horse because we get how clouds work.
- The same ability helps us handle objects easily — even ones we've never seen before. Humans can generally guess how to place an item on a surface so that it stays upright, for example, rather than tipping over.
- "We've built physics models in our heads, and we've not quite been able to transfer them to robots," says Avideh Zakhor, a Berkeley professor who studies computer vision.
The big question: How much of the problem is solvable with incremental improvements in machine vision, before robots need better common sense?
- Evan Nisselson, a partner at LDV, argues that industry can get 85% or 90% of the way toward lucrative automation with better machine vision.
- But that depends on how much warehouses and factories can remove variability and chaos from the areas where robots are working.
The bottom line: "The Rubicon here, which we haven't crossed yet, is to not just be able to see objects," says Marcus. "It's interpreting scenes that will be the breakthrough."
3. An uncertain future for future jobs


Despite recession worries, demand for workers in the U.S. has climbed ever higher — and with it, the outlook for a bundle of jobs that could dominate future economies.
Why it matters: The fate of this seemingly future-proof work, much of it centered on interactions with increasingly intelligent machines, has looked rosy since 2016, but it's not clear how it would weather an economic downturn.
Driving the news: The latest numbers come from a quarterly report from IT consulting company Cognizant, provided first to Axios.
- For a year, Cognizant has been tracking U.S. hiring for 50 jobs that it deems forward-looking, with statistics going back to 2016 pulled from the Bureau of Labor Statistics via Burning Glass, a jobs database.
- The index includes jobs in sectors such as AI and automation, transportation, health care and work culture.
What's happening: Earlier this year, the "jobs of the future" index flattened along with larger hiring trends, suggesting that this work isn't entirely insulated from the rest of the economy.
- Now, the index has picked back up — but so, too, has hiring across all industries. After a split in early 2017, as seen in the chart above, the "jobs of the future" index has largely tracked hiring in general.
- This suggests there might not be some special magic in these jobs — or, perhaps, that the makeup of the leading edge of work has already changed since Cognizant first assembled the bundle a year ago.
- Rob Brown, VP of Cognizant's Center for the Future of Work, argues there's no particular reason for the parallel movement.
Most future-oriented jobs are increasingly in demand, like health information managers and robotics engineers. But, surprisingly, tech consultants are among a few jobs that dove in Cognizant's most recent analysis.
- Ben Pring, director of the Center for the Future of Work, says that could be because big companies, recognizing the long-term value of tech experts, are hustling to hire them rather than relying on consultants from outside firms like Accenture or, yes, Cognizant.
- Plus, AI models that once required a PhD to set up are becoming easier to use without specialized training, potentially diminishing the value of pricey consultants.
Cognizant picked mostly white-collar jobs for the index, but several jobs outside that category are also good candidates for the list, too, either because they're byproducts of the tech industry or because they are difficult to automate with today's technology.
- One example: workers who label data that trains AI. That's a boring, repetitive job, but one that's in huge demand because labeled data is critical for AI systems. It is, however, in danger of being automated itself down the line.
- Manual labor that varies from day to day — like plumbing or construction, for example — is extremely hard for robots (for some of the reasons outlined in item 2 above). People will continue doing that work for some time.
The big question: What happens when the economy eventually tanks?
- If jobs of the future keep tracking wider hiring trends, they might evaporate, too.
- But Pring argues it's more likely employers will intensify their focus on hiring for jobs of the future.
- That could come at the cost of hiring for other, potentially more accessible jobs.
4. Worthy of your time

Illustration: Eniola Odetunde/Axios
Detroit's gamble on the future (Joann Muller - Axios)
Google's contested claim of quantum supremacy (Elizabeth Gibney - Nature)
The AI face scans that make some hiring decisions (Drew Harwell - Washington Post)
Can you really be addicted to video games? (Ferris Jabr - NYT Magazine)
5G experiments on military bases (Roxana Tiron & Travis J. Tritten - BGov)
1 fun thing: SF is expensive, even for the Warriors

Photo: Tayfun Coskun/Anadolu/Getty
When the Golden State Warriors moved across the bay from Oakland to San Francisco this year, its players went apartment hunting.
Now, they have a familiar complaint, the San Francisco Chronicle's Connor Letorneau reports: The rent is too damn high.
- “You hear about the crazy prices out here. But until you actually see those numbers add up on paper, it doesn’t feel real.” That's Warriors guard Jacob Evans, who says he's spending $7,900 a month on rent and utilities. (He'll make $1.9 million this year.)
- Forward Glenn Robinson III rented a three-bedroom mansion in Detroit last year for $3,500 a month. Now he pays "more than twice that for a fraction of the space," Letorneau reports. (He just signed a two-year $3.9 million contract.)
Fun fact: Thirteen of the 14 Warriors have moved to San Francisco. The other one, Steph Curry, recently bought a $31 million property in Atherton, home to the country's most expensive zip code and an hour-plus trafficky commute to the new stadium.