Aug 15, 2018

Axios Future

By Bryan Walsh
Bryan Walsh

Welcome back to Future. Thanks for subscribing. Consider inviting your friends and colleagues to sign up. And if you have any tips or thoughts on what we can do better, just hit reply to this email or shoot me a message at steve@axios.com.

Let's start with ...

1 big thing: The Parkland effect...

Jammal Lemy and Emma González in Chicago on June 15, the start of the summer campaign. Photo: Jim Young/AFP/Getty

The surprising endurance of the U.S. student anti-gun violence movement is usually traced to the February murder of 17 teens and teachers in Parkland, Fla., and the rage of a tight group of hard-edged, culture-wise classmates there.

The long-term future impact of their campaign, if any — such as whether there will be a broader "Parkland generation" with an important legacy — can't yet be known. But there already are signs of a Parkland effect:

  • The movement, called March for Our Lives, has swollen to include youth from many of the nation's major cities, driven by grief and anger over a two-decade failure to secure their safety, and a determination to stop the bloodletting. (See the next post)
  • They befriended one another along a 59-day bus journey through two dozen states, ending Sunday, and now have fanned out back home.
  • Their explicit aim: to bulk up typically anemic youth voting, and in November oust national and state lawmakers who they say are tools of the National Rifle Association, their stated foe.

Early data suggest they are having an impact both on laws and voter registration:

  • Tom Bonier, CEO of TargetSmart, a political consultancy, says a survey of 39 states shows a surge of youth voter registration.
  • The most dramatic shift is in Pennsylvania, a battleground state: In the 75 days prior to Parkland, people younger than 30 were 45% of all new registered voters there. But in the 75 days afterward, they were 61.4%. If history holds, these new registrants will vote Democrat 2-1, Bonier says.
  • Since Parkland, some 50 new gun laws have been passed across the country, including in 14 states with Republican governors, according to Pew. "They have turned tragedy into a civic state of mind," Bonier tells Axios. "Suddenly, younger people are forced to care about who their elected officials are."

The bottom line: The Marjory Stoneman Douglas High School students and their national network have animated U.S. politics. They are well-funded, raising at least $5.7 million between a GoFundMe campaign and $2 million from Hollywood personalities. To the extent they are able to mobilize youth on election day, they could decide numerous close races.

Go deeper: The Parkland generation has huge plans this fall.

2. ...and its army

Bria Smith, left, a Milwaukee senior, in Los Angeles with March for Our Lives on July 20. Photo: Emma McIntyre/Getty

Invited and sometimes self-inviting, teens from across the country hopped on a bus carrying students from Marjory Stoneman Douglas High on a tour of some 80 cities and towns.

I followed the group on the last three days of the campaign. Here are stories about some of the students on the trip, starting with Jaclyn Corin...

Jaclyn Corin, in orange. Photo: Emilee McGovern/March for Our Lives

Today, Stoneman Douglas students returned to class. Corin is senior class president, and last year was junior class president, too. When the Valentine's Day shooting happened, she had been delivering carnations sold to fund the junior prom. Now, she naturally assumed the leadership of the Parkland anti-gun violence movement.

On Sunday in Newtown, Conn., where the bus tour ended, Corin sounded hopeful about the November elections but also knowing about the difficulty of changing U.S. gun laws significantly.

"It's going to take a cultural shift," she said. "And a cultural shift always takes a generation or two. I hope my kids know we don't need weapons of war on the street."

We spoke by phone late this morning. Corin was in the cafeteria for lunch, surrounded by friends. It's hard to escape the memory of what happened given the swarm of reporters on the street, the clear view of now-closed Building 1200 — where 17 of their classmates and teachers were murdered — and how some teachers are handling the first day back.

"Unfortunately teachers are immediately saying where you can hide, and how the alarm system is going to work this year," she said. "I just got out of Holocaust History. And the teacher — four kids died in her class, and she identified herself as just that and how she risked her life to save others. She talked about whether the windows were bulletproof."

Read more of the students' stories.

3. AI and 3D eye exams
AI segments a 3D eye scan into sections representing different types of tissue. Animation: DeepMind

A new artificial intelligence program that reads and interprets 3D eye scans has gained the trust of some doctors — largely because it shows its work and provides an assessment of how much confidence should be placed in its recommendation.

Axios' Kaveh Waddell reports: Google's DeepMind says the system can identify more than 50 eye problems and recommend a course of action with expert accuracy.

  • AI systems are usually too opaque to be able to explain their reasoning, making them risky to deploy in high-stakes environments like hospitals.
  • "One of the reasons we're putting so much effort into explainability and interpretation is that we desperately want to build trust with nurses and doctors," DeepMind co-founder Mustafa Suleyman tells Axios.

DeepMind's approach emulates how humans work, breaking the process into two steps:

  • First, a neural network segments the scan, which humans have difficulty reading in its raw form, into colored areas representing different types of tissue.
  • Then, the system analyzes the segmentation map with a second neural network, and identifies signs of disease. Here, the system recommends a course of action.
  • The system pairs the results with a percentage reflecting its confidence in its diagnosis or recommendation.

The system’s 5.5% error rate matches or exceeds the accuracy of human eye experts, DeepMind and University College London researchers write in a paper published in Nature Medicine.

Go deeper: Read Kaveh's whole post.

4. From our mailbox: Mulling AI

Illustration: Sarah Grillo/Axios

On Sunday, Kaveh wrote two posts about how researchers are contemplating their responsibility for the big dangers inherent in the new computer science age. In one, researchers were pushing their peers to assess and mitigate the negative effects of their own work. The other asked whether dangerous research should be suppressed.

On Twitter, Jack Clark, OpenAI's strategy and communications director, sparked a lively discussion.

What they're saying, excerpted:

  • Andrew Kemendo, founder of an augmented reality company: The "every technology can be used for bad" argument isn't good enough. Critics have proven forever that they can't differentiate between the technologies and how users behave with them.
  • Johannes Klingebiel, a journalist at Germany's Süddeutsche Zeitung: You don't need bad actors to do bad things with technology. One man's utopia is another one's dystopia and so on. One other aspect worth considering might be who is shaping the public's image of what AI is. There should definitely be more scientists involved.
  • Ryan Khurana, director of the nonprofit Institute for Advancing Prosperity: I really dislike this bifurcation between scientists and philosophers. For most of history the lines between them were blurry (and still are in cognitive science and neuroscience). AI is a General Purpose Technology — the science and engineering can’t be removed from policy.

Another Future reader, who asked not to be identified beyond that she works in AI, shared a nearly 50-year-old paper by the scientist Arthur Galston that grappled with similar ethical conundrums:

  • As a grad student in the 1940s, Galston discovered properties of a chemical compound that was later used to create Agent Orange. In 1965, Galston, now the head of Yale's botany and biology department, began lobbying the government to stop using the chemical agent.
  • His research and activism led to President Nixon's 1971 ban on the substance.
  • Galston reflected on his role in the 1972 paper linked above. In the conclusion, he wrote:
What are the morals of this story for the scientist interested in insuring useful social applications of his findings?
First, it would appear that no discovery is immune from the danger of misuse. This means that every scientist must be on the alert to the possibility that his discoveries, however ethically neutral or benign they may seem, can be perverted to antisocial ends.
Second, scientific societies must somehow be made to realize and act on their social responsibilities. Those individual scientists who recognize the importance of such actions must be prepared for long, frequently tedious educational and political campaigns in societies that determinedly seek to avoid any social entanglements.
Finally, the individual must be prepared to take what action he can, either alone or with a few like-minded colleagues, to ensure that the perversion of science does not go unchallenged.

Go deeper: When Kaveh was formerly at The Atlantic, a computer scientist told him that cryptographers' reluctance to fight surveillance was a moral failure.

5. Worthy of your time

Illustration: Sarah Grillo/Axios

How we got from Facebook to Trump (Zeynep Tufekci – MIT Tech Review)

The U.S. recycling problem (Keerthi Vedantam, Jessie Li, Caresse Haaser – Axios video)

Ice cream and "experience retail" (Emily McCormick – Bloomberg Businessweek)

Hackers are rejoicing over 5G (Joe Uchill – Axios)

The ideas conundrum (Diane Coyle – FT)

Big Tech should get in front of regulation (David McCabe – Axios)

6. 1 cinema thing: A really big shark

Photo: Warner Bros. Pictures

When you make a film that everyone will call "that big shark movie," it’s important that the shark be satisfyingly enormous. So the team behind a new film called The Meg brought on a software company that specializes in creatures to craft the megalodon.

Kaveh writes: Computer-generated imagery has been a staple of big-budget films for decades, but computer animation is expensive and time-consuming work. By contrast, the AI-powered system that created the megalodon makes it easy for animators to tweak the shark in ways small and large once a model has been created.

First, the bottom line: Great shark, awful plot. If you want to watch a good, thoughtful movie about the human condition, you’re looking in the wrong place. The Meg’s shark is cool, and the action scenes are fairly exciting, but the storyline is tired and the dialogue extremely canned.

On the other hand: Who ever went to a shark movie for the dialogue?

The details: After the film was shot, it became clear that the story was going to be changed, and that there would be a lot of back-and-forth with the director about the shark animations, said Mohsen Mousavi, the visual effects supervisor at Scanline, the company behind the movie’s effects.

  • Scanline brought in Ziva, an animation company that specializes in creating virtual characters that move realistically. Its software uses AI to compile a creature model that can be animated quickly and automatically, with the help of some heavy compute power in the form of 2,500 Intel Xeon processors.
  • Consulting anatomy books to understand the properties of a great white shark’s body, the animators created a skeleton and a muscle system, layered it with fat, and wrapped it in sharkskin.
  • Ziva uses a physics engine that models how each of these elements' physical properties interact, so that animators don’t have to make its virtual muscles fire manually.

Go deeper: Read Kaveh's whole post.

Bryan Walsh