Welcome back to Future. Thanks for subscribing. Consider inviting your friends and colleagues to sign up. And if you have any tips or thoughts on what we can do better, just hit reply to this email or shoot me a message at firstname.lastname@example.org.
Let's start with ...
Jammal Lemy and Emma González in Chicago on June 15, the start of the summer campaign. Photo: Jim Young/AFP/Getty
The surprising endurance of the U.S. student anti-gun violence movement is usually traced to the February murder of 17 teens and teachers in Parkland, Fla., and the rage of a tight group of hard-edged, culture-wise classmates there.
The long-term future impact of their campaign, if any — such as whether there will be a broader "Parkland generation" with an important legacy — can't yet be known. But there already are signs of a Parkland effect:
Early data suggest they are having an impact both on laws and voter registration:
The bottom line: The Marjory Stoneman Douglas High School students and their national network have animated U.S. politics. They are well-funded, raising at least $5.7 million between a GoFundMe campaign and $2 million from Hollywood personalities. To the extent they are able to mobilize youth on election day, they could decide numerous close races.
Go deeper: The Parkland generation has huge plans this fall.
Bria Smith, left, a Milwaukee senior, in Los Angeles with March for Our Lives on July 20. Photo: Emma McIntyre/Getty
Invited and sometimes self-inviting, teens from across the country hopped on a bus carrying students from Marjory Stoneman Douglas High on a tour of some 80 cities and towns.
I followed the group on the last three days of the campaign. Here are stories about some of the students on the trip, starting with Jaclyn Corin...
Today, Stoneman Douglas students returned to class. Corin is senior class president, and last year was junior class president, too. When the Valentine's Day shooting happened, she had been delivering carnations sold to fund the junior prom. Now, she naturally assumed the leadership of the Parkland anti-gun violence movement.
On Sunday in Newtown, Conn., where the bus tour ended, Corin sounded hopeful about the November elections but also knowing about the difficulty of changing U.S. gun laws significantly.
"It's going to take a cultural shift," she said. "And a cultural shift always takes a generation or two. I hope my kids know we don't need weapons of war on the street."
We spoke by phone late this morning. Corin was in the cafeteria for lunch, surrounded by friends. It's hard to escape the memory of what happened given the swarm of reporters on the street, the clear view of now-closed Building 1200 — where 17 of their classmates and teachers were murdered — and how some teachers are handling the first day back.
"Unfortunately teachers are immediately saying where you can hide, and how the alarm system is going to work this year," she said. "I just got out of Holocaust History. And the teacher — four kids died in her class, and she identified herself as just that and how she risked her life to save others. She talked about whether the windows were bulletproof."
Read more of the students' stories.
A new artificial intelligence program that reads and interprets 3D eye scans has gained the trust of some doctors — largely because it shows its work and provides an assessment of how much confidence should be placed in its recommendation.
Axios' Kaveh Waddell reports: Google's DeepMind says the system can identify more than 50 eye problems and recommend a course of action with expert accuracy.
DeepMind's approach emulates how humans work, breaking the process into two steps:
The system’s 5.5% error rate matches or exceeds the accuracy of human eye experts, DeepMind and University College London researchers write in a paper published in Nature Medicine.
Go deeper: Read Kaveh's whole post.
Illustration: Sarah Grillo/Axios
On Sunday, Kaveh wrote two posts about how researchers are contemplating their responsibility for the big dangers inherent in the new computer science age. In one, researchers were pushing their peers to assess and mitigate the negative effects of their own work. The other asked whether dangerous research should be suppressed.
On Twitter, Jack Clark, OpenAI's strategy and communications director, sparked a lively discussion.
What they're saying, excerpted:
Another Future reader, who asked not to be identified beyond that she works in AI, shared a nearly 50-year-old paper by the scientist Arthur Galston that grappled with similar ethical conundrums:
What are the morals of this story for the scientist interested in insuring useful social applications of his findings?
First, it would appear that no discovery is immune from the danger of misuse. This means that every scientist must be on the alert to the possibility that his discoveries, however ethically neutral or benign they may seem, can be perverted to antisocial ends.
Second, scientific societies must somehow be made to realize and act on their social responsibilities. Those individual scientists who recognize the importance of such actions must be prepared for long, frequently tedious educational and political campaigns in societies that determinedly seek to avoid any social entanglements.
Finally, the individual must be prepared to take what action he can, either alone or with a few like-minded colleagues, to ensure that the perversion of science does not go unchallenged.
Go deeper: When Kaveh was formerly at The Atlantic, a computer scientist told him that cryptographers' reluctance to fight surveillance was a moral failure.
Illustration: Sarah Grillo/Axios
How we got from Facebook to Trump (Zeynep Tufekci – MIT Tech Review)
The U.S. recycling problem (Keerthi Vedantam, Jessie Li, Caresse Haaser – Axios video)
Ice cream and "experience retail" (Emily McCormick – Bloomberg Businessweek)
Hackers are rejoicing over 5G (Joe Uchill – Axios)
The ideas conundrum (Diane Coyle – FT)
Big Tech should get in front of regulation (David McCabe – Axios)
Photo: Warner Bros. Pictures
When you make a film that everyone will call "that big shark movie," it’s important that the shark be satisfyingly enormous. So the team behind a new film called The Meg brought on a software company that specializes in creatures to craft the megalodon.
Kaveh writes: Computer-generated imagery has been a staple of big-budget films for decades, but computer animation is expensive and time-consuming work. By contrast, the AI-powered system that created the megalodon makes it easy for animators to tweak the shark in ways small and large once a model has been created.
First, the bottom line: Great shark, awful plot. If you want to watch a good, thoughtful movie about the human condition, you’re looking in the wrong place. The Meg’s shark is cool, and the action scenes are fairly exciting, but the storyline is tired and the dialogue extremely canned.
On the other hand: Who ever went to a shark movie for the dialogue?
The details: After the film was shot, it became clear that the story was going to be changed, and that there would be a lot of back-and-forth with the director about the shark animations, said Mohsen Mousavi, the visual effects supervisor at Scanline, the company behind the movie’s effects.
Go deeper: Read Kaveh's whole post.