Axios Navigate

January 23, 2019
Good morning! Thanks for reading. Please share this newsletter and tell your friends they can subscribe here. If you have tips or feedback, just reply to this email.
Today Expert Voices contributor Chris Gundling looks at the process of simulating AV sensors and perception systems.
1 big thing: Lidar startups will soon detect the end of the road

Illustration: Aïda Amer/Axios
A big shakeout is coming among providers of a key autonomous vehicle technology: lidar sensors, where dozens of companies — estimates range from 50 to more than 100 — have popped up in recent years.
Why it matters: The startups all promise to deliver better technology to help self-driving cars see their environments. But deploying self-driving cars at scale has proven more challenging than the industry anticipated, and many companies will run out of steam before they find customers for their technology.
The big picture: In the gold rush that has characterized the race to autonomy, perhaps no underlying technology has attracted as much startup activity — or as much investment capital — as lidar.
- Since 2015, lidar startups have raised $1.1 billion in funding, according to CB Insights, which tracks venture capital activity.
- In 2018, a total of $324 million flowed into lidar companies across 26 deals, per CB.
- Some lidar companies plan to keep riding the wave, with additional fundraising rounds likely this year, says Duncan Williams, partner at Greentech Capital Advisors in San Francisco.
Yes, but: Hardly any of these companies are shipping more than a handful of sensors to customers for testing, making it difficult to determine which companies can actually survive in the long run.
- "That will just make the shakeout that much uglier," Williams says.
Background: Lidar, which detects objects by sending out laser beams and measuring how long it takes for the light to bounce back, is seen as the most promising technique for AVs to sense the world around them.
- Velodyne has long been the 800-pound gorilla in lidar, as detailed in Forbes, but now faces competition from all sides.
- Many of the newcomers, including Innoviz, Luminar and Ouster, are working on solid-state lidar systems, which have fewer moving parts and are less expensive.
A reckoning is coming. AV developers have spent the past few years sampling a multitude of lidar systems in their vehicle test fleets, but decision time is nearing.
- To meet expected targets for bringing their first self-driving cars to market by 2021 or 2022, manufacturers must commit to suppliers within the next 12 to 18 months, if not earlier.
- Lidar firms that win production contracts will be able to use those revenues to keep their businesses going until AV volumes pick up 5 to 10 years from now.
The bottom line: Long term, the industry only needs 5 to 10 lidar suppliers, experts say. That means the majority of existing lidar startups won't make the cut.
2. Waymo to build self-driving cars near Detroit

Waymo will partner with Magna to assemble self-driving vehicles in Michigan. Here's their robotaxi depot in Arizona. Photo: Waymo
Waymo, which in December launched the nation's first paid robotaxi business, is taking the next step toward commercialization with a deal to assemble self-driving cars at a new factory in Michigan.
Why it matters: Detroit frets constantly about Silicon Valley usurping its claim as the current and future home of the auto industry. But it turns out the Motor City, with its deep talent pool, is the perfect place for Alphabet's self-driving vehicle unit to assemble its futuristic cars.
What's new: Waymo said yesterday it plans to invest $13.6 million to open a factory in southeast Michigan.
- This will create at least 100 jobs, potentially reaching 400 new positions for engineers, operations experts and fleet managers.
- The company will receive a grant of up to $8 million from the Michigan Economic Development Corp., if it meets the 400-person hiring target.
- That would equate to $20,000 in state aid per job, per Reuters.
- Waymo doesn't build cars, so it partnered with Magna, a giant auto supplier with loads of experience as a contract manufacturer, to integrate its self-driving technology into FiatChrysler and Jaguar vehicles.
The big picture: Waymo has deals to buy up to 62,000 Chrysler Pacifica minivans and 20,000 Jaguar I-Pace luxury crossovers for its robotaxi service. It also has agreements with Avis and AutoNation to provide maintenance and fleet management services.
Yes, but: So far, Waymo has just 600 vehicles in its fleet, most of them still in testing mode. CEO John Krafcik has said the company plans to expand its fleet to as many as 20,000 vehicles by 2022, which seems like a stretch, given the slow roll of AV progress today.
3. Why AV sensors need simulations of their own

Illustration: Rebecca Zisser/Axios
Simulation has been critical to speeding up AV development, especially for motion planning and control algorithms.
But in these simulations, vehicles are either not perceiving the simulated world at all or perceiving only scenarios that have been previously encountered on the road, AV software engineer Chris Gundling writes for Axios Expert Voices.
Why it matters: Using these approaches, vast amounts of on-road data will need to be gathered in order to reach the edges of a perception module's capabilities. An additional layer of simulation for AV sensors and perception systems could use synthetic data to accelerate the development of AVs in several ways...
- Optimizing sensor placement on vehicles. Being able to position and visualize the output from sensors is easier and faster using software than real hardware.
- Creating data sets to train for rare situations. Simulations can model any imaginable scenario, however unlikely.
- Feeding perception algorithms with synthetic sensor data. This allows engineers to continuously create new scenarios and test their algorithms, rather than waiting for new on-road data.
The catch: Sensor simulation for perception algorithms is difficult because each sensor type (lidar, radar, camera, infrared) has different levels of complexity.
- These simulation platforms have to be capable of creating and rendering large photorealistic environments in real time.
- Sensor tasks add to the load: Properly simulating radar or lidar can require casting millions of synthetic rays per second.
The bottom line: Sensor simulation for perception is still a new process, but could fill an important gap in the development cycle of AV software.
Gundling is a senior software engineer at Applied Intuition, an AV simulation software company.
4. Driving the conversation
Bullied: Why do we hurt robots? (Jonah Engel Bromwich — The New York Times)
- If, like me, you're curious about the abuse that Waymo's self-driving minivans have taken in Arizona, you'll find this story interesting.
- Oddly, "our tendency to dehumanize robots comes from the instinct to anthropomorphize them."
- But some researchers think the abuse of less humanoid machines, like self-driving cars, might have more to do with fear of unemployment.
O, Canada: Ontario allows public road testing of autonomous cars — without someone behind the wheel (Peggy Lam — CBC News)
- Why it matters: The University of Waterloo in Ontario is a hotbed of AV development so it's no surprise the province is changing the rules of its AV pilot program to allow testing on public roads without a safety driver, similar to Arizona and California.
Named: Daimler-BMW Mobility Services Joint Venture To Be Named "Jurbey" (Edward Niedermeyer — The Drive)
- The big picture: Daimler and BMW decided last March to combine their efforts on ride-hailing, car-sharing and other mobility services, a sign of just how expensive and distracting it can be to transform a business. Now that they have regulatory approval, they are expected to unveil details of the joint venture in February.
- But with a name like Jurbey, it's hard to keep a secret.
5. 1 bug thing

Studio image of an Ectemnius wasp. Photo: John Hallmen/Barcroft Media/Getty Images
Engineers want to make AVs see as clearly as human drivers. But maybe they should be benchmarking creatures with better eyesight, according to a new study in ACS Nano, the American Chemical Society's journal.
Why it matters: Single lens eyes, like those in humans and many other animals, can create sharp images.
- But the compound eyes of insects and crustaceans have an edge when it comes to peripheral vision, light sensitivity and motion detection, the study says.
What's happening: Scientists are developing artificial compound eyes for use in AVs and robots, among other applications.
- Compound eyes are made up of tiny receptors, called ommatidia, each consisting of a lens, cornea and photoreceptor cells.
- Some insects have thousands of units per eye, and creatures with more ommatidia have increased visual resolution.
Yes, but: It's expensive and difficult to create artificial compound eyes in the lab.
- They tend to be too large and sometimes include only a fraction of the ommatidia and nanostructures typical of natural compound eyes.
- Some groups are using lasers and nanotechnology to generate artificial bug eyes in bulk, but the quality is inconsistent and they often produce distorted vision.
The latest: Chinese scientists have come up with a simple, low-cost method for creating better bug eyes.
- First, they shot a laser through a double layer of acrylic glass. This caused the lower layer to swell, creating a convex dome shape.
- They then created an array of these tiny lenses and bent them along a curved structure to create the artificial eye.
- They grew nanostructures on top of the convex glass domes that — if you look closely — resemble a shag carpet.
- The result: microlenses with better anti-reflective and water-repellent properties.
My thought bubble: But how would they look on the front of a car?