As things stand today, the driverless car of the future can’t handle more than a dusting of snow.
It’s a known problem in the field, and vaguely embarrassing when the end result is supposed to be robots sophisticated enough to navigate the uncertainties of traffic and improve on lackluster human perception. In Boston, where NuTonomy has been road testing autonomous vehicles in cooperation with city planning officials, snow and seagulls have emerged as two of the biggest obstacles. “Snow not only alters the vehicle’s traction but also changes how the vehicle’s cameras and sensors perceive the street,” concluded a study by the World Economic Forum and the Boston Consulting Group.
For the local breed of unflappable seagulls—which can stop autonomous cars by simply standing on the street, unbothered by NuTonomy’s quiet electric cars—engineers programmed the machines to creep forward slightly to startle the birds. There’s not yet a solution for Boston snow.
After years of testing, with hundreds of cars and vans deployed on public streets and private facilities, even the best autonomous-driving efforts still struggle with inclement weather. The ultimate hurdle to the next phase of driverless technology might not come from algorithms and artificial intelligence—it might be fog and rain.
Another Boston-area startup is promising a way to solve these weather woes, just as leading players race to launch viable businesses. WaveSense has built a radar system to scan what’s below the road, down where there’s no snow at all, rather than parse wintry mix on top.
The cluster of sensors, bolted underneath a chassis like a skid plate, can scan 10 feet beneath the ground to reveal soil, water, roots, and rocks. Once WaveSense has scanned an entire roadway, it creates a map of the subsurface strata that can determine the location of a vehicle within a few centimeters.
Ground-penetrating radar isn’t that novel; archaeologists and land surveyors use it all the time. WaveSense claims it’s the first to use the technique to locate something above ground, and it can do so at speeds up to 65 miles an hour.
The startup vows to keep autonomous vehicles on a road, regardless of weather, while other sensors look out for traffic, stop lights, and careless pedestrians. “The sector to date has been really focused on replicating how humans drive,” says Tarik Bolat, WaveSense’s chief executive officer. “You actually should start from a clean piece of paper.”
The company just raised $3 million in seed capital in a round led by Rhapsody Venture Partners and has signed two customers to test the sensor on self-driving vehicles. WaveSense declined to name them.
At the moment, autonomous cars rely on a patchwork of sensors: GPS, traditional cameras, radar, and lidar technology that bounces lasers off nearby cars and pedestrians. Mother Nature essentially sidelines two of those four applications: Cameras are useless in fog and heavy snow, and lidar lasers careen wildly off raindrops and snowflakes. The remaining systems also have major deficiencies. GPS connections can be slow and spotty, and radar is lousy at distinguishing obstacles—is that a pedestrian or a seagull?
The trick is programming a vehicle to make decisions on a blend of the best information from each system while ignoring the rest—what autonomous driving engineers call sensor fusion. “What’s evolving for everybody is basically the algorithms that understand what is being looked at,” says Gartner analyst Mike Ramsey. “This is a big part of the intellectual property of these companies right now.”
Waymo, widely seen as leading the self-driving vanguard, says it has made progress teaching its software to better filter out “noise” from precipitation. Minivans modified with its autonomous technology are conducting road tests in 25 cities, including snowy Detroit and the rainy Seattle suburbs. Still, the Alphabet Inc. unit is only ferrying passengers who don’t work for the company on robo-taxi trips around the sunny Phoenix area, where it plans to launch a driverless ride-hailing business later this year.
WaveSense is trying to position itself at the end of a widening pipeline of money as transportation giants try to crack the code on what could be a $7 trillion business. Last year, investors poured some $350 million into lidar alone. With its Google origins and deep-pocketed parent company, Waymo can afford to build its own sensors. Many rival programs, even those such as at Ford Motor Co. that can spend billions on research, often rely on partners.
“You read about another lidar company every day,” says Rhapsody Managing Partner Carsten Boers. “I don’t know if FOMO is the right word, but there’s this incredible anxiety in the space right now.”
Much of the WaveSense research and development cost has already been covered by the U.S. military. The company hatched from Massachusetts Institute for Technology’s Lincoln Lab, a federally funded center that does national security-related research. The U.S. Army has been using ground-penetrating radar to find land mines for years.
Byron Stanley, now WaveSense’s chief technology officer, started tinkering with underground radar around 2009 while researching self-driving systems for armored trucks. His team at MIT showed the military a working prototype in 2012; a year later, it was jutting from the bumpers of nine-ton military trucks as they caromed through remote areas of Afghanistan. The roads in the Registan Desert look remarkably like the surrounding terrain—a problem shared with virtually every snowed-under U.S. city in February.
“Figuring out that it was even possible wasn’t unique,” Stanley says. “But it was something that no one had ever been able to do.”
WaveSense says it’s ready for the private sector now that it has managed to shrink the hardware to a manageable size. The box of electronics is 5 feet long, 2 feet wide, and 3 inches high, and the startup expects to deliver an even smaller version in coming months. Eventually, with enough demand, prices could fall to around $100 apiece.
Xavier Mosquet, senior partner at the Boston Consulting Group, thinks it will be at least two years before even an elite player such as Waymo is able to code a car brain to consistently handle the challenges of a Boston-like climate. “If you aren’t able to drive one hour a week, people will understand,” he says. “But if it’s four or five hours a day, it doesn’t work for a market.”
That’s why the earliest self-driving vehicles set loose on the world will be mostly restricted to sunny, dry cities that look like the places they’re most often tested. The weather, along with minimal government oversight, is a large part of the reason why the companies trying to perfect autonomous vehicles have flocked to Arizona.
“At the moment, even the most advanced players will call any kind of weather ‘out of scope,’ ” Bolat says.
Weather is a problem for the next generation of cars in multiple ways. Electric vehicles can be hamstrung by cold weather because battery power is needed to heat the car to the point at which electrons operate efficiently. A deep chill can sap about 30 percent of the potential mileage from a battery.
Ramsey at Gartner predicts that the first robo-taxis to roll out in a rain-slicked city such as Seattle will be bespoke iterations beefed up with additional sensors and computational power. “In the early days of autonomy, these things will not be functioning in anything heavier than a light rain,” he says. “These companies will be extremely risk-averse.”
Nuro, a company developing a fleet of self-driving delivery vehicles, is working to “future proof” its silver pods for groceries. A desert dust storm in July halted testing near Phoenix, highlighting how weather weakness can afflict even work in sunny climates. Nuro has further priorities, says co-founder Dave Ferguson, such as making sure the front end will crumble correctly on impact and programming sensors to handle the vertical hills of San Francisco.
Weather is on the list, just not at the very top. “We have a lot of confidence that with our sensor suite, we can solve these problems, but no one has really done it yet,” Ferguson says of the weather. “And until it’s done, it’s not done.”
WaveSense will busy itself this winter by driving its ground-piercing sensors around the Boston area. A winter full of snow, and even the occasional obstinate seagull, will help with product development. “Eventually, we expect this to be something that’s on every vehicle that drives autonomously,” WaveSense’s Bolat says.