Technical difficulties are embarrassing — especially when you're supposed to be regulating high-tech companies.
That was the situation Tuesday, when a U.S. Senate committee hosted executives from General Motors, Google, ridesharing giant Lyft, auto supplier Delphi and Duke University to discuss an issue decidedly more nuanced than day-to-day IT troubles: integrating self-driving cars into society.
"Something very real and fundamental is shifting here," said Joseph Okpaku, vice president of government relations for Lyft, during the Senate Committee on Commerce, Science and Transportation hearing. "Concepts that could once only be imagined in science fiction are on the verge of becoming a reality."
A struggle at the hearing to tee up an old Google promotional video highlighted a primary point of contention: how government officials far removed from R&D hubs in Detroit or Silicon Valley are supposed to effectively regulate fast-moving transportation technology that raises big questions about safety, cybersecurity, data privacy and legal liability, among other issues.
Still, self-driving cars are often hailed as a prime opening for the U.S. to flex its advanced manufacturing muscle while creating jobs, increasing transportation efficiency and cutting smog-inducing traffic — a refrain echoed in other fields navigating industrial technology breakthroughs, such as drones and so-called smart cities.
"There are clear potential economic and safety advantages," said Mary Louise Cummings, director of the Humans and Autonomy Lab and Duke Robotics at Duke University. "How can we get there with minimal risk?"
States such as California are at the forefront of localized efforts to create new systems for testing self-driving cars. But as commercialization looms larger, businesses warn that outdated rules and a "growing patchwork" of new local, state and federal laws may stifle innovation and undercut U.S. competitiveness in the industry.
“America is currently in very much a leadership position," said Chris Urmson, director of self-driving cars at Google X. "Not a day goes by where a company in China isn’t trying to recruit engineers from our team."
Although (mostly Democratic) lawmakers at the hearing pressed the issue of minimum standards for cybersecurity and privacy, Urmson contended in Silicon Valley's signature libertarian tradition that, "the best action is to take no action."
Sen. Ed Markey (D-Mass.) pointed to auto industry history to argue for a more hands-on approach.
“Witnesses sat here 30 years ago and said the same thing about seat belts and air bags," Markey said. "We need minimal standards."
Care to share a self-driving car?
Automakers and tech companies such as Apple and Google say that fully autonomous vehicles could hit the market in a matter of years — a sharp contrast to more skeptical calculations that it could take decades — but there are also parallel shifts in transportation already underway.
Fast-growing cities have fueled the growth of shared transportation services, such as Zipcar, Uber and Lyft, calling into question how long personal car ownership will be the norm for a majority of consumers. And how many cars of the future will run on gasoline versus electric engines?
"We believe that the next logical step toward public availability of high-level automated vehicles will be controlled ride-sharing projects, such as what we are planning with Lyft," said Mike Ableson, GM's vice president of strategy and global portfolio planning.
Although high costs of radar and camera systems for self-driving cars are likely to persist in the immediate future, Abelson said that introducing autonomous vehicles as ridesharing fleets could make deployment more economically feasible. In addition to maximizing the usage of expensive early autonomous vehicles and easing consumers into the idea, GM's top brass also "thinks it's very interesting" to make those shared cars electric, he added.
"We would introduce it originally as vehicles with drivers … within the next couple of years," Ableson said.
GM also isn't banking on retrofitting existing cars with fully autonomous systems, largely because of cybersecurity and other vulnerabilities that Ableson said are unlikely to be bridged with a stop-gap fix.
“We need to design these vehicles with this in mind," he said. "It would be a new car."
While GM and the other big guns in the automotive industry know minimum safety standards and testing requirements well, newer entrants in the space are also entering the political fray in a big way.
Ridesharing companies such as Lyft are no strangers to regulatory battles pitting innovative technologies with rapid adoption curves against slow-moving government and outdated laws — often against a backdrop of fights with the taxi industry and background checks and pay standards for drivers.
"Three years ago, only one state had issued a regulatory framework for the ridesharing industry," said Lyft's Okpaku. "Today, 30 states have enacted legislation for this industry."
Among the regulatory issues at hand are how many hours self-driving cars should have to long on a test track and how state and federal agencies will balance oversight of the autonomous vehicle industry. Urmson and Ableson said that current legal gray areas could make it impossible to drive an autonomous cars across state lines and extend the time it will take to get the vehicles to market.
Cummings warned that the behavior of human drivers will ultimately be one of the toughest nut to crack when it comes to rolling out self-driving cars. Complications are already happening as companies start publicizing partially automated systems for driving on freeways or other seemingly straightforward scenarios.
"Recently, Tesla suffered from one of [its] drivers getting in the back seat of the car while on autopilot," she said. "If humans just think the car is pretty good, then their behavior is going to be even worse."
In the most heated exchange of the two-hour hearing this week, Sens. Markey and Richard Blumenthal (D-Conn.) took issue with executives from all four auto industry companies evading questions about whether they would support minimum standards for cybersecurity and privacy.
"The credibility that this technology has may become exceedingly fragile if people can’t trust standards that are uniform and mandatory," Blumenthal said.