Danger to Watch Out for in Autonomous Cars

autonomous cars

This post is also available in: עברית (Hebrew)

Autonomous cars are still being developed, and this means that an ever growing quantity of data is a massed through cars’ navigational technologies. Recently, questions regarding privacy, ownership, cybersecurity and public safety have made quite the buzz, as heavily guarded mapping data is collected and leveraged by private companies.
These companies may have intellectual property and other economic interests in protecting geospatial data, however, local governments, private citizens and other actors also have an interest in using the data to inform decisions on traffic, future construction projects, the allocation of public funds and other projects, all possibly holding public interest.
“Self-driving cars have the potential to transform our transportation network and society at large. This carries enormous consequences given that the data and technology are likely to fundamentally reshape the way our cities and communities operate,” explains, Luis F. Alvarez León, an assistant professor of geography at Dartmouth, who’s written a study on the subject.
“Right now, the geospatial data obtained by a self-driving car exists in technological and corporate black boxes. We don’t know who can see the data or profit from it. With insufficient government regulation of data from self-driving cars, this raises significant concerns regarding privacy, security and public safety,” Alvarez León added in an interview for phys.org. The author discusses how legislation, open source design and hacking are avenues that can be leveraged to help open the black box, enabling consumers and the government to gain access to this corporate collected information. each of these three approaches can help frame the public debate on the ownership and use of geospatial data from self-driving cars.
Autonomous cars rely on computerized systems. User access to this data proves difficult when they are locked in closed networks controlled by automobile manufacturers. The study looks at how legislation could help make this data more accessible. Companies such as Udacity, an online education company, offers a Self-driving Car Engineer Nanodegree program in which students learn, develop and refine code for autonomous systems. Although there may be economic and intellectual property trade-offs for the manufacturers, open source design plays an important role in allowing for greater transparency, according to the study.
In addition to legislation and open source design, hacking is both a systemic risk for autonomous vehicles and an approach that has been deployed to make car data and automated systems more transparent while holding self-driving car companies more accountable. In 2013 and 2015, two security experts remotely hacked into a 2014 Jeep Cherokee, and a Toyota Prius and Ford Escape, respectively, demonstrating the security flaws in vehicles that were not autonomous. Security vulnerabilities are likely to run much deeper with fully autonomous vehicles.
Precisely because hacking is a generalized risk for autonomous vehicles, certain instances of hacking in the context of research and advocacy have shown the importance of building secure systems. “If we’re going to adopt self-driving cars, then we should really make absolutely sure that they are secure. This requires input from parties outside of the corporations who are building those very systems, such as government, advocacy groups and civil society at large.” says Alvarez León.
In the U.S., Arizona, California and Michigan are currently some of the most hospitable states for self-driving vehicles, serving as testing areas for companies such as Waymo, which started as Google’s Self-Driving Car Project. While there are local regulatory battles, other states may open their doors to this new mode of transportation in the future.