The societal move towards automation is happening all around us. Physical maps and self-navigation are a thing of the past, computers, software and robots have taken over jobs—often frustratingly so (has anyone else screamed for a real person during automated phone calls?), and little is required now that Siri tends to our daily needs and commands. But for former rally race driver, Alex Roy, self-driving cars represent the pinnacle of this cultural shift and make clear the humanity that's at risk when everything becomes automated.
Roy, who is now an editor of TheDrive.com, spoke with Salon's Alli Joseph about driving innovation and it's two-fold future: what we gain and what we inherently give up.
Here are some highlights from their conversation. Watch the video for more on autonomous driving.
On the freedom driving affords us:
Technology is only as good as our understanding of it. There were accidents using cruise control when it first came out. We sold cars that are 300, 400, 500 horse power to kids who's parents pay for them, they crash and kill people all the time and we're okay with it, because people want an element of freedom and there's a cathartic element to driving that people crave, beyond all the hassles and expense. But if you release technologies that require people to be smarter, or more thoughtful than they often are, bad things will happen. So autonomy is a good thing, but semi-autonomy requires a parallel approach not a series approach.
On the future of autonomous driving for young drivers:
I think what's going to happen is you're going to have over time—let's skip forward 20 years—this bifurcation, so culturally there'll be people, there'll be kids who grew up who never owned a car. They may have drivers licenses, but their skills are very limited. If they're lucky, they played video games, driving games, they have some basic skills, and they will probably spend a lot of time in self-driving cars and do little to no driving. And if they have a car that requires them to take over, they will be unsafe.
But over time, the equilibrium point where people have never driven and they're in self-driving cars will rise, that'll be good. However, I think that there's a deeper cultural problem, which is that there is a valuable cathartic element to driving a car, to using any machine or device which amplifies human inputs and desires towards an outcome. As we move from a world of analog hobbies, pleasures and expressions to a digital and virtual one, people need an outlet. If you remove from people's choices their ability to decide how they go, where they go, and then you limit where you go to options within a corporate, marketed, advertised series of GPS choices, yelp reviews on your screen, and there's no opportunity to make a mistake, and get lost, and discover something new, and there's no opportunity to take control of the machine and actually learn a skill, then you have removed an element of what it is to be human. So there is a moral and ethical problem to be solved in reducing road deaths, but if we could reduce them without removing the catharsis derived from learning skills and commanding machines and automation, if we can't do that. then people will have consequence-free lives. in which case nothing is learned and life has no value.