The truth is that we’ve actually expected driverless cars for some time, the first prototypes appearing in the 1950s. The idea isn’t so far-fetched, really – modern autonomous vehicles already harvest wheat, roam the wastelands of Mars, and vacuum our floors, so why not navigate city streets and the highways that connect them?
Driverless cars are already testing in many parts of the world, and they’ve managed to prove their metal – see what I did there? – cruising highways, navigating stop-and-go traffic, returning from their own parking spots, even racing. Still, autonomous vehicle technology has a long way to go, and there’s no such thing as a fully-autonomous vehicle on the road, if not because legislation requires human drivers, at least because driverless cars are still ”learning” how to drive. A “responsible adult” needs to take over in case it needs help.
Are We Too Trusting Too Soon?
The precursors for driverless cars have been here for years, and many people have already become accustomed to features such as adaptive cruise control, electronic stability control, autonomous braking, parking sonar, even anti-lock brakes – every one of these systems takes control out of the driver’s hands. The result is that we’ve put more and more trust in machines, leaving us to our own devices, that is, we trust these systems so much that we’ve forgotten how to drive. Then, when called upon to “drive,” we’re in no condition to do so!
Then, as highlighted in a recent fatal accident involving a semi-autonomous vehicle, we see what happens what that trust is taken too far. The late Joshua Brown’s Tesla Model S drove under a tractor-trailer, which Tesla Autopilot was unable to “see” and react to, unlike Brown could have. Ultimately, we have to ask, “Is it too early to trust driverless car technology?” I believe Brown’s family would agree that it is indeed too early.
Driverless Car Ethics
According to a recent study published in Science magazine, people were asked about how driverless cars should be programmed. Imagine the scenario: a driverless car needs to choose between embedding itself in a tree, possibly injuring or killing the driver, or blasting through a crowded crosswalk, possibly injuring or killing a dozen people.
The question is, essentially, “Who should be protected – the driver or the pedestrians?” Admirably, most respondents chose to protect the pedestrians. Interestingly, when asked “Would you buy a car that protects the driver instead of the pedestrian?” these same people chose to protect themselves. Of course, this is just a thought experiment, but it has real-world implications which legislators and programmers have yet to address.
Apparently, neither the technology nor humanity is capable of making this decision, which means we’re not ready yet. Will future driverless cars, programmers, and passengers be able to satisfactorily answer this question?Tags: Driverless Cars