Waymo is letting people drive without touching a wheel, a gear, or a pedal. Featured in the promotional video is a legally blind man driving in local traffic with current AI self-driving tech. This will have many implications for driving and the future of the industry and related industries, and I want to address some of the common questions asked about it and my thoughts on it.
Won't this be dangerous if people can't control the vehicle?
Not really. I think the reverse should be asked since this question assumes people are completely competent drivers. Humans are not perfect, but it is true that neither is AI which is the reason this question is asked. It has to be taken into consideration that the imperfections are not on the same scale. Humans get tired, drunk, emotional, distracted, and have a far lower reaction time and awareness than technology does. You need to dart your eyes at the rear view mirror, side mirrors and turn your body for a lane change, check your speed, ignore passenger distractions, and monitor all road movements at the same time while a self driving car has a 360 degree view of its surroundings, precision navigation, and reacts accordingly and consistently. It's a matter of time before autonomous vehicles go from being better than the average driver to being better than the best driver, and in that time it would seem odd to think of a time when people were in charge of driving.
AI may be better than us, but not perfect and never will be. What happens if an accident occurs? How will the ethics work? Who is to blame?
People are rightfully concerned with how ethical decisions will be made by autonomous vehicles. Should a car swerve to avoid a group to kill a pedestrian instead? Should a car swerve into a wall and kill the passengers instead of a group of people? The answer may be up to you. It is speculated that due to the fact that most of the time cars go without use that people will stop owning them and just call for a car from an autonomous fleet. From that point you would be asked if you agree to the terms of being a passenger in the car which include the ethical rules it will follow. With that, people will vote with their wallets, and it may work out to people not wanting to get into cars that prioritize other things too high above the passenger. Also, part of that agreement would come liability agreements in the event of an accident so it would knock two birds out with one stone in a sense. Your experience with a self driving car may even start with a test like this, with you being responsible for the decisions made if they scenario occurs: http://moralmachine.mit.edu/
What about hackers, spying/tracking, and control of where you can go?
Consider the adversary. If a terrorist were to hack your car, or a bunch of cars and cause mass traffic and kill thousands or even millions of people isn't there cause for alarm? Yes and no. All digitization is met with this question, such as moving to online banking that I'll use as an example, where hackers could steal money from around the world and crash markets and do all kinds of markets. The massive scale attack hasn't happened, however individual crimes like identity theft occur often and are increasing rapidly. The main problem here is faults with security, and centralization. I honestly think blockchain tech would cover both of those and could be applied to the security of autonomous vehicles, communication between them, and with the agreements I alluded to in the question above. The blockchain would make it incredibly hard, virtually impossible, for a mass hack to occur and for companies to control you while using the car. Government actors as an adversary could pose as a problem by not allowing such technology specifically for accessing data (consider the San Bernardino terrorist's iphone). Even so, the blockchain would still be hard to disrupt due to its decentralized and cryptographic nature.
Those 3 questions seemed to be the biggest and most common so I chose those 3 before this got a bit too long. Didn't really want to get into what will happen to jobs either because it can go either way. These are just my opinions on what will end up working out in the future, but I'd like to know what anyone else thinks too in the comments!
There's no doubt self-driving cars are the near future. The secret is the neural network - millions of cars would be talking to each other, learning from each others' mistakes at real time. The good old of question "which way should the car swerve, who should it kill?" would be obsolete in itself - there would simply be no accidents barring severe malfunctions. But yes, that's a good question to answer while there are human drivers on road.
On a different note, Waymo is one hideous car. It's almost like computer nerds jigged it together, no real car designer on the team.