One of Uber’s self-driving cars ran into a woman who was crossing the street outside of the crosswalk in Tempe, Arizona, USA, on Sunday March 18. This was during a test ride of Uber’s self-driving car technology. There are not many details of this accident released as of right now. There was a human monitoring the self-driving car, but he must not have interfered. He may have been distracted, or there may be some other reason.
Immediately after the accident, Uber stopped all self-driving car testing until further notice. Uber has not made a full report of this accident, not much is known about the causes or Uber’s entire reaction.
This is bad for the self-driving car industry, a bad sign, as other companies will begin letting self-driving cars drive on public roads without a human in the vehicle to supervise it starting April 2nd. This may postpone development and deployment of future self-driving cars, but we have yet to see a reaction from other companies, notable figures, or the government.
Even if the human was distracted, it is still a self driving car. It should have been programmed to see this woman and stop.
This is a catastrophic failure that should not have happened. It makes me wonder how the programming is exactly set up.
Should self-driving vehicles be held to a higher standard than human drivers? If a human couldn't have stopped in time, are the programmers to blame when an autonomous vehicle fails to stop? I don't have the answers to these questions, but there's a lot to consider.
Yes. For example, robotic manipulators can do a better job than humans in a factory. They are more precise, and hardly ever make mistakes. They are faster and more accurate than any human.
This is due to once that the programming is there, it is hard for the robot to deviate from it, since it only knows this programming. It also doesn't get tired as it has a constant source of energy.
The same can be said about autonomous robots. If the computer vision algorithm can detect an obstacle in front, and if this obstacle is moving, it should have some sort of code saying that the probability that it is a living thing is high. In which case all four brake calipers should have activated, or steered in a different direction, or a combination of both.
This accident shows us a poorly programmed robot, if not one that may need more sensors.