Should self-driving vehicles be held to a higher standard than human drivers? If a human couldn't have stopped in time, are the programmers to blame when an autonomous vehicle fails to stop? I don't have the answers to these questions, but there's a lot to consider.
You are viewing a single comment's thread from:
Yes. For example, robotic manipulators can do a better job than humans in a factory. They are more precise, and hardly ever make mistakes. They are faster and more accurate than any human.
This is due to once that the programming is there, it is hard for the robot to deviate from it, since it only knows this programming. It also doesn't get tired as it has a constant source of energy.
The same can be said about autonomous robots. If the computer vision algorithm can detect an obstacle in front, and if this obstacle is moving, it should have some sort of code saying that the probability that it is a living thing is high. In which case all four brake calipers should have activated, or steered in a different direction, or a combination of both.
This accident shows us a poorly programmed robot, if not one that may need more sensors.