Same wonky things can happen to human drivers, and it happens all the time with DUI and fatigue. This is why roads have changed from the horse riding days (you can think of horses as a dumb self driving vehicles that can get a drunk rider home with lane departure), and we have traffic lanes marked on the roads, and curbs separating human and vehicles, and signals, and laws regulating right of way, etc.I've spent many years operating programed robots in the past, they are not flawless. In the old days "early 80's" when we had problems with them the service department was thinking sun spots could have caused some of the unexplainable failures.
I remember a machine crash one time that was caused by a small metal chip that some how was allowed to enter the electronics cabinet and bridged a switch. How and why and at the time I was the lucky one.
Just like occasionally when any computer can funk out for an unknown reason or a near by lightning strike, and then needs a complete shut down or reboot, this can happen to the most expensive new robotic industrial machines, been there done that.
I don't trust any electronic device.
Autonomous cars and trucks? There have been cases of normal over computerized cars becoming autonomous and running away,
and can be especially bad with ones where every system is run through a computer of some sort, and the driver controls are in essence just joy sticks for the game controller.
Robots aren't perfect and that's why assembly lines for robots tend to be different than human assembly lines, to make the robots work better. I wouldn't be surprised some roads would be marked differently and self driving cars would take a different than regular human driving cars. The best way to avoid crashing in self driving cars is to just program it to follow a car in front of it and keep a safe distance. If human drivers do that it can probably avoid 99.99% of accidents.