Former Lyft eng here. From my vantage point, as an industry, we're nowhere near where we should be on the safety side. The tech companies are developing driving tech privately instead of openly. Why can a private for profit company "test" their systems on the public roads? The public is at serious risk of getting run over by hacked and buggy guidance / decision system. Even when a human operator has his hands hovering 1 inch off the steering wheel and his foot on the brake, if the car decides to gas it and swerve into a person, it is probably too late for the human crash test driver to overtake. This is going to keep happening. The counterargument FOR this is that it is overall a good idea for the transportation system if the number of crashes & deaths is statistically less than human operated cars. I see this as the collision of what's possible with what's feasible and that we are years away from any of this being close to a good idea. :( Very sad for the family and friends.

Even worse: There are people who test their DIY self-driving hacks on public roads:

https://youtu.be/GzrHNI6eCHo?t=100

Using https://github.com/commaai/openpilot , which is cool but not on public roads.

That is indeed reckless, but that guy is testing open source self-driving technology.

Given that all the main car companies are keeping their technology private I don't see how open-source systems are supposed to keep up without people doing this.

I think you can also contrast this to the many more people who decide to drink alcohol and then drive.

Unless things have changed, all the important parts aren't open source -- neither the vision nor the decision pipeline.

https://github.com/commaai/openpilot this is the bit that needs training as far as I'm aware.