Regular reminder that any form of driving automation that requires driver oversight and does not explicitly transfer legal liability from the human driver to the system/developer is not "autonomous" or "self-driving." https://twitter.com/RCMPAlberta/status/1306600570791301123">https://twitter.com/RCMPAlber...
If an employee trained by Uber can be criminally charged for failing to pay sufficient attention to an in-development driving automation system, Tesla owners can be too. https://twitter.com/Tweetermeyer/status/1306002308950495232">https://twitter.com/Tweeterme...
Here& #39;s the part that& #39;s really worrying: we know that driver inattention when using these kinds of systems is not some freakish aberration, but the norm. Given enough time, everyone tunes out. Please, I beg of you, listen to these experts! https://youtu.be/45mJGYiqrxY ">https://youtu.be/45mJGYiqr...
I am deeply concerned about what we& #39;ll see when Tesla releases what it is calling "feature-complete Full Self-Driving" but still requires human driver oversight. This encourages untrained "testers" to push the limits of an unknown capability, while making them liable for failure.
AV developers have built up a considerable body of knowledge around the best practices for on-road testing of AVs. This includes: multiple, highly-trained people per vehicle, camera-based driver monitoring and regular swapping of roles/partners. Teslas have none of these things.
Roads are a public space, full of people who did not choose to make the riskiest thing they do every day even riskier. Nobody deserves to be injured or die so a company can cut corners in the development of a driving automation system. This is a line we should not cross.