• There are so many things we have to ask ourselves with regard to autonomous systems.
    Which rules designed for human situations simply don't apply in autonomous ones?  If you take vehicle stopping distances at a given speed an element of it is the time it takes the human to react and begin to apply the brake.  A human will be thinking of the road, listening to the radio, thinking about what is for tea tonight and a thousand other things that will extend that reaction time.  An autonomous vehicle will be thinking about the job of piloting the vehicle 100% of the time and be indefatigable.  It may not have to detect and react to the vehicle in front, it may be told explicitly by the vehicle in front that braking will take place.  It may have any number of sensors going far beyond those of a human.
    The precision in driving made possible by such technology may make it possible to reduce road width to the point where cars travelling 100mph in opposite directions need only have 6" of clearance either side rather than several feet for human drivers, if we trusted humans to drive at 100mph on normal roads.

    In terms of believability of data I think there is a tendency to use data to prop up a preconceived view.  Hence the joke about using data the way that a drunk uses a lamp post.  For support rather than illumination.  As long as data tells you what you expect it goes unchallenged.  The instant it challenges what you need it to say then "the data must be wrong".