How soon will they come, and what are the liability issues?
These are sorts of things that will be a drag on flying cars as well.
How soon will they come, and what are the liability issues?
These are sorts of things that will be a drag on flying cars as well.
Comments are closed.
Self driving cars will be the greatest advance in car bomb technology since the invention of the car – and the bomb.
It’s a big nut to crack, since product flaws usually mean products liability claims, which are strict liability (meaning you’re at fault, regardless of negligence or intent).
Because of the liability hurdle, I think it’s going to take some very sophisticated networking and on-board computers before it can happen on a large scale.
About flying cars–I think that doesn’t happen on a large scale, either, until we have truly autonomous cars. No way people are going to be okay with teenagers flying into houses.
We tolerate teenagers driving into houses.
If planes were cheaper (or helicopters were quieter), I like to think that we could maintain our appreciation for freedom.
I’m all for the freedom we no longer quite have; just commenting on the likely legal problems. If it isn’t litigated to death first, the states and/or the federal government would probably feel the need to overregulate.
Safely piloting an airplane is at least 1 or 2 orders of magnitude more difficult than driving a car. Aircraft will therefore always be more complex (and dangerous) than autos.
Tie in the application to buses, trucks, and garbage trucks and the fight won’t even make it to liability issues.
The fights will be tectonic.
I’m going to bet that if this is allowed at all, it will very quickly become apparent that robo-drivers are vastly superior to human drivers. The only accidents that will occur will be due to circumstances that no human driver could possibly cope with, and time and again it will be seen that minor accidents would have been major with a human at the wheel.
More importantly, it will be seen that in two car accidents, the fault will lie with the human driver, not the robo-driver “victim”.
At that point, the real fight will begin, and humans will fairly quickly lose the right to drive on public roads.
See Daniel Keys Moran’s’ The Long Run.
One more reason to regret the demise of Intrade is that there is now noplace I can easily cover that bet.
We have better than twenty years of experience with unmanned aircraft, and the damn things still keep crashing about an order of magnitude more often than piloted aircraft flying similar missions. And the accidents which occur are mostly ones that any competent pilot could cope with. This in an environment orders of magnitude simpler, less crowded, and more algorithm-friendly than a Los Angeles freeway during rush hour.
Yes, one of the things computers are really good at is precisely executing the “How to escape Calamity X” script. Turns out one of the things computers really suck at, and humans are pretty good at, is taking a list of thousands of anticipated potential calamities and, in about two seconds based on incredibly fuzzy data, figuring out which of the thousand calamities they are actually dealing with.
See the real world, which is not a science fiction novel.
I have checked the laws in several states and with the exception of Nevada, which recently passed laws about self driving cars, there does not seem to be any laws that prohibit self driving cars. The laws tend to deal with the person driving a vehicle having various restrictions but not a word about an autonomous vehicle.
That is the thing about a, even marginally, free country. Anything can be done until there is a law to restrict it.
The fact that Nevada has such laws is the result of lobbying by Google. Another time they forgot to follow their motto.
My wife and I have owned two Chrysler Pacificas, 2005 and 2008. The second one turned out to have a few problems, most annoying of which is the back-up warning beeper – get too close to something while backing and it beeps bloody murder.
Now, on the first car this wasn’t so bad. There were not many false alarms. On the second car, it is going off all the time, often at nothing.
My suspicion is that they purposely raised the sensitivity on the thing, in order to avoid lawsuits. After all, if you ignore the beeper and back into something, well, that’s not their fault, is it? They tried to keep you safe. You were just stupid. And, with such legal thinking, a moderately useful and helpful device becomes worse than useless. Once they hook the things directly to the brakes, won’t any of us be going anywhere.
Which is my point, of course. Once we get fully automated cars, they will start slowing them down, making them uber-cautious, stopping for air pockets, and the like. And once they make automated cars mandatory, we won’t be going anywhere, certainly not anywhere fast.
Starting around 1967, Piper manufactured a simple retractable gear plane called the Arrow. It had an interesting feature – a simple system would automatically lower the landing gear (after sounding an alarm) if the pilot forgot to do so. You could throw a switch to disable the auto-extend feature if you were performing stall practice. The system was simple and quite reliable, with the possible exception if you were flying in icing conditions (the plane wasn’t certified for known icing and had no deicing equipment) and the auto-extend pitot iced over. The system was standard on all Arrows made for decades.
You can guess what happened.
Apparently, it’s true that no good deed goes unpunished. Pilots and insurance companies alike hailed the automatic extension system as one of the most important innovations in modern aviation history. It was even copied by both Beech and Bellanca. Unfortunately, Piper was forced to discontinue the automatic gear feature after a pilot ran out of fuel near an airport, was apparently set up for landing, but got too slow during the dead stick approach and had the gear drop out. The resulting drag compromised the airplane’s glide so severely that the pilot couldn’t make the airport and crashed. There was the inevitable lawsuit, and Piper was forced to forego automatic gear from then on.
No amount of testing can prove a piece of software as elaborate as a self-driving car is free of bugs. It simply isn’t possible. Any self-driving car will be involved in accidents because no software, no matter how well thought out, can anticipate every possible driving condition and deal with every other idiot driver. Accidents are inevitable, and when they happen, everyone involved will be sued. Not even Google has enough money to defend against every scumbag tort lawyer that’ll go after them.
How will lawyers react to self-driving ambulances?
Hopefully by driving in front of them.
[…]
Did I type that out loud?
I must not have — if I had, I would have typed “diving,” not “driving.”
I concur that the transition from automated cars being allowed to automated cars being mandatory is likely to be swift. Look for a period of unmanned delivery vehicles working out the kinks. It won’t be long; I doubt that I’ll be doing much, if any, driving in retirement (I’m 53).
What about personal car that you can still drive yourself (probably by “wire”) but also has a self-driving mode, maybe called “auto chauffeur” or something. Consider the possibilities…you could go to a party or bar, get blotto, and instead of a cab or designated driver simply have your car in “auto drive” mode take you home, or anywhere else.
Ya know that’s already illegal in most parts of the world? Just like sleeping in your car when drunk.. you are “in charge of a motor vehicle while intoxicated”.
The cars in I, Robot (the Will Smith movie) had that. His character was teased for keeping control more than “normal.”
What about your personal car that you can still drive yourself (probably by “wire”) but also has a self-driving mode, maybe called “auto chauffeur” or something. Consider the possibilities…you could go to a party or bar, get blotto, and instead of a cab or designated driver simply have your car in “auto drive” mode take you home, or anywhere else.
Tim,
You just hit on the largest flaw in the self-driving scheme. There will be a tendency for the human operators, who are lazy by nature, to allow the vehicle to have total control over their destiny. We already see that with people who blindly follow GPS steering or set their cruise controls and refuse to think what they might be doing to traffic around them by inching past vehicles in the slow lane while completely blocking the fast lane.
As a USAF pilot I instructed night low altitude auto-TFR operations. HAL could fly the aircraft quite well and avoid the rocks. He was NEVER in control; at best we had an uneasy partnership. Many pilots who forgot to “pilot” and became passengers instead met an untimely demise when the system exceeded its limits unnoticed. If people take the attitude that self-driving technology is an “assist to” rather than a “replacement of” the operator, then the concept will work. Unfortunately, seeing how my fellow citizens drive, I think that’s an unlikely scenario. Liability will be directed towards the manufacturer instead of the operator where it likely belongs.
Yep. If people can go to sleep behind the wheel they will go to sleep behind the wheel. Pilots on trans-Atlantic flights are a case in point. There comes a point where making the vehicle more autonomous makes it less safe.
If the automation in that car crashes as much as Google Navigator does on my phone, they’ll have to pry my steering wheel from my cold dead hands.
Yeah, I can already see the jokes coming, but you know what I mean.
Could be worse. Could be Windows.
“Google crashing windows,” is now of ambiguous context. Interesting times.
Actually Windows 7 has been quite stable for me. Only problem I’ve had was with nVidia drivers. Took two attempts at upgrades to get that fixed.
http://www.bloomberg.com/news/2013-05-07/tesla-ceo-talking-with-google-about-autopilot-systems.html
Tesla CEO Talking With Google About ‘Autopilot’ Systems
Flying cars indeed!
http://www.terrafugia.com/tfx-vision