Tom Pittman's WebLog

2016 February 13 -- Robot Cars & Law

The house rag for the IEEE (a professional organization I belong to) comes every month, but I rarely take time to read much of it -- usually none at all. This month, however, the cover story announces:
It is the year 2023, and for the first time, a self-driving car strikes and kills a pedestrian. A lawsuit is sure to follow. But exactly what laws will apply? Nobody knows.
The author is identified as a lawyer, but his commitments are not disclosed. However, the internet is wonderful: I found that the law firm where he practices includes as clients
some of the largest multinational corporations engaged in industries such as medical devices, semiconductors, software, electronic systems and pharmaceuticals.
In other words, he mostly lawyers for the guys who will be making the robot cars, not the people they kill. Y'all already know I'm not a big fan of lawyers, but I didn't tell you why. It's a very Christian thing, something that not even most people who call themselves "Christian" would think about, but Jesus taught it as "the Second Great Commandment" -- which I usually abbreviate "2C" -- (second behind giving God first place in your life, 1C). In everything you do, you need to give the other person at least as much deference as you want for yourself. That sort of destroys competition, as I pointed out seven years ago. The American legal system is adversarial (the legal term for competition), which basically means that the smartest lawyer, the guy who can convince the judge that the law is on his side, or who makes the most entertaining presentation to the jury (they get to decide for any or no reason at all, it's called "nullification" because the law doesn't matter, only their opinion), he wins and the other guy loses. If you ever start to think the jury has any other method in mind, consider my sister, who votes for the Presidential candidate whose face she likes. People do that. Anyway, every lawyer believes it's his God-given (or "evolved", same thing) duty to scrape off the table into their clients' lap, everything that isn't nailed down. It's absolutely antithetical to 2C. So it's his (and his company's) business to make sure the cookies come off the table into the laps of Google and General Motors and Tesla and the other robot car wannabes, and not the families of somebody lucky enough to have their child killed by a robot car.

He claims (and I believe) that well-made robot cars will be safer than human drivers. Certainly safer than Texas drivers (see "Back up, Texas Style"). Probably. He proposes limiting the liability of robot cars to be no more than a human would do in comparable circumstances. The problem he didn't mention is that when a human driver runs over a pedestrian and kills him, he goes to jail as "involuntary manslaughter". It's not as bad as running over the guy while DUI, but it's still jail time. Every driver knows that. They may not alter their driving much -- at least not in Texas -- but it is a motivating influence. What does that mean when the robot car runs over the pedestrian, that the car goes to jail? Who cares about that? That the carmaker goes to jail? That's what this lawyer is trying to prevent. If a bridge or a skyscraper collapses and people die, the builder or engineer (or both) go to jail unless they prove they used "generally accepted" practices to design and build it. This guy is trying to keep a million lines of computer code out of court. He said so.

You see, the problem is that carmakers need to balance the cost of litigation (and losing) against the cost of more testing and better designs. Remember the Pinto scandal? Ford had decided the cost of litigating would be less than the cost of making a safer design. If this lawyer has his way, the laws will set that cost of litigating artificially low, so the cars will be less safe -- maybe even than human drivers. Does your iPhone never freeze up? Why do you need anti-virus software? Has your computer never crashed? A robot car has ten or a hundred times more code than a simple operating system like Linux or Windows or iPhone, and a thousand times more different ways it can break and kill people. Debugging that code costs a lot of money, and the makers will trade off that cost against litigating a few deaths. Bet on it.

Write your legislator and make sure they understand what the carmakers and their lawyers are trying to do to us. The vehicle codes are state by state, not Federal (yet). You vote. Cars don't. Car manufacturers in some other state or country don't vote here. The legislator needs to know that. But mostly they need to know that the money they get from Google or GM doesn't save lives here (in your state). The carmaker lobbyists are spending that money because legislators are cheaper than lawsuits. Don't be. We want them to work harder at their robot cars than Apple and Microsoft work at making their computers, you know the ones, which crash all the time and whose "EULA" promises nothing. We want them to promise something useful before we hand them a "Get out of jail free" card. We don't want to give them that GOOJF card at all.
 

Links

"The Problem with 21st Century AI"
Complete Blog Index
Itty Bitty Computers home page