Desert of My Real Life











{March 14, 2010}   Toyota Software

Toyota has been much in the news lately because of questions about the performance of a variety of their cars.  In the interest of full disclosure, I should say that I’ve owned four cars in my lifetime–two Fords and two Toyotas (my current car is a Scion which is a division of Toyota).  When I was 14, I learned to drive a standard using my father’s company trucks, teeny tiny Toyota pickup trucks.  So I have been a fan of the company. 

But I think the company’s response to reports about problems with unexpected, uncontrolled acceleration have been quite problematic.  And this response made me realize how scary a situation we are all in when it comes to our cars, no matter the make or model.

I’ve understood for a long time that our cars are increasingly controlled by a computer.   I think this realization came upon me gradually, as my check engine light came on over the years and increasingly computers could read whatever the problem was based on diagnostic codes.  I was a software developer for a long time and I believe we are placing too much trust in software. The Toyota issue is another piece of evidence that we are placing too much trust in software.

When I first started hearing about the Toyota recalls, the discussion was all about the unexpected acceleration being caused by gas pedals that get stuck or by floor mats that get wedged under the brake pedals.  These mechanical explanations for the problem are comforting because they can be fixed fairly easily.  Just replace the shaft of the gas pedal or the floor mats and the problem goes away.  Toyota would certainly like us to believe that the problem is mechanical and not a problem with the software.  They have implied that the National Highway Traffic Safety Administration’s report indicates that misplaced floor mats have caused all of the accelertation problems.  Unfortunately for Toyota, the NHTSA’s report simply said that they had found no other problems–yet.

So here’s the thing.  Software is complex.  The way software interacts with hardware is even more complex.  Finding bugs in software is sometimes incredibly difficult because it is impossible, in a complex system, to anticipate and test every single combination of conditions.  As a result, bugs in software can raise their ugly heads intermittently for years before they are discovered.  Software developers should understand this.

Whenever possible, if lives depend on the integrity of a software system, an override should be built into the system.  In the case of Toyota vehicles (and cars of any other make), this means that there should be some sort of mechanical override of the software system.  An easy override would be that when the key is turned off (which should be a mechanical process unmediated by software–this is clearly not the case), the computer should shut down and the brakes should go into mechanical mode, which means it will be more difficult to brake but braking should still be possible.  Apparently, this does not happen in Toyota vehicles since a driver in California recently had a high profile case of unexpected acceleration.  Scrutiny has turned to the past of the driver in that case (he has had significant financial problems in the past) but even if this case turns out to be a hoax, Toyota should seriously reconsider any decisions they have made to rely exclusively on their software.  Even the best software has bugs.



One of my favorite shows on NPR is On The Media.  Each week, the hosts examine a variety of topics related to the media, mostly in the US.  I hear the show on Sunday mornings on New Hampshire Public Radio.  On February 26, 2010, the show aired a story called “The Watchers.”  It brought me back to my graduate school days and my academic roots in computer science, specifically in pattern recognition and machine learning.

The story was about the value of the massive amounts of data that each of us leave behind as we go about our daily electronic lives.  In particular, John Poindexter, convicted of numerous felonies in the early 1990’s for his role in the Iran-Contra scandal (reversed on appeal), had the idea that the US government could use computers to troll through this data, looking for patterns.  When I was in graduate school, deficit hawks were interested in this idea as a way to find people who were scamming the welfare system and credit card companies were interested using it to ferret out credit card fraud.  Then George Bush became president and 9/11 occurred.  Suddenly, Poindexter’s ideas became hot within the defense department.

In 2002, Bush appointed Poindexter as the head of the Information Awareness Office, part of DARPA, and Poindexter pushed the agenda of “total information awareness,” a plan to use software to monitor the wide variety of electronic data that we each leave behind with our purchases and web browsing and cell phone calls and all of our other modern behaviors.  The idea was that by monitoring this data, the software would be able to alert us to potential terrorist activity.  In other words, the software would be able to detect the activities of terrorists as they plan their next attack.

The On The Media story described the problems with this program, problems that we knew about way back when I was in graduate school in the early 1990’s.  The biggest problem is that the software is overwhelmed by the sheer volume of data that is currently being collected.  This problem is similar to the problem of information overload in humans.  The software can’t make sense of so much data.  “Making sense” of the data is a prerequisite for being able to find patterns within the data.

Why do we care about this issue?  There are a couple of reasons.  The first is that we’re spending a lot of money on this software.  In a time when resources are scarce, it seems crazy to me that we’re wasting time and money on a program that isn’t working.  The second reason is that data about all of us is needlessly being collected and so our privacy is potentially being invaded (if anyone or any software happens to look at the data).  Poindexter’s original idea was that the data would be “scrubbed” so that identifying information was removed unless a problematic pattern was identified.  This particular requirement has been forgotten so that our identifying information is attached to each piece of data as it is collected.  But I think the main reason we should care about this wasted program is because it is another example of security theater, which I’ve written about before.  It does nothing to make us actually safer but is instead a way of pretending that we are safer.

When I was in graduate school, I would never have thought that we would still be talking about this idea all these years later.  Learning from the past isn’t something we do well.



et cetera