Ever since making my reservation and tuning in to the world of Tesla I've repeatedly heard about this technology called autopilot. Before that I knew little to nothing about the technology of self-driving cars.
My first reaction to autopilot was "meh". My second reaction was one of slight surprise, upon reading through forum threads and other sources regarding how incredibly popular the notion of autopilot is, in particular on a (in a?) Tesla. Many Model 3 reservationists seem thrilled about this feature.
My "meh" reaction comes from having worked with computers for over 30 years. A nugget of wisdom that came forth early on was that the moment you let a computer start make decisions for you, you are opening yourself up for trouble. Computers traditionally have not been good at making decisions where there is any sort of ambiguity involved... anything requiring human-like judgement. I have to wonder if even with all the iterations of programming algorithms slowly (maybe) pushing us toward -god forbid- artificial intelligence... has anything changed?
When I first heard the term "autopilot" I assumed that meant the car drove itself. Meaning, get in the car, program in a destination -kind of like Han Solo does in the Millennium Falcon- and then kick back and the car drives itself to that destination... while you engage in other activities such as reading, sleeping, or even better, having a little Kissee-poo session with your passenger(s).
But nooooooooooo. As Elon himself has pointed out, autopilot works in cars the way it does on an airoplane... it flies on a sort of fixed course, but the pilot must be alert to changing conditions and be prepared to takeover at a moments notice. Not being a pilot I did not know that. To me, what is being called "autopilot" seems more like "driver assist". But that's just me. I've learned that the term "autonomous driving" is what I was actually taking autopilot to be. My bad.
But after reading about a few autopilot-based crash stories something is starting to crystallize. Aside from sensor failures and other foul-ups with the technology (if any), I'm starting to get the sense that drivers will start to treat autopilot like autonomous driving. Are some of the reported accidents due to people relying too much on the autopilot? If so, it's human nature. We get used to the idea that autopilot will, for example, slow down the car if it spots an obstacle on the road. That facility works great 5.. 10.. maybe 100 times in a row. We get used to that behavior. And adapt. We keep our eyes on the newspaper we're reading for longer and longer periods of time. We turn our heads to make eye contact with the rear passenger for longer periods at a time. And quite likely, especially with teenagers, we engage in longer and longer Kissee-poo sessions (amazingly, "Kissee-poo" is already in the dictionary of this forum).
Bottom line, we are lulled into a [false] sense of complacency. Again, human nature.
So for me, the jury is really out on autopilot. One would suspect that it's not going to take too many overconfidence-in-auto-pilot-fender-benders to have lawmakers going Pencils Up to try and address the issue. I'll bet anyone a nickel on that.
Moral of the story: Maybe the current technology actually should be called Driver Assist after all. Maybe we should have a soft voice gently remind us each time we engage autopilot that autopilot is NOT autonomous driving. Maybe this. Maybe that. I don't know. Maybe I'm just a Luddite.
One thing is for certain when I get my 3... I'm DISABLING the autopilot before letting any young people drive the car!
Oh. Am I going to use autopilot when I get my 3? Probably most likely. Only human you know.