What’s up with that Tesla autopilot crash?

I assume you’ve heard about this.

Woodlands Fire department, Montgomery County Hospital District and Cypress Creek EMS were dispatched around 9 p.m. Saturday to a fire in the woods in the Carlton Woods Subdivision on Hammock Dunes Place.

Several neighbors had called reporting a fire in the woods, and that a car had crashed and exploded, Palmer Buck, the Woodlands Fire Department chief said.

When the responding units arrived at the scene firefighters discovered the bodies of two males in the 2019 Tesla Model S, according to the Montgomery County Police Reporter. One male was in the front passenger seat and the other in the rear passenger seat.

Harris County Precinct 4 Constable Mark Herman told the Associated Press on Monday that investigators are “100% sure” that no one was driving the car.

Federal investigators are on the scene to find out what happened. The claim that no one was driving the car at the time is of course of interest, for all the obvious reasons. Elon Musk has publicly disputed this assertion, claiming that it’s not possible because the Tesla’s autopilot function will shut down if no one is in the driver’s seat. It turns out that’s not exactly true.

Consumer Reports engineers easily tricked our Tesla Model Y this week so that it could drive on Autopilot, the automaker’s driver assistance feature, without anyone in the driver’s seat—a scenario that would present extreme danger if it were repeated on public roads. Over several trips across our half-mile closed test track, our Model Y automatically steered along painted lane lines, but the system did not send out a warning or indicate in any way that the driver’s seat was empty.

“In our evaluation, the system not only failed to make sure the driver was paying attention, but it also couldn’t tell if there was a driver there at all,” says Jake Fisher, CR’s senior director of auto testing, who conducted the experiment. “Tesla is falling behind other automakers like GM and Ford that, on models with advanced driver assist systems, use technology to make sure the driver is looking at the road.”

Our demonstration comes as federal and local investigators continue to probe the cause of a fatal crash Saturday in Texas in which an apparently driverless 2019 Tesla Model S struck a tree, killing the vehicle’s two occupants. Harris County Precinct 4 Constable Mark Herman, who was on scene at the crash, told CR that he’s almost certain that no one was in the driver’s seat when the vehicle crashed. (The Model S in the crash and our Model Y are different models, but they both have Autopilot.)

We tried to reach Tesla to ask about the Texas crash but did not hear back. Tesla CEO Elon Musk tweeted Monday evening that data logs recovered from the crashed Model S “so far show Autopilot was not enabled,” and he suggested that it would not be possible to activate Autopilot on the road where the crash took place because of the lack of painted lane lines. The National Highway Traffic Safety Administration and the National Transportation Safety Board are investigating the crash, which occurred on a winding road in Spring, Texas, outside of Houston.

CR wanted to see whether we could prompt our own Tesla to drive down the road without anyone in the driver’s seat. So Fisher and Kelly Funkhouser, CR’s program manager for vehicle interface testing, took our 2020 Tesla Model Y out on our test track. Funkhouser sat in the rear seat, and Fisher sat in the driver seat on top of a buckled seat belt. (Autopilot will disengage if the driver’s seat belt is unbuckled while the vehicle is in motion.)

Fisher engaged Autopilot while the car was in motion on the track, then set the speed dial (on the right spoke of the steering wheel) to 0, which brought the car to a complete stop. Fisher next placed a small, weighted chain on the steering wheel, to simulate the weight of a driver’s hand, and slid over into the front passenger seat without opening any of the vehicle’s doors, because that would disengage Autopilot. Using the same steering wheel dial, which controls multiple functions in addition to Autopilot’s speed, Fisher reached over and was able to accelerate the vehicle from a full stop. He stopped the vehicle by dialing the speed back down to zero.

“The car drove up and down the half-mile lane of our track, repeatedly, never noting that no one was in the driver’s seat, never noting that there was no one touching the steering wheel, never noting there was no weight on the seat,” Fisher says. “It was a bit frightening when we realized how easy it was to defeat the safeguards, which we proved were clearly insufficient.”

There’s video at the link, and I also recommend listening to Friday’s What Next TBD podcast, which discusses this crash, Tesla’s spotty record with its autopilot feature, AI and the driverless car question, and more. Tesla is not currently cooperating with NTSB on this, which has drawn some ire from Rep. Kevin Brady, who represents The Woodlands. I probably won’t follow this obsessively, but as driverless cars are an interest of mine I will keep an eye on it.

Related Posts:

This entry was posted in Planes, Trains, and Automobiles and tagged , , , , , , , . Bookmark the permalink.

3 Responses to What’s up with that Tesla autopilot crash?

  1. Alan Coovert says:

    Electric cars are vaporware and are just a marketing ploy by the auto industry to sell regular vehicles.

  2. Dan Wallach says:

    Right now self-driving features are in the “uncanny valley”, where they’re good enough that you trust them more than you really should because the failure modes are wildly non-obvious.

    The AI experts outside of Tesla still see this as an open research problem. But the more restricted universe of freeway driving on divided roads with good lane markings? That’s now coming from a bunch of car companies.

  3. Pingback: Tesla disputes official account of that autopilot crash – Off the Kuff

Comments are closed.