Tesla owner explains Autopilot behavior at Model X accident scene

Discussion in 'Model X' started by simonalvarez0987, Apr 1, 2018.

  1. simonalvarez0987

    simonalvarez0987 Active Member

    Dec 21, 2017
    #1 simonalvarez0987, Apr 1, 2018
    Last edited by a moderator: Aug 1, 2018


    A Tesla owner recently shared a theory on factors that might have led up to the fatal Model X accident near Mountain View, CA, on March 23. Driving on the same stretch of road on Autopilot, the Tesla owner observed that there were deviations on the street’s markings and repair cuts - things which might have caused the electric car’s sensors to misread the highway’s lanes.

    The 36-second clip was uploaded and shared on YouTube by Privater, who included annotations to the video highlighting his observations. At the 0:05-second mark on the clip, the Tesla owner noted that the markings on the road deviated from their original line due to the beginning of a repair cut. Further into the street (0:12 into the clip), Privater noted that the repair cuts in the road became very prominent. This could have confused Autopilot into thinking that it was a lane, especially under the direct glare of the sun.

    As the barrier where the fatal Model X accident took place in came into view (0:23 into the video), Privater noted that the section of the road leading up to the crash cushion was marked by solid white lines. As could be seen in the Tesla owner’s clip, the lines were almost wide enough to be a lane, which could have also been misread by Autopilot.

    The Tesla owner noted that he had been driving on the same stretch of road on Autopilot for almost two years. During that time, Privater stated that his car had misread the road marks and nearly collided with the crash cushion once or twice. He described his experiences as a response to a comment on his YouTube video.

    [​IMG]Tesla Model X accident theory [Credit: Privater/YouTube]

    “On the video, my car is on Autopilot. I drive the same section for nearly two years, (and) 99.9% of (the) time, I’m on Autopilot. However, this kind of error only happened to me once or twice. It’s scary enough for me to keep high alert on this intersection,” he wrote.

    In an update to its first statement about the fatal Model X accident, Tesla confirmed that the ill-fated electric SUV was on Autopilot when it collided with the highway barrier. According to Tesla, the Model X driver had received several visual warnings and one audible hands-on warning earlier during the drive. The ill-fated electric SUV’s driver had also not placed his hands on the steering wheel for 6 seconds before the fatal accident. Overall, the Model X driver had about 5 seconds and 150 meters of unobstructed view to steer the car away from the highway divider before the collision occurred.

    In a statement to Reuters, NTSB spokesman Chris O’Neil expressed the agency’s disagreement about the Elon Musk-led company’s decision to release information about the investigation to the public.

    “The agency needs the cooperation of Tesla to decode the data the vehicle recorded. In each of our investigations involving a Tesla vehicle, Tesla has been extremely cooperative on assisting with the vehicle data. However, the NTSB is unhappy with the release of investigative information by Tesla,” O’Neil said.

    As we noted in a previous report, the Model X crash was so severe because a crash attenuator, a highway safety device designed to absorb the impact of a colliding vehicle, had not been repaired by CalTrans since a 2010 Toyota Prius smashed into the safety device 11 days before the Tesla accident. In a statement to ABC7 News, Caltrans stated that the standard timeline for a crash attenuator’s repair is 7 days or 5 business days after an accident. The safety device’s repairs were delayed, however, due to storms in the area. 

    Watch Privater’s Autopilot drive-by in the video below.

    TSLA Update Letter 2018-1Q by Teslarati on Scribd

    Article: Tesla owner explains Autopilot behavior at Model X accident scene
    • Useful Useful x 2
  2. imipsgh

    imipsgh New Member

    Jun 26, 2017
    Tesla\'s \"Autopilot\" is not remotely safe.

    Setting aside the fact that public shadow driving for AI and test vs proper aerospace level simulation will never lead to L4 and cause thousands of needless casualties, having to ignore stationary objects in the lane and not being able to tell pavement markings from lane markings is beyond reckless. These vehicles should not be on the road.

    There is a far better way

    Aerospace Systems Engineer-Member SAE AV Testing Task Force

    Impediments to Creating an Autonomous Vehicle
  3. J.Taylor

    J.Taylor Active Member

    Feb 13, 2017
    Reaching level 5 autonomy is still a way off yet. Someday cars will be able to self drive, but for now, a human driver is still a necessary part of a safe trip.
    • Like Like x 1
    • Agree Agree x 1

Share This Page