Tesla Autopilot “nag” complaints prompt upcoming screen alert adjustments

Discussion in 'Model S' started by simonalvarez0987, Jun 12, 2018.

  1. simonalvarez0987

    simonalvarez0987 Active Member

    Joined:
    Dec 21, 2017
    Messages:
    860
    #1 simonalvarez0987, Jun 12, 2018
    Last edited by a moderator: Jun 13, 2018
    [​IMG]

    [​IMG]

    Elon Musk has noted that Tesla would be adjusting the screen alerts on its electric cars, addressing complaints from Model S, Model X, and Model 3 owners about the persistent Autopilot nag in the software’s current iteration.

    Musk’s Twitter update came as a response to Peter McCullough, a Tesla owner who aired his grievances about Autopilot v8.1(2018.21.9). According to the Tesla owner, he now needs a “white-knuckle grip” on his car’s steering wheel to get rid of the system’s nags. Musk responded promptly, stating that Tesla would be adjusting its vehicles’ alerts to clarify what drivers need to do to address the system’s warnings.



    Autopilot nags were introduced back with the rollout of v8.0 in September 2016, and Tesla has been refining its alert system since then. With the recent implementation of v8.1 (2018.21.9), however, Autopilot’s nags have gotten more frequent, with some drivers seeing the “Hold Steering Wheel” prompt 15-20 seconds after they take their hands off the steering wheel.

    These were immediately addressed by Tesla’s user base as soon as the update started rolling out. One driver even noted that the increased frequency of the AP nags had discouraged him from using Autopilot altogether, considering that the “Hold Steering Wheel” alert does not disappear even if his hand is gripping the wheel. In response to the complaint, Musk noted that an upcoming update should address the issue.



    While increasing the frequency of Autopilot’s nags is perceived as an annoyance by a number of Tesla drivers, the driver-assist system’s changes are ultimately designed to maximize safety on the road. If any, most of the complaints about the software’s current iteration seem to stem from a misunderstanding of Tesla’s system in detecting drivers’ interaction with the vehicle. After all, Tesla vehicles’ sensors do not consider the intensity of a driver’s grip on the steering wheel. Instead, the system detects resistance that drivers apply to the steering wheel.  This is highlighted in Tesla’s Owner’s Manual for its vehicles (page 81 in the Model S Manual).

    “Autosteer detects your hands by recognizing light resistance as the steering wheel turns, or from you manually turning the steering wheel very lightly (i.e., without enough force to retake control). When your hands are detected, the message disappears and Autosteer resumes normal operation.”

    [​IMG]A Model 3 owner applies resistance by holding the steering wheel in the 3 o’clock position while Autopilot is engaged. [Credit: zxn11/Reddit]

    Tesla owners who have vehicles equipped with Autopilot are set to see a number of considerable improvements to their vehicles’ performance in the coming months. During Tesla’s 2018 Annual Shareholder Meeting, Elon Musk noted that Autopilot’s reliability and capability would increase exponentially over the next 6-12 months. Musk also highlighted that the improvements to the system would be rolled out quickly.

    Even more recently, Elon Musk stated that the first Full Self-Driving features of the company’s electric cars would be introduced in August, with the rollout of Software Version 9. During Musk’s announcement, he noted that Autopilot’s resources have so far been focused on increasing safety, but when software Version 9 comes out, Tesla could start enabling FSD features.

    The new and upcoming features for Autopilot will not only be enjoyed by owners who bought Tesla’s Enhanced Autopilot system, however. During the shareholder meeting, Musk also confirmed that Autopilot Free Trials would start getting reintroduced “hopefully” by next month, giving drivers who have not purchased the system an opportunity to try out the intelligent driver-assist feature.

    Article: Tesla Autopilot “nag” complaints prompt upcoming screen alert adjustments
     
  2. joeski1

    joeski1 Member

    Joined:
    Dec 15, 2016
    Messages:
    205
    Location:
    Voorhees, Nj
    Home Page:
    My AP hasn't nagged me once in over 2 years...

    I disabled it and haven't missed it one bit.

    30000+ miles accident free with AP, auto park, summon , lane change, high beam, lane center ALL disabled.

    ALL totally unnecessary for the driving I do... and I do some driving .

    They could eliminate all that garbage in a "hands on only" vehicle and sell the vehicle for $25K less and most folks wouldn't miss the dents any of these systems will eventually incur on their rolling Faberge eggs that cost mega bucks to collision repair .

    Respectfully...

    Learn how to drive.

    Your passengers , other drivers and vehicle will thank you for it
     
  3. imipsgh

    imipsgh New Member

    Joined:
    Jun 26, 2017
    Messages:
    12
    Location:
    US
    Elon Musk is the Pied Piper of Autonomous Vehicles
    The press recently reported Tesla does not have a feature in their Autopilot system that properly monitors if a driver is paying proper attention to the road or able to take over from the AP. While that process cannot actually be made consistently safe, more on the below, this is yet another example of Elon Musk misleading his customers and their families, into paying for the privilege to risk their lives to be Guinea pigs in order to create an autonomous vehicle. The problem is the process Tesla and most AV makers use will never get remotely close to producing a truly autonomous vehicle. Just as with the driver who was banned from driving in the UK for using Tesla’s Autopilot (AP) from the passenger seat a couple weeks ago, this demonstrates Tesla wants its drivers to let go of the steering wheel no matter what their legalese states. If Tesla actually wanted people to keep their hands on the wheel their system wouldn\'t allow it to happen. Of course the reason for this is that they need for their customers to not only let go of the wheel but get in the very same accidents they will later scapegoat and abandon them for getting in to. Elon Musk needs to mislead, use and then abandon his paying customers so he can use them and their families as Guinea pigs to train the AI in the “Autopilot” system. The problem is this approach is futile and killing people for no reason.
    Public shadow driving is the process where a driver alternates between hands on driving to train the AI and letting go of the wheel so it can be evaluated. Far too many other companies do this as well, Uber being another example. However, Tesla, comma.ai and some others take it further by use their paying customers and their families, not paid drivers, as Guinea pigs. The process is futile because it is impossible to drive and redrive, stumble and restumble on enough scenarios enough times to get anywhere close to L4. Toyota estimates the miles needed are one trillion. My very conservative calculation for cheap cars, sensors and drivers, no other costs, says that is $300B over ten years. To make matters worse though is how unnecessarily dangerous the process is. The process of the vehicle handing over control back to the driver cannot be made consistently safe by any monitoring and control system, especially in complex and dangerous scenarios. This because they cannot provide you the 6-45 seconds of time to gain the proper level of situation awareness to take the right action. Beyond this the AV makers will have to move from public shadow driving hyped benign or simple scenarios to those that are complex and dangerous. Eventually running thousands of accident scenarios thousands of times each. That will mean thousands of casualties.
    Unfortunately Elon Musk doesn’t get any of this. He actually said the lives lost in the pursuit of autonomous vehicles are an ends to a means. That some lives need to be lost now to save many more later. The problem being this is not accurate regarding the use of public shadow driving. Musk has also said he only needs 6B miles to get to L4. I believe 4B have been driven so far. That has resulted in Tesla bragging it can now drive straight on a narrow bridge, killing Walter Huang by having his car follow pavement lines vs street marking directly in to a barrier and driving one of their cars into a stopped firetruck. (And that because these systems can’t see non-moving objects right in front of them well. Leaving them to ignore them and drive right through them). On the other side if this though Elon Musk would like to avoid accountability and liability. So Tesla built some legal CYA into their manuals and contracts telling the driver to not let go of the wheel. And if they do they will be to blame for the accidents they are involved in
    Elon wants it both ways. He wants to convince his overly trusting customers to pay $5000 for the privilege to risk their lives as Guinea pigs to create a system that will supposedly save lives. Appealing to their desire to be part of a higher good. But he also wants to avoid blame and liability so he creates a legal protection system that abandons his customers when they get in to the accident Tesla needs them to get in. This so the AI can be properly tested. This is why you can see videos of him in his cars with his hands off the wheel. He sends mixed messages to fool people into being his beta testers. All while using a process that will never get close to creating the systems he has convinced his customers and the public will eventually save lives.
    What is needed here is for the industry to switch from the use of public shadow driving to the use of aerospace/DoD level simulation. And for someone to step in and force Tesla and Elon Musk to do the right thing. Not unlike what NASA did in 2011 when it said SpaceX code failed safety testing. SpaceX was forced to regroup, hire aerospace engineers and do things right. NASA saved Elon from himself. NHTSA should put a stop to this practice or at least impose a moratorium while these issues are properly vetted. The problem there being that NHTSA is part of the problem. They stated in 2015 that handover can be made safe. The issue there being the study they used to support that position was poorly executed. They determined full control of the vehicle was attained, after being distracted, by simply grabbing the wheel and facing forward. They purposefully chose to ignore the quality of the actions taken after gaining “total control”, the proper situational awareness time needed to perform them properly and if that time could be provided by the monitoring and alert system.
    As you can see we have a perfect storm here. The watchdogs are part of the problem. So what is the real solution? Short of more people dying, especially the first child or family needlessly? We need to people, especially those who make up the press and various levels of government, to stopped being overly impressed by the folks who make the apps of their devices or the games that they play, do their due diligence and police an industry that is clearly unable to police itself.
    For more information on this please find my other articles
    Impediments to Creating an Autonomous Vehicle
    https://www.linkedin.com/pulse/impediments-creating-autonomous-vehicle-michael-dekort/
    Autonomous Levels 4 and 5 will never be reached without Simulation vs Public Shadow Driving for AI
    https://www.linkedin.com/pulse/autonomous-levels-4-5-never-reached-without-michael-dekort
    DoT, NHTSA and NTSB are Enabling Autonomous Vehicle Tragedies
    https://www.linkedin.com/pulse/dot-nhtsa-ntsb-enabling-autonomous-vehicle-tragedies-michael-dekort/
    Open Letter to Elon Musk - Please consider changing your approach in creating autonomous vehicles
    https://www.linkedin.com/pulse/open-letter-elon-musk-michael-dekort/
     
  4. joeski1

    joeski1 Member

    Joined:
    Dec 15, 2016
    Messages:
    205
    Location:
    Voorhees, Nj
    Home Page:
    I will NEVER pay one cent more for any of this auto BS again.. They will have to give it away for free after my experience with it.

    You forgot to note the latest AP induced incident.. a MS slamming a parked police cruiser in Laguna Beach? just last week.. the TESLA shot across the road and appears to have T boned the parked police SUV in the driver's side front door area..

    The sheriff noted this is the 2nd such TESLA accident on that stretch of road in 2 years?

    Coincidence?

    Bad code?

    A TESLA HAL1000 suicide mode?

    DISABLED

    100% safer now.
     
  5. Michael Russo

    Michael Russo Moderator

    Joined:
    Dec 16, 2016
    Messages:
    1,830
    Location:
    Pau, France
    @joeski1 , same comment here as for the assessment of the interior of the 3: opinions vary on this one, though the safety aspect is less prominent there obviously!

    Many love AP, even more so those who understand how to use it right...
     
    • Agree Agree x 1

Share This Page