My wife and I have had access to Tesla Full Self Driving (Beta) and used it obsessively for at least 6 months, but then we struck out 5 times and lost access to FSD Beta for the second time. We were told that we would get access restored with a future software update. Tesla never described specifically what a failure (strike) was, and sometimes we would seem to get two or three strikes at the same time. According to Tesla, we are among the ~285,000 who have paid from $5,000 (April 2019) to $15,000 (since September 2022) for Full Self Driving (FSD). We are supposedly among the ~160,000 who have passed the safety test and are actively using the system.
Tesla is very proud of the data that show that drivers using FSD have many fewer accidents per mile than drivers who don’t. If this is true, it seems very counterproductive — even immoral — to unnecessarily suspend access to the system to anyone who has paid for it. [Editor’s note: On the flip side, it could be the hypersensitivity to whom is allowed to use FSD that has led to the good stats, presuming they do still show fewer crashes per mile driven. —Zach Shahan]
The “five strikes, you’re out” policy, followed by 60+ days in purgatory, seems to me like a very bad system. Every Tesla driver who uses the standard Automatic Lane Assist knows that if you screw up torqueing the steering wheel too little, too much, and mainly too infrequently, then you lose access to it for “the rest of the drive.” If you are driving on a limited access road, you must wait until the next exit, pull off, come to a stop, put the car into park, and then you have access restored. It seems to me that this would be an excellent system for FSD as well. It gives you significant inconvenience and the punishment fits the crime. Why would Tesla remove access to its wonderful safety system longer than this?
Tesla has been using the “torque the wheel test” now since the earliest days of Automatic Lane Assist. It determines if your hands are on the wheel, but it doesn’t determine if you are paying attention to the road. For at least 4 years now, Tesla has installed a camera above the rear view mirror in its cars which can determine if the driver’s face, maybe even eyes, are directed toward the road while using the FSD software. Since the camera method is much superior to the torqueing the wheel method, I don’t understand why Tesla needs to put the driver in double jeopardy by using both methods. It’s even been reported that Tesla may eliminate the wheel torqueing method if the driver can drive safely using FSD for 1000 miles. Really? Another test of questionable benefit? Just be done with steering wheel torqueing for all Tesla cars that have the cabin camera.
Last night, after waiting ~60 days, we got our third software update (2022.44.30.10) since we lost access. This time it was V10.69.25.2 of FSD. Not only was our access to FSD restored, but we have been allotted 5 new strikes.
Note: Purgatory aside, for over 300 days, my wife and I have obsessively used and observed Tesla FSD in our Model 3 through 9 versions of the software V10.5, V10.8, V10.10, V10.11.2.1, V10.12.2, V10.69. V10.69.2.4., V10.69.3.1, and now V10.69.25.2.
My wife and I are going to try very hard to retain access longer this time! In the past, we have obsessively used FSD whenever possible. This time, we are going to turn off FSD except when we can give total concentration to keeping access to the software. No more eating, navigating, adjusting controls, distractions from grandkids, driving cross country, etc., etc. when using FSD. There is not much reason to use FSD when driving cross-country because regular FSD gives you Automatic Lane Assist on limited access roads and will take the exit to the Superchargers. Cross-country driving gives you many hours to lose concentration and lose one of your 5 allotted strikes. Ironically, when an experienced driver is at the wheel, most driving is done reflexively, and the stress level is low. Unfortunately, your concentration level must be higher with FSD, because you can’t always be absolutely certain what it is going to do.
With FSD, the camera above the rearview mirror is looking at your face and knows if you have looked down at your phone, looked at the screen to your right, or maybe have shut your eyes too long. Most would agree that the driver should be concentrating on the road directly ahead. But the first warnings come by the color blue blinking at the top of the screen to your right. If you are concentrating on the road ahead, you may miss the warning on the screen to your right in your peripheral vision. The secondary warning is an audible signal. However, by the time you get the audible signal, you are well on your way to a forced disengagement (e.g., a strike).
It has been reported that Tesla is being sued as a result of deaths that occurred when FSD was active. I have been driving my Model 3 safely now for over three years and almost 90,000 miles, using FSD obsessively for the 6 months I’ve had it. I used Automatic Lane Assist obsessively the rest of the time. I follow the warnings in the FSD and ALA instructions, and I am prepared to intervene instantly whenever the automatic software fails.
Tesla Automatic Lane Assist is a very powerful system! It will keep your car in the center of the lane better than all but the most expert drivers can do. It’s far superior to the lane assist that I had on my 2018 Nissan Leaf. It will slow down automatically and make accurate turns as sharp as those marked 25 mph. Tesla FSD extends this to hairpin turns marked 15 mph and will also navigate a roundabout (rotary) accurately.
Tesla FSD is very predictable except when it’s not. Faced with the repeat situation of changing into the turn lane when following the navigation would require staying in the center through lane, it makes the same mistake every time. After exiting I-15 in Orem, Utah, on 1600 North, we have observed this behavior numerous times through many software updates. It persists with the latest version of the software. However, you would not expect this behavior the first time it is encountered. In general, I don’t trust FSD in heavy traffic except at traffic lights. It will frequently react too slowly merging onto a busy street or highway — at this point, I will give it a nudge with the accelerator. Even in light traffic at a stop sign, FSD will often stop too early and creep slowly forward until it determines it is safe to proceed. Meanwhile, unless you are patient, you will be annoyed and a driver following you will certainly be annoyed.
One of the most disturbing characteristics of FSD is that it sometimes picks the wrong lane. It could be a turn lane or a wide bike lane. This behavior persists with the latest version.
I have used FSD Beta now through nine versions of the software (see versions above). I enjoy being part of an artificial intelligence experiment. I really like being able to set an address into the navigation system, pull out into the street in front of my house, and have my car drive to that address — without intervention in some cases. Sometimes, I like to patiently test the performance in light traffic. Other times, I will only use FSD (particularly in heavy traffic) when I am confident that it will work.
Appreciate CleanTechnica’s originality and cleantech news coverage? Consider becoming a CleanTechnica Member, Supporter, Technician, or Ambassador — or a patron on Patreon.
Don’t want to miss a cleantech story? Sign up for daily news updates from CleanTechnica on email. Or follow us on Google News!
Have a tip for CleanTechnica, want to advertise, or want to suggest a guest for our CleanTech Talk podcast? Contact us here.
Source: Clean Technica