Tesla released a new version of its controversial “Full Self-Driving Beta” program last month. Among the updates in version 11.4 are new algorithms that determine the vehicle’s behavior around pedestrians. What’s troubling, however, is that a video posted to Twitter over the weekend shows that although Tesla’s system can see pedestrians crossing the road, the Tesla can choose not to stop or even slow down as it passes.
The video was posted by the Whole Mars Catalog account, a high-profile pro-Tesla account with more than 300,000 followers. tweetViewed 1.7 million times, it featured a five-second video with the accompanying text:
One of the most uplifting/exciting things I’ve seen in Tesla Full Self-Driving Beta 11.4.1.
It detected a pedestrian, but instead of slamming on the brakes, it continued as if the human knew there was enough time to do so.
The person who posted the video then made it clear that it was filmed in San Francisco and that anyone is not okay with this driving behavior He must not be familiar with city life. (As someone who has lived in big cities her entire life, I’m certainly not okay with cars not stopping for pedestrians at the crosswalk.)
Most partially automated driving systems like General Motors’ Super Cruise or Ford’s BlueCruise are geo-fenced into a controlled operating field—usually restricted access on divided highways. Tesla has taken a different approach, though, letting users launch its FSD pilot on surface streets.
Not everyone is comfortable with Tesla drivers road testing unfinished software about other road users. In February, the National Highway Traffic Safety Administration required Tesla to issue a recall on approximately 363,000 vehicles with the software installed.
The agency has four major complaints, including that “the FSD Beta system may allow a vehicle to behave unsafely around intersections, such as traveling straight through an intersection while in a turning lane only, entering a stop-sign controlled intersection without coming to a complete stop,” or proceeding at an intersection during a steady yellow traffic light without caution.”
The April 11.4 update was supposed to improve the cars’ behavior, but now there’s more evidence that the FSD Beta is still causing Teslas to break traffic laws. Section 7 of the California Driving Manual, which deals with road laws and rules, states that pedestrians are considered vulnerable road users and that “pedestrians have the right of way on marked or unmarked crosswalks. If there is a boundary line before a pedestrian crossing, stop at the line Limit and allow pedestrians to cross the street.
This isn’t the first time Tesla has been programmed to break traffic laws either.
FSD is “make or break” for Tesla
Elon Musk, CEO of Tesla, has spoken repeatedly about the importance of FSD to his company, saying that it is “make or break it” for Tesla and that this is the difference between Tesla being “worth a lot of money or basically zero.”
FSD Beta has been implicated in a number of accidents and is the subject of several open federal investigations into Tesla electric cars. The option now costs $15,000, and every time the automaker announces another feature “completed,” it allows the company to recognize some of the deferred revenue it has been collecting as payments for the program.
Despite this bold stance in public, Tesla has been far more cautious when dealing with authorities — in 2020 it told the California Department of Motor Vehicles that it did not expect the FSD to become significantly more capable and that it would not pass what is called SAE level 2, which requires a A vigilant human being in the driver’s seat remains in charge of the vehicle’s actions.
or, It is also described more succinctly by author Ed NiedermayerFully self-driving is not, and will not be, in fact self-driving.
Tesla holds its annual shareholder meeting later today in Texas.
#Teslas #full #selfdriving #sees #pedestrians #chooses #slow #Ars #Technica