Tesla is working hard to improve its assisted driving system, Autopilot, making it more and more like driving a human being. The goal is to make the car react more naturally and less rigidly, with ongoing updates to the Full Self-Driving (FSD) system to improve the smoothness of driving and the ability to make decisions. However, it has emerged that, in some cases, Autopilot may not always follow traffic laws to the letter.
It appears that Tesla is training its system to ignore some road rules to make driving less robotic. For example, there are reports that Autopilot may ignore signals such as no right turns at red lights to better simulate the behavior of a human driver. This approach aims to make the driving experience more flexible, although it could raise some safety concerns in complex traffic situations.
Some criticism has also come from former Tesla employees, who say that when training the system, the company would prioritize specific videos, such as viral content online. This could distort the way Autopilot is refined, perhaps setting aside more common situations that affect all users.
In short, in an effort to make assisted driving more “human”, Tesla may be exposing its customers to risks that were not intended.