FSD v12 Breaks the Rules

Author Walter Isaacson has been giving out sneak previews of his new biography of Tesla CEO Elon Musk - and in one recent interview to the folks at CNBC, Isaacson relays some new information on Tesla’s current design philosophy regarding their Full Self Driving systems.

According to Elon, the company's new approach to training their FSD algorithms is to throw out the rules-based approach, and treat the system more like a learning system.

The original idea was reportedly explained to Musk by Dhaval Shroff, who works for Tesla’s Autopilot division. He told Elon to think of it,

[...] like ChatGPT, but for cars,
— Dhaval Shroff, Tesla Autopilot division

And the idea sounds dangerous - after all, we have road laws for a reason - but how often do human drivers strictly follow those rules? How many times has someone been rear-ended because they slowed too quickly for a yellow light instead of safely accelerating through, for instance.

There are a bunch of unwritten rules about the way we drive that keeps us safe, but the second you put a computer program that can’t understand nuance into the mix, we get accidents - like some of the ones we have been seeing across previous editions of the FSD tech.

Instead, the Autopilot division pitched the idea of throwing away the hard-coded rules of the road, and let the system learn from incident recordings - which Tesla has thousands of hours of. By watching other drivers, surely the system will also pick up that it needs to stop at a stop sign, or red light - even if it starts to use the creeping stop that everyone does but isn’t strictly legal.

Needless to say, Elon took to this idea very quickly, and by the beginning of this year, the new network had already worked through 10 million clips of driver video from “best-case” scenarios. 

In fact, some of you might remember that livestream of Elon testing out FSD version 12 in his car - well, this was using the new algorithm. The drive was reportedly great, despite almost running a red light - which was the only instance of driver intervention for the whole trip. 

And remember, the version Elon was using in this video was not coded to know what to do at a stop light, lane change, or parking spot. It learned those behaviours from driver data.

Obviously, switching away from hard-coding the driving laws into the automated system is something that is making the people at the National Highway Traffic Safety Administration a little bit twitchy. The FSD has been a very scrutinised system since it began - and regulators are often expressing their concerns about it. They are predictably not thrilled at this new approach.

Regardless, Tesla intends to launch FSD v12 as soon as the administration gets around to approving it - which might happen within the year, since the NHTSA is looking to study how this sort of system actually works in the wild.

And as the hardware for FSD evolves, this sort of training will get better and better. For now though, it looks like FSD beta testers will have to keep driving their law-abiding systems - while gathering data to test the more organic one.

Previous
Previous

Powerwall 3 Launches

Next
Next

“Model 2” Details Revealed