Tesla Inc. terminated an employee six days after he posted a YouTube video of his car running into a traffic pylon while using Full Self-Driving, or FSD, the carmaker’s controversial driver-assistance system.
John Bernal, who worked on the data-annotation team for Tesla’s Autopilot system, received a separation agreement from the company on Feb. 11, just under a week after he posted a video that now has more than 180,000 views. About 3½ minutes in, Bernal’s Model 3 makes a right turn too sharply and runs into a green pylon separating a road and bike lane in downtown San Jose, California.
Bernal, 26, said in a phone interview that while his manager refused to put the reason for his firing in writing, he was told it was in part due to improper use of FSD. Tesla said in January the beta software was running on almost 60,000 vehicles in the U.S.
The carmaker, which disbanded its public relations department in 2020, didn’t respond to a request for comment. CNBC reported Bernal’s termination earlier on Tuesday.
Tesla’s effort to limit information-sharing by FSD beta users drew scrutiny from the U.S. National Highway Traffic Safety Administration four months before the company fired Bernal. The regulator expressed concern in October about reports that participants in an FSD early-access program had been subject to non-disclosure agreements that discouraged portraying the feature negatively.
Chief Executive Officer Elon Musk joked about dropping the NDAs just before NHTSA sent its letter to Tesla, tweeting that the agreements would be “available in perforated rolls.” The agency has opened two investigations into possible defects involving Autopilot since August.
Bernal, whose YouTube account AI Addict has more than 8,300 subscribers, said another reason his manager gave for the termination was that his video channel was a conflict of interest. In addition to sharing an unsigned copy of his separation agreement, he shared a photo of the screen of his Model 3 showing that FSD beta had been suspended based on his recent driving data.
Tesla initially made FSD beta available to members of an early access program comprised of employees and vocal fans of Musk and the company. Vice’s Motherboard reported in September that members of the program should share their experience with the software on social media “responsibly and selectively.”
“Do remember that there are a lot of people that want Tesla to fail,” the company’s agreement read, according to Motherboard. “Don’t let them mischaracterize your feedback and media posts.”
Bernal’s AI Addict channel stood out in part because he narrated his videos with a mix of both positive and negative feedback on how Tesla’s FSD software handled city streets.
About 2 minutes into the just over 9-minute-long video in which his Model 3 ran into the traffic pylon, he praises the system for slowing down to let another car go by and moving from a far-right lane to a left-turn lane in time to make a traffic light. As the Model 3 is completing the turn, however, it has trouble finding the correct lane to turn into.
About 2 minutes and 40 seconds in, the car runs a red light and turns right without stopping. A passenger brings up that Tesla had just disabled a setting in which FSD beta users were able to slowly roll through intersections without coming to a complete stop when no other cars or pedestrians were present. The carmaker determined a recall was necessary after meeting with NHTSA about the functionality in January.
Bernal’s Model 3 runs into the traffic pylon less than a minute later.
“We hit that,” he said just after the collision, which scuffed up the Tesla’s front bumper. “It’s the first, for me, to have actually hit an object on FSD.”
Later in the video, Bernal takes manual control of the Model 3 twice after it attempts to steer down sets of railroad tracks. Just after he praises the car for patiently waiting for pedestrians to cross a street around the 7 minute and 40 second mark, the vehicle veers toward two more sets of pylons.
Source: www.autoblog.com