Tesla’s FSD Beta - An Experiment On Public Roads
We're going to make this left turn. And I'm getting honked at. I just. Now it's just turned right.
We're supposed to go left. We just ran a red light. We just ran a red light. I'm going to get a ticket for this. This is Taylor Ogan.
He's the founder and CEO of Snow Bull Capital, a hedge fund that invests in green and high-tech sectors. Ogan is a self-proclaimed Tesla fan and says that he's owned the stock since the company went public. Recently, he invited CNBC to ride along with him in his 2020 Tesla Model Y as he tested out Tesla's FSD beta 10.8.1 software in Brooklyn, New York. He's geez! I don't know if you realize what happened, but that's that's the closest thing I've ever seen. Ogan is not a professional test driver, nor is he a Tesla employee. Instead, he's one of the over 50,000 customers that Tesla has allowed to access FSD Beta.
FSD Beta, which stands for "Full Self-driving Beta," can be best summarized as a host of new driver assistant features that are not yet debugged. Chief among them is "autosteer on city streets," which enables drivers to automatically navigate around complex urban environments without moving the steering wheel with their own hands. Ok, I'm going to take over there.
So it wanted to change lanes in the middle of an intersection there and cut off the guy behind me. The system is anything but "self-driving" and still requires constant attention and intervention from the driver, a caveat that Tesla highlights on its website. With it being a beta, you know, I've seen that any time the car could just make a mistake, and I have to be ready for that. Right now, my stress levels go up, not down from using full self-driving, but that's that's the cost of of making it better.
Tesla's FSD Beta program is heavily scrutinized by regulators and has earned Tesla side eye from competitors, who usually have professionally trained drivers, not customers, test driver assistant features in their vehicles. But for now, the program is still available for thousands of Tesla owners to access without the knowledge of drivers and pedestrians around them. Tesla is pushing the bounds of what is possible with self-driving, and they've got to be applauded for that. On the other hand, they're taking a huge, huge gamble with this by putting an
incomplete or a buggy piece of software on the roads. CNBC went for a ride with three FSD Beta testers in different parts of the country to see how the system performs in the real world and explore what this program could mean for the future of vehicle automation. Before we get into our ride alongs, it's important to understand what FSD Beta can and can't do.
The way that Tesla markets its driver assistance packages has historically been confusing. Tesla's standard driver assistance package is marketed as Autopilot. Its functionality includes automatic emergency braking, forward-collision warning, lane keeping and adaptive cruise control, which matches the speed of your car to that of the surrounding traffic. For an extra $12000 or $199/month, Tesla sells an option it calls Full Self-driving, or FSD. The premium package includes more sophisticated features like automatic lane changing, automatic parking and "smart summon," which lets drivers call their car from a parking spot to come pick them up.
FSD Beta takes this one step further with unfinished features like "autosteer on city streets" and "traffic light and stop sign control," which identifies stop signs and traffic lights and automatically slows your car to a stop upon approach. Although they may sound advanced, none of these packages come even close to making Tesla's vehicles autonomous. When we're talking about FSD Beta here, we're talking really about a system that is an SAE level 2 system that is designed to assist the driver in a wide variety of different situations. But the driver at all times is responsible for the control of the vehicle. By comparison, a robotaxi would be level 4 or higher. It's unbelievable. It's unbelievable that people think that this is somehow going to
be a robotaxi. Listening to Tesla CEO Elon Musk, who's been promising driverless cars since 2016, it's easy to see why people may believe that. My personal guess is that we'll achieve full self-driving this year at a safety level significantly greater than a person.
Over time, we think Full Self-driving will become the most important source of profitability for Tesla. If you run the numbers on robotaxis, it's nutty good for from a financial standpoint. In order to gain access to FSD Beta, participants have to first allow Tesla to monitor their driving for a period of time, until they can earn a high enough safety score.
The score is calculated based on five metrics specified by Tesla. Those who qualify are allowed to download FSD Beta via an over-the-air update. While some testers say that earning a high score is hard and required driving even more cautiously than they typically would, Ogan says he was able to easily game the system. The way I got a 100 on the safety score was I drove like a grandma for 103 miles and then I didn't drive it for a week. And then I had 100 safety score and then I was granted access to the beta software.
Newer Tesla vehicles have in-cabin cameras, which are supposed to monitor drivers and alert them if they're not paying attention. If a driver gets enough warnings, the car can disable driver assistance features temporarily. After several warnings, a driver may get kicked out of the FSD Beta program entirely. But not all Teslas have camera-based driver monitoring, and those with cameras are far from perfect. Kelly Funkhouser has been testing Tesla's Autopilot and FSD Beta on a closed route for Consumer Reports, and says that a lot of the time Tesla's driver monitoring systems simply didn't work. In our testing, we weren't able to really get that camera to be sufficient for monitoring attention because we could cover it, and there would be no difference as if the camera wasn't even there.
And as long as you have your hands on the wheel, even if you're looking at your phone or totally distracted or asleep, you'd never get any of those warnings to pay attention. For Full Self-driving Beta, that's when Tesla claims that they have these additional messages that say things like 'the camera's blocked' or 'to please pay attention,' we've never experienced any of those warnings. We decided to run our own tests with the in-cabin camera using Ogan's Model Y. The first time we covered the camera before attempting to engage FSD Beta and were unable to do so. When we covered the camera after FSD Beta
was already engaged, we were able to drive for about a block before the system turned off and gave us a strike. During our drive in Brooklyn, FSD Beta struggled with pedestrians. Here's a pedestrian. We have the right of way. We have a green, and look at that! It just slammed on... She wasn't crossing. She was waiting to cross. Traffic lights. Ok, I don't know why we're stopping her. And now we just missed the light. Sorry, Yeah, this woman's laughing at me.
You can't help but laugh. And complex intersections. It doesn't want to go. There's a bus behind us, or it's just not going anywhere.
And now it's creeping. Now it's trying to go right. We're in the middle of an eight way intersection just stuck in the middle. So of course, I have to take over.
At one point during the drive, Ogan restarted the system and was able to engage FSD even with the screen still black. Yeah. The screen is black and the car is driving itself. I have no idea what it can see. I don't know if it's engaged or not.
You can see the wheel turning. I don't know which way it wants to turn. It just slammed on the brakes.
There's a school bus behind me. I just got honked at. Look at that. You see that Pete Buttigieg? FSD Beta did not fare much better in San Francisco, where we also tested version 10.8.1. Complex intersections...
Hold on one sec. Yeah, OK, so that's me getting honked at again. It got a little bit confused by that intersection. Pedestrians. Oh, OK, so it's worried about those pedestrians over. That was a pretty hard stop.
And traffic lights all seemed to pose problems. Ok, so now it's confused. It's stopping in the middle.
Ok, so. Yeah, so that guy behind me, is like justifiably upset because I stopped in the middle of an intersection when the light was green. When I'm using the FSD, I'd say I'm less worried about hitting a pedestrian than I am about being the victim of like a road rage incident because I'm, you know, just not driving in a courteous way. Roundabouts were also tricky. Ok, here's our roundabout. Let's see what happens here.
Ok. Whoa. All right. I don't think it understood how to use the roundabout because we're basically going the wrong way now. In fact, yeah, yeah, it's it's kind of like diverting itself a block so that it can avoid that that roundabout.
So I guess it doesn't understand that yet. Though a second roundabout proved more successful. Ok. This time it got it and found its way through. The car also seemed to struggle with bollards, at one point almost plowing into one while taking a right turn. Whoa! Ok. In nearby San Jose, popular YouTuber AI Addict posted a video that showed his car actually running over a bollard while on version 10.10.
Oh oh, oh. We hit that. We actually hit it. We hid it. Wow. Where the system seems to perform better is in a more suburban setting.
We have a four-way stop. It's always done well with those. I've never had an issue where it didn't handle the four-way stops. Nice and clean.
Intersections were also better. Here it's trying to yield, but these cars are turning, so it'll eventually make up its mind to go, OK. Kevin Smith is a software engineer for a print-on-demand book manufacturer. Smith says he's been testing FSD Beta for around 5,000 miles, and that most of the time updates will mean solving one issue only to encounter another.
So basically, before the 10.5 update and including 10.5, most of the issues that we're having were related to like turn lanes and odd street markings and things like that. With 10.8, all those little turn lane issues and things, those all kind of went away. So they fixed those little things. But I started to have more and more disengagements for cars it's not yielding to or something like that. Something else that Smith noticed FSD Beta struggle with is driving in low visibility, such as during a rain or snowstorm and heavy snow.
It pretty much is nonfunctional. It just tells you that it can't see and you need to you need to get to take over. The other issue I had was on our kind of rural roads. Whereas our main roads, they'll plow them, the rural roads, they they just, we clear the middle of the road. This car, because it could see the center yellow line, it wanted to stay on its lane. And that included keeping two tires in that big pile of ice and snow that you wouldn't drive on as a human. A major concern that experts point to when it comes to FSD Beta is a lack of transparency and oversight in pretty much all aspects of the program, starting with who gets to participate.
It's Tesla deciding who is good and who is bad here. There is no third party. It can't come down to the company alone to decide who is safe and who is not.
Bryan Reimer agrees that there needs to be more transparency. He heads the Advanced Vehicle Technology Consortium at MIT, which is made up of a number of insurance providers, research organizations and automakers. Show me the data and prove it to me. Prove how you're using this data and day in and day out that this is an efficient system enhancing the risk benefit model on our roads. Much like the FDA requires drug manufacturers to prove the benefits of the pharmaceutical intervention inherently outweigh the risks.
That is what I am looking for. Destiny Thomas says we need to think beyond just the safety of drivers. I think the problem with the auto industry is that we center everything around the person behind the wheel. What we lose is the human element and the impact to human beings who are not behind the wheel. Thomas is the founder and CEO of Thrivance Group, a for profit urban planning organization with a specific focus on marginalized communities.
We need to make sure that there's representation within private industry to make sure that the folks who are developing this technology understand the very important nuances of what it means to travel through space in a body that's not normative. How is this car going to recognize someone using an assistive device that maybe isn't a wheelchair and doesn't look like one? How is this technology going to be able to recognize someone who has purple undertones in their skin that don't react to the sensors that are in the car? Currently, children under the age of 16 are more likely to be killed by cars while walking down the street or riding their bikes. People with disabilities are more likely to be killed because of the speed at which cars are traveling. Government agencies are increasing scrutiny of FSD Beta. In a letter to Tesla in October 2021, the Department of Transportation's National Highway Traffic Safety Administration reprimanded Tesla's use of non-disclosure agreements for early participants of FSD Beta, calling the practice "unacceptable" and saying that it "adversely impacts NHTSA's ability to obtain information relevant to safety." In a second letter to Tesla released that same day, the agency
requested more details on how Tesla decides who gets beta access and specific information on the people involved. The NDA asked participants to keep their experience in the program confidential and to refrain from sharing any information regarding the program on social media sites or forums. Additionally, participants were asked not to use their vehicles for services like Lyft, Uber, Turo or Scoop while enrolled in the Early Access Program. Tesla eventually dropped the NDA requirement for beta participants, but Ogan thinks many are still nervous to talk about the system's issues from fear that they'll get the software taken away. I think the most dangerous part about this are the people who are posting videos of flawless drives with zero interventions, zero disengagement, and that is incredibly misleading.
I think that's really detrimental to the Tesla shareholders, to the other enthusiasts of these technologies because they are led to believe that this system is a lot more capable than it is by a significant margin. Tesla has been criticized for using terminology that exaggerates the capability of its vehicles. It's clear that if you're marketing something as full self-driving and it is not full self-driving and people are misusing the vehicles and the technology that you have a design flaw and you have to prevent that misuse. FSD Beta is already being blamed for some accidents.
In November, NHTSA announced that it was looking into a driver's complaint in which a 2021 Tesla Model Y was hit by another car after the Tesla went into the wrong lane while taking a left turn. At the time, the complaint to NHTSA says, the car was using FSD Beta. Another consumer complaint to the agency released in February states that a 2021 Tesla Model S misjudged the road and took too wide of a turn, which resulted in the car going slightly off the road. The complaint says Beta was engaged at the time.
Tesla recently ordered a recall of its FSD beta software for more than 53,000 vehicles to retract a feature that allowed Teslas to automatically roll past stop signs without first coming to a stop. The recall won't require a visit to the shop. Tesla can fix the issues with a free over-the-air software update. During our drive in San Francisco, we experienced our own stop sign incident. Ok, that's a stop sign that we just almost ran through. That's not good.
Public confusion over the availability of autonomous vehicle technology is also a problem. One survey of 4,000 vehicle owners found that 19 percent of customers thought fully autonomous vehicles are already available for sale. If a respondent said that they believe it is in the marketplace right now, we asked them to give us an example, and we did this in an open-ended fashion. The most commonly mentioned word in those verbatim comments is "Tesla."The fact that Tesla is very top of mind, it really demonstrates that the brand is shaping consumer expectations, even though they're currently not producing a fully automated self-driving vehicle, yet.
So I think we really have to be careful in how we're messaging things as an industry. Most of the experts we spoke to agree that there needs to be more oversight of FSD Beta, as well as all autonomous vehicle testing in general. There are no federal laws regulating the operation or testing of autonomous vehicles on public roads. Instead, states have taken a piecemeal approach to regulation through a mix of local legislation and executive orders with varying degrees of strictness. The best thing that government can do when it comes to regulations is to set guardrails for companies and the industry that put hard stops in place beyond which the companies and the technology shouldn't go. But they also give them a lot of latitude and agility to develop the technology within those guardrails. And at that point, we see the social benefits and social rewards of it while
avoiding some of the more egregious risks. But some argue that regulation doesn't have to strictly come from the government. I am of the belief that proper checks and balances don't have to always happen by way of local government, that there are community interest groups that would be happy to support Tesla and anyone else looking to innovate in this way to make sure that the folks whose interests they represent truly are safe once these vehicles make it onto the roadway. Meanwhile, even the testers that we spoke to are split on whether testing using consumers is safe. I don't think it's right that customers are able to just test this.
By using the FSD Beta on public streets, I don't feel I'm increasing the assumed risk that people are putting themselves in by also being on those public streets. We share those streets with people who are, you know, using a car for the first time with their learner's permit. I believe if someone is not paying attention and they're using their cell phone and they're doing these things, then there is an increased risk. But that same individual likely is doing those same things when they're not on FSD Beta. If I thought there was a risk of injuring someone, I wouldn't be doing it. I understand why Tesla might want to get, you know, novice testers out there experiencing it, but I definitely think that that's a huge risk that consumers take when they are doing this on public roads.
2022-02-20 12:09