CNN Tried Tesla's FSD Beta System, And Here's How It Went

As you may have expected as soon as you read the title, CNN’s time with Tesla’s Full Self-Driving Beta system was … interesting. While the car’s semi-autonomous technology certainly does some things quite well, and it’s impressive in many ways, it’s clearly not ready for a full public launch.

Throughout the short journey, the Tesla Model 3 with FSD Beta tries to drive on the wrong side of the road, it almost crashes into a construction site, and it tries to turn directly into a stopped truck. Not the words “almost” and “tries.” Fortunately, there are no accidents on the trip, but that’s because the driver is doing his job, remaining alert, engaged, and ready to take control at a moment’s notice.

Tesla warns that its Autopilot and FSD Beta systems need constant supervision, and it seems they also need human intervention on a regular basis. The reason we haven’t heard anything about any FSD Beta accidents is likely because the beta testers are making sure to engage and take over when necessary. However, it’s hard to know for sure what might happen if they didn’t engage. Would the car figure it out and proceed safely, or would the results be catastrophic?

CNN also pointed out that other drivers don’t seem happy about the FSD Beta technology. Much like a driver’s training student, it hesitates often, it doesn’t always seem so sure what’s happening next, and it’s not easy for other drivers to forecast what the system might do at any given time. CNN writes:

“The Model 3’s “full self-driving” needed plenty of human interventions to protect us and everyone else on the road. Sometimes that meant tapping the brake to turn off the software, so that it wouldn’t try to drive around a car in front of us. Other times we quickly jerked the wheel to avoid a crash.”

The publication did note that the car stopped and waited for children to cross the road, it noticed an approaching bicycle that the driver didn’t immediately see, and it waited as it should have. Its wide palette of capabilities can be impressive at times, and the screen renderings (what the car “sees”) are pretty incredible. However, it’s not yet consistent in many ways, and it’s certainly not full self-driving technology. It may be able to “fully” drive the car in some cases, but in others, it seems outright confused.

CNN writes that Tesla owners admit that while the technology is impressive, it has its issues. It can be driving itself spectacularly one minute, and the next minute you have to take over or it might not end well.

The video is only four minutes long, and it’s clearly an edited compilation of “highlights,” but regardless, it’s eye-opening on many levels. Sure, the driver wasn’t familiar with the tech, and we have no idea what his Tesla Safety Score would be, but that shouldn’t really matter. He drove the car with the tech engaged and he did his job based on Tesla’s warnings and expectations. What do you think?

Sources: CNN, CNN (YouTube)


Source: Read Full Article