welcomeToWhat happens when our Tesla Model Y's cameras can't see? Nothing good.-VaTradeCoinwebsite!!!

VaTradeCoin

What happens when our Tesla Model Y's cameras can't see? Nothing good.

2024-12-26 08:45:24 source:lotradecoin guide Category:Finance

Tesla — its CEO in particular — is obsessed with cameras. Elon Musk is reported by the New York Times to have told his employees that if people can drive with just eyes, cars should, too. Unfortunately, it just doesn’t work this way, and as with eyeballs, when the cameras can’t see, the tech that relies on them regresses or stops working altogether. Losing, even temporarily, the most valuable features of our 2023 Tesla Model Y Long Range is understandably frustrating and makes us question their worth and the decisions that got us here.

All the hardware you need?

At one point, Tesla advertised all new models as being equipped with all the hardware they’d ever need to achieve real autonomous driving, or “Full Self-Driving” in Tesla’s parlance. Actually getting there, the company claimed, was simply a matter of sufficiently training the software. The company has since backed away from this messaging.

Around the same time, Tesla was busy deactivating and even removing hardware from its vehicles. Forward-facing radar was deactivated in existing cars, and installation stopped on new cars. Later, ultrasonic parking sensors were removed. In both cases, Tesla and Musk argued those sensors were unnecessary thanks to software advancements, and that cameras were all that are needed.

Every other company we’ve spoken to in the autonomous vehicle field disagrees. Outside of Tesla, it’s generally accepted that multiple different types of sensors, each with its own strengths and weaknesses, that provide overlapping and redundant information are necessary to achieve true autonomous driving. A single sensor type, the thinking goes, can too easily break, get obscured, or miss crucial information, and leave the whole system temporarily blinded.

Nature proves the point

The limitations of Tesla’s approach and the wisdom of its autonomous driving competitors has been made apparent to us on multiple occasions — most often, first thing in the morning.

Where we live, overnight and morning fog, sometimes devolving to a heavy mist, is common from late fall to early spring. It tends to condense on car windows, which can be cleared with defrosters, wipers, or by rolling them down. It also tends to collect on camera lenses and their protective covers. When this happens, the only way to clear the cameras on our Model Y is to walk around the car and wipe them off by hand. Forget to do this before you get in the car and attempt to drive, and you’ll quickly discover Park Assist, Tesla’s name for its camera-based parking sensors, is “degraded” and not trustworthy due to the cameras being obscured.

This is a somewhat minor annoyance but knowing that ultrasonic parking sensors suffer no such limitation (though they have their own issues) and that they used to be installed on these cars is frustrating.

It’s not just manual parking, either. Tesla’s self-parking and “Summon” technologies also rely on the cameras, so if you’re hoping to have the car pull itself out of its parking space or even drive to you, you’re going to be disappointed.

More from Tesla:Tesla brings back cheap Model 3 variant with big-time range

It’s also indicative of the larger problem at hand: Tesla hasn’t engineered any way for the car to clean its cameras when they’re misty or dirty, so it’s entirely on the driver. Instead, the company has tried to place its cameras where they’re unlikely to get obscured, but once it happens, there’s nothing the car can do but figuratively throw its hands up. Aside from certain off-roaders, most other companies don’t routinely install camera washers on their cars, either, but those companies also don’t rely exclusively on cameras for their cars to operate as intended.

Obscured cameras are one thing when backing out of the driveway; they’re another in foul weather. Be it heavy fog or rain, the cameras’ inability to see quickly shuts down the “Full Self-Driving (Supervised)” system, and with it Autosteer lane centering and Autopilot adaptive cruise control, in the name of safety. All it takes to completely neutralize the car’s flagship $15,000 (at the time of our purchase) tech feature is a little rain. We’re happy the vehicle is erring on the side of caution, but it’s frustrating to know the steps that could’ve been taken to address these issues have been ruled out.

Here again, we must note that all autonomous-aspiring vehicles struggle with the weather to some degree. Teslas, however, are uniquely susceptible to the problem due to their total reliance on cameras. Competitors address the issue with other sensors, like radar, which isn’t affected by fog. With recent advancements, lidar, which Musk has continually talked down, has also been demonstrated to see better in fog than cameras. Though icing is still possible with radar and lidar, heating elements have already been incorporated into vehicles using those systems to melt snow and ice and keep them unobstructed.

Why won’t Tesla use more sensors?

Ignore Musk’s eyes-to-cameras analogy; scientists who study the human eye and visual cortex say cameras and computers don’t work the same way at all. From the way they collect light to the way they process it, there’s little comparison between the two beyond the obvious.

The real reason, per Musk, is cost. He’s long rejected lidar as being too expensive. It’s not just lidar, though. Removing radar (a single module of which can cost three times as much as a camera) and ultrasonic sensors from the parts list saves Tesla money in materials cost, R&D cost, and assembly cost. Not having to design and manufacture those parts, program them, teach the computer to fuse data from multiple sensor types and interpret it, and add time and complexity on the assembly line potentially saves the company a significant amount of money. If cameras could actually do the same things as those sensors, it would be a brilliant business move. Tesla’s real-world experiment, though, hasn’t turned out that way.

Tesla agrees

Recent reports show Tesla’s attitude may be changing. Numerous court cases and exposés have shown Tesla engineers do not agree with the claims being made in the company’s marketing. When pressed, they’ve had to walk back company claims about the capabilities of their cameras and software, and have even admitted to faking promotional videos overselling the system’s functionality. In the most recent court case, Tesla’s lawyers even argued before a judge that lidar is necessary to achieve autonomous driving, and the current cars can’t actually drive themselves and won’t be able to without it.

Filings with government regulators, meanwhile, show Tesla has applied for licenses to put radar back in its cars. (Radar uses radio frequencies and is therefore regulated by the Federal Communications Commission.) The company has also recently purchased several hundred lidar units, likely for R&D purposes, which may indicate it has changed its position on that technology as well.

How much is a Tesla Cybertruck?Here's a pricing breakdown for the EV pickup

Where does that leave us?

In all likelihood, we’re stuck with what we have. Tesla will no doubt continue to work on its software and send more OTA (over-the-air) updates intended to improve our car's functionality. Although the company does give owners of certain older models the option of upgrading their car’s computer for a price, it does not retrofit hardware like radar units or cameras. That’s just the risk of buying a Tesla, though — everything from the price to the hardware to the features might change the next day.

For more on our long-Term 2023 Tesla Model Y Long Range:

  • We Bought A 2023 Tesla Model Y Long Range For A Yearlong Test
  • The Supercharger Difference
  • How Far Can You Tow With A Tesla?
  • What Changed After The Tesla Autosteer Recall? Not Much.
  • Are The Tesla Model Y’s Third-Row Seats Worth It?

Photos by MotorTrend