Tesla FSD Handles Dense Fog With Ease — Watch the Footage
🔥 JUST IN — 1h ago

30-Second Brief

The News: A video shared by Whole Mars Catalog shows Tesla's Full Self-Driving system navigating confidently through dense fog, with the car maintaining lane position, speed management, and obstacle awareness without driver intervention.

Why It Matters: Fog is one of the most challenging conditions for any driver — human or AI. This footage adds to a growing body of real-world evidence that FSD's camera-based perception is more capable in low-visibility weather than many critics assume.

Source: @wholemars on X

Tesla FSD Handles Dense Fog With Ease — Watch the Footage

By BASENOR Editorial • March 26, 2026

Dense fog is the kind of driving condition that makes most people grip the wheel a little tighter. Visibility drops, lane markings disappear, and reaction time becomes everything. It's also the kind of scenario that critics of Tesla's camera-only Full Self-Driving approach have long pointed to as a fundamental weakness — the argument being that without radar or lidar, FSD would struggle when its visual inputs are compromised.

This morning, Whole Mars Catalog posted a video that makes a strong counter-argument.

Whole Mars Catalog tweet showing Tesla FSD navigating dense fog
Source: @wholemars — March 26, 2026

▶ Watch Video on X

What the Video Actually Shows

The footage captures a Tesla running FSD through conditions where visibility is severely reduced — the kind of fog where headlights reflect back at you and the road ahead vanishes into grey. Despite this, the vehicle holds its lane, adjusts its speed appropriately, and continues navigating without prompting the driver to take over.

This is notable for a few reasons. FSD relies entirely on cameras — eight of them — to build its understanding of the world around the car. In fog, those cameras are working with degraded visual input, the same way human eyes are. What the system has to lean on instead is its trained understanding of road geometry, lane markings when they're partially visible, and the positions of other vehicles.

The fact that it handles this smoothly — without a disengagement — is exactly the kind of real-world validation that matters more than any controlled test environment.

Why the Camera-Only Debate Matters Here

Tesla's decision to build FSD around cameras — and to remove ultrasonic sensors from newer vehicles — has been one of the more polarizing technical choices in the autonomous driving space. The conventional wisdom in the industry has long favored sensor fusion: combining cameras with radar and lidar to create redundancy, especially in conditions where cameras underperform.

Elon Musk has consistently argued the opposite: that the human visual system navigates the world using eyes alone, and that a sufficiently trained neural network operating on camera data can do the same. The argument is that radar and lidar are crutches that add cost and complexity without proportional safety gains — and that the real solution is better AI, not more sensors.

Videos like this one don't settle the debate definitively, but they do chip away at the assumption that camera-only systems are inherently fragile in adverse weather. For our FSD coverage, this represents a meaningful data point.

🔭 The BASENOR Take

Timeline: This is a real-world demonstration captured in current production FSD — not a future promise or a staged demo.

Impact Level: Medium-High — reinforces the case for FSD's real-world reliability in conditions that were previously considered edge cases.

Confidence: High — the video is primary footage from a trusted, well-known Tesla community account with a long track record of accurate reporting.

One video doesn't define a system's capabilities across millions of miles and infinite weather permutations. But it does reflect something that long-time FSD users have been reporting anecdotally for months: the system has gotten meaningfully better at handling conditions that used to reliably trigger disengagements.

The pattern matters. Each time a credible piece of footage surfaces showing FSD operating confidently in rain, snow, construction zones, or fog, it adds another data point to a picture that's becoming harder to dismiss. Tesla's neural network training pipeline — fed by data from millions of vehicles on real roads — appears to be doing exactly what it was designed to do: improve continuously through exposure to edge cases.

For owners who have been hesitant to engage FSD in anything other than ideal conditions, this is worth paying attention to. The system is not limited to clear-sky, dry-road scenarios. Whether you're comfortable relying on it in fog is still a personal judgment call — and staying alert remains non-negotiable — but the underlying capability is clearly there.

The broader industry implication is also worth watching. If Tesla can demonstrate consistent, reliable FSD performance across adverse weather conditions using cameras alone, it significantly strengthens the case for their approach at a time when competitors are still investing heavily in lidar-based stacks. That has downstream consequences for cost, scalability, and ultimately who wins the autonomous driving race.

Self-driving

Stay in the Loop

Join 27,000+ Tesla owners who get our tips first — plus 10% OFF

Shop Tesla Accessories — Free USA Shipping

Keep Reading