eyes on the road

US to probe Texas-based Tesla's self-driving system after pedestrian killed in low visibility conditions

Investigators will look into the ability of “Full Self-Driving” to “detect and respond appropriately to reduced roadway visibility conditions, and if so, the contributing circumstances for these crashes." Photo courtesy of Tesla

The U.S. government's road safety agency is investigating Tesla's “Full Self-Driving” system after getting reports of crashes in low-visibility conditions, including one that killed a pedestrian.

The National Highway Traffic Safety Administration said in documents that it opened the probe last week after the company reported four crashes when Teslas encountered sun glare, fog and airborne dust.

In addition to the pedestrian's death, another crash involved an injury, the agency said.

Investigators will look into the ability of “Full Self-Driving” to “detect and respond appropriately to reduced roadway visibility conditions, and if so, the contributing circumstances for these crashes.”

The investigation covers roughly 2.4 million Teslas from the 2016 through 2024 model years.

A message was left Friday seeking comment from Tesla, which has repeatedly said the system cannot drive itself and human drivers must be ready to intervene at all times.

Last week Tesla held an event at a Hollywood studio to unveil a fully autonomous robotaxi without a steering wheel or pedals. Musk, who has promised autonomous vehicles before, said the company plans to have autonomous Models Y and 3 running without human drivers next year. Robotaxis without steering wheels would be available in 2026 starting in California and Texas, he said.

The investigation's impact on Tesla's self-driving ambitions isn't clear. NHTSA would have to approve any robotaxi without pedals or a steering wheel, and it's unlikely that would happen while the investigation is in progress. But if the company tries to deploy autonomous vehicles in its existing models, that likely would fall to state regulations. There are no federal regulations specifically focused on autonomous vehicles, although they must meet broader safety rules.

NHTSA also said it would look into whether any other similar crashes involving “Full Self-Driving” have happened in low visibility conditions, and it will seek information from the company on whether any updates affected the system’s performance in those conditions.

“In particular, this review will assess the timing, purpose and capabilities of any such updates, as well as Tesla’s assessment of their safety impact,” the documents said.

Tesla reported the four crashes to NHTSA under an order from the agency covering all automakers. An agency database says the pedestrian was killed in Rimrock, Arizona, in November of 2023 after being hit by a 2021 Tesla Model Y. Rimrock is about 100 miles (161 kilometers) north of Phoenix.

The Arizona Department of Public Safety said in a statement that the crash happened just after 5 p.m. Nov. 27 on Interstate 17. Two vehicles collided on the freeway, blocking the left lane. A Toyota 4Runner stopped, and two people got out to help with traffic control. A red Tesla Model Y then hit the 4Runner and one of the people who exited from it. A 71-year-old woman from Mesa, Arizona, was pronounced dead at the scene.

The collision happened because the sun was in the Tesla driver's eyes, so the Tesla driver was not charged, said Raul Garcia, public information officer for the department. Sun glare also was a contributing factor in the first collision, he added.

Tesla has twice recalled “Full Self-Driving” under pressure from NHTSA, which in July sought information from law enforcement and the company after a Tesla using the system struck and killed a motorcyclist near Seattle.

The recalls were issued because the system was programmed to run stop signs at slow speeds and because the system disobeyed other traffic laws. Both problems were to be fixed with online software updates.

Critics have said that Tesla’s system, which uses only cameras to spot hazards, doesn’t have proper sensors to be fully self driving. Nearly all other companies working on autonomous vehicles use radar and laser sensors in addition to cameras to see better in the dark or poor visibility conditions.

Musk has said that humans drive with only eyesight, so cars should be able to drive with just cameras. He has called lidar (light detection and ranging), which uses lasers to detect objects, a “fool's errand.”

The “Full Self-Driving” recalls arrived after a three-year investigation into Tesla's less-sophisticated Autopilot system crashing into emergency and other vehicles parked on highways, many with warning lights flashing.

That investigation was closed last April after the agency pressured Tesla into recalling its vehicles to bolster a weak system that made sure drivers are paying attention. A few weeks after the recall, NHTSA began investigating whether the recall was working.

NHTSA began its Autopilot crash investigation in 2021, after receiving 11 reports that Teslas that were using Autopilot struck parked emergency vehicles. In documents explaining why the investigation was ended, NHTSA said it ultimately found 467 crashes involving Autopilot resulting in 54 injuries and 14 deaths. Autopilot is a fancy version of cruise control, while “Full Self-Driving” has been billed by Musk as capable of driving without human intervention.

The investigation that was opened Thursday enters new territory for NHTSA, which previously had viewed Tesla's systems as assisting drivers rather than driving themselves. With the new probe, the agency is focusing on the capabilities of “Full Self-Driving" rather than simply making sure drivers are paying attention.

Michael Brooks, executive director of the nonprofit Center for Auto Safety, said the previous investigation of Autopilot didn't look at why the Teslas weren't seeing and stopping for emergency vehicles.

“Before they were kind of putting the onus on the driver rather than the car,” he said. “Here they're saying these systems are not capable of appropriately detecting safety hazards whether the drivers are paying attention or not.”

Trending News

A View From HETI

A new generation of technology is making it faster, safer, and more cost-effective to identify CUI. Courtesy photo

Corrosion under insulation (CUI) accounts for roughly 60% of pipeline leaks in the U.S. oil and gas sector. Yet many operators still rely on outdated inspection methods that are slow, risky, and economically unsustainable.

This year, widespread budget cuts and layoffs across the sector are forcing refineries to do more with less. Efficiency is no longer a goal; it’s a mandate. The challenge: how to maintain safety and reliability without overextending resources?

Fortunately, a new generation of technologies is gaining traction in the oil and gas industry, offering operators faster, safer, and more cost-effective ways to identify and mitigate CUI.

Hidden cost of corrosion

Corrosion is a pervasive threat, with CUI posing the greatest risk to refinery operations. Insulation conceals damage until it becomes severe, making detection difficult and ultimately leading to failure. NACE International estimates the annual cost of corrosion in the U.S. at $276 billion.

Compounding the issue is aging infrastructure: roughly half of the nation’s 2.6 million miles of pipeline are over 50 years old. Aging infrastructure increases the urgency and the cost of inspections.

So, the question is: Are we at a breaking point or an inflection point? The answer depends largely on how quickly the industry can move beyond inspection methods that no longer match today's operational or economic realities.

Legacy methods such as insulation stripping, scaffolding, and manual NDT are slow, hazardous, and offer incomplete coverage. With maintenance budgets tightening, these methods are no longer viable.

Why traditional inspection falls short

Without question, what worked 50 years ago no longer works today. Traditional inspection methods are slow, siloed, and dangerously incomplete.

Insulation removal:

  • Disruptive and expensive.
  • Labor-intensive and time-consuming, with a high risk of process upsets and insulation damage.
  • Limited coverage. Often targets a small percentage of piping, leaving large areas unchecked.
  • Health risks: Exposes workers to hazardous materials such as asbestos or fiberglass.

Rope access and scaffolding:

  • Safety hazards. Falls from height remain a leading cause of injury.
  • Restricted time and access. Weather, fatigue, and complex layouts limit coverage and effectiveness.
  • High coordination costs. Multiple contractors, complex scheduling, and oversight, which require continuous monitoring, documentation, and compliance assurance across vendors and protocols drive up costs.

Spot checks:

  • Low detection probability. Random sampling often fails to detect localized corrosion.
  • Data gaps. Paper records and inconsistent methods hinder lifecycle asset planning.
  • Reactive, not proactive: Problems are often discovered late after damage has already occurred.

A smarter way forward

While traditional NDT methods for CUI like Pulsed Eddy Current (PEC) and Real-Time Radiography (RTR) remain valuable, the addition of robotic systems, sensors, and AI are transforming CUI inspection.

Robotic systems, sensors, and AI are reshaping how CUI inspections are conducted, reducing reliance on manual labor and enabling broader, data-rich asset visibility for better planning and decision-making.

ARIX Technologies, for example, introduced pipe-climbing robotic systems capable of full-coverage inspections of insulated pipes without the need for insulation removal. Venus, ARIX’s pipe-climbing robot, delivers full 360° CUI data across both vertical and horizontal pipe circuits — without magnets, scaffolding, or insulation removal. It captures high-resolution visuals and Pulsed Eddy Current (PEC) data simultaneously, allowing operators to review inspection video and analyze corrosion insights in one integrated workflow. This streamlines data collection, speeds up analysis, and keeps personnel out of hazardous zones — making inspections faster, safer, and far more actionable.

These integrated technology platforms are driving measurable gains:

  • Autonomous grid scanning: Delivers structured, repeatable coverage across pipe surfaces for greater inspection consistency.
  • Integrated inspection portal: Combines PEC, RTR, and video into a unified 3D visualization, streamlining analysis across inspection teams.
  • Actionable insights: Enables more confident planning and risk forecasting through digital, shareable data—not siloed or static.

Real-world results

Petromax Refining adopted ARIX’s robotic inspection systems to modernize its CUI inspections, and its results were substantial and measurable:

  • Inspection time dropped from nine months to 39 days.
  • Costs were cut by 63% compared to traditional methods.
  • Scaffolding was minimized 99%, reducing hazardous risks and labor demands.
  • Data accuracy improved, supporting more innovative maintenance planning.

Why the time is now

Energy operators face mounting pressure from all sides: aging infrastructure, constrained budgets, rising safety risks, and growing ESG expectations.

In the U.S., downstream operators are increasingly piloting drone and crawler solutions to automate inspection rounds in refineries, tank farms, and pipelines. Over 92% of oil and gas companies report that they are investing in AI or robotic technologies or have plans to invest soon to modernize operations.

The tools are here. The data is here. Smarter inspection is no longer aspirational — it’s operational. The case has been made. Petromax and others are showing what’s possible. Smarter inspection is no longer a leap but a step forward.

---

Tyler Flanagan is director of service & operations at Houston-based ARIX Technologies.


Trending News