The U.S. National Highway Traffic Safety Administration has raised concerns about Tesla's public messaging on its "Full Self-Driving" system. Photo via tesla.com

The U.S. government's highway safety agency says Tesla is telling drivers in public statements that its vehicles can drive themselves, conflicting with owners manuals and briefings with the agency saying the electric vehicles need human supervision.

The National Highway Traffic Safety Administration is asking the company to “revisit its communications” to make sure messages are consistent with user instructions.

The request came in a May email to the company from Gregory Magno, a division chief with the agency's Office of Defects Investigation. It was attached to a letter seeking information on a probe into crashes involving Tesla's “Full Self-Driving” system in low-visibility conditions. The letter was posted Friday on the agency's website.

The agency began the investigation in October after getting reports of four crashes involving “Full Self-Driving" when Teslas encountered sun glare, fog and airborne dust. An Arizona pedestrian was killed in one of the crashes.

Critics, including Transportation Secretary Pete Buttigieg, have long accused Tesla of using deceptive names for its partially automated driving systems, including “Full Self-Driving” and “Autopilot,” both of which have been viewed by owners as fully autonomous.

The letter and email raise further questions about whether Full Self-Driving will be ready for use without human drivers on public roads, as Tesla CEO Elon Musk has predicted. Much of Tesla's stock valuation hinges on the company deploying a fleet of autonomous robotaxis.

Musk, who has promised autonomous vehicles before, said the company plans to have autonomous Models Y and 3 running without human drivers next year. Robotaxis without steering wheels would be available in 2026 starting in California and Texas, he said.

A message was sent Friday seeking comment from Tesla.

In the email, Magno writes that Tesla briefed the agency in April on an offer of a free trial of “Full Self-Driving” and emphasized that the owner's manual, user interface and a YouTube video tell humans that they have to remain vigilant and in full control of their vehicles.

But Magno cited seven posts or reposts by Tesla's account on X, the social media platform owned by Musk, that Magno said indicated that Full Self-Driving is capable of driving itself.

“Tesla's X account has reposted or endorsed postings that exhibit disengaged driver behavior,” Magno wrote. “We believe that Tesla's postings conflict with its stated messaging that the driver is to maintain continued control over the dynamic driving task."

The postings may encourage drivers to see Full Self-Driving, which now has the word “supervised” next to it in Tesla materials, to view the system as a “chauffeur or robotaxi rather than a partial automation/driver assist system that requires persistent attention and intermittent intervention by the driver,” Magno wrote.

On April 11, for instance, Tesla reposted a story about a man who used Full Self-Driving to travel 13 miles (21 kilometers) from his home to an emergency room during a heart attack just after the free trial began on April 1. A version of Full Self-Driving helped the owner "get to the hospital when he needed immediate medical attention,” the post said.

In addition, Tesla says on its website that use of Full Self-Driving and Autopilot without human supervision depends on “achieving reliability" and regulatory approval, Magno wrote. But the statement is accompanied by a video of a man driving on local roads with his hands on his knees, with a statement that, “The person in the driver's seat is only there for legal reasons. He is not doing anything. The car is driving itself,” the email said.

In the letter seeking information on driving in low-visibility conditions, Magno wrote that the investigation will focus on the system's ability to perform in low-visibility conditions caused by “relatively common traffic occurrences.”

Drivers, he wrote, may not be told by the car that they should decide where Full Self-Driving can safely operate or fully understand the capabilities of the system.

“This investigation will consider the adequacy of feedback or information the system provides to drivers to enable them to make a decision in real time when the capability of the system has been exceeded,” Magno wrote.

The letter asks Tesla to describe all visual or audio warnings that drivers get that the system “is unable to detect and respond to any reduced visibility condition.”

The agency gave Tesla until Dec. 18 to respond to the letter, but the company can ask for an extension.

That means the investigation is unlikely to be finished by the time President-elect Donald Trump takes office in January, and Trump has said he would put Musk in charge of a government efficiency commission to audit agencies and eliminate fraud. Musk spent at least $119 million in a campaign to get Trump elected, and Trump has spoken against government regulations.

Auto safety advocates fear that if Musk gains some control over NHTSA, the Full Self-Driving and other investigations into Tesla could be derailed.

Musk even floated the idea of him helping to develop national safety standards for self-driving vehicles.

“Of course the fox wants to build the henhouse,” said Michael Brooks, executive director of the Center for Auto Safety, a nonprofit watchdog group.

He added that he can't think of anyone who would agree that a business mogul should have direct involvement in regulations that affect the mogul’s companies.

“That’s a huge problem for democracy, really,” Brooks said.

Investigators will look into the ability of “Full Self-Driving” to “detect and respond appropriately to reduced roadway visibility conditions, and if so, the contributing circumstances for these crashes." Photo courtesy of Tesla

US to probe Texas-based Tesla's self-driving system after pedestrian killed in low visibility conditions

eyes on the road

The U.S. government's road safety agency is investigating Tesla's “Full Self-Driving” system after getting reports of crashes in low-visibility conditions, including one that killed a pedestrian.

The National Highway Traffic Safety Administration said in documents that it opened the probe last week after the company reported four crashes when Teslas encountered sun glare, fog and airborne dust.

In addition to the pedestrian's death, another crash involved an injury, the agency said.

Investigators will look into the ability of “Full Self-Driving” to “detect and respond appropriately to reduced roadway visibility conditions, and if so, the contributing circumstances for these crashes.”

The investigation covers roughly 2.4 million Teslas from the 2016 through 2024 model years.

A message was left Friday seeking comment from Tesla, which has repeatedly said the system cannot drive itself and human drivers must be ready to intervene at all times.

Last week Tesla held an event at a Hollywood studio to unveil a fully autonomous robotaxi without a steering wheel or pedals. Musk, who has promised autonomous vehicles before, said the company plans to have autonomous Models Y and 3 running without human drivers next year. Robotaxis without steering wheels would be available in 2026 starting in California and Texas, he said.

The investigation's impact on Tesla's self-driving ambitions isn't clear. NHTSA would have to approve any robotaxi without pedals or a steering wheel, and it's unlikely that would happen while the investigation is in progress. But if the company tries to deploy autonomous vehicles in its existing models, that likely would fall to state regulations. There are no federal regulations specifically focused on autonomous vehicles, although they must meet broader safety rules.

NHTSA also said it would look into whether any other similar crashes involving “Full Self-Driving” have happened in low visibility conditions, and it will seek information from the company on whether any updates affected the system’s performance in those conditions.

“In particular, this review will assess the timing, purpose and capabilities of any such updates, as well as Tesla’s assessment of their safety impact,” the documents said.

Tesla reported the four crashes to NHTSA under an order from the agency covering all automakers. An agency database says the pedestrian was killed in Rimrock, Arizona, in November of 2023 after being hit by a 2021 Tesla Model Y. Rimrock is about 100 miles (161 kilometers) north of Phoenix.

The Arizona Department of Public Safety said in a statement that the crash happened just after 5 p.m. Nov. 27 on Interstate 17. Two vehicles collided on the freeway, blocking the left lane. A Toyota 4Runner stopped, and two people got out to help with traffic control. A red Tesla Model Y then hit the 4Runner and one of the people who exited from it. A 71-year-old woman from Mesa, Arizona, was pronounced dead at the scene.

The collision happened because the sun was in the Tesla driver's eyes, so the Tesla driver was not charged, said Raul Garcia, public information officer for the department. Sun glare also was a contributing factor in the first collision, he added.

Tesla has twice recalled “Full Self-Driving” under pressure from NHTSA, which in July sought information from law enforcement and the company after a Tesla using the system struck and killed a motorcyclist near Seattle.

The recalls were issued because the system was programmed to run stop signs at slow speeds and because the system disobeyed other traffic laws. Both problems were to be fixed with online software updates.

Critics have said that Tesla’s system, which uses only cameras to spot hazards, doesn’t have proper sensors to be fully self driving. Nearly all other companies working on autonomous vehicles use radar and laser sensors in addition to cameras to see better in the dark or poor visibility conditions.

Musk has said that humans drive with only eyesight, so cars should be able to drive with just cameras. He has called lidar (light detection and ranging), which uses lasers to detect objects, a “fool's errand.”

The “Full Self-Driving” recalls arrived after a three-year investigation into Tesla's less-sophisticated Autopilot system crashing into emergency and other vehicles parked on highways, many with warning lights flashing.

That investigation was closed last April after the agency pressured Tesla into recalling its vehicles to bolster a weak system that made sure drivers are paying attention. A few weeks after the recall, NHTSA began investigating whether the recall was working.

NHTSA began its Autopilot crash investigation in 2021, after receiving 11 reports that Teslas that were using Autopilot struck parked emergency vehicles. In documents explaining why the investigation was ended, NHTSA said it ultimately found 467 crashes involving Autopilot resulting in 54 injuries and 14 deaths. Autopilot is a fancy version of cruise control, while “Full Self-Driving” has been billed by Musk as capable of driving without human intervention.

The investigation that was opened Thursday enters new territory for NHTSA, which previously had viewed Tesla's systems as assisting drivers rather than driving themselves. With the new probe, the agency is focusing on the capabilities of “Full Self-Driving" rather than simply making sure drivers are paying attention.

Michael Brooks, executive director of the nonprofit Center for Auto Safety, said the previous investigation of Autopilot didn't look at why the Teslas weren't seeing and stopping for emergency vehicles.

“Before they were kind of putting the onus on the driver rather than the car,” he said. “Here they're saying these systems are not capable of appropriately detecting safety hazards whether the drivers are paying attention or not.”

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Houston energy-focused AI platform raises $5M in Mercury-led seed round

fresh funding

Houston-based Collide, a provider of generative artificial intelligence for the energy sector, has raised $5 million in seed funding led by Houston’s Mercury Fund.

Other investors in the seed round include Bryan Sheffield, founder of Austin-based Parsley Energy, which was acquired by Dallas-based Pioneer Natural Resources in 2021; Billy Quinn, founder and managing partner of Dallas-based private equity firm Pearl Energy Investments; and David Albin, co-founder and former managing partner of Dallas-based private equity firm NGP Capital Partners.

“(Collide) co-founders Collin McLelland and Chuck Yates bring a unique understanding of the oil and gas industry,” Blair Garrou, managing partner at Mercury, said in a news release. “Their backgrounds, combined with Collide’s proprietary knowledge base, create a significant and strategic moat for the platform.”

Collide, founded in 2022, says the funding will enable the company to accelerate the development of its GenAI platform. GenAI creates digital content such as images, videos, text, and music.

Originally launched by Houston media organization Digital Wildcatters as “a professional network and digital community for technical discussions and knowledge sharing,” the company says it will now shift its focus to rolling out its enterprise-level, AI-enabled solution.

Collide explains that its platform gathers and synthesizes data from trusted sources to deliver industry insights for oil and gas professionals. Unlike platforms such as OpenAI, Perplexity, and Microsoft Copilot, Collide’s platform “uniquely accesses a comprehensive, industry-specific knowledge base, including technical papers, internal processes, and a curated Q&A database tailored to energy professionals,” the company said.

Collide says its approximately 6,000 platform users span 122 countries.

CenterPoint reports progress on grid improvements ahead of 2025 hurricane season

grid resilience

As part of an ongoing process to make Houston better prepared for climate disasters, CenterPoint Energy announced its latest progress update on the second phase of the Greater Houston Resiliency Initiative (GHRI).

CenterPoint reported that it has completed 70 percent of its resiliency work and all GHRI-related actions are expected to be complete before the official start of the 2025 hurricane season.

"Our entire CenterPoint Houston Electric team is focused on completing this historic suite of grid resiliency actions before the start of hurricane season,” Darin Carroll, Senior Vice President of CenterPoint's Electric Business, said in a news release. “That is our goal, and we will achieve it. To date, we have made significant progress as part of this historic effort.”

CenterPoint’s resiliency solutions include clearing higher-risk vegetation across thousands of miles of power lines, adding thousands more automation devices capable of self-healing, installing thousands of storm-resistant poles, and undergrounding hundreds of miles of power lines.

CenterPoint's GHRI efforts, which entered a second phase in September 2024, aim to improve overall grid resiliency and reliability and are estimated to reduce outages for customers by more than 125 million minutes annually, according to the company. It has undergrounded nearly 350 miles of power lines, about 85 percent of the way toward its target of 400 miles, which will help improve resiliency and reduce the risk of outages. CenterPoint also aims to install the first of 100 new local weather monitoring stations by June 1.

In March, CenterPoint cleared 655 miles of high-risk vegetation near power lines, installed 1,215 automated reliability devices capable of self-healing, and added an additional 3,300 storm-resilient poles.

In April, CenterPoint will begin building a network of 100 new weather monitoring stations, which will provide 24/7 weather monitoring and storm response preparation.

“We will continue to work every day to complete these critical improvements as part of our company's goal of building the most resilient coastal grid in the country,” Carroll added in the release.

ExxonMobil, Rice launch sustainability initiative with first project underway

power partners

Houston-based ExxonMobil and Rice University announced a master research agreement this week to collaborate on research initiatives on sustainable energy efforts and solutions. The agreement includes one project that’s underway and more that are expected to launch this year.

“Our commitment to science and engineering, combined with Rice’s exceptional resources for research and innovation, will drive solutions to help meet growing energy demand,” Mike Zamora, president of ExxonMobil Technology and Engineering Co., said in a news release. “We’re thrilled to work together with Rice.”

Rice and Exxon will aim to develop “systematic and comprehensive solutions” to support the global energy transition, according to Rice. The university will pull from the university’s prowess in materials science, polymers and catalysts, high-performance computing and applied mathematics.

“Our agreement with ExxonMobil highlights Rice’s ability to bring together diverse expertise to create lasting solutions,” Ramamoorthy Ramesh, executive vice president for research at Rice, said in the release. “This collaboration allows us to tackle key challenges in energy, water and resource sustainability by harnessing the power of an interdisciplinary systems approach.”

The first research project under the agreement focuses on developing advanced technologies to treat desalinated produced water from oil and gas operations for potential reuse. It's being led by Qilin Li, professor of civil and environmental engineering at Rice and co-director of the Nanosystems Engineering Research Center for Nanotechnology-Enabled Water Treatment (NEWT) Center.

Li’s research employs electrochemical advanced oxidation processes to remove harmful organic compounds and ammonia-nitrogen, aiming to make the water safe for applications such as agriculture, wildlife and industrial processes. Additionally, the project explores recovering ammonia and producing hydrogen, contributing to sustainable resource management.

Additional projects under the agreement with Exxon are set to launch in the coming months and years, according to Rice.