Investigators will look into the ability of “Full Self-Driving” to “detect and respond appropriately to reduced roadway visibility conditions, and if so, the contributing circumstances for these crashes." Photo courtesy of Tesla

The U.S. government's road safety agency is investigating Tesla's “Full Self-Driving” system after getting reports of crashes in low-visibility conditions, including one that killed a pedestrian.

The National Highway Traffic Safety Administration said in documents that it opened the probe last week after the company reported four crashes when Teslas encountered sun glare, fog and airborne dust.

In addition to the pedestrian's death, another crash involved an injury, the agency said.

Investigators will look into the ability of “Full Self-Driving” to “detect and respond appropriately to reduced roadway visibility conditions, and if so, the contributing circumstances for these crashes.”

The investigation covers roughly 2.4 million Teslas from the 2016 through 2024 model years.

A message was left Friday seeking comment from Tesla, which has repeatedly said the system cannot drive itself and human drivers must be ready to intervene at all times.

Last week Tesla held an event at a Hollywood studio to unveil a fully autonomous robotaxi without a steering wheel or pedals. Musk, who has promised autonomous vehicles before, said the company plans to have autonomous Models Y and 3 running without human drivers next year. Robotaxis without steering wheels would be available in 2026 starting in California and Texas, he said.

The investigation's impact on Tesla's self-driving ambitions isn't clear. NHTSA would have to approve any robotaxi without pedals or a steering wheel, and it's unlikely that would happen while the investigation is in progress. But if the company tries to deploy autonomous vehicles in its existing models, that likely would fall to state regulations. There are no federal regulations specifically focused on autonomous vehicles, although they must meet broader safety rules.

NHTSA also said it would look into whether any other similar crashes involving “Full Self-Driving” have happened in low visibility conditions, and it will seek information from the company on whether any updates affected the system’s performance in those conditions.

“In particular, this review will assess the timing, purpose and capabilities of any such updates, as well as Tesla’s assessment of their safety impact,” the documents said.

Tesla reported the four crashes to NHTSA under an order from the agency covering all automakers. An agency database says the pedestrian was killed in Rimrock, Arizona, in November of 2023 after being hit by a 2021 Tesla Model Y. Rimrock is about 100 miles (161 kilometers) north of Phoenix.

The Arizona Department of Public Safety said in a statement that the crash happened just after 5 p.m. Nov. 27 on Interstate 17. Two vehicles collided on the freeway, blocking the left lane. A Toyota 4Runner stopped, and two people got out to help with traffic control. A red Tesla Model Y then hit the 4Runner and one of the people who exited from it. A 71-year-old woman from Mesa, Arizona, was pronounced dead at the scene.

The collision happened because the sun was in the Tesla driver's eyes, so the Tesla driver was not charged, said Raul Garcia, public information officer for the department. Sun glare also was a contributing factor in the first collision, he added.

Tesla has twice recalled “Full Self-Driving” under pressure from NHTSA, which in July sought information from law enforcement and the company after a Tesla using the system struck and killed a motorcyclist near Seattle.

The recalls were issued because the system was programmed to run stop signs at slow speeds and because the system disobeyed other traffic laws. Both problems were to be fixed with online software updates.

Critics have said that Tesla’s system, which uses only cameras to spot hazards, doesn’t have proper sensors to be fully self driving. Nearly all other companies working on autonomous vehicles use radar and laser sensors in addition to cameras to see better in the dark or poor visibility conditions.

Musk has said that humans drive with only eyesight, so cars should be able to drive with just cameras. He has called lidar (light detection and ranging), which uses lasers to detect objects, a “fool's errand.”

The “Full Self-Driving” recalls arrived after a three-year investigation into Tesla's less-sophisticated Autopilot system crashing into emergency and other vehicles parked on highways, many with warning lights flashing.

That investigation was closed last April after the agency pressured Tesla into recalling its vehicles to bolster a weak system that made sure drivers are paying attention. A few weeks after the recall, NHTSA began investigating whether the recall was working.

NHTSA began its Autopilot crash investigation in 2021, after receiving 11 reports that Teslas that were using Autopilot struck parked emergency vehicles. In documents explaining why the investigation was ended, NHTSA said it ultimately found 467 crashes involving Autopilot resulting in 54 injuries and 14 deaths. Autopilot is a fancy version of cruise control, while “Full Self-Driving” has been billed by Musk as capable of driving without human intervention.

The investigation that was opened Thursday enters new territory for NHTSA, which previously had viewed Tesla's systems as assisting drivers rather than driving themselves. With the new probe, the agency is focusing on the capabilities of “Full Self-Driving" rather than simply making sure drivers are paying attention.

Michael Brooks, executive director of the nonprofit Center for Auto Safety, said the previous investigation of Autopilot didn't look at why the Teslas weren't seeing and stopping for emergency vehicles.

“Before they were kind of putting the onus on the driver rather than the car,” he said. “Here they're saying these systems are not capable of appropriately detecting safety hazards whether the drivers are paying attention or not.”

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Houston researcher dives into accessibility of public EV charging stations

EV equity

A Rice University professor wants to redraw the map for the placement of electric vehicle charging stations to level the playing field for access to EV power sources.

Xinwu Qian, assistant professor of civil and environmental engineering at Rice, is leading research to rethink where EV charging stations should be installed so that they’re convenient for all motorists going about their day-to-day activities.

“Charging an electric vehicle isn’t just about plugging it in and waiting — it takes 30 minutes to an hour even with the fastest charger — therefore, it’s an activity layered with social, economic, and practical implications,” Qian says on Rice’s website. “While we’ve made great strides in EV adoption, the invisible barriers to public charging access remain a significant challenge.”

According to Qian’s research, public charging stations are more commonly located near low-income households, as these residents are less likely to afford or enjoy access to at-home charging. However, these stations are often far from where they conduct everyday activities.

The Rice report explains that, in contrast, public charging stations are geographically farther from affluent suburban areas. However, they often fit more seamlessly into these residents' daily schedules. As a result, low-income communities face an opportunity gap, where public charging may exist in theory but is less practical in reality.

A 2024 study led by Qian analyzed data from over 28,000 public EV charging stations and 5.5 million points across 20 U.S. cities.

“The findings were stark: Income, rather than proximity, was the dominant factor in determining who benefits most from public EV infrastructure,” Qian says.

“Wealthier individuals were more likely to find a charging station at places they frequent, and they also had the flexibility to spend time at those places while charging their vehicles,” he adds. “Meanwhile, lower-income communities struggled to integrate public charging into their routines due to a compounded issue of shorter dwell times and less alignment with daily activities.”

To make matters worse, businesses often target high-income people when they install charging stations, Qian’s research revealed.

“It’s a sad reality,” Qian said. “If we don’t address these systemic issues now, we risk deepening the divide between those who can afford EVs and those who can’t.”

A grant from the National Science Foundation backs Qian’s further research into this subject. He says the public and private sectors must collaborate to address the inequity in access to public charging stations for EVs.

Energy expert: Unlocking the potential of the Texas grid with AI & DLR

guest column

From bitter cold and flash flooding to wildfire threats, Texas is no stranger to extreme weather, bringing up concerns about the reliability of its grid. Since the winter freeze of 2021, the state’s leaders and lawmakers have more urgently wrestled with how to strengthen the resilience of the grid while also supporting immense load growth.

As Maeve Allsup at Latitude Media pointed out, many of today’s most pressing energy trends are converging in Texas. In fact, a recent ERCOT report estimates that power demand will nearly double by 2030. This spike is a result of lots of large industries, including AI data centers, looking for power. To meet this growing demand, Texas has abundant natural gas, solar and wind resources, making it a focal point for the future of energy.

Several new initiatives are underway to modernize the grid, but the problem is that they take a long time to complete. While building new power generation facilities and transmission lines is necessary, these processes can take 10-plus years to finish. None of these approaches enables both significantly expanded power and the transmission capacity needed to deliver it in the near future.

Beyond “curtailment-enabled headroom”

A study released by Duke University highlighted the “extensive untapped potential” in U.S. power plants for powering up to 100 gigawatts of large loads “while mitigating the need for costly system upgrades.” In a nutshell: There’s enough generating capacity to meet peak demand, so it’s possible to add new loads as long as they’re not adding to the peak. New data centers must connect flexibly with limited on-site generation or storage to cover those few peak hours. This is what the authors mean by “load flexibility” and “curtailment-enabled headroom.”

As I shared with POWER Magazine, while power plants do have significant untapped capacity, the transmission grid might not. The study doesn’t address transmission constraints that can limit power delivery where it’s needed. Congestion is a real problem already without the extra load and could easily wipe out a majority of that additional capacity.

To illustrate this point, think about where you would build a large data center. Next to a nuclear plant? A nuclear plant will already operate flat out and will not have any extra capacity. The “headroom” is available on average in the whole system, not at any single power plant. A peaking gas plant might indeed be idle most of the time, but not 99.5% of the time as highlighted by the Duke authors as the threshold. Your data center would need to take the extra capacity from a number of plants, which may be hundreds of miles apart. The transmission grid might not be able to cope with it.

However, there is also additional headroom or untapped potential in the transmission grid itself that has not been used so far. Grid operators have not been able to maximize their grids because the technology has not existed to do so.

The problem with existing grid management and static line ratings

Traditionally, power lines are given a static rating throughout the year, which is calculated by assuming the worst possible cooling conditions of a hot summer day with no wind. This method leads to conservative capacity estimates and does not account for environmental factors that can impact how much power can actually flow through a line.

Take the wind-cooling effect, for example. Wind cools down power lines and can significantly increase the capacity of the grid. Even a slight wind blowing around four miles per hour can increase transmission line capacity by 30 percent through cooling.

That’s why dynamic line ratings (DLR) are such a useful tool for grid operators. DLR enables the assessment of individual spans of transmission lines to determine how much capacity they can carry under current conditions. On average, DLR increases capacity by a third, helping utilities sell more power while bringing down energy prices for consumers.

However, DLR is not yet widely used. The core problem is that weather models are not accurate enough for grid operators. Wind is very dependent on the detailed landscape, such as forests or hills, surrounding the power line. A typical weather forecast will tell you the average conditions in the 10 square miles around you, not the wind speed in the forest where the power line is. Without accurate wind data at every section, even a small portion of the line risks overheating unless the line is managed conservatively.

DLR solutions have been forced to rely on sensors installed on transmission lines to collect real-time weather measurements, which are then used to estimate line ratings. However, installing and maintaining hundreds of thousands of sensors is extremely time-consuming, if not practically infeasible.

The Elering case study

Last year, my company, Gridraven, tested our machine learning-powered DLR system, which uses a AI-enabled weather model, on 3,100 miles of 110-kilovolt and 330-kilovolt lines operated by Elering, Estonia’s transmission system operator, predicting ratings in 15,000 individual locations. The power lines run through forests and hills, where conventional forecasting systems cannot predict conditions with precision.

From September to November 2024, our average wind forecast accuracy saw a 60 percent improvement over existing technology, resulting in a 40 percent capacity increase compared to the traditional seasonal rating. These results were further validated against actual measurements on transmission towers.

This pilot not only demonstrated the power of AI solutions against traditional DLR systems but also their reliability in challenging conditions and terrain.

---

Georg Rute is the CEO of Gridraven, a software provider for Dynamic Line Ratings based on precision weather forecasting available globally. Prior to Gridraven, Rute founded Sympower, a virtual power plant, and was the head of smart grid development at Elering, Estonia's Transmission System Operator. Rute will be onsite at CERAWeek in Houston, March 10-14.

The views expressed herein are Rute's own. A version of this article originally appeared on LinkedIn.

Energy co. to build 30 micro-nuclear reactors in Texas to meet rising demand

going nuclear

A Washington, D.C.-based developer of micro-nuclear technology plans to build 30 micro-nuclear reactors near Abilene to address the rising demand for electricity to operate data centers across Texas.

The company, Last Energy, is seeking permission from the Electric Reliability Council of Texas (ERCOT) and the U.S. Nuclear Regulatory Commission to build the microreactors on a more than 200-acre site in Haskell County, about 60 miles north of Abilene.

The privately financed microreactors are expected to go online within roughly two years. They would be connected to ERCOT’s power grid, which serves the bulk of Texas.

“Texas is America’s undisputed energy leader, but skyrocketing population growth and data center development is forcing policymakers, customers, and energy providers to embrace new technologies,” says Bret Kugelmass, founder and CEO of Last Energy.

“Nuclear power is the most effective way to meet Texas’ demand, but our solution—plug-and-play microreactors, designed for scalability and siting flexibility—is the best way to meet it quickly,” Kugelmass adds. “Texas is a state that recognizes energy is a precondition for prosperity, and Last Energy is excited to contribute to that mission.”

Texas is home to more than 340 data centers, according to Perceptive Power Infrastructure. These centers consume nearly 8 gigawatts of power and make up 9 percent of the state’s power demand.

Data centers are one of the most energy-intensive building types, says to the U.S. Department of Energy, and account for approximately 2 percent of the total U.S. electricity use.

Microreactors are 100 to 1,000 times smaller than conventional nuclear reactors, according to the Idaho National Laboratory. Yet each Last Energy microreactor can produce 20 megawatts of thermal energy.

Before announcing the 30 proposed microreactors to be located near Abilene, Last Energy built two full-scale prototypes in Texas in tandem with manufacturing partners. The company has also held demonstration events in Texas, including at CERAWeek 2024 in Houston. Last Energy, founded in 2019, is a founding member of the Texas Nuclear Alliance.

“Texas is the energy capital of America, and we are working to be No. 1 in advanced nuclear power,” Governor Greg Abbott said in a statement. “Last Energy’s microreactor project in Haskell County will help fulfill the state’s growing data center demand. Texas must become a national leader in advanced nuclear energy. By working together with industry leaders like Last Energy, we will usher in a nuclear power renaissance in the United States.”

Nuclear energy is not a major source of power in Texas. In 2023, the state’s two nuclear power plants generated about 7% of the state’s electricity, according to the U.S. Energy Information Administration. Texas gains most of its electricity from natural gas, coal, wind, and solar.