WASHINGTON – Both car and driver contributed to the 2018 crash of a Tesla in California that left a father of two dead, and federal regulators have shown a “lack of leadership” in addressing safety problems with partially-automated vehicles, investigators said Tuesday.

The investigation underscored questions about the safety and marketing of Tesla’s company’s “Autopilot” system.

Robert Sumwalt III, chairman of the National Transportation Safety Board, pointed to a “lack of system safeguards to prevent foreseeable misuses of technology.”

“Industry keeps implementing technology in such a way that people can get injured or killed,” Sumwalt said. “If you own a car with partial automation, you do not own a self-driving car. Don’t pretend that you do.”

The NTSB also cited “shortfalls” in how the U.S. Department of Transportation has overseen partial automation in cars made by Tesla and other manufacturers. NTSB officials said while the National Highway Traffic Safety Administration has numerous open investigations into Tesla vehicles, the regulator has been “misguided” in its largely hands-off approach.

The March 23, crash that killed Walter Huang after he dropped his son off at preschool was just one of 36,560 road deaths in the United States in 2018.

But it drew broad – and, to Tesla, unwelcome – attention to the company and potential problems with the high-end technology that is so central to the Silicon Valley electric car pioneer’s brand. After the crash, Tesla placed blame on Huang, an Apple engineer, saying “the only way for this accident to have occurred is if Mr. Huang was not paying attention to the road.”

But days before Huang’s Tesla drove off Highway 101 and into a barrier in Mountain View, California, the car’s “Autopilot” had made a “left steering movement toward” the same area, the NTSB found. Huang caught it in time, investigators said, and he told relatives the same problem had occurred in the same place multiple times before.

Tesla executives did not respond to questions about the potential flaw or what had been done to fix it.

Despite that history, Huang was over-reliant on the “Autopilot” system, the NTSB found, noting that he used his iPhone 8 while behind the wheel and that a strategy game called “Three Kingdoms” was “active during his commute to work.”

The NTSB cited weaknesses in Tesla’s technology for monitoring whether drivers are paying attention, and said limitations in “processing software” were likely associated with Huang’s Tesla driving off the road.

“According to the family, Mr. Huang was well aware that Autopilot was not perfect and, specifically, he told them it was not reliable in that exact location, yet he nonetheless engaged Autopilot at that location,” Tesla said in a 2018 statement.

Huang’s family has sued Tesla, saying that “based on Tesla’s advertising and promotional material” Huang “reasonably believed the 2017 Tesla Model X vehicle was safer than a human-operated vehicle because of Defendant’s claimed technical superiority regarding the vehicle’s autopilot system . . .”

“The vehicle should not leave a marked travel lane and accelerate, without the input of the operator, in such a way as to cause damage, harm or injury,” according to the lawsuit filed in Santa Clara County Superior Court.

Tesla executives have bristled at the scrutiny from the NTSB, which led to an unusual public dust-up and flashes of mutual frustration.

In a rare move, Tesla was removed as a party to the investigation. The NTSB, an independent federal body that investigates airplane, boat and highway crashes, generally brings manufacturers, government officials and others together for the painstaking process of deciphering the probable causes of major crashes. That process can take a year or two, an eternity in the tech world, but the results are widely trusted.

The April 2018, the revoked Tesla’s party status after the automaker released information about the crash. Tesla says it chose to withdraw and accused the safety agency of being “more concerned with press headlines than actually promoting safety.”

At a November hearing on the federal investigation into the deadly crash of a self-driving Uber SUV, Sumwalt pointed to what he called Uber’s openness with investigators, and contrasted it with the approach of Tesla chief executive Elon Musk.

“I did notice when I talked to their CEO he did not hang up on me,” Sumwalt said, referring to Uber.

“We chose to withdraw from the agreement and issued a statement to correct misleading claims that had been made about Autopilot – claims which made it seem as though Autopilot creates safety problems when the opposite is true,” Tesla said in a 2018 statement.

The company also noted that NHTSA, which is part of the U.S. Department of Transportation, regulates automobiles, adding that the company has “a strong and positive relationship” with NHTSA.

“Lot of respect for NTSB, but NHTSA regulates cars, not NTSB, which is an advisory body. Tesla releases critical crash data affecting public safety immediately & always will. To do otherwise would be unsafe,” Musk tweeted in April 2018.

Huang’s crash was not the first time a Tesla driver was killed while using “Autopilot,” nor would it be the last.

In 2016, a speeding Tesla driver in Williston, Florida, crashed into a truck that turned in front of him. At the time, Tesla said “neither Autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied.”

Musk later said that it was “very likely” a new radar-based braking system would have prevented the crash.

In 2017, NHTSA said it found no defects in the “Autopilot” system in use at the time of that crash. The regulator’s broader federal review of dozens of Autopilot crashes did point to industry-wide challenges.

“Many of the crashes appear to involve driver behavior factors, including traveling too fast for conditions, mode confusion, and distraction,” the investigators wrote. Mode confusion is the idea that it is unclear who is in control, the car or the driver, and occurred “during attempted Autopilot activations” and “after inadvertent overrides.”

In 2019, a driver using Autopilot in Delray Beach, Florida, crashed into a tractor-trailer that drove out from a private driveway and slowed in the middle of a highway, blocking the Tesla’s path, the NTSB said.

Neither the Tesla driver nor the Autopilot system “executed evasive maneuvers,” and the car’s roof sheared off as it drove under the truck, according to the NTSB.