U.S. auto safety regulators are investigating Tesla Inc.’s advanced driver-assistance system known as Autopilot after a series of crashes at emergency scenes.

The National Highway Traffic Safety Administration said in a document made public Monday that it had identified 11 crashes since early 2018 in which a Tesla vehicle that had been using the company’s driver-assistance system struck one or more vehicles involved in an emergency-response situation. Four of those crashes happened this year.

“Most incidents took place after dark and the crash scenes encountered included scene control measures such as first responder vehicle lights, flares, an illuminated arrow board, and road cones,” the agency said.

NHTSA is studying the Autopilot system in some 765,000 Tesla vehicles from the 2014 through 2021 model years.

NHTSA has been looking more closely at advanced driver-assistance systems such as Autopilot as they have become more ubiquitous in vehicles on roads today. The agency recently began requiring companies to regularly report crashes involving advanced driver-assistance or automated driving systems to the agency.

Tesla didn’t immediately respond to a request for comment. The company has long said that driving with Autopilot engaged is safer than doing so without it. The system is designed to help with tasks such as steering and keeping a safe distance from other vehicles on the road.

Earlier

U.S. safety regulators are probing crashes involving Teslas, suspecting the company’s Autopilot system might be involved. WSJ’s Robert Wall reports on how some motorists may mistakenly think Autopilot is a self-driving feature that doesn’t require their attention. (Video from 3/18/21) The Wall Street Journal Interactive Edition

NHTSA has launched more than two dozen investigations into crashes thought to be related to Tesla’s advanced driver-assistance system.

Earlier this year, two U.S. senators urged NHTSA to develop recommendations for improving advanced driver-assistance systems such as Tesla’s Autopilot. Their comments followed a fatal crash in Texas, though another federal safety agency, the National Transportation Safety Board, has since raised doubts that Autopilot was involved.

Tesla’s driver-assistance system has drawn scrutiny for how some drivers misuse the technology, using it to operate the vehicle without their hands on the wheel, for example, against company instructions.

The company recently recalled more than 285,000 vehicles in China to address a cruise control-related safety issue. China’s State Administration for Market Regulation said Tesla’s cruise-control system could be activated accidentally, potentially causing sudden acceleration. Most of the vehicles affected by the recall were made at Tesla’s factory in Shanghai, and the software fix could be completed remotely. Tesla apologized to car owners in connection with the recall and said it would continue to improve safety in accordance with national requirements.

In the U.S., Tesla agreed early this year to recall roughly 135,000 Model S luxury sedans and Model X sport-utility vehicles over touch-screen failures. Tesla, at the time, said it disagreed that the issue constituted a defect in the vehicles, but that it was going ahead with a recall to conclude the investigation and provide a better experience for customers.

Write to Rebecca Elliott at rebecca.elliott@wsj.com