U.S. Department Of Transportation Issues Simplified Automated Driving Guidelines

U.S. Secretary of Transportation Elaine Chao paid a visit to Ann Arbor, Mich. today to announce a revised set of guidelines for automated driving systems. A year ago, the Obama administration announced the first set of federal guidelines for companies developing automated driving systems. The new framework is called Automated Driving Systems (ADS) 2.0, A Framework For Safety.

To no one’s surprise, the updated guidelines have been scaled back from the original proposal. Neither the original iteration or the new guidelines are formal regulations. Everyone involved on both the government and manufacturer side largely agrees that it is premature to define formal regulations as the technology is still evolving so rapidly.

The main purpose of these guidelines is to serve as a framework for how the companies involved can report on what they are doing as well as to give guidance to regulators and legislators at other levels of government. In her remarks, Chao emphasized that “safety is primary.”

The single biggest change in the updated guidelines is to the so-called safety assessment for the automated vehicles that companies want to test and deploy. In 2016, the Federal Automated Vehicles Policy outlined a 15-point safety assessment. Manufacturers could voluntarily submit the assessment of their vehicles and systems to the National Highway Traffic Safety Administration (NHTSA). NHTSA would review and then publish these safety assessment letters in a process that could take six months or more. However, since the process was entirely voluntary, this almost never happened.

Manufacturers no longer need to submit these safety assessment letters to NHTSA and the guidelines clarify that no federal approval is required. The assessment system has also been simplified to 12-points and DOT is no longer considering so-called level 2 automated vehicles in these guidelines. Level 2 automation as defined by the SAE J3016 standard controls, steering braking and acceleration but still requires the driver to stay attentive to the road and be prepared to take over.

Tesla’s AutoPilot and General Motors’ SuperCruise fall into L2. The ADS 2.0 guidelines are only concerned with level 3, 4 and 5 vehicles where the driver is not required to remain attentive during the system’s operation.

The original policy also included a model state policy intended to guide states in setting up their own regulations for automated vehicles. This too has been replaced with a set of best practices that more clearly defines the role of states and federal regulators. Traditionally, the federal government has been responsible for defining motor vehicle safety standards such as crash protection, braking systems and lighting requirements. States have been responsible for licensing and registration of vehicles as well as local rules of the road such as whether vehicles were allowed to turn right at a red light.

This may well still pose a problem as we get closer to deployment of highly automated vehicles. States have always been responsible for testing and issuing licenses to drivers. However, when the driver is a stack of software, sensors and computers, does that responsibility still fall to the states? Or is verifying that a virtual driver is competent fall to federal safety standards?