Congressional Drunk Driver Detection Mandate Raises Privacy Questions

The vague mandate leaves the door wide open for intrusion and the collection of sensitive data.

Congress has mandated that starting later this decade, all cars must have a built-in ability to detect drunk drivers and to disable their cars. However, Congress left the Department of Transportation wide latitude to figure out how best to implement such a technology, creating a very real potential that we’ll end up with a system that could be a privacy disaster.

The measure, which was included in the $1.5 trillion infrastructure bill signed by President Biden last week, says vehicles must be equipped with “advanced drunk and impaired driving prevention technology.” What is that? Nobody really knows, but Congress defines it as a system that can either “passively monitor the performance of a driver” to detect if they are impaired, or “passively and accurately detect” whether the driver’s blood alcohol level is above the legal limit. If impairment or an illegal blood alcohol limit is detected, the system is required to “prevent or limit motor vehicle operation.”

Driving under the influence of alcohol is a serious problem that results in thousands of preventable deaths every year. But of course, mandating something called “advanced drunk driving technology” doesn’t mean that Congress can conjure such a capability into existence, and it’s far from clear how it would work — or how well.

One key word in the measure is “passively.” Some states already require in-car breathalyzers for people with DUI convictions. Known as “ignition interlock devices,” they require drivers to blow an alcohol-free breath into a tube before their car will start. But Congress has ordered that cars with technology produced to meet this mandate must “passively” detect impairment or intoxication. That means they don’t want breathalyzer tubes; they want a system that will work automatically without drivers having to do anything.

One possibility is that such a system would involve video analytics. Some automakers have already begun equipping their cars with AI cameras that warn drivers if they appear distracted or drowsy. Employers such as Amazon have imposed similar machine-vision nannies on the workers who drive for them. This kind of an ignition interlock system would raise a lot of questions:

  • How would it work? Video analytics technology (as we discussed in this report) has made great strides but continues to work poorly in many respects. In particular, a number of driver monitoring products are based on “emotion recognition” algorithms that are so problematic as to basically constitute snake oil. The visual detection of intoxication would seem to be an even harder problem.
  • Would such a system falsely classify people with certain disabilities as being intoxicated?
  • Such a system would require every car to have a built-in camera focused on the driver. Would that video be stored, or processed in real-time? Would that camera be available for other applications? If so, would the data all flow to the same place?
  • Would the system check the driver when they start their car, or continuously monitor them while they’re behind the wheel? The latter concept would involve the collection of far more data. It would also raise questions about how a car that is in motion — and potentially in the middle of merging onto a highway — could be safely disabled.
  • Will the system minimize false negatives (allowing some people to drive even though they’re drunk) or false positives (missing fewer drunk people but preventing more sober people from starting their cars)? Every system has errors, but depending on how sensitive you make it you can tilt the balance between false positives and false negatives.

It’s also possible that DOT or automakers (if DOT issues performance-based regulations that leave it up to carmakers to select their own technology) could turn to some kind of system that remotely analyzes the driver’s breath or gathers other physiological information.

In any case, any technology imposed to fulfill Congress’s mandate will involve sensors that collect data about drivers’ bodies, and no technology should be implemented that doesn’t strongly protect that data. Cars today are basically computers on wheels, and the state of privacy of those computers/cars is shameful, with automakers collecting all sorts of data without the meaningful knowledge or consent of drivers. It would be utterly unacceptable for data from AI interlock devices to become part of that data stream. Any system should be required to be designed at an architectural level to prevent the sharing of data. No data should be permitted to be collected that isn’t necessary for the operation of the system or stored any longer than necessary. The purpose of this system is not forensic — it is not to help catch and prosecute drunk drivers. The purpose is to prevent drunk people from driving at all.

This is not some free online ad-supported service that people are choosing to pay for with their privacy or can opt out of; it would be mandated by the federal government. Privacy protection must be included.

Congress mandated that regulations implementing this mandate be issued within three years of the bill’s enactment, with the option for another three-year extension if necessary. That means there will likely be many years in which to consider this issue and to debate how it’s implemented. We will be carefully watching every step of the way.