Automated License Plate Readers (ALPRs) are being used by a growing number of public and private entities to track drivers’ movements and location. The use of the technology is almost entirely unregulated and can be subject to abuse, so some communities are implementing policies aimed at limiting the potential damage.
In a policy memo, Noah Stein, a research assistant in the Science, Technology, and Public Policy program at the Ford School, examines the landscape and suggests a variety of policy options to better control the use of ALPRs and the data they collect. The project is a part of STPP's Community Partnerships Initiative (CPI).
ALPRs use a combination of high-speed cameras and computer software to log every license plate that passes by the camera. ALPR software compares each plate with a “hot list” of vehicles, including those believed to be at a recent crime scene or stolen, and even those involved with low-level offenses. Yet, in the past, police have used license plate readers to target locations where people have a constitutional right to assemble, such as mosques and political rallies, or where they are engaging in legal activities, such as gun shows.
ALPRs are already in widespread use throughout the United States. A 2013 report sponsored by the Department of Justice showed that 77% of police departments serving populations over 100 thousand used ALPR technology, and other studies showing that their use has rapidly increased over the intervening years. The two principal vendors, Motorola and FlockSafety, have collectively claimed to serve over 3,000 communities across the US.
At the Federal level, regulation of ALPRs is nonexistent, and only sixteen states have enacted some form of regulation. As most jurisdictions have zero laws regarding ALPRs, law enforcement and private actors can use the technology however they wish.
Recent studies examining the accuracy of ALPRs show that they often misread license plates, leading to disastrous real-world consequences, including violent arrests of innocent people. ALPR errors arise not only from shortcomings internal to their technology but from the hot lists they depend on to provide matches.
Even when ALPRs work as intended, the vast majority of images taken are not connected to any criminal activity. As most jurisdictions have no policies regarding retention limits, many agencies keep these scans on innocent people indefinitely. This can allow the government to maintain an overarching and potentially unconstitutional level of surveillance and can lead to abuse.
In some instances, officers have misused confidential databases “to get information on romantic partners, business associates, neighbors, journalists and others for reasons that have nothing to do with daily police work.” Professional abuse includes targeting religious minorities and communities of color. Reproductive rights advocates are now raising alarms about the ways police and others could use ALPRs for the targeting of abortion clinics in the wake of the Supreme Court’s Dobbs decision that overturned Roe v. Wade.
Given the potential risks associated with ALPRs, Stein outlines the range of policy options open to communities: a ban or moratorium on law enforcement use of the technology, or implementation of a range of safeguards.
Many jurisdictions across the country have chosen a complete ban, including cities in California, Indiana, and New York. Recently, the City Council of Ypsilanti, Michigan, voted to ban the technology. While it may be difficult to generate enough political willpower to counter law enforcement’s likely support of ALPRs, a moratorium can be a compromise, which would give policymakers the time to develop the appropriate policies to reduce the harm posed by the technology.
Short of a ban, other types of safeguards can be implemented. In communities where ALPR technology is already in place and utilized, proactive oversight and limits can be established. Those regulations can include limiting the amount of time a police department can retain ALPR records, or requiring warrants to access the data. Those policies should be transparent for the community, and subject to comment and feedback. Additionally, ongoing monitoring and periodic audits should take place to identify both misuse of the technology and any disparate impacts the technology is having.