Maksim Kabakou - stock.adobe.com
AI in law enforcement is growing, but needs work
AI for police includes numerous different analytics, machine learning and natural language processing technologies, including facial recognition and automated transcription tools.
While a true robotic crime fighter might not exist yet, federal, state and local law enforcement officials have routinely turned to analytics and AI technologies to help prevent crime and track and arrest criminals.
AI in law enforcement has, so far, primarily centered on tools to automatically identify suspects in video evidence and to predict the possibility of a crime occurring in a particular area by using analytics and machine learning models. The practice has had some successes, but has also prompted some significant concerns.
In Baltimore, a city with high crime rates, law enforcement officials rely on many different AI-powered tools and technologies to stop crime. The technologies are relatively new, according to Kevin Davis, the city's police commissioner from 2015 to 2018, but have proven useful.
AI, facial recognition in Baltimore
"In the city of Baltimore, there are surveillance cameras, city watch cameras; we have facial recognition, we have license plate readers, we've got Wi-Fi, all these things that collect data that we've never had before," Davis said.
However, Davis, speaking at the AI World Government conference in Washington, D.C. in June, noted that AI in law enforcement isn't able to fully automate the functions of well-trained officers, and it requires human oversight.
That can sometimes create problems, he said.
In Baltimore, which has some 1,000 city watch cameras, "we have humans watching the technology," said Davis, who is now chief security officer at Armored Things, a security technology vendor based in Boston.
Kevin DavisChief security officer, Armored Things
"I'd like to get to a place where we have technology watching the technology, so you take the subjectivity out of it because when the humans are watching the technology, he or she is applying his or her experiences and discretion," he said.
"Hopefully we can get to a place where technology can look at an environment and identify anomalies" on its own, Davis continued.
Davis noted that while that situation would be ideal, taking humans out of the process would be near impossible, even in the future, because laws can have subjective components to them.
AI problems
AI in law enforcement tools and technologies have also proven to have problems. Inadvertent bias in machine learning models has consistently been brought up by analysts as a significant flaw in some AI technologies used by law enforcement, especially facial recognition software.
Earlier this year, for example, Amazon shareholders called for the company to cease sales of its facial and object recognition platform Rekognition to U.S. public safety departments after studies that showed that the platform apparently less accurately distinguished women and people with darker skin.
A recent survey conducted by anonymous professional social networking platform Blind found that employees at major tech vendors were willing to create AI for law enforcement systems, but were also concerned about misuse of that technology.
The survey, which had 3,826 unique responses, found that more than half the responders that worked at Microsoft and Amazon and about half of the responders that worked at Google and Salesforce said they would be willing to build AI software for law enforcement. Those survey takers also overwhelmingly said they have concerns that AI and facial recognition technologies might be misused.
Automated transcription
Law enforcement agencies use other AI-powered technologies in addition to facial recognition tools.
VIQ Solutions, a vendor of audio and video capture and management software, regularly works with law enforcement departments around the world.
Headquartered in Mississauga, Canada, VIQ supplies its law enforcement customers with a range of technologies, including a semi-automated transcription system that is cloud platform agnostic.
The transcription software uses machine learning and natural language processing to automatically transcribe audio provided by officers, Sebastien Pare, president and CEO of VIQ, said. Audio generally comes from recorded interviews and body cameras.
With human transcribers and editors working alongside the software, turnaround times are quick, Pare said, and accuracy levels have consistently hit higher than 90%.
"There's a lot of data flowing around," Pare said.