DOJ Expands AI Use for Law Enforcement, FBI Doubles AI Applications in 2025
New information on how the Department of Justice (DOJ) is deploying artificial intelligence (AI) tools for law enforcement practices.
Overall, DOJ listed 315 total AI entries with 188 active use cases in its 2025 inventory. That’s a 31 percent increase from 2024. Law enforcement applications accounted for 62 percent of deployed use cases, with administrative and IT-related systems making up 23 percent.
DOJ noted its inventory is incomplete with some information “withheld due to recognized information sharing restrictions in compliance with existing laws, policies, and regulations.”
FBI Use Cases
Looking deeper at the numbers, the Federal Bureau of Investigation (FBI) doubled AI applications over the past year.
In 2025, the FBI had 50 AI use cases total, with 27 of those categorized as law enforcement related. That’s up from 19 AI use cases overall with 15 designated to helping to aid “law and justice” in 2024.
The FBI is deploying AI for a wide variety of tasks, including nine that are designated high-impact. Those include the use of facial recognition and data mapping technology, biometric matching across large datasets, next generation location, and automatic vehicle identification.
Low-impact use cases being used include summarization and transcription tools, redaction tools, and data extraction tools.
However, the report notes that the FBI has not yet completed the required impact assessments on issues like bias, safety, and performance. The Trump Administration set a deadline of April 2026 for all federal agencies’ high-impact AI use cases to meet minimum risk management requirements.
Other DOJ agencies have also not completed the required assessments, but say the work is in progress.
Privacy Advocates Voice Concerns
The rapid adoption of AI technology and the way DOJ is reporting it is raising concerns among privacy advocates.
“There’s a level of ambiguity in how they’ve chosen to describe some of the things in the document itself,” said Patrick Eddington, senior fellow in homeland security and civil liberties at the Cato Institute. “It’s extremely difficult to get any kind of definitive sense in terms of how good it is, who’s conducting the oversight or auditing or stress testing.”
The AI inventory is required under a 2020 executive order from President Trump’s first term.