Defense & National Security
Case StudyLondon Metropolitan Police

London Metropolitan Police plan automatically detecting illegal and inappropriate material on confiscated devices

The London Metropolitan Police plan to implement AI software which can identify inappropriate and illegal content, such as child pornography, stored on confiscated devices. Currently their software identifies guns and drugs.

Context

"The Metropolitan Police''s digital forensics department, which last year trawled through 53,000 different devices for incriminating evidence, already uses image recognition software but it is not sophisticated enough to spot indecent images and video, Mark Stokes, the Met''s head of digital and electronics forensics, told the Telegraph. ''We have to grade indecent images for different sentencing, and that has to be done by human beings right now, but machine learning takes that away from humans,'' he said. Handing this work over to computers could save forensics specialists who spend their career trawling through pictures from psychological strain."

The Project

According to The Telegraph, currently the "digital forensics team uses bespoke software that can identify drugs, guns and money while scanning someone’s computer or phone. But it has proven problematic when searching for nudity." However, "Artificial intelligence will take on the gruelling task of scanning for images of child abuse on suspects'' phones and computers so that police officers are no longer subjected to psychological trauma within "two to three years"."

Results

Planned; results not yet available

Back to Case Studies
AI Daily Brief — leaders actually read it.

Free email — not hiring or booking. Optional BPAI updates for company news. Unsubscribe anytime.

Include

No spam. Unsubscribe anytime. Privacy policy.