It’s like something out of Minority Report…
A new development in the world of artificial intelligence software could lead to a future where criminals can’t get away with lying to police.
After astounding breakthroughs in facial recognition software, law enforcement may be getting one of the most helpful tools ever to be created.
While psychologist Paul Ekman’s vast research on facial expression and their tie-ins with lying created an entire network of human lie detectors more than four decades ago, these new advancements might be able to take over. While the main company developing this technology is based in the UK, the United States may soon have this tool in their command.
So how does it work?
Facesoft, the UK startup, has reportedly created a database of over 300 million images of faces. The system can identify emotions like fear, anger and more based on ‘micro-expressions’ which are often invisible to the naked eye.
The software would monitor a subject while they were being questioned by authorities, and could offer some information when a “tell” is given by the subject.
“If someone smiles insincerely, their mouth may smile, but the smile doesn’t reach their eyes — micro-expressions are more subtle than that and quicker,” co-founder and Chief Executive Officer Allan Ponniah.
Today on #LetItRip… Using facial recognition technology to solve crimes. Detroit Police Chief James Craig says it works. But critics believe it violates constitutional rights. Our panel debates if DPD should be allowed to use the technology? The debate starts at 930 a.m. pic.twitter.com/vKEV9h8rC9
— FOX 2 Detroit (@FOX2News) June 30, 2019
So, it’s basically like the show ‘Lie to Me’ but with a computer instead of the lead character. Instead of trained law enforcement officials becoming experts in micro-expressions, the technology could replace that part of the job altogether.
But with new technology comes new problems.
Researchers are claiming that while the incredible new advancements could help solve an immense amount of crimes by stopping liars in their tracks, false signs could trigger a reaction from the software when in fact, there isn’t one.
The algorithms designed within this type of software’s framework may also be potentially ‘biased’ and ‘opaque’, according to a study performed on technology that police are currently using to determine who should be granted bail, parole or probation.
With the growing technology, police have access to more information than ever before. But installing these platforms within our criminal justice system will be no easy feat. A number of citizens and companies have already raised their concern on an infringement of rights to privacy.
— The Verge (@verge) June 30, 2019
For now, it’s unclear when these bugs within the technology will be solved. Will these new advancements become the standard in police departments? We’ll have to wait and see.