It’s like something out of Minority Report…

A new development in the world of artificial intelligence software could lead to a future where criminals can’t get away with lying to police.

That’s right.

After astounding breakthroughs in facial recognition software, law enforcement may be getting one of the most helpful tools ever to be created. 

New facial recognition software could stop liars in their tracks, changing the game when it comes to criminal investigations. (Pixabay)

 

While psychologist Paul Ekman’s vast research on facial expression and their tie-ins with lying created an entire network of human lie detectors more than four decades ago, these new advancements might be able to take over. While the main company developing this technology is based in the UK, the United States may soon have this tool in their command.

So how does it work?

Facesoft, the UK startup, has reportedly created a database of over 300 million images of faces. The system can identify emotions like fear, anger and more based on ‘micro-expressions’ which are often invisible to the naked eye.

DOWNLOAD THE LAW ENFORCEMENT TODAY APP AND NEVER MISS A STORY

 

The software would monitor a subject while they were being questioned by authorities, and could offer some information when a “tell” is given by the subject.

“If someone smiles insincerely, their mouth may smile, but the smile doesn’t reach their eyes — micro-expressions are more subtle than that and quicker,” co-founder and Chief Executive Officer Allan Ponniah.

 

So, it’s basically like the show ‘Lie to Me’ but with a computer instead of the lead character. Instead of trained law enforcement officials becoming experts in micro-expressions, the technology could replace that part of the job altogether. 

But with new technology comes new problems.

Researchers are claiming that while the incredible new advancements could help solve an immense amount of crimes by stopping liars in their tracks, false signs could trigger a reaction from the software when in fact, there isn’t one.

The algorithms designed within this type of software’s framework may also be potentially ‘biased’ and ‘opaque’, according to a study performed on technology that police are currently using to determine who should be granted bail, parole or probation. 

With the growing technology, police have access to more information than ever before. But installing these platforms within our criminal justice system will be no easy feat. A number of citizens and companies have already raised their concern on an infringement of rights to privacy.

 

For now, it’s unclear when these bugs within the technology will be solved. Will these new advancements become the standard in police departments? We’ll have to wait and see.

 

LAW ENFORCEMENT TODAY MEMBERSHIPS ARE FORMING – CLICK TO LEARN HOW YOU CAN BECOME THE NEXT MEMBER OF THE FAMILY!