You are here

‘Poker face’ stripped away by new-age tech

By AFP - Apr 15,2018 - Last updated at Apr 15,2018

VANCOUVER — Dolby Laboratories chief scientist Poppy Crum tells of a fast-coming time when technology will see right through people no matter how hard they try to hide their feelings.

Sensors combined with artificial intelligence can reveal whether someone is lying, infatuated, or poised for violence, Crum detailed at a big ideas TED Conference.

“It is the end of the poker face,” Crum said.

“We broadcast our emotions. We will know more about each other than we ever have.”

Eye dilation reveals how hard a brain is working, and heat radiating from the skin signals whether we are stressed or even romantically piqued.

The amount of carbon dioxide exhaled can signal how riled up someone, or a crowd, is getting. Micro-expressions and chemicals in breath reveal feelings.

The timing of someone’s speech can expose whether they are at risk of dementia, diabetes, multiple sclerosis, or bipolar disorder, according to the neuroscientist.

Brain waves can indicate whether someone’s attention is elsewhere in a room, regardless of the fact their gaze is locked on the person in front of them.

Technology exists to read such cues and, combined with artificial intelligence that can analyse patterns and factor in context, can magnify empathy if used for good or lead to abuses if used to oppress or manipulate, said Crum.

“It is really scary on one level, but on another level it is really powerful,” Crum said.

“We can bridge the emotional divide.”

She gave examples of a high school counselor being able to tell whether a seemingly cheery student is having a hard time, or police quickly knowing if someone acting bizarrely has a health condition or is criminally violent.

One could skip scanning profiles on dating apps and, instead, scan people for genuine interest.

Artists would be able to see the emotional reactions people have to their creations.

“I realise a lot of people are having a hard time with people sharing our data, or knowing something we didn’t want to share,” Crum said.

“I am not looking to create a world where our inner lives are ripped open, but I am looking to create a world where we can care about each other more effectively.”

With emotion-reading rooms, smart speakers, or accessories on their way, Crum is keen to see rules in place to make sure benefits are equally available to all while malicious uses are prevented.

“It is something people need to realise is here and is going to happen; so let’s make it happen in a way we have control over,” Crum told AFP.

“We will be able to know more about each other than we ever have. Let’s use that for the right reasons rather than the wrong ones.”

up
110 users have voted.

Add new comment

CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
20 + 0 =
Solve this simple math problem and enter the result. E.g. for 1+3, enter 4.

Opinion

Editorial

Thursday 18 October 2018

Newsletter

Get top stories and blog posts emailed to you each day.