Dolby Laboratories chief scientist Poppy Crum, pictured in April 2018, says sensors and artificial intelligence are joining forces to reveal whether someone is lying, infatuated, or poised for violence
Vancouver (AFP) - Dolby Laboratories chief scientist Poppy Crum tells of a fast-coming time when technology will see right through people no matter how hard they try to hide their feelings.
Sensors combined with artificial intelligence can reveal whether someone is lying, infatuated, or poised for violence, Crum detailed at a big ideas TED Conference.
“It is the end of the poker face,” Crum said.
“We broadcast our emotions. We will know more about each other than we ever have.”
Eye dilation reveals how hard a brain is working, and heat radiating from the skin signals whether we are stressed or even romantically piqued.
The amount of carbon dioxide exhaled can signal how riled up someone, or a crowd, is getting. Micro-expressions and chemicals in breath reveal feelings.
The timing of someone’s speech can expose whether they are at risk of dementia, diabetes, multiple sclerosis, or bipolar disorder, according to the neuroscientist.
Brain waves can indicate whether someone’s attention is elsewhere in a room, regardless of the fact their gaze is locked on the person in front of them.
Technology exists to read such cues and, combined with artificial intelligence that can analyze patterns and factor in context, can magnify empathy if used for good or lead to abuses if used to oppress or manipulate, said Crum.
“It is really scary on one level, but on another level it is really powerful,” Crum said.
“We can bridge the emotional divide.”
She gave examples of a high school counselor being able to tell whether a seemingly cheery student is having a hard time, or police quickly knowing if someone acting bizarrely has a health condition or is criminally violent.
One could skip scanning profiles on dating apps and, instead, scan people for genuine interest.
Artists would be able to see the emotional reactions people have to their creations.
“I realize a lot of people are having a hard time with people sharing our data, or knowing something we didn’t want to share,” Crum said.
“I am not looking to create a world where our inner lives are ripped open, but I am looking to create a world where we can care about each other more effectively.”
With emotion-reading rooms, smart speakers, or accessories on their way, Crum is keen to see rules in place to make sure benefits are equally available to all while malicious uses are prevented.
“It is something people need to realize is here and is going to happen; so let’s make it happen in a way we have control over,” Crum told AFP.
“We will be able to know more about each other than we ever have. Let’s use that for the right reasons rather than the wrong ones.”