Facebook Twitter YouTube SoundCloud RSS

REVEALED: The TSA’s New Computerized ‘Facial and Emotional’ Recognition System

21st Century Wire says…

More Orwellian technology is being rolled out, not just to make you into more of a commodity than you already are, but also to ‘profile’ your emotions.

With no regulation on this issue, corporations are basically writing the privacy rules as they go along. Do you trust them? Where is this really heading?

(Image: Android Headlines)

In realty, they have no control over how third parties using their software might use images of people’s faces and digital signatures of your ’emotions’, storing, sharing and selling that data across macro platforms. Sure, it’s just another new wing of Big Data. Even social media data trawlers like Facebook has already begun moving into facial recognition of their users.

Governments are already using face recognition technology at ports of entry and exit – at airports, sea ports and land ports (train and bus). Some road cameras are also equipped with these systems. The UK is already using software to detect your individual ‘walk’ signature. But that’s only the surface of it…

Now let’s read between the mainstream media lines…

Deception #1

The first layer of this onion of media deception is how the corporate media story (below) makes it sound as if this new digital ‘mood recognition’ technology is all about ‘advertising’.

In actuality, this computerized detection system is heading straight to the Transportation Security Authority (TSA). How do we know that? Because the man behind this latest facial recognition and behavioral recognition software of this technology being rolled out by tech firms Emotient Inc., Affectiva Inc. and Eyeris (see full report below) is none other than Dr. Paul Ekman (photo, left) from the University of California and author of Emotions Revealed.

What the Wall Street Journal has conveniently left out of their story is that back in 2008, Ekman trained human TSA officers to discern emotions, or anybody whom they believed, had a fear of being discovered. We’re told this was done by looking for ‘strange clothing’, perspiration and ‘strange’ facial expressions that indicted ‘impatience’ or ‘anger’. So confident Ekman was of his 2008 pre-crime system, that he erroneously claimed back then that his 1984 psycho-screening would have stopped the 9/11 hijackers. Really?

As we speak, Ekman is probably busy helping the TSA and DHS to implement the new computerized version of his old ‘human’ system.

Deception #2

Here is the second layer of the onion. The government and media will eventually admit this is being used by the TSA, and they will insist that it’s only to “catch terrorists”. You can already hear the technocrats and police state apologists harping on the usual talking points, saying something like, “Somewhere out there the needle in the haystack is a bad guy. If our behavior detection systems give us better odds of finding that needle, we’re going to use every tool we can”.

Yes, it’s all about saving lives.

What they will not tell you is that this will be extended to regional and local law enforcement and security firms in order to detect people with, “outstanding warrants – in other words, a reason to be nervous.”

This eventually be rolled out to every area of society, including child support payments, tax collection, fines, parking tickets, and who knows – maybe even auto or health insurance. There’s your technocracy or Scientific Dictatorship right there, in full color.

With the Full Body Airport X-Ray Scanners, travelers can opt-in, or opt-out. However, with this new advanced regime, it’s no longer a question of comply, or not comply, you will have no choice, or say in how and what the state or a private corporation can do to you.

What’s clearer than ever in 2015, is that such technology will only keep rolling forward until the public actually rejects it…


‘Unmasking Your Emotions’: Using Psychology and Data Mining to Discern Emotions as People Shop, Watch Ads; Breeding Privacy Concerns


Elizabeth Dwoskin and Evelyn M. Rusli
Wall Street Journal

Paul Ekman, perhaps the world’s most famous face reader, fears he has created a monster.
The 80-year-old psychologist pioneered the study of facial expressions in the 1970s, creating a catalog of more than 5,000 muscle movements to show how the subtlest wrinkling of the nose or lift of an eyebrow reveal hidden emotions.

Now, a group of young companies with names like Emotient Inc., Affectiva Inc. and Eyeris are using Dr. Ekman’s research as the backbone of a technology that relies on algorithms to analyze people’s faces and potentially discover their deepest feelings. Collectively, they are amassing an enormous visual database of human emotions, seeking patterns that can predict emotional reactions and behavior on a massive scale.

Dr. Ekman, who agreed to become an adviser to Emotient, says he is torn between the potential power of all this data and the need to ensure it is used responsibly, without infringing on personal privacy.

So far, the technology has been used mostly for market research. Emotient, a San Diego startup whose software can recognize emotions from a database of microexpressions that happen in a fraction of a second, has worked with Honda Motor Co. and Procter & Gamble Co. to gauge people’s emotions as they try out products. Affectiva, an emotion-detection software maker based in Waltham, Mass., has used webcams to monitor consumers as they watch ads for companies like Coca-Cola Co. and Unilever PLC.

The evolving technology has the potential to help people or even save lives. Cameras that could sense when a trucker is exhausted might prevent him from falling asleep at the wheel. Putting cameras embedded with emotion sensing software in the classroom, could help teachers determine whether they were holding their students’ attention.

But other applications are likely to breed privacy concerns. One retailer, for instance, is starting to test software embedded in security cameras that can scan people’s faces and divine their emotions as they walk in and out of its stores. Eyeris, based in Mountain View, Calif., says it has sold its software to federal law-enforcement agencies for use in interrogations.

The danger, Dr. Ekman and privacy advocates say, is that the technology could reveal people’s emotions without their consent, and their feelings could be misinterpreted. People might try to use the software to determine whether their spouse was lying, police might read the emotions of crowds or employers might use it to secretly monitor workers or job applicants.

“I can’t control usage,” Dr. Ekman says of his catalog, called the Facial Action Coding System. “I can only be certain that what I’m providing is at least an accurate depiction of when someone is concealing emotion.”…

Continue this story at Wall Street Journal

READ MORE SCI-TECH NEWS AT: 21st Century Wire Sci-Tech Files



Get Your Copy of New Dawn Magazine #203 - Mar-Apr Issue
Get Your Copy of New Dawn Magazine #203 - Mar-Apr Issue