Identifying Emotions on the Basis of Manual Activation

Arash Eshghi started what turned out to be a very productive fight on our CogSci listserv with this press release: Carnegie Mellon Researchers Identify Emotions Based on Brain Activity and its attendant paper: Identifying Emotions on the Basis of Neural Activation.

I came up with a press release of my own, I might at some point get round to doing the Atlantic Salmon paper on the subject.


Press Release: University Researchers Identify Emotions Based on Finger Activity

New Study Extends “Palm Reading” Research to Feelings by Applying Machine Learning Techniques to Keyboard Data

For the first time, scientists at a university have identified which emotion a person is experiencing based on finger activity.

------------------

 :)      happy

 :(      sad

-----fig 1--------

The study combines keyboards and machine learning to measure finger signals to accurately read emotions in individuals. The findings illustrate how the finger categorizes feelings, giving researchers the first reliable methods to evaluate them.

“Our big breakthrough was the idea of testing typists, who are
experienced at expressing emotional states digitally. We were fortunate,
in that respect, that EECS has so many superb typists”

said a professor.

For the study, typists were shown the words for 9 emotions: anger, disgust, envy, fear, happiness, lust, pride, sadness and shame. and were recorded typing them multiple times in random order.

------------------
   :§      8(>_<)8
      8^O
            =)
x-(    ;-b...
 _ _ 
( " )   :-c
             :-@
    (-.-)

 DX        >:(
     :-S     
             ;-)
:0=     :-)

  :(     *:-}

  !-}       (-_-)
-----fig 2--------

The computer model, using statistical information to analyse keyboard activation patterns for 18 emotional words was able to guess the emotional content of photos being viewed using only the finger activity of the viewers.

“Despite manifest differences between people’s psychology, different
people tend to manually encode emotions in remarkably similar ways”

noted a graduate student.

A surprising finding from the research was that almost equivalent accuracy levels could be achieved even when the computer model made use of activation patterns in only one of a number of different subsections of the keyboard.

“This suggests that emotion signatures aren’t limited to specific
regions such as the qwerty parentheses cluster, but produce
characteristic patterns throughout a number of keyboard regions”

said a senior research programmer.