This was a random quark project
What you see is coloured by how you feel. The Emotional Mirror brings you face to face with this phenomenon by analysing your facial expression and reflecting it back to you. Along with your own image it displays your thoughts as tweets that embody the emotion you’re experiencing. While you’re looking into the mirror the feedback loop between sensation and perception becomes more visible. If you smile, happy tweets appear, if you frown, you see sad tweets.
The installation uses computer vision to identify the facial expressions of the user. It then searches twitter and picks random tweets and performs sentiment analysis on the text, categorising them as emotionally positive or negative. It then presents the tweets that correspond to the emotional expression as clouds that appear around the user's head and then float away. Smiling bring happy tweets, while a sad face brings sad tweets.
The project was created using python, openFrameworks/C++ making use of the ofxFaceTracker addon to track faces, the Vader library for sentiment analysis of tweets as Tweepy to interface with the Twitter stream. The source for the display and the tweet analyser backend can be found on our github.
We worked with Michele Panegrossi and Laura Pedroni who assisted with the hardware and conceptual design.