Final Project Prototype 4: The Emotion Reader

In developing the protoype I went and re-evaluated the original hypothesis I worked on for my udnergraduate thesis which was an American Sign Language Interpreter using Microsoft Kinect.

It was broken down into these steps.

  1. Determine Face and set it as the baseline for a grid system
  2. Record and train machine for emotions. (Joy, Sad, Anger etc)
  3. Store these information on a database using SQL
  4. Run the device and do a universal if statement if the similar expressions are recognized
  5. Tweak them based on speed of expression.

One of early thoughts that came while I thought of the idea was if business could sense emotion using tech at all times they could capitalize to sell to the consumer in the actualy moment. This would lead us to Machine emotional intelligence, it would also lead to huge consequences in startups, healthcare, wearables, education and a lot more. One of the early lists I’ve developed as a practical application for the tool are these:

 

  • Helping to better measure TV ratings.
  • Adding another security layer to security at malls, airports, sports arenas, and other public venues to detect malicious intent.
  • Wearables that help autistics discern emotion
  • Check out counters, virtual shopping
  • Creating new virtual reality experiences

Images

  • Facebook
  • Twitter
  • Facebook
  • Twitter
  • Facebook
  • Twitter
  • Facebook
  • Twitter

The earliest form I wrote the code on was C++ and I was able to get it running however. I wanted to see if I was able to run the same system using Java script so I can be able to see if I can run the system in a chat based interface for my next iteration.

Luckily after hours of late night work, I was able to get it to run using a online server to host the data.