Thumbnail

Cognitive Emotion

Uses Microsoft Cognitive Emotion cloud to analyze camera images, returning emotion descriptions and confidence for speech/output (requires internet).

How to add the Cognitive Emotion robot skill

  1. Load the most recent release of ARC (Get ARC).
  2. Press the Project tab from the top menu bar in ARC.
  3. Press Add Robot Skill from the button ribbon bar in ARC.
  4. Choose the Camera category tab.
  5. Press the Cognitive Emotion icon to add the robot skill to your project.

Don't have a robot yet?

Follow the Getting Started Guide to build a robot and use the Cognitive Emotion robot skill.


How to use the Cognitive Emotion robot skill

The Synthiam ARC Cognitive Emotion robot skill uses Synthiam's cloud-based emotion service to detect emotions from a camera image (for example: happy, sad, surprised, neutral, etc.). The camera image comes from a Camera Device skill that you add to your ARC project.

If you are using an EZ-B and your robot is also using WiFi, make sure you have a working internet connection on the computer running ARC. Many users either:

  • Configure the EZ-B for WiFi Client Mode (robot joins your home/office WiFi), or
  • Add a second USB WiFi adapter so one connection can be dedicated to the robot and the other to the internet.

Beginner Overview

The basic flow is:

  1. Add a Camera Device skill
  2. Add the Cognitive Emotion skill
  3. Point the Emotion skill at your camera
  4. Use the emotion variables in scripts (ex: make ARC speak the result)

What Can You Do?

Once the Cognitive Emotion skill is running, it updates variables you can use in ARC scripts. A simple first project is to have ARC speak the emotion it detects from the computer’s speaker.

What you need
  • ARC installed and running
  • A working Camera Device skill in your project (USB camera, IP camera, etc.)
  • The Cognitive Emotion skill added to the project
  • An active internet connection
Step-by-step (Beginner)
  1. Add a Camera Device
    In ARC, click ProjectAdd Skill → search for Camera Device. Configure it so you can see a live video image.
  2. Add the Cognitive Emotion skill
    Click ProjectAdd Skill → search for Cognitive Emotion.
  3. Select the camera source
    Open the Cognitive Emotion skill’s configuration and choose the Camera Device you want it to analyze. (If you have more than one camera, this step is important.)
  4. Make sure it’s updating
    With the camera pointed at a face, the skill should begin returning an emotion description and a confidence value. If you are not getting results, confirm your internet connection and that the camera image is working.
  5. Add a script to speak the result
    In the Emotion skill’s configuration (or in a separate script), add the example code below.

Here is an example script line that will speak the detected emotion and confidence:

say("You are " + $EmotionConfidence + " percent " +  $EmotionDescription)

See It In Action

DJ Sures created a fun example project using an EZ-Robot JD humanoid. The robot detects a person’s emotion and responds during conversation. This is a great way to see how emotion recognition can be combined with speech and other skills.

If you have an EZ-Robot JD Humanoid, you can try the same sample project:

Download DJ's testemotion.EZB project

A project layout may look similar to this (your skills may vary depending on your robot and camera):

Example ARC project layout showing the Emotion skill and Camera Device

Troubleshooting (Common Beginner Issues)

Check that your PC speakers are working and volume is up. Also confirm the script is actually running (for example, triggered by a timer, button, or auto-start option in the skill).
Make sure the Camera Device is showing a live image, the Emotion skill is set to the correct camera, and ARC has internet access. If you are connected directly to the robot’s WiFi, you may need Client Mode or a second WiFi adapter so the PC still has internet.
Lighting and camera angle matter. Try brighter lighting, keep the face larger in the frame, and reduce motion blur. Results are “best effort” and may vary by environment.


ARC Pro

Upgrade to ARC Pro

Take control of your robot's destiny by subscribing to Synthiam ARC Pro, and watch it evolve into a versatile and responsive machine.

#1  

I see the same eight emotions listed for both Cognitive Emotion and Cognitive Face. Does this skill duplicate the emotion detection functionality of the Cognitive Face skill? If not, how are they different when it comes to reading someone's emotions?

Thanks,

Thomas

Author Avatar
PRO
Canada
#3  

FYI the testemotion example that you can download on this page no longer works.  Plugins out of date