Robot Knows Your Face and Converses

The first thing you will want to do is merely experiment with what chatbot you'd like to use. There are a number of them on Synthiam's platform, as you can see in the skill store: https://synthiam.com/Products/Controls/Artificial-Intelligence.

I would probably recommend using the AIMLBot because it is very configurable and has a feature that you require to understand who is looking at the robot by the camera. So, install the AIMLBot here: https://synthiam.com/Support/Skills/Artificial-Intelligence/AimlBot?id=16020.

Make Chatbot Speak
The chatbot won't speak by default. It'll display the output in the log window. Let's edit the chatbot to add some python code to make it talk out of the PC speaker OR EZB speaker - whatever you choose. View the aimlbot configuration and select the response script.
User-inserted image


Now under the Python tab, add one of these... depends if you want the audio out of the PC or EZB.

Code:


// speak out of the PC
Audio.Say(getVar("$BotResponse"));

// speak out of the EZB
Audio.SayEZB(getVar("$BotResponse"));


User-inserted image



Speech Recognition
Now you need to speak to the robot. There are dozens of speech recognition modules, but Bing Speech Recognition is preferred. That is very reliable and configurable for things like this. You can install it here: https://synthiam.com/Support/Skills/Audio/Bing-Speech-Recognition?id=16209.

Connect Speech Recognition to Chatbot
Now you need to connect the speech recognition to the chatbot. So that when you speak, it pushes the detected phrase into the AIML chatbot. View the bing speech recognition configuration screen and add this code to the All Recognized Scripts. Since you're using python, I used the python tab.
User-inserted image


User-inserted image


Code:


ControlCommand("AimlBot", "SetPhrase", getVar("$BingSpeech"));


Once you save that configuration setting, you can start talking to the robot, and the chatbot will print responses back. Now that you have the chatbot working let's add the camera stuff to make the robot change who it is seeing...

Add Camera Device
Add the camera device so the robot can see using the camera. Install it from here: https://synthiam.com/Support/Skills/Camera/Camera-Device?id=16120
- Select your camera input, whether an EZB camera or USB
- Select your resolution
User-inserted image



Detect and remember a Face
There are a few ways to remember and know faces. The most popular is using Cognitive Face, which will remember the face and emotions, age, etc. It will allow your robot to recognize how happy or sad someone is. So we'll go ahead with the cognitive face. Add the cognitive face robot skill from here: https://synthiam.com/Support/Skills/Camera/Cognitive-Face?id=16210.

The other method is to train the face as an object using object training in the camera device. You will have to read the camera device manual for information on that: https://synthiam.com/Support/Skills/Camera/Camera-Device?id=16120.

Now, when you press the DETECT button, the information about the person will be displayed. If the robot does not know you yet, press the LEARN button to learn who you are.
User-inserted image



Now we can have the camera tell the chatbot who is talking to it. Press the configuration button on the cognitive face and add this python code to the script. It will send the current detected face to the aimlbot chatbot and specify the NAME parameter.
User-inserted image


Code:


ControlCommand("AimlBot", "SetValue", "name", getVar("$FaceName"));

User-inserted image



Make Camera Device Detect Face
Now we need to tell the camera device to run the detection when a face is detected. On teh camera device, swith to the tracking tab and select FACE
User-inserted image



Now let's make a script run that will tell the Cognitive Face to detect the face. This is like pushing the DETECT button every time a face is detected. Edit the configuration of the camera device and add this script to TRACKING START.

Code:


ControlCommand("Cognitive Face", "Detect");

User-inserted image
There, now you're done with the AI.

The next step is to make the head move. Just use the Auto Position robot skill, and that will be best. Install it from here: https://synthiam.com/Support/Skills/Servo/Auto-Position-Gait?id=20314

There's more than enough information in the Auto Position manual to explain how to create your animations. Here is the project all working: aiml chatbot.EZB. All you need to add next is the Auto Position for having the robot head servos move around. You can also configure the camera device to move the robot head's eyes to follow the face. That are just a few settings in the camera device.

Select the Track by relative position option if the camera is stationary and not moving. But here are the camera settings you need to have the robot's eyes move and follow the face.
User-inserted image




User-inserted image