Asked
Resolved Resolved by EzAng!

Blockly Workspace

Hi Fellow Roboters! I have a question I am a newbie and I have followed the precse instructions for emotions recognition using Blockly. However, the Cognitive Emotion plug in states it does not recognize a face? Also in Blockly, Six will only say "I think you are feeling". That's it I added the EmotionalDescription variable but it does not say any emotion.It says I think you are feeling zero or once it said I think you are feeling neutral? Suggestions?


Related Hardware Six Hexapod
Related Control Cognitive Emotion

ARC Pro

Upgrade to ARC Pro

ARC Pro will give you immediate updates and new features needed to unleash your robot's potential!

PRO
USA
#1   — Edited

cognitive emotion step1 -Start the camera step 2 - start cognitive emotion step3 - step 3 - add this to the cognitive vision EZ -script: next to the blocky

for PC

Say(("I see " + $EmotionDescription))

or

for Ez robots

SayEZB(("I see " + $EmotionDescription))

step1 -Start the camera

step 2 -  start cognitive vision

step 3 -  add this to the cognitive vision  EZ -script: next to the blocky

(for PC)

Say(("I see " + $VisionDescription))

for Ez robots

SayEZB(("I see " + $VisionDescription))

In the Camera click on object, "train new object", have it record your face, name the object (your name)

save

thats's it

#2  

"I think you are feeling" is a very strong statement for a robot though!!! :D

PRO
Synthiam
#3  

Lol Mickey - the OP stumbled across sentience

#4  

Thank you so much EZang60! I will try that. My apologies for slow thanks, I have a lot to do in preparation for college graduation. I appreciate your time and help. ~Blessings

PRO
USA
#5   — Edited

Ok, that should work, I tried it myself, if not let me know, good luck with your college graduation. What are your current goals?

#6   — Edited

Hi EZang60, The Cognitive Vision works great! However the Emotion Description does nothing at all. Is there some code besides the  SayEZB(("I see " + $EmotionDescription)) I should be adding to the EzScript in the Cognitive Emotion code? It keeps giving error message no face recognition? In Cognitive Vision it gives description of what I look like including the fact that I wear eyeglasses, so all good there, but the emotion description is what I really want.  Here's what I would like it to do, when a random face appears in the camera, tell what emotion the person is showing : Ex: angry, happy, sad etc.  If the random person asks the robot how they feel, it will say the emotion. Hope I have explained it clear enough for you. I graduate in 2020 but shoo they got lots of lit tasks to complete prior to graduation lol. I plan to be  Digital Forensics expert ... :-) But I am finding working with robots both challenging and interesting lol. So we shall see

PRO
USA
#7   — Edited

Put in the cognitive emotion -  SayEZB(("I see " + $EmotionDescription)) for your robot

make sure the variables are:

detected emotion = $EmotionDescription confidence = $EmotionConfidence

start the camera, then In the camera tracking - click "face"

correct, put your face in front of the camera, lol - assume nothing

Digital Forensics expert, sounds good -  working with robots is the now and the future

PRO
Synthiam
#8  

You must have your face in front of the camera for the control to recognize and detect your emotion. The control cannot detect emotion if there is no human face on the camera image. Also, ensure you are close enough to the camera to have your face in it.

Here's a good tutorial to get started by EZ-Robot. There's a lot of great tutorials on their website that I highly recommend. Follow this link: https://synthiam.com/Community/Tutorials/92

PRO
USA
#9  

correct, put your face in front of the camera, lol - assume nothing

PRO
Synthiam
#10   — Edited

LOL - well, that message is returned by the cognitive service engine if there's no face detected. So....

#11  

thank you both, I have followed all these instructions and it still says same thing sooo  Guess I'm not smarter than a first grader , because If I get my face in camera anymore I will be the robot lol. It has a trained image of my face also, so I'm not sure what more I can do. It was a nice idea tho. I will leave this to you experts, as I said I am a newbie, but reading is fundamental and comprehension isn't lost on me. I'm not sure how it could tell emotion IF a face wasn't in camera view. I thought this was where I should ask any questions dumb or otherwise, my apologies.

PRO
USA
#12   — Edited

All questions are welcome, asking questions is how you learn

Did you follow - https://synthiam.com/Community/Tutorials/92

Just follow the video

Can you send me a picture of what is happening?

PRO
USA
#13  

@bonreed: simple things first: you have a project with camera and Cognitive emotion then you start the camera

User-inserted image

then you press "describe image" button. In my case the camera is not pointing to a face. Then I point the camera to a face and i press "Describe image"

User-inserted image

and the service returns the results. Can you do a similar test (Pic 2) and post a screen capture.

PRO
USA
#14   — Edited

User-inserted image

I followed the video,, I heard the robot say, I think you are feeling happiness -  read 100% happiness

PRO
USA
#15  

Your screen shot says 99.4% neutral

did you hear the robot speak?

PRO
USA
#16   — Edited

User-inserted image

leave a space after feeling click save

here is the breakdown:

User-inserted image

PRO
USA
#17  

Gee thanks must be that ! LoL

btw, You are insisting with ez-scripts / blocky when the problem seems absence of a face.

PRO
USA
#18   — Edited

now that you got a "face the facts", a face in the front the camera, lol,  a simple question: did you hear the audio from the robot?

PRO
USA
#19  

I Cant hear anything no sound, nada!

PRO
USA
#20   — Edited

I thought so,

Always use a fresh charged robot

Step 1 - Adjust the volume on your computer

Step 2 - open ARC, go to the audio tab, click on sound board (Ezb), raise the volume to the top

Step 3 - again in the audio tab, select speech synthesis settings, I use David, set your voice / tone there

Try that, let me know

PRO
USA
#21  

Here is a screen shot

User-inserted image

PRO
USA
#22  

Don't give up my young friend

#23   — Edited

hi Ezang6o! after licking my wounds and tightening my  willingness to listen and learn straps ... I am proud to report that the program works! Hallelujah! Apparently I have one of those faces  the Six Robot doesn't like lol. Seriously I tried the program on everyone in my class and it worked great! and then it got to me it wouldn't give me an emotion and then it happened after 10 tries ...it said I was feeling happiness! Thank you so much for patience, perseverance and encouragement toward me. It really mattered and is greatly appreciated. Thanks to you ptp for questions too it made me revisit some things I added that I really didn't need. Oh, I did download the Microsoft cognitive emotion tho. couldn't get cognitive to work for me. Im sure some errors on my part .  Happy Thanksgiving to you all! See you in cyberspace or robotics land.

PRO
USA
#24   — Edited

perseverance = continue in spite  of difficulty - is the name of the game, great going

perseverance = continue in spite of difficulty - use in all aspects of your life

#25   — Edited

thank you my new mentor ;). will do!