Asked — Edited

I Am Unable Hear The Sound From Ezb After Getting Response From Microsoft Cognit

I am unable to hear the voice out from Microsoft Cognive Vision Service. I am getting respose but voice is not come out from EZB. What is the probem with this?


Upgrade to ARC Pro

Experience the transformation – subscribe to Synthiam ARC Pro and watch your robot evolve into a marvel of innovation and intelligence.


Follow the robot program episode for the plugin. You have no added any code to have it speak. You must follow instructions.


I followed the Robot Program to configure but, I am facing the same issue.


Please follow the robot program instructions. In the robot program, code is added to the plugin. The code is required to have the robot speak. If the instructions are not completed, the robot will not speak. The point to the robot program instructions is to teach you how to program. Please follow the instructions to begin learning.

Perhaps starting with an earlier the robot program episode is suggested to begin learning how to use the product. The emotion plugin may be slightly too advanced for your current skill set. But, do not fear because the robot program episodes will build your skills to that level if you start at the beginning.


my jd is unable to sound back with cognitive emotion. It is not giving the response of cognitive emotion API. Can anyone help me to resolve this issue?


@karikatla Have you filled in the script in the cognitive emotion control settings (as seen in the picture below)? You need to tell JD to speak (using the say command) when he uses controls like cognitive emotion or cognitive vision... As DJ suggested.. please follow the tutorials.

User-inserted image


Yes, I filled. It saying that I think you are looking. for this script

SayEZBWait(("I think you are looking" + $EmotionDescription))

jd is unable to get the response output. Is there any issue with service or anything else.



Your code is not correct... you have...

SayEZBWait(("I think you are looking" + $EmotionDescription))

It should be... with single parenthesis

SayEZBWait("I think you are looking" + $EmotionDescription)


I just followed the tutorial. It is like that only. I change the code to what you suggested in that case response is not coming out from ezb.


Does the speaker on your ezb4 actually work? Does it make the start up chime sound? Can you play other basic "say" commands through it?


The sound is coming out from ezb. but, cognitive emotion response is not coming out.


Are you in Client mode or AP mode with your ezb4? You need to be in Client mode (so you are connected to the internet and the ezb4 at the same time) for this to work... It can also work in WP mode if you have an additional wifi usb dongle... Where one is connected to the internet and the other is connected to the ezb4....


I am client mode only. Cognitive vision is working fine. But, cognitive emotion is not working for me.