Asked — Edited

Please, Can We Have Detection Of Smile Or Frown

Hey, Justin or anyone.

I just got an idea. How hard would it be to simply allow your Facial Software to Detect emotions. We have software to demonstrate emotions and can make the robot have emotions. But, can we interface emotion seeking software into the EZB program? If a robot could sense posture, body language and even a simple smile or frown, it could be much more accepted into a household. People are about Feelings. I just realized that when I saw the Pepper Robot demonstration. I would like to hear if this would be a possibility for you in the future. I would like for my sunshine to make eye contact and KNOW if I am happy or sad.

Please, give me your Thoughts on these matters.


Skip to comments


Upgrade to ARC Pro

Get access to the latest features and updates before they're released. You'll have everything that's needed to unleash your robot's potential!


i think nao does this whit pictures off your face.


Yes, Nao and Pepper will do it. I am looking to do it with an EZB robot. I wish to purchase a Pepper when it comes out in the states,but I have to literally sell everything I own in order to get it. We will see what happens.


He reminds me of my puppy. If he had a robot "accident" on the floor I'd have to frown. eek


moviemaker the use off nao/pepper program cost 250 euro a year. if you have any broken parts,or maintenance cost some 700 euro exc chipping.

this video shows ARC sees pictures.



Reading emotions from facial expressions is tricky. Unless you exaggerate the face your making even other humans have problems understanding what you feel. I can see smile recognition being a fun feature but the others would seem impractical.

The " picture matching" idea isnt bad but it would need to be trained for every person.


nomad, thanks for the idea RENT or LEASE.

United Kingdom

The camera would need to be high resolution to detect changes as discrete as smiling or frowning, lighting would also need to be very good to detect these changes.

Even then, it doesn't work too well on my camera or on my phone, both of which have smile detection and blink detection.


the woman in the first video has something on her head,maybe thats what, you are looking.


Nomad, when you say use for $250 Euros, do you mean just the software or the whole package. Very Important.




that is just to use apps.


I was hoping you were talking about Pepper and the apps for that amount. I could raise that. Choreograph looks like Microsoft Robotics Studio. But, I cannot read the crg files with MS R Studio. It is going to be a real challenge to raise the money to buy a Pepper. It would help if the price in Japan was the same in the USA when they release. But, I know that it will be much higher possibly cost prohibited.


@Mel... You might have to sell your house for this one... Tell you what... Give me the money you would have otherwise spent buying the Pepper robot and I will design, program and build one for you using the EZB4.... As a bonus, I won't even charge you annual fees for use of the software....:P


@nomad, did you say you had a QBO as well as a NAO?


sorry for late respons,i seem the getting no mails from here.

whats a qbo?


This is very doable by using the trained custom object. I just train 3 objects of my mouth for smile, frown and tongue out and each is detected very well. Give it a try, its rather fun.


Justin, that is using your brain program? i have to wait until I get my new camera in. It is HD and autofocus.

I am happy this is already done. You have made me very happy.


I have another question for JUSTIN.

OK, so you train the image and the robot knows YOU are happy. How can we make it work on everybody? Couldn't we map a certain set of vectors? Like dots on the brows, dots on the lips? It really seems like it could be done. Everyone has brows, everyone has lips. Could we make this work with a sort of frame? for instant ANGRY could be \ / for brows.


@Mel, the object training I did was in ARC. It was very easy, straight forward and works well for one person (maybe a couple more).

For software to recognize if you are happy or sad in multiple people you'd need to invent something new. OpenCV has a custom haar for the face like you mentioned, it's called: haarcascade_mcs_mouth.xml I have found a couple of software companies that have posted demos of using this haar to create a custom application. I don't even want to ask what they would charge for their code or to create an application.


what you think off this?


Justin R, Thank you for responding, can I learn more about this? sounds great.

Patrick, Thanks for sharing examples of emotion. Steve S


@Steve... as Justin said it's easy, you can use object training in ARC. You just need a camera, ARC and your face...;)


@nomad, just do a search on ebay for personal family robots and it should be the first one up. I have two robots for sale. It will be green.


i have a look ah thats an qbo.i dont have money to by a robot.

i just got mail from a member aldebaran,he bouth 3 last new nao's. 2 are allreddy broken.


that sounds expensive. What they need is an EZB inside!

A lot of people think they are made for children. But, they are made for adults. Children will tear them up if the children are not careful. Well, the ARE made for autistic children.


i gess they made the arms off nao longer,means more lift for the motors. i hear they brake all at the choulders.