Hey, Justin or anyone.
I just got an idea. How hard would it be to simply allow your Facial Software to Detect emotions. We have software to demonstrate emotions and can make the robot have emotions. But, can we interface emotion seeking software into the EZB program? If a robot could sense posture, body language and even a simple smile or frown, it could be much more accepted into a household. People are about Feelings. I just realized that when I saw the Pepper Robot demonstration.
I would like to hear if this would be a possibility for you in the future. I would like for my sunshine to make eye contact and KNOW if I am happy or sad.
Please, give me your Thoughts on these matters.