hi guys and girls,
My daughter somehow got me into this robot obsession. (I'm normally biased towards chemistry sets for kids being a Chemical Engineer, though a robot looked more house trained and gave a challenge in mechanical and computer engineering)
Of course I jumped probably too early and purchased a Meccanoid and have already started to see the limitations. Big plans already to lobotomize the brain and upgrade to the Ez-robot platform. Further down the track try and tack on an Alan-AI head (that thing is cool).
Though I digress, I have seen some demos on the Dexter, GoPigo robot where Google ran their Cloud vision and speech APIs. You could ask questions in many different languages and get answers to nearly any question (somewhat like Alexa) - It also looks like Alan has this capability (he is running Ez-robot it looks)
I'd like to see who out there has this working on Ez-robot (particulary the Google Speech Cloud), I'm a bit reluctant to launch into learning Python, Raspberry-Pi and whatever else is needed at the moment (time available and the learning curve necessary are the constraints- and my wife would divorce me if I dedicated the time necessary, I've already been warned about spending more time talking to an AI robot than her)
It has been an interesting journey so far and this site is fantastic. Its got me thinking about the complexities of the human body and just how nature through evolution has refined and perfected our nervous system and the programmed neurons that fire to make it all work. Just seeing how an arm and the hand movements work is a very deep & complex subject without even taking in to account all the other biological feedbacks happening at the same time. The journey to imitate this is a long way of from what I've seen, though this is what makes it exciting, and many new things are yet to come. So I look forward to hopefully sharing the journey with others on this website platform.