Asked January 15 2017

Community Question

Has anyone seen or heard of any hacks to the echo yet? Currently, I have my "personal assistant, Antonn" using Alexa. Key voice commands by him operate her when she can perform better than he can. My goal is a future Alexa voice change to male. I have seen some voice changes done on You Tube but they are beyond my skill level.

Has anyone heard of a voice service offering an Alexa's voice?

I think Alexa will be a great incorporation with an EZ-B in the future. Any opinions?

May 26 2017
It also interfaces with Google Home and with Google Assistant. I haven't looked into that, but with Google now owning them, it would be logical.

Also, if memory serves me right, it has an IFTTT channel but I could be wrong. I am trying to remember how we did some of the things what we did with EZ-AI. Memory gets fuzzy when you aren't actively working on something anymore. I believe that we were using IFTTT to then control the OpenHAB instance but I can't remember.

So, here is the chain... You can record text from MS Speech stuff, and then send it to API.AI which then can act as a brain and determine what to do with what was said. From there it can return text to be spoken, call an echo skill (like say [what ever was returned from API.AI]) and a lot more. The issue is that a robot would only have the alexa voice. If API.AI decided that it needed to do something like "Turn on my living room light" it would use IFTTT to then transfer the request to OpenHAB which would be able to complete the request and so on.

It is a pretty powerful addition to EZ-Builder for sure. I just need to get back to it again as things change quickly as is evident by the new features in EZ-Builder. This rapid change is ultimately what caused me to bow out of EZ-AI due to it needing to be a full time job to keep up with the changes that are happening in this arena. I wish I was wealthy for no other reason than to be able to focus on one thing instead of many.
May 26 2017
One other thing... OpenHAB2 is out now. It is much easier to use than base OpenHAB.
May 26 2017
Okay, one more thing... API.AI has conversational logic built in, so something like...

"Turn off the living room light and lock the doors" would work.
Then, "turn it back on" - would be recognized as "Turn the living room light back on" and would work.

The conversational logic is a really cool piece of this. Also, something like
Who wrote "To Kill a Mockingbird?"
and then
"What year was she born?"
and then
"What happened that year?"
and then
"What other books did she write?"

would all work.
May 26 2017
Thanks David. I didn't realize the power in API.AI . I have to do homework and get it going.

Another thing to load into Antonn. It sounds like this will be the main AI driver I will need for him.
May 27 2017
Yeah guys I could not agree more on the is the key to bring natural language to your robot! It is a very powerful tool!

As @CochranRobotics pointed out, you will send text to where it gets processed and returned to EZ-Builder. This can be done easy by using the powerful plugin that @DJSures provided!
Along with the speech there is also the action variable returned to EZ-Builder...this variable can be used to build logic or trigger events within EZ-Builder!

If you would like to call IFTTT you could use the new EZ-Builder IFTTT plugin.
A certain returned action variable would trigger a certain IFTTT call!
If you would want the robot to perform a certain move you could trigger AutoPosition...different action variables could trigger different robot animations.
There is a lot of ways to get stuff done, and integration is very easy!

If you are using EZ-Builder you can also obviously change your TTS voice in Windows, if you are using the Google Assistant they already provided Google AIY where you can hack into Actions on Google to give your robot a specific voice!
I don't know about Alexa skills, but I guess it is well integrated too!

Let's put some peronality to your beloved little toys!:D