Welcome to Synthiam!

Program robots using technologies created by industry experts. ARC is our free-to-use robot programming software that makes features like vision recognition, navigation, and artificial intelligence easy.

Get Started
Asked — Edited

Amazon Echo Hacks ?

Has anyone seen or heard of any hacks to the echo yet? Currently, I have my "personal assistant, Antonn" using Alexa. Key voice commands by him operate her when she can perform better than he can. My goal is a future Alexa voice change to male. I have seen some voice changes done on You Tube but they are beyond my skill level.

Has anyone heard of a voice service offering an Alexa's voice?

I think Alexa will be a great incorporation with an EZ-B in the future. Any opinions?

Ron

AI Support Bot
Related Content
Synthiam
Based on your post activity, we found some content that may be interesting to you. Explore these other tutorials and community conversations.
PRO
USA
#18  
Same speaker for everything in the echo dot. Ive been experimenting with live voice changers that change pitch and gender. But run into the issue you are bringing up , if you want to play music, be prepared for some interesting results :/

..and all your robots will be named Echo, Alexa or Amazon...
#19  
They just added "Computer" as a name.:)

Using a voice modifier was what I was going to try, so there goes that idea out the window. Unless there's a filter switch that could turn the modifier on or off when the voice engages.
PRO
USA
#20  
In my best Scotty voice , " computer on..."
#22  
One of my favorite scenes from ST IV. :D
#23  
Has anyone done more with Alexa ? I continue to transfer commands from my robot to Alexa for weather, music, radio, news, setting timers or alarms but that is it.
#25  
@CochranRobotics could not agree more on this!:)
#26  
I am hoping for more features to hack. The phone features may become interesting. A futuristic "video phone call' generated by a robot might be cool? I have to see where it goes first. (It could be cool to have a robot as an "answering machine"). ( Using vision to recognizes who is calling or answering ?)

I'm just stirring the pot. Maybe someone will get some other ideas. I hope others in the community will post their uses interacting with their robots.
#27  
@CochranRobotics I did not know the api.ai could call Alexa Skills.

I thought you misstated that until I looked and under integration on the api.ai website it sure enough does have that option built in! LOL :D

There are so many fantastic software options now, it's like an embarrassment of riches!
#28  
It also interfaces with Google Home and with Google Assistant. I haven't looked into that, but with Google now owning them, it would be logical.

Also, if memory serves me right, it has an IFTTT channel but I could be wrong. I am trying to remember how we did some of the things what we did with EZ-AI. Memory gets fuzzy when you aren't actively working on something anymore. I believe that we were using IFTTT to then control the OpenHAB instance but I can't remember.

So, here is the chain... You can record text from MS Speech stuff, and then send it to API.AI which then can act as a brain and determine what to do with what was said. From there it can return text to be spoken, call an echo skill (like say [what ever was returned from API.AI]) and a lot more. The issue is that a robot would only have the alexa voice. If API.AI decided that it needed to do something like "Turn on my living room light" it would use IFTTT to then transfer the request to OpenHAB which would be able to complete the request and so on.

It is a pretty powerful addition to ARC for sure. I just need to get back to it again as things change quickly as is evident by the new features in ARC. This rapid change is ultimately what caused me to bow out of EZ-AI due to it needing to be a full time job to keep up with the changes that are happening in this arena. I wish I was wealthy for no other reason than to be able to focus on one thing instead of many.
#29  
One other thing... OpenHAB2 is out now. It is much easier to use than base OpenHAB.
#30  
Okay, one more thing... API.AI has conversational logic built in, so something like...

"Turn off the living room light and lock the doors" would work.
Then, "turn it back on" - would be recognized as "Turn the living room light back on" and would work.

The conversational logic is a really cool piece of this. Also, something like
Who wrote "To Kill a Mockingbird?"
and then
"What year was she born?"
and then
"What happened that year?"
and then
"What other books did she write?"

would all work.
#31  
Thanks David. I didn't realize the power in API.AI . I have to do homework and get it going.

Another thing to load into Antonn. It sounds like this will be the main AI driver I will need for him.
#32  
Yeah guys I could not agree more on the subject...api.ai is the key to bring natural language to your robot! It is a very powerful tool!

As @CochranRobotics pointed out, you will send text to api.ai where it gets processed and returned to ARC. This can be done easy by using the powerful plugin that @DJSures provided!
Along with the speech there is also the action variable returned to ARC...this variable can be used to build logic or trigger events within ARC!


If you would like to call IFTTT you could use the new ARC IFTTT plugin.
A certain returned action variable would trigger a certain IFTTT call!
If you would want the robot to perform a certain move you could trigger AutoPosition...different action variables could trigger different robot animations.
There is a lot of ways to get stuff done, and integration is very easy!

If you are using ARC you can also obviously change your TTS voice in Windows, if you are using the Google Assistant they already provided Google AIY where you can hack into Actions on Google to give your robot a specific voice!
I don't know about Alexa skills, but I guess it is well integrated too!

Let's put some peronality to your beloved little toys!:D
PRO
USA
#33  
Andy Roid

2017 video 

how did you add the jaw movement on Antonn to speak the Alexa response?
#34  
I use a Scary Terry board and set up 2 inputs to allow it to move the jaw.
PRO
USA
#35   — Edited
I use the talk servo on ARC, it works well, 

but it to connect to Alexa -- ??

on the back of my echo Alexa there is a:

3.5 mm audio input or output
Configure in the Alexa app


can you give me a screen shot?
PRO
USA
#36  
are you still there?
#37  
@EzAng  the audio jack on alexa is output just get a 3.5mm phono plug double ended, plug one end into alexa the other end to a Scary Terry board and that is it, but you will have the alexa voice, I tried as per instructions but was unable to get Alexa to use a male voice.  I added a phone jack to Google not an easy job but google is easy to add a male voice, I find google better anyway as you don't have to keep using hay google if you keep talking it stays on. I am still working on how to change the wakeup command to the robots name instead of hay google. lots to do so little time.