
PRO
TerryBau
USA
Asked
Hi all
Is there a way to fire a voice command... like say Hi, then the script will say what is your name... then you say your name, and the VR holds that variable (e.g. your name) and the EZ says the variable back?
Related Hardware EZ-B v4
thanks all for your help
I found what i was looking for based on this thread and another thread. Not looking for a full blown convo as my robot (as Dave knows) has limited phrases, so I wanted to run a call word (like hey siri) in this case it would be Hey B9... then listen to my voice input and store that voice input, then based on what I said, run a script or sound... i got it working, this is what i pieced together, obv. just testing based on two simple words, but I will expand greatly on this... this is using Bing
saywait("How can I help you") sleep(1000)
controlCommand("Bing Speech Recognition", "StartListening") controlCommand("Bing Speech Recognition", "UnpauseListening") sleep(3000) controlCommand("Bing Speech Recognition", "PauseListening") controlCommand("Bing Speech Recognition", "StopListening") #sleep(2000)
print($BingSpeech) sleep(1000) if ($BingSpeech = "Tom") ControlCommand("Convo", "Track_4") print("track 4") Elseif ($BingSpeech = "Terry") ControlCommand("Convo", "Track_3") print("track 3") endif
Nice work! You got it!
Because you know what to expect, it might be easier to use the WaitForSpeech command.
All you do is pass the parameters that you want to wait for, and you'll get the response.
Here's an example from your code. I added an ELSE so that the robot would speak it didn't understand if the timeout occurred.
Outstanding Terry! Nicely done. I know how exciting this is.
I love having a wake word. I found early on that if I let Bing just run and listen all the time for phrases I wrote in the control I would get a lot of false positives. I also had trouble with it understanding me if there was background chatter from other people. Another thing I didn't like about letting a SR run and listen to everything being said in the room and sending it off to God knows where or who.
I solved this by having two SR's installed. The Speech Recognition Skill and the Bing Skill. *Like you start the project with Bing paused from listening but have the Speech Recognition skill listening and listening for only word, "Robot". *Once the SR skill hears Robot it will run a script I wrote in and unpause Bing. It will also start the belly lights in a Scanner sequence for about 5 seconds. After 5 seconds that script will pause Bing from listening again. *During those 5 seconds Bing will be listening for a phrase and run a script depending on if it understands or not.
It works really well and I love it even in noisy rooms. However I'm always trying and looking for better ways. I'm actively working on my B9 now and will be looking at your and others methods for using a Wake Word.
When you have time it would be great to see a vid of your method in action. I saw your Stock Price video and that was really amazing.
Here's a vid of my wake word method in action I took about 5 months ago.
thanks Dave... i am using VAD (in the Bing Voice), but yes, he was picking up too much noise... i tried throwing in a PauseListening, but then realized i had to turn off my other EZ Voice module (lol)
But here is a snippet of what I am doing using Bing and Aimlbot (if we had deep fake audio, we can take the else statement output the Aimlbot words into B9's voice... I cropped out all the other ElseIf to make the code easier to see... so far he is working well about 90% of the time, does our vid show how to code the wake word, that was my next section
controlCommand("AimlBot", "SetPhrase", $BingSpeech)
controlCommand("B9 Voice", "Stop")
controlCommand("Bing Speech Recognition", "PauseListening") sleep(2000) ControlCommand("Scripts", "ScriptStart", "RadarMoveL")
if ($BingSpeech = "How are you")
# I am swell ControlCommand("BingChat", "Track_0") Elseif ($BingSpeech = "Tell me a joke") ControlCommand("BingChat", "Track_13") sleep(15000) Else sleep(6000) ControlCommand("BingChat", "Track_10") sleep(3000) # SayEZB($BotResponse) # sleep(5000) endif
controlCommand("Bing Speech Recognition", "UnpauseListening") controlCommand("Bing Speech Recognition", "StartListening")
Dave, thanks for the kind words on the Stock Quote, i looked at your date and weather, modified those, then tweaked it into a stock API
Dave, i would also like to understand how you ran the motors (uart i think) into the EZB, mine are just on a random sequence on an Adruino, but love how you did them (e.g. the ears)
Hi Terry, I started using H-Bridges to control motors back and forth. However I had limited position and speed control and no feedback of where the motors were. I was able to set speeds with the PWM skill. I soon realized I needed better precision and control over most of my motors. I started using Sabertooth Motor controllers with a Kangaroo daughterboard attached. This gave me precise position and speed control. The Kangaroo can use either a pot or an encoder for feedback. I found that for my needs an encoder was the way to go.
I started switching my motors over to ones that already had encoders attached to the motor shaft. ServoCity has a great selection. I controlled the Sabertooth/Kangaroo by sending SimpleSerial commands through the Uart ports of an EZB from ARC.
I do have a couple small DC motors (like the ears sensors) that I run directly off the EZB's digital pins. I simply turn on and off the signal pin to run the motor. A PWM skill in ARC lets me adjust the speed of these little motors.
I hope this helps.