
leonardo46

I added a mouth to my robot. I'm trying to let it move while speaking.
I tried with "talk servo" and speak synthesis. It works, but I found 2 problems:
1) at the end of the sentence the mouth always remains open. servo positions are :
min =110 (mouth closed), max=140 (mouth open).
I tried checking "invert" in the talk servo configuration window, but this way it doesn't move at all. How should "invert" be used ? Can I in some way force the mouth closed at the end of the sentence,?
2) For long sentences the mouth stops moving well before the sentence ends.
The sync between speech and servo movement is not good. Could it be made better ?
Sound servo with Soundboard v4 might give better results ?
110= mouth open,140=mouth closed
Nobody replied. In the meantime I found myself the solution to problem 1). I don't need anymore help for this problem. The solution is:
For getting the mouth closed at the end of the sentence, I had to check the "invert" box in talk servo window. I discovered that this option only works if min is <90 and max is >90. So I had to fit the servo bracket accordingly.
I still need help for sync between speech and mouth movements (problem 2).
With short sentences (1-2 words) it's OK, but for longer sentences (10 words) the mouth keeps moving for seconds even if the speech has ended, or viceversa, the mouth stops moving before the speech ends. It's different for different sentences, In an apparently unpredictable way.
Any solution ?
I'll be more patient. Usually community members reply very quickly. This time I was quicker than them !
What about the problem 2) ?
Why is there a large and unpredictable (for me) difference between duration of speech and duration of servo movement ?
For some sentences servo stops much before speech ends, for others it goes on moving a long time after speech has ended.
How to control this phenomenon ?
Thx
The problem is in speech synthesis and talk servo, that moves the robot mouth, following the speech.
There are three sentences. For the first sentence , the mouth is rather synchronised with the speech. For the other, shorter , sentences the mouth keeps moving even if the speech has ended. This probably may be adjiusted setting the delays in the talk servo configuration window, but this setting is unique for all sentences.
How to achieve a good sync for all sentences ?roby.EZB
I used the 3 robot laws from iRobot to set my mouth movement. I had to play with each settings to get it to stop every time my robot said the 3 laws, after that most any small or large words seems to work. It might be because you servo is inverted or, not sure?
here is my 3 laws... put this in a script to run.
Code:
You have to play with each of the setting in the red circled settings to get it right. Remember there is a Question "?" on each of the settings in the Talk servo config to tell you what each one does.
Can you put the speech that you have on many Speech Synthesis in you Speech Recognition, that way you can use the "SayWait command? Using the SayWait command allows the speech to finish before starting a different command thread.
It would help to see a picture of the mouth to get a better idea as to how it's driven.
Hope this helps, maybe other we also make suggestion too.
Cheers,
For his mouth it uses a micro servo moving the lower lip.
I had to use the invert option in the configuration of the talk servo, depending on how the servo was mounted and its sense of rotation. Otherwise the mouth would have been always open except while speaking. This should have no effect on servo movement timing.
I put in the project the three laws and your settings. The speech lasts 25 seconds, but the mouth stops moving after only 20 seconds, while the speech is going on for 5 seconds. Timing depends on those setting . With the three laws and my own settings, the mouth kept moving for 10 seconds after the speech had finished !...
The problem I'm trying to solve is :
A sentence needs an accurate setting to be realistic, but those settings may not be good for different sentences. A compromise is required.
Good luck and let us know if you get it working.
You can reveres the direction of your servo by opening the servo and reverse the motor wires and Reverse the the 2 out side wires on the pot. Just another thought.
https://m.youtube.com/watch?v=T_BvPGEn-to
In this video am using my init script and my talk servo setting.
My mouth shuts before the end but I know with a little more tweaking I probably could get it right on.
https://youtu.be/eO3egGgr1kk
Example to set your servo min & max then your servo speed
Code:
Give it a try
The only way to get good sync seems to be adjusting the settings in the talk servo configuration page.
But a configuration good for a long sentence is not good for a short one and viceversa. I think I'll have to avoid long sentences, and use only sentences of the same length.
I wonder why it's so difficult to get a good sync between speech and mouth movement.
Is it possible to make some improvement , or the PC itself, being a non-real-time system, makes the job difficult or even non possible ?
Some hardware solution (connected to ezb speaker) might be better.
Thanks to merne, who suggested some test, that unfortunately didn't work.
You are welcome.
Cheers
Ezb is great. Using it you can give life to your robots with a few clicks, without using complex hardware and software.
I'm asking DJ if some improvement in ezb software is planned , that might make talk servo work with any sentence, without the need to fine-tune everything each time you add a new sentence.