Unknown Country
Asked

Language Setting For Tts

I want to program jd humanoid via EZ-Script so he will speak the words i want him to say , in the dutch language. How do i  do that ?.



Related Hardware (view all EZB hardware)
JD Humanoid by EZ-Robot
JD humanoid robot kit - WiFi-enabled, 16 DOF with metal-gear servos; easy, fun, educational, available from the EZ-Robot online store.
Wi-Fi / USB
Servos 24
Camera
Audio
UART 3
I2C
ADC 8
Digital 24

Related Robot Skills (view all robot skills)
Accurate Bing cloud speech-to-text for ARC: wake-word, programmable control, $BingSpeech output, Windows language support, headset compatible
Azure TTS for ARC robots: generate natural, customizable neural voices for companion, educational, assistive, and entertainment applications.

ARC Pro

Upgrade to ARC Pro

ARC Pro will give you immediate updates and new features needed to unleash your robot's potential!

Author Avatar
PRO
Canada
#1  

To program your JD Humanoid to speak in Dutch using the Azure Text To Speech robot skill, you'll need to set the language to Dutch and choose a Dutch voice provided by the Azure Text to Speech service. Here's how you can do it:

  1. Add the Azure Text To Speech Robot Skill:

    • If you haven't already, add the Azure Text To Speech robot skill to your ARC project.
  2. Configure the Azure Text To Speech Robot Skill:

    • Open the configuration menu of the Azure Text To Speech robot skill.
    • In the Neural Voice field, enter the name of a Dutch voice provided by Azure. You can find the list of available voices on the Microsoft Azure website. For Dutch, you might find voices like "nl-NL-FennaNeural" or "nl-NL-CoenNeural".
    • Save the configuration.
  3. Write Your EZ-Script:

    • Open the EZ-Script editor in ARC.
    • Use the ControlCommand() function to interact with the Azure Text To Speech robot skill. You'll need to use the Speak command followed by the text you want the robot to say.

Here's an example EZ-Script that makes JD Humanoid speak a phrase in Dutch:

# Set the language to Dutch and select a Dutch voice
ControlCommand("Azure Text To Speech", "SetVoice", "nl-NL-FennaNeural")

# Make JD Humanoid say "Hallo, hoe gaat het?" which means "Hello, how are you?" in Dutch
ControlCommand("Azure Text To Speech", "Speak", "Hallo, hoe gaat het?")

Replace "nl-NL-FennaNeural" with the Dutch voice you prefer from the Azure Text To Speech service. The SetVoice command is used to change the voice, and the Speak command is used to make the robot speak the text.

  1. Run Your Script:
    • Execute the script in ARC to hear JD Humanoid speak in Dutch.

Remember to ensure that your Azure subscription has access to the Dutch language neural voices and that you have entered the correct voice name in the script. If you encounter any issues, double-check the voice names on the Azure website and ensure that your Azure Text To Speech API key is correctly configured in ARC.


I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
Author Avatar
PRO
Iraq
#2  

Okay, I'll use this method for the Arabic language and move the robot's mouth using a single servo motor. Thank you.

Author Avatar
PRO
Canada
#3  

Great - that will work. Below are stepbystep instructions to make JD speak Arabic with the Azure Text To Speech skill and move the mouth with a single Talk servo v2 servo. I recommend using an ARC Script (JavaScript) rather than EZScript - I can’t create EZScript for you, but I’ll provide readytopaste JavaScript and explain the Skill configuration and tuning you’ll need.

What you’ll do (high level)

  • Configure Azure Text To Speech to use an Arabic neural voice.
  • Configure Talk servo v2 with the servo port (e.g., d0) and min/max positions.
  • Use a Script (JavaScript) to set the voice and trigger both the Azure TTS speak and the Talk servo animation (Talk servo v2 accepts a Speak ControlCommand to animate mouth movement from a text string).

Useful docs

Step 1 - pick an Arabic voice in Azure

  1. Open the Azure Text To Speech robot skill config (press the kebab/hamburger or Config on the skill window).
  2. Click View List and find Arabic voices from Microsoft. The Synthiam manual references the Microsoft voice list: https://learn.microsoft.com/en-us/azure/ai-services/speech-service/language-support?tabs=tts
  3. Copy the "voice id" (the grey text in the Microsoft chart) for the Arabic voice you want, e.g. a voice id might look like:
    • ar-EG-SalmaNeural (Egyptian Arabic example)
    • ar-SA-ZariyahNeural (Saudi Arabic example) (Use the Microsoft list to pick the exact voice id you prefer.)
  4. Paste that id into the Neural Voice field and Save.

Step 2 - configure Talk servo v2

  1. Open Talk servo v2 config (press the kebab / Config on the Talk servo v2 skill).
  2. Set the servo port to the EZB port you’re using (for example d0). Remember servo port constants are not in quotes (d0).
  3. Set Minimum and Maximum positions for the mouth servo (experiment: try min ~ 60 and max ~ 120 as a starting point; tune for your servo/jaw geometry).
  4. Leave Pause unchecked (or use ControlCommand to unpause).
  5. Tune Start Delay, Constant Delay, Vowel Delay and Period Delay to sync the animation to the speech. Defaults (Start Delay ~460ms, Constant ~90ms, Vowel ~40ms) are a good starting point; you’ll likely reduce Start Delay for shorter phrases.

Important: you must be connected to the EZB for servo motion to work.

Step 3 - Script (JavaScript) to set voice and speak + animate mouth Add a Script robot skill (Project -> Add robot skill -> Scripting -> Script), open the script editor, choose JavaScript, and paste this:

// Example JavaScript for ARC Script skill
// 1) Set Azure TTS to an Arabic voice
ControlCommand("Azure Text To Speech", "setVoice", "ar-EG-SalmaNeural");
print("Azure voice set to ar-EG-SalmaNeural");

// 2) Text to speak (Arabic). Replace with whatever you want the robot to say.
var text = "  "; // "Hello, how are you?" in Arabic

// 3) Trigger Talk servo v2 to animate mouth using the same text
ControlCommand("Talk servo v2", "Speak", text);
print("Talk servo animation started");

// 4) Speak using Azure and wait until completed
ControlCommand("Azure Text To Speech", "speakWait", text);
print("Azure finished speaking");

Notes about the code:

  • Use ControlCommand(...) exact skill names that appear in your project. If you renamed the skill window, use that name.
  • I used speakWait to block until speech completes. If you prefer asynchronous speak, use "speak" instead.
  • Talk servo v2's Speak command will animate the servo based on the text provided. Tune Start Delay inside the Talk servo config to get better sync with Azure audio.
  • Use print() for debug output in ARC scripts.

If you prefer the Talk servo to automatically animate when Azure TTS speaks:

  • In the Azure TTS skill config you can enable any "Replace Audio Script Commands" or similar option (see the Azure skill config). If the Azure skill is replacing Audio.say, Talk servo listens to the built-in Say() commands. But the explicit approach above (sending both commands) is simpler and gives you control over timing.

Tuning tips

  • If the mouth animation starts too early or late, reduce/increase Talk servo v2’s Start Delay.
  • If jaw movement amplitude is too large or small, adjust servo Min/Max in Talk servo config.
  • Test with short phrases, then longer ones.
  • If using the EZB speaker (Speak out of EZB option in Azure skill), verify EZB audio is connected (Audio.isConnected()) and that you selected "Speak out of EZB" in the Azure skill config.

Limits and reminders

  • Azure TTS limits in ARC: generated speech is limited to 500 characters per call and 1000 daily calls for ARC Pro (see Azure skill manual).
  • Make sure your Azure credentials (if required by the skill) are configured in the Azure TTS skill settings.
  • Servo ports are constants like d0 (don’t put them in quotes when configuring servo positions in scripts).

Would you like:

  • A tested sample phrase list in Arabic to try different mouth motions?
  • Help tuning Start Delay and vowel/constant delays (I can suggest values based on your initial test)?
  • If you still want an EZScript version, paste the EZScript you are trying and I’ll convert it into JavaScript for you.

Any of the above I can help with - which servo port will you use (d0, d1, etc.) and which Arabic voice did you pick from the Microsoft list?


I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
Author Avatar
PRO
Iraq
#4  

Hello @Athena , I currently want to convert speech to text in Arabic. Which skill supports this? Bing doesn't support Arabic. I want to speak to it using ChatGPT and have it reply using Azure voice. I followed your YouTube tutorial and it worked for me, but it doesn't support Arabic. Is there a skill that converts speech to text, or is there another way for me to do this?

Author Avatar
PRO
Canada
#6  

Short answer: Bing Speech Recognition can do Arabic - you must pick the correct Arabic locale in the Bing skill (or paste the locale code). Then send the recognized text to your ChatGPT/OpenAI skill and have Azure Text To Speech speak the AI reply in Arabic.

Below are exact steps, examples, and a ready-to-use JavaScript template you can paste into the Bing skill’s All Recognized Script (or a Script skill), plus tips for reliability.

  1. Pick the Arabic locale (Microsoft)
  1. Set the Bing Speech Recognition skill to Arabic
  • Open the Bing Speech Recognition robot skill in ARC.
  • Press Config (kebab/hamburger) find the Language dropdown or Language field.
  • Select or paste the locale you chose (for example: ar-EG).
  • Save and test by speaking short Arabic phrases. Confirm $BingSpeech contains the Arabic text.
  1. Pipeline: STT (Bing) ChatGPT/OpenAI TTS (Azure)
  • Add/configure your ChatGPT / OpenAI robot skill in ARC and confirm how it accepts prompts and where it stores the reply (check that skill’s manual for exact ControlCommand and response variable name).
  • Configure Azure Text To Speech:
    • In Azure TTS config choose an Arabic neural voice (see Microsoft TTS list in the same MS doc; example voice id: ar-EG-SalmaNeural or ar-SA-ZariyahNeural).
    • Test Azure TTS directly (Config Sample).
  1. Example JavaScript templates
  1. If you want the Bing skill to automatically forward recognized text (put this as the Bing skill’s All Recognized Script - JavaScript):
// All Recognized Script for Bing (JavaScript)
var phrase = getVar("$BingSpeech", "");
print("Recognized (Bing): " + phrase);

// Send the phrase to your ChatGPT/OpenAI skill (replace skillName and command with the skill's documented command)
ControlCommand("Your ChatGPT Skill Name", "SendPrompt", phrase);

// Option A: If the chat skill returns its reply in a global variable, speak it with Azure after waiting a bit:
sleep(800); // give the chat skill time to complete (or use a more robust wait strategy)
var reply = getVar("$ChatGPTReply", ""); // replace with the actual variable name the Chat skill uses
if (reply !== "") {
  // set an Arabic voice (optional; set once elsewhere if you prefer)
  ControlCommand("Azure Text To Speech", "setVoice", "ar-EG-SalmaNeural");
  // speak asynchronously
  ControlCommand("Azure Text To Speech", "speak", reply);
  print("Spoken (Azure): " + reply);
}
  1. If your ChatGPT/OpenAI skill exposes a ControlCommand that returns text directly (some skills let you call and get a return value), adapt accordingly. Check the ChatGPT/OpenAI skill manual for exact ControlCommand names and response variable names.

Notes about the code:

  • Replace "Your ChatGPT Skill Name" with the exact name of the skill window in your ARC project.
  • Replace "SendPrompt" and "$ChatGPTReply" with the actual command and response variable as documented by the ChatGPT/OpenAI skill you’re using in ARC.
  • Use print() for debugging output shown in the Script skill’s console.
  1. If Bing doesn’t show Arabic in the dropdown
  • Paste the exact locale token from Microsoft (e.g., ar-EG) into that Language field. Synthiam_Support already recommended this approach for Arabic: use the skill config language dropdown and enter the desired locale. If the exact entry isn’t visible, pasting the correct locale token normally works.
  1. Microphone, dialect, and accuracy tips
  • Arabic dialects vary. Decide whether you want Modern Standard Arabic (MSA) or a regional dialect (Egyptian, Saudi, etc.). Pick the locale that best matches the speaker’s dialect.
  • Use a headset or close mic, and avoid mounting the mic on the robot (robot motors make STT less accurate).
  • If the environment is noisy, use push-to-talk (button) to avoid accidental captures. See the Push-to-Talk tutorial: https://synthiam.com/Support/Get-Started/Tutorials/speech-recognition-with-push-button-microphone
  1. Want me to prepare the exact script? Tell me:
  • Which ChatGPT/OpenAI skill you have in your project (exact skill name as it appears in ARC),
  • The ControlCommand that skill uses to accept text and how it returns responses (or the response variable name),
  • Which Arabic voice/locale you prefer for Azure (e.g., ar-EG-SalmaNeural or ar-SA-ZariyahNeural),
  • Whether you want synchronous speakWait or asynchronous speak.

I’ll build the exact All Recognized Script (JavaScript) with the correct ControlCommand/variable names and a robust wait strategy to ensure smooth STTAITTS flow.


I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
Author Avatar
PRO
Iraq
#7  

Hello @Athena , thank you. The Arabic language issue has been resolved. The second problem was with moving the servo. I discovered that my EZB is an Arduino Mika, and I downloaded the latest version (2026/1/25) and its code from your website using the IDE. It successfully connected to ARC on COM5. When I send commands to move the servo, it receives the signal on the TX and RT pins, but it doesn't move the servo, as if it's not connected to ARC and only receives commands from it. When I tried moving the servo on the same Arduino Mika using the IDE, it moved on the same electronic connection. What is the problem? Thank you.

Author Avatar
PRO
Canada
#8  

Good troubleshooting - you already narrowed it a lot. The most likely causes are either (A) the Arduino Mika is not running the EZB-compatible firmware/protocol ARC expects, or (B) the servo is wired to the TX/RX (serial) pins or otherwise to a pin that conflicts with ARC’s serial communication. Both will produce exactly the symptom you described: the servo moves when you run a direct Arduino sketch, but not when ARC is connected via COM.

Do these checks and try the fixes below. I included ready-to-run ARC JavaScript tests so you can quickly verify behavior.

  1. Quick connection check (in ARC)
  • Confirm ARC reports the EZB/Arduino as connected.
    • Open your EZB Connection skill window and check the status line.
    • Or run this Script (Project Add robot skill Script JavaScript) and paste:
// Quick connection check
var connected = controlCommand("Connection","GetStatus", 0); // change index if needed
print("Connection GetStatus => " + connected);
  • If this returns false / disconnected, ARC is not connected (fix COM/baud/DTR). If true, ARC is talking to the board.
  1. Verify the firmware
  1. Check where the servo is physically connected (very important)
  • You said it "receives the signal on the TX and RT pins" - that suggests the servo is plugged into the Arduino TX/RX (serial) pins (pins 0 & 1). That is a problem:
    • Pins 0/1 are used for the USB serial interface which ARC uses to talk to the board. If you put a servo on those pins it will conflict with ARC’s protocol and won’t move when ARC is connected.
    • When you upload a custom Arduino sketch that directly toggles those pins (and aren’t using serial), the servo may move - but as soon as ARC opens the serial port, the serial protocol interferes.
  • Solution: Move the servo to a different PWM-capable digital pin that the Mika/EZB firmware maps to a d# port (e.g., a port that corresponds to d0..d23 in ARC). Do NOT use pins 0 or 1 for servos when using ARC over USB/Serial.
  1. Power and ground
  • Ensure the servo power supply is present and strong enough. Many issues only appear when ARC commands multiple servos.
  • The servo GND must be common with the Arduino ground. Confirm the ground is shared between the servo battery and the Arduino.
  1. COM/baud and DTR/RTS
  • In the EZB Connection config in ARC, check that the COM port and Baud match the firmware settings (commonly 115200 but check the firmware docs). If you aren’t sure, try 115200 and 57600.
  • Some USB devices require toggling DTR/RTS. Open the EZB Connection Config COM/USB and try enabling DTR/RTS for that index and reconnect.
  1. Test a servo command from ARC
  • With the board connected and the servo on a proper servo pin (not 0/1), run this JavaScript in a Script skill to test:
// Test servo on d0 to center, wait, then move
// Replace d0 with whatever port you will use (do not put d0 in quotes)
Servo.setPosition(d0, 90);   // center
sleep(1000);
Servo.setPosition(d0, 30);   // move
sleep(1000);
Servo.setPosition(d0, 150);  // move other side
print("Servo test complete. Last position: " + Servo.getPosition(d0));
  • If that moves the servo, ARC -> firmware -> servo wiring is correct.
  1. If the servo still doesn’t move
  • Confirm the firmware supports the physical pin you used (check Mika docs for pin-to-d# mapping).
  • Open the ARC Status window at the bottom, press COPY, and paste the text somewhere for inspection. That will show connection errors, firmware handshake issues, or protocol errors.
  • Use the Serial Terminal (PC) skill to monitor the raw COM data while sending the Servo.setPosition command to see whether ARC is sending data and whether the board replies.

Useful support docs:

Follow-up questions so I can help further

  1. Which physical Arduino pin is your servo plugged into right now? (e.g., "pin 0 (TX)", "pin 9", etc.)
  2. Which firmware sketch is currently uploaded to the Mika (exact filename or date)? Are you 100% sure it’s the Synthiam/EZB-compatible firmware and not a custom servo test sketch?
  3. What does controlCommand("Connection","GetStatus", 0) return for index 0 in your project?
  4. Paste the ARC Status window output (use COPY) if you still see odd behavior - I can read it and point out errors.

If you tell me the physical pin you used and the firmware you uploaded, I’ll give exact pin-mapping guidance and a tailored test script (and show where to move the servo wiring).


I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.