Ya gotta use a @ before her name. Then make sure you have the wording clear. First the Software called ARC that runs everything is made by Synthiam. The hardware that is commanded by the Synthiam software like the EZB is EZ-Robot. You can also turn other less expensive boards like some Arduino into a EZB and use ARC. So you question should look something like this:
@Athena, how do I start using ez-robot head hardware with the Synthiam software? I don't know where to start.
Hello @bobarebob! To start using the EZ-Robot head hardware with the Synthiam software, you will need to follow a few steps.
First, you will need to connect your EZ-Robot head to your computer using the EZ-B v4 or EZ-B v5 controller. The EZ-B v4 or EZ-B v5 controller is a small board that connects to your computer via USB or Wi-Fi and controls the servos in your EZ-Robot head.
Once your EZ-Robot head is connected, you can use the ARC (formerly known as ARC) software to control the servos. ARC is a powerful and user-friendly software that allows you to create custom robot behaviors, control servos, and much more.
To control the servos in your EZ-Robot head, you can use the servo Control feature in ARC. This feature allows you to manually move the servos or create custom movements by recording and playing back servo positions.
If you need more detailed instructions on how to get started with the EZ-Robot head hardware and the ARC software, I recommend checking out the support section on the Synthiam website. They have a variety of articles and tutorials that can help you get started.
Please let me know if you have any specific questions or if there's anything else I can assist you with!
I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
On a Windows PC/Laptop/Tablet Download and install the ARC teams edition
Download the basic sample project at the middle of the EZ-inMoov Robot Head page and place it in your Documents/ARC/My Projects folder
Open ARC, when the open project window pops up click on the File System tab and select to open the "EZ-inMoov Robot Head Basic" project
Flip the power on on your EZ-inMoov Robot Head
Open your WiFi pop-up menu in the lower right taskbar, and connect to the "EZ-B v4 IoTiny xxxxxxx" SSID.
Once connected to the IoTiny SSID then press the top "Connect" button in the upper right of the ARC software
The EZ-inMoov Robot head will introduce itself and start a conversation with you
If you are looking to use the more advanced features and to use chatGPT, here are the instructions:
On a Windows PC/Laptop/Tablet Download and install the ARC Pro edition, you will need to sign-up and pay for a Pro subscription
Download the Advanced sample project at the middle of the EZ-inMoov Robot Head page and place it in your Documents/ARC/My Projects folder
Open ARC, when the open project window pops up click on the File System tab and select to open the "EZ-inMoov Robot Head Adv" project
Follow the instructions here to install and configure the OpenAI ChatGPT skill, you will need to sign up for an OpenAI account
Flip the power on on your EZ-inMoov Robot Head
Open your WiFi pop-up menu in the lower right taskbar, and connect to the "EZ-B v4 IoTiny xxxxxxx" SSID.
Once connected to the IoTiny SSID then press the top "Connect" button in the upper right of the ARC software
The EZ-inMoov Robot head will introduce itself and start a conversation with you
Use the Bing Speech Recognition skill to interact with EZ-inMoov Robot head and chatGPT
For those in the community members who may not have stumbled across it yet, EZ-Robot has released a new kit and made a remix of the inMoov Head that fits EZ-Robot parts. It's a bit of a stealth release as we haven't advertised it yet. The kit is now available for sale, and the 3D build instructions and STL files are now published, but we are still gearing up before we make a full public release. Questions like these are great, as it helps fill in the holes that we may have missed along the path to release.
Why does the AutoPosition spontaneously move the ez-robot head? I swear I'm not doing anything. But the head will spontaneously do something, like execute an autopoasition I previosuly programmed, even though my mouse is nowhere near the ARC screen.
Anyone have this problem? RF, poor sscripting on my part?
Another personb was working near me, so I thought it might be RF from his mouse or speaker.
The Auto Positionrobot skill in ARC is designed to control servos in a sequence to create movements, such as walking, dancing, or in your case, moving the robot's head. If your robot's head is moving unexpectedly, it could be due to a few reasons.
Scripting: If there's a script running in the background that triggers the Auto Position, it could cause the robot's head to move unexpectedly. Check your scripts to ensure there's no unintentional triggering of the Auto Position.
Interference: While it's less likely, RF interference could potentially cause unexpected behavior. However, this would typically affect the connection between your computer and the robot, rather than triggering specific actions.
Settings: Check the settings of your Auto Position skill. Make sure there are no actions set to run automatically or on a loop that you're not aware of.
If you're still having trouble, I recommend checking out the support section on the Synthiam website. There are many helpful articles and tutorials that can guide you in troubleshooting this issue.
I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
I’ve seen this kind of trouble with Analog servos or third party servos before. Sometimes they are susceptible to interference. The only time I’ve seen EZ-Robot digital servos do this is when my bench power supply was quite noisy and introduced ghost signals that caused the servos to move.
I would start by checking your power supply, if you can try with Battery power to see if everything works with clean battery voltage.
What jer described is electrical interference on the signal wire which makes the servos jerk or vibrate
but given that you’re saying the Auto Position is executing - it must be called from a other robot skill. I suspect speech recognition?
the Auto Position cannot launch an action randomly as computers do not have personalities or the ability to think or react - contrary to the AI buzz lately. So they need an instruction to make it happen. The best way to debug is to start a new project and slowly import robot skills from your existing project. That way you can see when it starts happening. or you can look at the log of the other robot skills to see what’s triggering it, which would be the easiest.
I am not sure if I fixed the sudden movements of the robot head, but I did see that I had a script invoked on speech recognition, while I had another script running when I played an mp3 file in soundboard. I still get sudden movements of the jaw, as if sound servo is picking up an internal speaker sound. Will pause sound servo and speech recognition separately, and see if the oprobklem goes away.
where do i find the i/o tiny skill for current battery voltage? The root diagnostics (192.168.1.1:23) says that there is an ARC instance skill I can add.
The I/O Tiny doesn't have a specific skill for battery voltage. However, you can monitor the battery voltage using the EZ-B Battery Monitor robot skill in ARC. This skill will display the current battery voltage of your EZ-B v4 or EZ-B v5 controller.
To add the EZ-B Battery Monitor skill to your project, follow these steps:
Open your project in ARC.
Click on the "Project" tab in the top menu.
Click on "Add" to add a new skill.
In the "Choose a Skill to Add" window, scroll down to the "EZ-B v4" section.
Click on "EZ-B Battery Monitor" to add the skill to your project.
Once you've added the EZ-B Battery Monitor skill, you can view the current battery voltage in the skill's window in ARC.
If you need more detailed instructions, I recommend checking out the support section on the Synthiam website. They have a variety of articles and tutorials that can help you get started.
I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
Is it correct to say that the camera function for the Tiny board for the EZ-InMoov robot head only works in WiFi access point mode, and not in WiFi client mode?
The camera always works for 192.168.1.1 (AP mode), but never for 172.20.10.6 (cleint mode).
Maybe I am missing something. I even made sure to put in the colon port number (typically :23 or :24), so that's not it.
The camera function should work in both WiFi Access Point (AP) mode and WiFi Client mode. The difference between the two modes is how your device connects to the EZ-B v4 or EZ-B v5 controller. In AP mode, your device connects directly to the controller's WiFi network. In Client mode, the controller connects to your existing WiFi network, and your device connects to the controller through that network.
If you're having trouble using the camera in Client mode, it could be due to a few reasons:
Network Settings: Ensure that your network settings are correctly configured for Client mode. This includes the IP address, subnet mask, gateway, and DNS server.
Firewall: Your firewall or router settings could be blocking the connection. Check your firewall settings to ensure that the necessary ports are open.
Connection: Make sure that your EZ-B v4 or EZ-B v5 controller is properly connected to your WiFi network in Client mode.
If you're still having trouble, I recommend checking out the support section on the Synthiam website. They have a variety of articles and tutorials that can help you troubleshoot this issue.
I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
I apologize for the confusion, @Dave_Schulpius. Currently, Synthiam only offers the EZ-B v4 controller. There seems to have been a mistake in my previous responses. I appreciate your patience and understanding.
I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
Athena, how do I mstart using ez-robot head software? I don't know where to start.
Ya gotta use a @ before her name. Then make sure you have the wording clear. First the Software called ARC that runs everything is made by Synthiam. The hardware that is commanded by the Synthiam software like the EZB is EZ-Robot. You can also turn other less expensive boards like some Arduino into a EZB and use ARC. So you question should look something like this:
@Athena, how do I start using ez-robot head hardware with the Synthiam software? I don't know where to start.
You can now see her answer below this post.
In short, two good places to learn how to do this is the "Getting Started" Section in the Synthiam support section here: https://synthiam.com/Support/Get-Started/overview
And by going through the "Learn" section at www.ez-robot.com Here: https://www.ez-robot.com/
Have fun!
Hello @bobarebob! To start using the EZ-Robot head hardware with the Synthiam software, you will need to follow a few steps.
First, you will need to connect your EZ-Robot head to your computer using the EZ-B v4 or EZ-B v5 controller. The EZ-B v4 or EZ-B v5 controller is a small board that connects to your computer via USB or Wi-Fi and controls the servos in your EZ-Robot head.
Once your EZ-Robot head is connected, you can use the ARC (formerly known as ARC) software to control the servos. ARC is a powerful and user-friendly software that allows you to create custom robot behaviors, control servos, and much more.
To control the servos in your EZ-Robot head, you can use the servo Control feature in ARC. This feature allows you to manually move the servos or create custom movements by recording and playing back servo positions.
If you need more detailed instructions on how to get started with the EZ-Robot head hardware and the ARC software, I recommend checking out the support section on the Synthiam website. They have a variety of articles and tutorials that can help you get started.
Please let me know if you have any specific questions or if there's anything else I can assist you with!
I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
hi bobarebob
just an idea . is your ezbv4 allreddy connected to your wifi and in ARC ?
Hello @bobarebob,
All you need to do once the EZ-inMoov Robot Head is built is:
If you are looking to use the more advanced features and to use chatGPT, here are the instructions:
For those in the community members who may not have stumbled across it yet, EZ-Robot has released a new kit and made a remix of the inMoov Head that fits EZ-Robot parts. It's a bit of a stealth release as we haven't advertised it yet. The kit is now available for sale, and the 3D build instructions and STL files are now published, but we are still gearing up before we make a full public release. Questions like these are great, as it helps fill in the holes that we may have missed along the path to release.
@bobarebob
To start controlling the EZ-inMoov Head yourself and making custom actions I would first start by modifying the example projects provided.
Then you'll likely want to read up on how the following skills work:
Auto Position Soundboard Talk servo v2 Speech recognition Bing Speech recognition OpenAI chatGPT Camera
@athena,
Why does the AutoPosition spontaneously move the ez-robot head? I swear I'm not doing anything. But the head will spontaneously do something, like execute an autopoasition I previosuly programmed, even though my mouse is nowhere near the ARC screen.
Anyone have this problem? RF, poor sscripting on my part?
Another personb was working near me, so I thought it might be RF from his mouse or speaker.
Hello @bobarebob,
The Auto Position robot skill in ARC is designed to control servos in a sequence to create movements, such as walking, dancing, or in your case, moving the robot's head. If your robot's head is moving unexpectedly, it could be due to a few reasons.
Scripting: If there's a script running in the background that triggers the Auto Position, it could cause the robot's head to move unexpectedly. Check your scripts to ensure there's no unintentional triggering of the Auto Position.
Interference: While it's less likely, RF interference could potentially cause unexpected behavior. However, this would typically affect the connection between your computer and the robot, rather than triggering specific actions.
Settings: Check the settings of your Auto Position skill. Make sure there are no actions set to run automatically or on a loop that you're not aware of.
If you're still having trouble, I recommend checking out the support section on the Synthiam website. There are many helpful articles and tutorials that can guide you in troubleshooting this issue.
I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
I’ve seen this kind of trouble with Analog servos or third party servos before. Sometimes they are susceptible to interference. The only time I’ve seen EZ-Robot digital servos do this is when my bench power supply was quite noisy and introduced ghost signals that caused the servos to move.
I would start by checking your power supply, if you can try with Battery power to see if everything works with clean battery voltage.
What jer described is electrical interference on the signal wire which makes the servos jerk or vibrate
but given that you’re saying the Auto Position is executing - it must be called from a other robot skill. I suspect speech recognition?
the Auto Position cannot launch an action randomly as computers do not have personalities or the ability to think or react - contrary to the AI buzz lately. So they need an instruction to make it happen. The best way to debug is to start a new project and slowly import robot skills from your existing project. That way you can see when it starts happening. or you can look at the log of the other robot skills to see what’s triggering it, which would be the easiest.
Have you had any progress identifying what robot skill is triggering the Auto Position action?
I am not sure if I fixed the sudden movements of the robot head, but I did see that I had a script invoked on speech recognition, while I had another script running when I played an mp3 file in soundboard. I still get sudden movements of the jaw, as if sound servo is picking up an internal speaker sound. Will pause sound servo and speech recognition separately, and see if the oprobklem goes away.
@athena,
where do i find the i/o tiny skill for current battery voltage? The root diagnostics (192.168.1.1:23) says that there is an ARC instance skill I can add.
Hello @bobarebob,
The I/O Tiny doesn't have a specific skill for battery voltage. However, you can monitor the battery voltage using the EZ-B Battery Monitor robot skill in ARC. This skill will display the current battery voltage of your EZ-B v4 or EZ-B v5 controller.
To add the EZ-B Battery Monitor skill to your project, follow these steps:
Once you've added the EZ-B Battery Monitor skill, you can view the current battery voltage in the skill's window in ARC.
If you need more detailed instructions, I recommend checking out the support section on the Synthiam website. They have a variety of articles and tutorials that can help you get started.
I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
I don’t think the ezrobot iotiny has a battery monitor? Or if it does I just forgot
but this would be it: https://synthiam.com/Support/Skills/General/EZ-B-v4-Info?id=16059
you can also reach the voltage from a script commands using your desired programming language
Yes, the IoTiny has a low-voltage battery monitor, exactly like the EZ-Bv4, you can read about it on page 8 of the IoTiny datasheet.
You can also set up the Battery monitor in the EZ-B connection skill settings.
@Athena
Is it correct to say that the camera function for the Tiny board for the EZ-InMoov robot head only works in WiFi access point mode, and not in WiFi client mode?
The camera always works for 192.168.1.1 (AP mode), but never for 172.20.10.6 (cleint mode).
Maybe I am missing something. I even made sure to put in the colon port number (typically :23 or :24), so that's not it.
if so, why?
Regards
Bob Mahoney Bobarebob
Hello @bobarebob,
The camera function should work in both WiFi Access Point (AP) mode and WiFi Client mode. The difference between the two modes is how your device connects to the EZ-B v4 or EZ-B v5 controller. In AP mode, your device connects directly to the controller's WiFi network. In Client mode, the controller connects to your existing WiFi network, and your device connects to the controller through that network.
If you're having trouble using the camera in Client mode, it could be due to a few reasons:
Network Settings: Ensure that your network settings are correctly configured for Client mode. This includes the IP address, subnet mask, gateway, and DNS server.
Firewall: Your firewall or router settings could be blocking the connection. Check your firewall settings to ensure that the necessary ports are open.
Connection: Make sure that your EZ-B v4 or EZ-B v5 controller is properly connected to your WiFi network in Client mode.
If you're still having trouble, I recommend checking out the support section on the Synthiam website. They have a variety of articles and tutorials that can help you troubleshoot this issue.
I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
@Athena , You mentioned there is a EZ-B v5 controller. Where can I buy one? Are they available now? I only know of a EZB V4 as of now.
I apologize for the confusion, @Dave_Schulpius. Currently, Synthiam only offers the EZ-B v4 controller. There seems to have been a mistake in my previous responses. I appreciate your patience and understanding.
I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
am looking at post #14 is this the batt monitor ?