ARC Pro

Upgrade to ARC Pro

Elevate your robot's capabilities to the next level with Synthiam ARC Pro, unlocking a world of possibilities in robot programming.

PRO
Synthiam
#2  

The poster might be asking because he comes from the concept of writing a program.

Much like newer design suites, you don’t make programs with arc. You use preexisting modules and link them together rather than reinventing the wheel each time.

so if you just want to move a servo, it can be done with the python command found in Jeremie’s link. But moving a servo by a script can be useful, but generally not. This is because you can use existing robot skills to move the servos. Such as the Auto Position or even the camera robot skill for tracking an object with servos.

I recommend the getting started guide / because that explains more.

while python and other languages exist in arc, they are there to help fill the gaps between robot skills. Such as specific programmatic things you need the robot to accomplish

Australia
#4  

Thank you guys for the reply.

I have read all topics you recommended but could not find how to link my python script (with ARC Python API functions) with ARC. My understanding is that I have to write the script and somehow let ARC to execute it. Where to put my own script?

By the way, I have already used existing robot skills to move the servos. The reason I use python is for specific tasks for the robot, but I do not know how to do it.

PRO
Synthiam
#5  

Python is different for every robot framework.

as Jeremie said, you could write python in a script robot skill if you need to accomplish something that doesn’t already exist in another robot skill.

can you tell us what your python script does? Or what you’re wanting to achieve? Or paste the script for us to see?

because the concept of ARC and creative design tools is that you reuse existing modules to accomplish something. Rather than trying to write all the code to do what already exists. You can use python to link the logic between robot skills.

Australia
#6  

Python as a programming language is same for any computer, microcontroller, robot platform etc. This is as you know a portable language. How you guys implemented it on the ARC framework is different story. If I know python programing, I do not need to learn python for your platform. As I understand, I need to know the ARC python API and how to implement it. I had a look at the python API functions. Found some of them I need. Now, my "robot" description: I am creating a human head with speech and face recognition. It will make conversation with people around it. It is a big task. I spent a lot of time studying different robot platforms and and software frameworks and found the EZ-Robot and the ARC the best for that task. The EZ-Robot hardware for my project is already at my desk. The ARC is on my PC. Spent a week (not enough yet!) studying the ARC and "playing" with the hardware. Just to be familiar with the tools for my project. What I need to do first (not priority 1) is to implement the Camera skills in order to make my "head" capable to recognize a face among faces stored in the PC database. Next, my "head" by talking to the person recognized, has to acquire crucial data about that person for the further conversation when my "head" meets that person again. The data has to be stored to the persons database. As I said, first step is to implement the face recognition task. I did that already on the Raspberri Pi with OpenCV and python. That is probably what caused my confusion about python implementation on the ARC. I just wanted to make simple script to move servos for the sake of learning python API. My real task is the face recognition using EZ-R hardware and ARC software. Final task is the intelligent head with EZ-R and ARC.

Hope it is enough for now. Also, I hope you will help me during my "journey". PS The topic title "Python servo Control" now looks misleading (at least). Since I disclosed my project, I think I have to change it to "Humanoid head robot" or similar.

PRO
USA
#7  

Quote:

What I need to do first (not priority 1) is to implement the Camera skills in order to make my "head" capable to recognize a face among faces stored in the PC database.

https://www.ez-robot.com/learn-robotics-getting-started-humanoid-robot-kit.html

I don't see python in the list, but it must be similar

Camera Input

  1. Introduction to the EZ-B Camera
  2. Face Detection with RoboScratch
  3. Face Detection with Blockly
  4. Face Detection with EZ-Script
  5. Color Tracking with Servos
  6. Color Tracking with Movement
  7. Detecting Multiple Colors
  8. Line Following with Roli, AdventureBot and Shell-E
  9. Vision - Object Training & Recognition
  10. Glyphs to Control Robot Movement
  11. Detecting Glyphs & Augmented Reality
  12. QR Code Detect
  13. Microsoft Cognitive Emotion
  14. Microsoft Cognitive Vision
PRO
Synthiam
#8   — Edited

Great - thanks for sharing what your objective is. That can be achieved easily with a few mouse clicks, and I'll tell you how now.

But first, I do want to clarify so there's no confusion about my statement that Python is different per robot framework. There are other libraries and modules for Python to interact with physical hardware (i.e., servos, switches, cameras, etc.). Every framework uses another method of communicating with the hardware. This has nothing to do with changing Python commands or syntax - the API for each robot framework will accept different commands to move servos. My inquiry about your program was to understand what framework was being used to move servos so they could be easily translated to ARC servo commands. However, what you're looking to achieve is quite simple and doesn't require much scripting - all hail re-usable robot skill modules.:)

ChatBot The first thing you will want to do is merely experiment with what chatbot you'd like to use. There are a number of them on Synthiam's platform, as you can see in the skill store: https://synthiam.com/Products/Controls/Artificial-Intelligence.

I would probably recommend using the AIMLBot because it is very configurable and has a feature that you require to understand who is looking at the robot by the camera. So, install the AIMLBot here: https://synthiam.com/Support/Skills/Artificial-Intelligence/AimlBot?id=16020.

Make Chatbot Speak The chatbot won't speak by default. It'll display the output in the log window. Let's edit the chatbot to add some python code to make it talk out of the PC speaker OR EZB speaker - whatever you choose. View the aimlbot configuration and select the response script.

User-inserted image

Now under the Python tab, add one of these... depends if you want the audio out of the PC or EZB.


// speak out of the PC
Audio.Say(getVar("$BotResponse"));

// speak out of the EZB
Audio.SayEZB(getVar("$BotResponse"));

User-inserted image

Speech Recognition Now you need to speak to the robot. There are dozens of speech recognition modules, but Bing Speech Recognition is preferred. That is very reliable and configurable for things like this. You can install it here: https://synthiam.com/Support/Skills/Audio/Bing-Speech-Recognition?id=16209.

Connect Speech Recognition to Chatbot Now you need to connect the speech recognition to the chatbot. So that when you speak, it pushes the detected phrase into the AIML chatbot. View the bing speech recognition configuration screen and add this code to the All Recognized Scripts. Since you're using python, I used the python tab.

User-inserted image

User-inserted image


ControlCommand("AimlBot", "SetPhrase", getVar("$BingSpeech"));

Once you save that configuration setting, you can start talking to the robot, and the chatbot will print responses back.