
Gale
Australia
Asked

How to start using python in ARC? Could not find how to create and run python app in the ARC. Ta
Related Hardware EZ-B v4
How to start using python in ARC? Could not find how to create and run python app in the ARC. Ta
Now that you have the chatbot working let's add the camera stuff to make the robot change who it is seeing...
Add Camera Device Add the camera device so the robot can see using the camera. Install it from here: https://synthiam.com/Support/Skills/Camera/Camera-Device?id=16120
Detect and remember a Face There are a few ways to remember and know faces. The most popular is using Cognitive Face, which will remember the face and emotions, age, etc. It will allow your robot to recognize how happy or sad someone is. So we'll go ahead with the cognitive face. Add the cognitive face robot skill from here: https://synthiam.com/Support/Skills/Camera/Cognitive-Face?id=16210.
The other method is to train the face as an object using object training in the camera device. You will have to read the camera device manual for information on that: https://synthiam.com/Support/Skills/Camera/Camera-Device?id=16120.
Now, when you press the DETECT button, the information about the person will be displayed. If the robot does not know you yet, press the LEARN button to learn who you are.
Now we can have the camera tell the chatbot who is talking to it. Press the configuration button on the cognitive face and add this python code to the script. It will send the current detected face to the aimlbot chatbot and specify the NAME parameter.
Make Camera Device Detect Face Now we need to tell the camera device to run the detection when a face is detected. On teh camera device, swith to the tracking tab and select FACE
Now let's make a script run that will tell the Cognitive Face to detect the face. This is like pushing the DETECT button every time a face is detected. Edit the configuration of the camera device and add this script to TRACKING START.
There, now you're done with the AI.
The next step is to make the head move. Just use the Auto Position robot skill, and that will be best. Install it from here: https://synthiam.com/Support/Skills/Servo/Auto-Position-Gait?id=20314
There's more than enough information in the Auto Position manual to explain how to create your animations.
Here is the project all working: aiml chatbot.EZB. All you need to add next is the Auto Position for having the robot head servos move around. You can also configure the camera device to move the robot head's eyes to follow the face. That are just a few settings in the camera device.
Select the Track by relative position option if the camera is stationary and not moving. But here are the camera settings you need to have the robot's eyes move and follow the face.
Jees!!! What a great platform the ARC is!! I am going to try it right now!
PS I have a feeling that ARC Pro is needed for this project. Am I right D(r.)J Sures?
Most likely because there will be a number of third party robot skills. You can still code the whole thing yourself with python and avoid upgrading - but pro support is what helps us keep making the software better
Thank you for the kind words
Before upgrading a question: When I create "my" head does it mean it will work as long as I am subscribed to the ARC Pro? In another words, if I not renew my subscription does it mean I can not run "my" head any more (head "dies"
)?
In my last post, I forgot to thank you for all you have done so far. Thank you very much now
.
When you’re finished programming, use ARC runtime. It’ll run your project without needing a subscription. The subscription is for programming. It’s further documented on the ARC download page or the support section for subscriptions here: https://synthiam.com/Support/Install/subscription-plans/am-i-locked-in
It was sheer laziness on my part not to read the Subscription plan and licensing thoroughly. Sorry about it. I'll try my best not to rush to ask you a question before I read the ARC documentation thoroughly. I'll subscribe to have full ARC features.