Welcome to Synthiam!

Program robots using technologies created from industry experts. ARC is our free-to-use robot programming software that makes features like vision recognition, navigation and artificial intelligence easy.

Get Started
Asked — Edited



I set up in ARC to track a red ball and it works quite well. I followed the tutorial for tracking one color. I was playing around with RoboScratch and it is not only fun but educational.

However when I put Camera: wait for color and select the red color and then add a movement command to the left, sleep for 3 seconds, movement command to the right, sleep 3 seconds and then movement stop command and start the program, unexpected things happen.

When I put the red ball in front of the camera view the robot moves forward like it is in the tracking mode still and does some other random movements and then eventually moves left for 3 seconds and to the right for 3 seconds and then stops as expected. I'm wondering if there is something in the camera tracking and setup screens that I should be enabling or disabling. I wouldn't think so as the first thing the program does is stop tracking and enables color detection of red by the camera control command.

I was wondering what other users have found when using the wait for color block in RoboScratch? Detection seems very erratic compared to experience with color tracking? Thanks Much.....Rick B.

AI Support Bot
Related Content
Based on your post activity, we found some content that may be interesting to you. Explore these other tutorials and community conversations.
You will have to disable the movements that you have configured in the camera control. As your post says, you have configured the robot to track a red ball. That requires a few settings in the camera control, such as "Movement Tracking".

Unselect Movement Tracking from the config in the camera control.

You can configure the camera control for other features which can be enabled in roboscratch. In your case, the feature you enabled was partially forgotten, even though you had mentioned it in the post. So you simply weren't connecting the two:)

Now in the future, you can see the power of being able to use camera control settings to make RoboScratch do even more.

However, in your case right now - simply uncheck the "Movement Tracking" checkbox from within the camera control config screen.

I mentioned that when I looked at the code for the camera in RoboScratch, it looked like the code disabled tracking, enabled detecting color red and before exiting turned tracking back on.

Am I getting confused with what is actually going on when describing what the scripting is actually accomplishing with the RoboScratch camera code for waiting for color red to continue? I'm sure that is a real possibility.....Rick B.
It disabled tracking types (color, motion, face, etc). It did not disable tracking methods (servo, movement, etc)

For definition of tracking types press the ? question mark on the camera control to view the camera control manual. Here is a direct link: http://www.ez-robot.com/Tutorials/Help.aspx?id=80

You are correct, as per the description and the help - the RoboScratch feature in question "Waits for the camera". The RoboScratch UI has question marks that you can hover over for additional information.

1) Move your cursor over the ? (question mark) of the RoboScratch control in question on your workspace (Wait For Color)

2) A bubble will appear

3) Read the contents of the bubble

4) The bubble will disappear when you move the cursor away from the question mark
Thanks DJ,

I will disable movement tracking in camera controls and will give it a whirl.

Roboscratch will be great for kids especially and has something for everyone that wants to improve their programming skills. I can't wait to demo the adventure robot and ARC at the STEM FAIRE coming up not this Saturday but the following Saturday ! It is incredible the power that the software and hardware have ! Great work ! Rick B.
Nice! Also it works on your mobile device, so try it there as well:)

I updated the manual for the camera control to be more specific regarding your question with the difference between Tracking Types and Tracking Reactions. You will find this update in the Camera Control manual by pressing the ? (question mark)


Servo/Movement Tracking
The above section describes tracking types, that is what the robot is looking for. This section describes how to react when the robot detects an object from the selected tracking type. Examples of reaction types are Servo, Movement, Script, or all three. The Config menu of the Camera Control has many options. On the first page of the config menu, you will find options for servo and movement tracking reaction types.

If servo tracking is checked, the control assumes the camera is mounted on the specified servo. The servo will be moved from left, right, up and down to track the colored object based on the servo settings that you have provided.

First, do not confuse Movement with Motion. The motion tracking type (as discussed earlier) is for detecting a moving object in the camera view. The Movement reaction type (as discussed here) is having your robot physically move toward an object. If Movement is checked, the robot will follow the object when it is visible using the project's Movement Panel. The robot will move forward, left or right to follow the desired object. A robot has only one way to "move", and that is defined in the project's movement panel. The Movement Panel may be an Auto Position, HBridge, or more. Selecting this Movement option will configure the robot to move toward an image by controlling the movement panel.

On the Script tab of the Camera control config menu is a Tracking Start and Tracking End script. When an object is detected from the tracking type, the Tracking Start script will execute. When the camera is no longer tracking an object, the Tracking End will execute.

Thanks DJ,

Would I be correct in assuming at this point that color tracking, etc and speech recognition won't work at this time on the mobile app?

I have the rgb animation display and the camera tilt servo working in mobile app.

With regards to the mobile app and the ultrasonic sensor is concerned seems one could add button and display area (much like the temp and battery level areas) and write a script to display distance from the ultrasonic sensor. Would this be feasible?

Thanks Again DJ......Rick B.
The question mark (?) on the Mobile Interface control will display the help manual it. That will give a list of controls supported by the app.

Your ultrasonic idea will work, if you assign it's value to a variable and display it in a Script Label. However, there is no advantage or reasoning to display the value other than for curiosity sake. Ultrasonic sensors are used for object avoidance by detecting distance. There are tutorials in the learn section on using the ultrasonic sensor for object avoidance using scripts. You can run those scripts on your mobile device.
Here's the easiest way to perform autonomous with the ultrasonic sensor on your adventure bot. Simply create a button on your mobile interface and add this as the code...



$distance = GetPing(d11, d12)

IF ($distance < 50)



So when that button is pressed, the robot will begin driving and navigate around on its own. When the button is pressed again, the script will stop and the robot will no longer be navigating. You will have to press the STOP button because it will be most likely moving forward. But that will work.

To display the current distance data. Simply add a Script Label and have it display the $distance variable

The $distance variable will not be initiated until the script runs. So it's probably a good idea to set the $distance variable to a 0 or "n/a" at connection. So in the connection control, add this script to the EZB #1 connection script...


$distance = "n/a"

That will intitialize the $distance variable when a connection is established to the robot

Thanks a bunch ! Rick B.