This lesson will demonstrate how to use EZ-Script to have the robot wave once it detects a face. At the end of this lesson, readers will be able to enable facial detection and code a basic script using EZ-Script. Follow along with The Robot Program Episode 022: Detect Face and Wave - EZ-Script. View the video episode here: https://www.ez-robot.com/Tutorials/Lesson/102
This tutorial uses the following controls:
Professor E's Overview
This lesson demonstrates how to enable facial detection and how to trigger an action using EZ-Script.
Always start with a fully charged, disconnected robot. Load EZ-Builder and connect to the robot. Open the bare robot project, which provides a clean workspace without unnecessary controls.
Add the control for the camera and test the camera view. The camera will provide peripheral information (external input/output that can be used to provide information).
In the Tracking tab of the Camera Device, select Script and the execution checkbox. There are two different scenarios for when a tracking script will be executed- either when tracking begins or when tracking ends.
Click on the Pencil icon next to Tracking Start to access the Blockly workspace, and then change the tab to EZ-Script.
There are multiple ways to add code in the EZ-Script workspace. Right-click to view options, scroll through the Cheat Sheet, or start typing to be prompted by Intellisense. Line numbers are provided on the left-hand side for debugging and organization. It is recommended practice to use a consistent naming convention when coding.
Add the ControlCommand(Auto Position, AutoPositionAction, Wave) line of code, and then add SayEZB along with the desired text to be spoken. The text will stored as a string of characters. Review the code to understand how it will be executed. Save the script and return.
When a face is detected, the two lines of code will be executed, causing the robot to wave and speak the chosen text.
Remember to disconnect, power off, and charge the robot when finished.