Asked — Edited

2D Environment Mapping Code

I have almost completed my Omnibot code as an autonomous pet. I am about an elegant and effecient solution to AI. Of course, giving the impression of awareness is all in reactive behaviour. By combining a few concepts and sensors, Omnibot now acts like he has a mind of his own.

For example, when he is in Sleep (or stop) mode, his camera is always follow motion or a color. This means he will turn towards the TV and watch it. Or he will follow a person with his head.

His decision is based on two random generators, and of course some environmental decisions. But primarily he decides what to do by a random mode selection, and then a random time selection. The random mode determines between

  • autonomous (long, short)
  • follow motion
  • follow a color (red green or blue)
  • body language (turn head, nudge forward, nudge left, beep, etc)

The length of time to run each mode is randomized.

The microphone is always listening to voice commands. I set the voice command recognition to 90%, so he will recognize words that aren't exact. This adds some interesting behaviour by responding to words that are similar. For example, I am having a conversation with my friend while Omnibot appears to be watchin tv. He says something and the robot turns around and looks at him. When he moves, the robots head follows him. This of course freaks him out. Why? Because a phrase "turn around" was recognized at near 90%. He didn't actually say that, but the robot thought he did and reacted. This made it appear that Omnibot did it on his own.

Now for the Autonomous exploding 2D mapping code...

The concept is quite simple by implementation, but requires much testing for your robot size and speed. Some of my other robots have a much more compicated version of this concept using a 3rd dimension in the map.

The distance sensor is attached to the servo on the chest of Omnibot. The servo sways back and forth. As it scans, it remembers each distance in an array from left to right (or vice versa depending on the direction). When the servo gets to the end of the scan, it performs a quick calculation by examining the array.

It adds the Right and Left distance values up of every respective position in the scan. It compares the left and right distances against some constants that are determined by its speed and size during testing. The decision to properly align itself down hallway, or enter a door is determined.

Download the source code on his robot page to get a good look at the concept: synthiam.com/Robots/Tomy-Omnibot-V1


ARC Pro

Upgrade to ARC Pro

Unleash your robot's full potential with the cutting-edge features and intuitive programming offered by Synthiam ARC Pro.

Netherlands
#9  

I've got the same error.

Did you find a solution ?

United Kingdom
#10  

Nope I didn't tried lots of different ideas then gave up as it was frustrating me. However there have been so many developments with ARC thats keeping me busy and it just gets better and better my Omnibot is in pieces at present as he is going through a major upgrade program. I have found a solution to silence his motors which includes anti vibration mounts for the motor box just finishing that off

#11  

DJ I checked his page for the code and did not see it, am I missing it or was it removed?

PRO
Synthiam
#12  

It's an old post. The code for all projects (if you're looking for the source code and not ARC project files) are in the sdk package.

Otherwise, ARC examples are in ARC under file menu