Oh and @Richard, don't sell yourself short Think logically and follow the logic and you'll be writing awesome scripts. The only difference between me and anyone else who can follow the logic in a script is I know pretty much all script syntax and controls by memory.
DJ has done an awesome job with EZ-Script and ARC so it really is very easy to do most things (so easy I often find myself over complicating things only to find a simpler method after I've finished).
The hardest part of programming using ARC is the typing
As I said, I'm insanely busy with work at the moment so may not have time to do much (making hay while the sun shines and all that, I need to earn lots more money to afford a 3D printer so I can get all these ideas in my head out of my head lol) however am always on hand for quick and simple guidance or advice
@Fxrtst one of my goals for my B9 is to have him extend his claw and shake hands when he detects a face in front of him. I am thinking that it would be a good idea for him to ping first and make sure that the person is not too close before he sticks his arm out and clocks someone.
Here is the first test of the EZ:1 wide beam sonar array using 4 low cost (less than £10 for the four) automobile ultrasonic transducers. It has 4 sense zones where the beams just slightly overlap each other so most front objects (even slim chair legs) can be detected also giving their position. In this photo a 25mm wide piece of plastic trunking is place 60cm away from the inner right sense zone (its right because the robots leg section is upside down for the tests).
As you can see, only that particular sense zone detects the trunking, all other zones show "clear" which means that there are no obstacles (within 2.5m range) of their respective sense zones.
Here I come close to the sonar array (0.5m) and all zones pick me up (I have a pretty wide chest!). You can also see the new "sense field" monitor (forth line at the bottom of the display) that works with on 3 distance zones which cover 360 degrees around the robot body, and detects all objects and humans/animals etc, the two systems working together make a really nice sensor array for the robot.
@Fxrtst If I can be of any help to you on your project please let me know? I would be happy to help.
Perfect, got it, I'll have a look through shortly
I was gonna use it directly in front of my doombot's claw, amongst other areas, so it opens and closes when an object is in front of it
Oh and @Richard, don't sell yourself short
Think logically and follow the logic and you'll be writing awesome scripts. The only difference between me and anyone else who can follow the logic in a script is I know pretty much all script syntax and controls by memory.
DJ has done an awesome job with EZ-Script and ARC so it really is very easy to do most things (so easy I often find myself over complicating things only to find a simpler method after I've finished).
The hardest part of programming using ARC is the typing
As I said, I'm insanely busy with work at the moment so may not have time to do much (making hay while the sun shines and all that, I need to earn lots more money to afford a 3D printer so I can get all these ideas in my head out of my head lol) however am always on hand for quick and simple guidance or advice
Yea on my omnibot I has distance sensors in the claw to tell when a item like a cup or can of coke is inside , triggering it to vlose.
So what is it you are working on?
@Fxrtst one of my goals for my B9 is to have him extend his claw and shake hands when he detects a face in front of him. I am thinking that it would be a good idea for him to ping first and make sure that the person is not too close before he sticks his arm out and clocks someone.
Here is the first test of the EZ:1 wide beam sonar array using 4 low cost (less than £10 for the four) automobile ultrasonic transducers. It has 4 sense zones where the beams just slightly overlap each other so most front objects (even slim chair legs) can be detected also giving their position. In this photo a 25mm wide piece of plastic trunking is place 60cm away from the inner right sense zone (its right because the robots leg section is upside down for the tests).
As you can see, only that particular sense zone detects the trunking, all other zones show "clear" which means that there are no obstacles (within 2.5m range) of their respective sense zones.
Here I come close to the sonar array (0.5m) and all zones pick me up (I have a pretty wide chest!). You can also see the new "sense field" monitor (forth line at the bottom of the display) that works with on 3 distance zones which cover 360 degrees around the robot body, and detects all objects and humans/animals etc, the two systems working together make a really nice sensor array for the robot.
@Fxrtst If I can be of any help to you on your project please let me know? I would be happy to help.
Tony
I see Paris, I see France, I see EZ:1's underpants LOL
Sorry, childish I know
@ Justin, LOL. @Tony, thanks very much. I'm sure there will be some upcoming questions. @Josh, was that question directed toward me, or Doombot?