Back to community after a long time due to my graduation final exams. I'm excited to start building robots again...
I want to make a new robot that can move in the defined path on a floor. How can that be done?
I would even want the robot to click photos and automatically upload it to facebook page when I give the voice command "Upload to facebook!" Can that be done?
Kindly give me suggestions about how to proceed...
Upgrade to ARC Pro
Harnessing the power of ARC Pro, your robot can be more than just a simple automated machine.
Can the kangaroo x2 be used with EZ-B?
Just like the Record Video option, can we have a click photo option with ARC?
Can the wireless camera that comes with EZ-B developers kit click HD pictures or are there any other wireless cameras that can do it?
Sorry for too many questions...
While you can't upload to facebook through ARC you can use HTTPGet() or Exec() which could be used in conjunction with another application/web page to do that. You could possibly link facebook and twitter together so the post to twitter command in ARC would be copied to Facebook too, you would need to look in to that though, I'm not sure if it's that straight forward.
Yes the Kangaroo X2 can be used with the EZ-B however the V3 cannot receive serial data so the EZ-B V3/ARC is unaware of the Kangaroo data. The V4 can receive serial data so it's not a problem. This is quite a hot topic at the moment.
Floormap isn't really going to help much unless I've missed something with the control.
What do you mean by click photo option? You can have ARC take a photo, that's no problem. ControlCommand("Camera", CameraSnapshot) for instance is one way of doing that.
Which camera? The old one or the new one? Both are capable to taking photos however neither are HD. Any HD camera that Windows can see would be suitable though.
On Recognize Glyph1 move forward.
On Recognize Glyph3 turn right then move forward.
You could put arrows on the ground and have your camera looking at the floor.
You can even make glyphs that symbolize rooms. You might be able to make a script like (once again not real code)
ControlCommand (Start glyph tracking) #This will start him moving
On Glyph Detect (Living room) Stop
I'm sure there is an Example Project that will show how to assign values to recognize the glyphs. I haven't messed with them at all so I'm going off of theory on this.
These are just core ideas and need a little more code to them than I have eluded too but I think it's very doable. I have been know to make mistakes. From time to time.
And just so you don't have to go digging for the Glyph Video. In this video the glyphs are triggering a speak command. i don't see why you can't change it to any command you want.
And all of the camera info can be found here
Camera Device Manual
@Rich All that I'm speaking is only in reference to v4. I can't imagine the potential of the v4 controller. I shall now start my R&D about wireless HD cameras.
@Antron007 I really liked the idea of using Glyphs. That can be my last resort if I cant use encoders or Kangaroo x2.
I will keep updating the work status in this thread.
I don't know enough about encoders (since, until now I've pretty much dismissed them) but would they work with the Omnibot/Hearoid motors and drive train? My thinking is, if I can upgrade Melvin's drive train then I can justify the purchase of a Kangaroo/Sabertooth/Encoder setup.
With 64000 ticks per revolution its very accurate!