Asked
— Edited
I have been working on my telepresence robot in hopes of one day projecting myself and my thoughts to a remote location.
One of my roles is to describe my thought process , and I do this in the form of notation. I write, I draw, I depict my thoughts and concepts on a white board, black board or other physical medium.
As I build my Telepresence robot, I feel my robot needs to posses the same skills that I have. To be able to pick up a marker and draw out a vision based on my vision that everyone else can see.
Question?
How do I build, teach, program, inspire a telepresence robot to draw on a white board exactly what I would draw if I was there in person?
Give it an arm with many axis
On the end of the arm put a distance sensor so it knows how far to stay from the white board
Make it draw what ever you draw on a canvas
It'll actually be quite easy once you build the arm and calculate the whiteboard X/Y coordinates with the input canvas X/Y coordinates
The EZ-Script MAP() function would come in handy. Otherwise, i'd recommend this as a plugin and there's similar Map style functions for scalar in the EZ_B.Functions and EZ_Builder.Common
You know - actually i think i made something like this 3 or 4 years ago, that converted an input X/Y drawing on a canvas to a 3 or 4 degree servo arm...
let me see if i can find it
Thanks for the insight DJ.
I have been playing with dynamixels but I think there is some AI and sensors (like as you said depth sensors and cameras) in the hand that will be needed to help make this possible.
Replicate something like this:
It seems complicate and $$$
This is not part of the robot:
Something like this attached to an arm: https://makezine.com/2015/09/24/tiny-adorable-whiteboard-drawbot/
why not add a projector to your robot, and project your handwriting or digital documents over a white board ?
Thanks @ptp a pico projector maybe a better, easier solution as you suggest. Sometimes we get caught up in the past as we try to implement the future.