Had an idea....
With the keyboard being replaced by voice commands and the increasing likelihood of us being able to put a PC on board our bots I was thinking about ways to improve the sensor grid and make it a bit more advanced than the few micro switches and wires we use at present.
Are we able to take input in from the keyboard into our ARC application?
My thinking is to take the contacts under the keys of the keyboard and use them as a touch sensor grid on the outer skin of a bot.... We would have to program it so that we can return to normal entry method of course but it could be useful to map key presses to specific actions in EZ-Builder. ...
(I know I could map them using Event Ghost I would rather map them within ARC and reduce the need to work in another application.... keeping it all in house so to speak!)
if ($Keyboard_input= "g" )
elseif ($Keyboard_input= "h" )
What are your thoughts oh amassed body of wisdom out there?