United Kingdom
Asked — Edited

Artificial Intelligence

Hoping this will spark a huge discussion on what everyone is looking for when it comes to their robot's AI.

AI is something I've been working on since before I even learned of EZ-Robots. My JARVIS replica from IronMan is coming up to being 3 years old come December and, while not started in ARC, over the last few months I've been porting parts over to ARC and those which are beyond the capabilities of ARC are integrated via Telnet. These include such things as voice controlled media playback, voice activated control of appliances, lights etc. and, well to be honest, far more than I can really explain right now.

Basically, up until now it is entirely built around home automation and automated media acquisition, storage, playback and logging. Recently I have been integrating and porting over parts of it in to ARC and where ARC is not capable of carrying out the actions, integration via Telnet so that ARC (and it's scripts) are aware of everything they need to be aware of (i.e. if media playback starts, EventGhost sends ARC a script command $mediaplayback = 1, when it's finished it sends $mediaplayback = 0 (that's a very simple example, it also sends more info on the media). This will be demonstrated soon by Melvin when I get around to making the video of him knowing what's on TV.

Like I said, so far it's mainly based around Media and Home Automation. What I want to discuss is...

What do you want in your robot's AI?

What do you want him/her to be able to do without human interaction? What do you want him/her to react or respond to? What do you want the AI to enhance? Why do you want AI?

And, for anyone who already has some kind of AI running; What does your AI add to your robot?

Hopefully this will spark up some interesting conversation, get some ideas out there, inspire others (and myself) to push on with the AI and make robots more intelligent:)


ARC Pro

Upgrade to ARC Pro

Don't limit your robot's potential – subscribe to ARC Pro and transform it into a dynamic, intelligent machine.

#33  

@Toymaker>

Your AI software sounds truly amazing! The ability to learn visually is an incredible leap in A.I. learning ability.

Will you make it available in the near future?

#34  

For my current work with AI I'd say my goal and design path are pretty close to what Tony described for his AI's memory association and tutor learning.

My previous AI work focused on emotional responses and chatty types of interactions. What I ended up with was something that could only learn a few things well before it became too slow.

My revised AI efforts are not focused on emotional responses to things. I have come to like designs which feature only moderate displays of human type emotion. My main focus is on memory, how to store it and quickly search through it looking for connections to things over time and make predictions which can be used to direct the robot. Along with the I want my AI to have an awareness of when it needs help, which sounds similar to Tony's tutor learning.

United Kingdom
#35  

I have to say that HAL lasted all of 30 minutes on my PC before I removed it when checking out what it was and how good it was. I have to say I was not impressed in the slightest by it. It seemed like nothing more than a chatbot and when Pandora is free to use and simple to set up, customise and can be integrated directly in to ARC and run script commands I couldn't see a single reason why anyone would want to pay for something that's already free elsewhere (and in my opinion, better).

Like I've mentioned before, chatting is only one part of AI and to be honest it's not the part I am looking at the most (it's a part but not the total package).

I've yet to see something that wins me over with things like; If the robot knows you are out of the house and detects movement it alerts you (twitter/nma/email/sms) and then investigates. If the robot detects rapid increase in heat, noise, movement, light, etc. it acts on it. Automatic mapping and recording of where it has been and how to get back there, where obstacles are etc.

This side of AI seems to be something that's rarely touched upon, unless I'm missing a lot of videos and websites in my searches.

#36  

@Rgordon, where is it that you tell ultrahal that his response is not correct and how do you give him the correct answer. How do you activate his Learn mode?

@Tony, I really love your A.I. program. A.I. has always been a "hot spot" for me for over 50 years. But, I have almost always been disappointed in the offerings. Denise has came closer than any program I have experienced so far. I like that ability to open up the web and seek out the information if the bot does not know it. Arial seems to do more than Denise.

Mel

;)

#37  

So Toymaker when will this robot / these products be available?

PRO
United Kingdom
#38  

Mel, thanks, there is a lot of stuff I not showed yet, like the NLP (natural language processing) front end, with this you can ask for things in many different phrases and it will still work, like "turn the TV on", "turn TV on", "TV on", "switch the TV on" etc. This is handy as before we had this I used to always forget some of the exact phrases that had to be spoken, and often said the wrong phrase, with NLP this is not a problem. Our semantic parser is now very advanced this can sift through textural data and extract knowledge elements. The other thing we have developed for the Ai is "mini expert systems" where it can become a mini expert on certain subjects that the primary user is interested in, a lot of this data is taught to the Ai by the tutors, its a bit like the way a child learns. When I get time I will do more videos of this and some of the other functionality.

@rgordon, Integration of the Ai core is to the EZ:2 is a major part of the EZ:2 development and will be made available when the EZ:2 is rolled out.

@jstarne1, Because I had to completely re-design the 5DOF arm and develop the new ultra high torque servos, I am now approximately 2 months behind schedule for the start of beta testing which still should be some time in the first half of next year. This is a huge project (the EZ:2 robot), and our team is small so (development) delays such as these are inevitable with such a ground breaking product. If beta testing goes well then it is likely that the EZ:2 robot will start to become available in the later part of next year.

Hopefully this week I will be putting up a video of the new 5DOF (smart servo) arm, which I am now really pleased with.

Tony

#39  

@ToyMaker, I can't wait to see more. :-( sorry about the delay.)

@Rgordon, now I understand. Good work. At little problem with grammer,however.

@Rich. The reason ultraHal seemed dumb and stupid was because he WAS. When you first start he doesn't know much of anything. You have to train him like you would a child. After that, he becomes more intelligent. But he does have the ability to LEARN. And, that in itself in Exciting!

#40  

I went to the Applied Machine Intelligence website. I found this "The robot also has the ability to map the areas that it's working in, with a system that I developed called 'volume occupancy mapping'. Sensors in the robot include thermal imaging and IR ranging." This could be an answer for the robot to know where it is in the house. The Website is: www.appliedmachineintelligence.co.uk/robots.html