Asked — Edited

A Note On Ez-Ai/Rafiki

EZ-AI development is on hold right now, well kind of...

We are in the process of working with some services that will allow the capabilities of EZ-AI to be far better than what they currently are. These include Wolfram|Alpha and IBM BlueMix/Watson. Speech Recognition will be performed through Nuance Cloud services. Advanced vision features will be available through OpenCV. A quick search of these services will allow you to see the end goal of what we are doing. These will be in the Rafiki project which is the primary focus at this time for CochranRobotics. We will release a limited use version for free which will replace EZ-AI. All of the current features of the EZ-AI database will be available through the new version. All of the services provided by EZ-AI will be available through REST queries and exposed services. This will allow plugins to ARC to be able to use these services.

There has been a huge amount of changes to what is possible since I first started working on EZ-AI. This huge shift of improved technologies has made it necessary to rework EZ-AI so that it can continue to grow and mature.

We are also toying with the idea of allowing programmers to write their own business logic layer within Rafiki. This would allow a programmer to be able to use the core services of Rafiki/EZ-AI and write their own applications with the data that is returned. It will probably be a while before this is implemented, but it is something that we are trying to make happen.

I have probably said too much, but wanted to give you all a picture of what is happening and why EZ-AI isn't being worked on directly. We hope to have our new AI available around the end of the year. There are multiple developers working on this while I find come up with solutions to other problems that arise.

As far as Rafiki goes, the pods are functioning great and additional code/abilities are being added almost daily. The models for the pods are being tweaked to expose the HDMI and usb ports and network port to the outside of the case. This will allow someone to connect a mouse, keyboard and monitor to the pod and use it as a computer if they would like. The Rafiki Bot is about 1/3 of the way printed. I am making modifications to the models as needed and reprinting some of the parts as needed. There will be 6 subsystems on this robot. 3 of these subsystems have been written and are ready to use. The other 3 subsystems cant be worked on until more of the Rafiki Bot has been printed. The 3 that are not complete are all very similar for motor control. I hope to have these ready in a couple of weeks. I should be able to show a demo of the Rafiki Bot in about a month, and then all of the robot programming starts. I will work on the charger base shortly after the robot has been completed and the pods are all working with all of their functionality.

One more thing on EZ-AI... As a part of this rewrite, you will just need to have Java installed on your computer to install and use EZ-AI. The days of the huge install will be behind us so it should make it far better in the long run. The other thing that this allows is robot platform independence. I will be working on modules in ARC to allow the features of EZ-AI to be accessible far more easily. This will probably not be worked on until December at the earliest.

Okay, going back to my robot cave. Have a great day all.


ARC Pro

Upgrade to ARC Pro

Unlock the true power of automation and robotics by becoming a proud subscriber of Synthiam ARC Pro.

#9  

@David.... I went to your website (haven't been in a while) and was reading up on Rafiki (especially the mobile robot platform) and it's planned abilities.... Dude, if you can deliver this my Mastercard is all yours....:)... Outstanding!

#10  

Thanks Richard. We have some lofty goals. Luckily I found someone with deep pockets to fund everything. I also found a lot of people who are not only very talented, but also are excited to work on Rafiki.

There is also a lot of new technology that has come out at the right time for us.

We tested our server today with a lot of load and it performed great. This was a concern for me but it performed far better than I expected. It worked so well for the hardware that it was on (sub $300 total for hardware and software) that i had to question if it was even really working. It was passing requests to our BlueMix application where most of the work was happening, but the coding is really good and the traffic is really light and fast. I was blown away.

This is a fun project to work on. I really look forward to being able to share this. The work is part of the reward for me. When advances are made I really enjoy it. It makes all of the little frustrations worth it.

#11  

Here are a few pics of the control box inside the prototype Rafiki bot.

This is the brain of the Rafiki Bot that allows simple commands to be passed from the EZ-B to the Rafiki Control box to read do things with the neopixel, move motors, get sensor readings, get the position of the servos and other things. When we go live, the goal is to have a single board handling all of this.

The feeds to the distant parts of Rafiki are handled through USB cables and HDMI cables. There was one failed print in this but I used it anyway. It really didnt matter for what I was doing with how the print failed.

The power comes in the back of the box and is fed to all of the devices in the box. Keeping all of these wires enclosed makes for a much cleaner setup.

Enjoy:)

Inside the control box

User-inserted image

top of the control box

User-inserted image

connections out of the control box

User-inserted image

side of the control box

User-inserted image

PRO
USA
#12  

Looking cool. What's the size of that control box?

#13  

its about 12" long, 8 inches wide and 4 inches tall.

The purpose of this box is to have all motor and sensor readings come back in a very usable and simple language approach. Here are some sample commands

Move forward 10 inches Drive forward 2 feet move forward 4 yards move backward 7 cm drive backward 2 meters turn right 19 degrees turn left 18 degrees raise right are 5 degrees lower right arm 5 degrees move arms in or out raise both arms 60 degrees Look up 30 degrees look left 4 degrees

I am working on light ring commands but there are so many that making these "common speech" commands is probably not going to happen.

The sensors return text like Object detected 18 inches sensor B drop off detected front/back/right wheel

All "servo" motors can be queried to know the current position of the pot in the motor.

When a request is sent to this box, it will report back when the action is completed or if there was an issue completing the action. This is done by querying the position of the pot or encoder of the motor. It also is smart enough to stop an action if a sensor is tripped detecting a drop off and then recalculate a route to get to the location that you requested. If a route cant be found, the EZ-B will be notified via a serial command.

This box also is running the code for the room mapping and path finding that the robot uses. You will be able to say "Hey Rafiki, Come here." The pod that is closest to you will pickup this speech and the robot will know which room it is in, and which room that it needs to go to. All of these features are outside of ARC so customizing them will not be possible. You will simply send serial commands to this device in a simple language from the EZ-B. The EZ-B will be notified when the task is complete. All sensor readings will come back to the EZ-B in plain English with the information from the sensors.

The communication back to the EZ-B will be very light and will not be repeated multiple times. This should keep the EZ-B very expandable with any other sensors that the user of Rafiki wants to add to it. These sensors can be programmed through ARC without having to worry about how someone would then tie actions of a robot back into these sensor readings. Creating scripts for use with this control module will be very easy to do.

PRO
USA
#14  

Very cool. That's a lot of progress!

#15  

I for one am looking forward to your release. It sound Incredible!

:-)

#16  

David, I went to your sight. It let me register, but did not let me sign in.

Mel

P.S. glad you got funding. You Really deserve it.