Rubik's Cube Solving Robot

bhouston

Canada
Skip to comments

I found this robot on Thingiverse; www.thingiverse.com/thing:2471044 and thought it would be a fun project. Check it out they've really created a great robot. Printing it takes about 70 hrs. and then you need to go shopping for baking supplies (Raspberry Pi's and such). So I thought why not make it run with EZ-Robot hardware (most people here have lots of that) and software. I modified the camera holder to accept an EZ-Robot camera and the arms to accept EZ-Robot HDD servo horns and connected everything to an IoTiny. Building it was pretty straight forward.
Programing it to solve the cube was another matter, so I got ahold of forum member ptp and asked if he would be interested in helping out and he was. He doesn't have a 3D printer however so I built him the robot and sent it to him. He has been busy working on an EZ-Robot plug-in to solve the cube as well as calibrate the arms and grippers.
We are hoping to have the Plug-in ready to share by the end of the month, so start printing. This would be a fun project for both kids and adults.
We'll keep you posted.

By — Last update

ARC Pro

Upgrade to ARC Pro

Become a Synthiam ARC Pro subscriber to unleash the power of easy and powerful robot programming

PRO
Synthiam
#1  
Ooooh i saw you post an EZ-Cloud app about this and wondered when we'd hear more. This is very exciting:D
United Kingdom
#2  
@Bob, I saw this on Thingiverse a while ago, and thought about building it. It wasn’t until I read into it more I realized that to get the s/w to make it fully functional you had to subscribed and pay for it.

It’s absolutely fantastic you have modified it for the EZB and that ptp is creating the plugin and the calibration for this, as it looks extremely complex.

I take my hat off to both of you guys, this is GREAT work!

Chris.
PRO
Canada
#3  
cool project. i wonder if the ez-robot six could be modified with new claws if it would have the range of motion and clearance to solve a cube. Remove two legs and move the extra 4 servos to the other 4 legs (1 each), move the camera and then add the new claws and a stand.
Canada
#4  
Very nice Bob. That's a lot of work. I'll definitely put that on my build list. I see a discarded inmoov arm off to the right. Did your inmoov have to donate his hand servos? :)
PRO
USA
#5  
Thanks everyone for the support!

A special thanks: @Bob, a great challenge and thank you for your donation (print time, print materials and shipping!).

Another thank you to the original creators (otvinta3d).

The building is well documented. Bob did some adaptions for the EZ-Camera, and adjusted the grippers and is still working on more adjustments.

I got some showstoppers:

I recommend the EZ-Robot HDD (digital) servos, unfortunately i run out of EZ-Robot HDD servos, so I'm using half half and I'm testing some alternatives. So far the best option are the EZ-Robot HDDs no buzz and smooth operation.

I broke an EZ-Robot camera trying to fit inside a shell case, please be careful adapting the existent hardware.

The initial Grippers are narrow, i file them but they are not even, so they deform the cube.

Plugin road-map:
I'm working to solve the Cube, but, the plugin can also be used to execute cube rotations, query vision details and develop a solving solution.
PRO
USA
#6  
@Nink

Regarding the Six and other similar ideas. Nothing is impossible, but we can't forget we are using hobby servos without any position/torque feedback, limited accuracy.

With those conditions, is almost impossible to dynamically align the movements on the fly.

If you can create a setup, and replicate repetitive movements without losing positions, we can move to the next problem: Software.
PRO
Canada
#7  
Ok great point @ptp I was just wondering how others could participate easily. I can’t envision EZ-Robot producing a robot cube solver but I could envision a claw that snaps onto an existing model.

The calibration issue is a concern, the dynamixels sensors make it so easy to calibrate but adding pots to obtain sensor data to hobby servers is not a simple process. I was able to pull position data from the Meccanoid servos but they have no torque so not ideal.

i think we need to solve the calibration issue in general. The home position that EZ-Robot uses is great but it is a complex manual process to align every servo and they quickly become unaligned.

There are smart folks here. How can we autocalibrate hobby Servos? Maybe it is an optical sensor alignment addon, sensors move until we get a lock. This may be as simple as an infrared reflector on the servo arm that an infrared led can fire off and be picked up by a sensor. (This could be quite accurate). Maybe a rough estimate could be done with visual recognition and the EZ-B camera. Triangulation maybe as simple as 2 cameras or perhaps 1 camera and 2 markers on the servo arm.

I have not given the external calibration issue any real thought but I have to think it is solvable. Maybe not for this problem but perhaps just for an auto calibrate when ARC is powered on would be ideal.
#8  
I have done this project using the Pololu servo controller, but willing to help out where I can here.

If your looking for serial/feedback servos, take a look at: (don't mean to take away from the good HDD servos from EZ-Robot.
LewanSoul LX-16A Full Metal 17kg High Torque Serial Bus Servo
PRO
USA
#9  
Welcome back:)

This is Bob's Idea, thanks to him i have another project to spend my time:)

The main objective is to use only EZ-Robot hardware, and the HDD servos are the best fit.

Regarding your tip seems very interesting for a good price, although i don't know if they have the standard servo dimensions, serial is nice less cables, and the feedback is very interesting but so far no need for them but 240 rotation degrees would help the gripper to rotate both sides (CW & CCW).

Parallax released a new Continuous servo with 360 degree feedback, very interesting you can run multiple turns and obtain an absolute position of 0 to 360, OR if you hack the servo and reprogram the sensor you can obtain 12 bit resolution 0-4095 positions. I'm working on a parallax propeller firmware and an EZB/ARC skill plugin.

Bob is working on the gripper model to be able to obtain full 180 degrees.

An update is coming soon...
#10  
For the cube solver, I just used DS3218 and HS311
PRO
Canada
#11  
@69developer I took a look at the LewanSoul LX-16A looks like it uses a proprietary bus controller, were you able to get EZB to talk to their bus controller (looks like the bus controller supports TX RX Serial connect so theoretically we should be able to connect via the UART port) or could you get the servos to work direct off EZB controller ( dynamixel compatible ?)

Edit: I ordered a starter pack (5 servo's and controllers) from Amazon for $110 Canadian (on amazon.com it's $89 US) The servo's are $20 CDN each if you buy separately or in US ($14). I will report back with findings but if they work off EZB direct would be huge and would solve a lot of problems.
PRO
USA
#12  
@Nink,
Those servos are not new here in the forum, the first time i spot them was this thread:
https://www.ez-robot.com/Community/Forum/Thread?threadId=10643

To avoid hijack Bob's thread (Cube) let's use the above thread for information related to the LewanSoul servo.

Today I'm in the Kitchen cooking (Thanksgiving) so limited computer access.
PRO
USA
#13  
Testing Color Detection:


ARC Project:
TestCubeVision01.EZB

Automatic Color Detection is not easy.

No servos are needed to run this project.

Feedback is welcome.

Soon I'll start again on the Rubik's cube (ROBOT).

I would like to thank:

1) Bob for the new Grippers , I damaged the initial ones.

2) A member of the forum for a generous donation : EZ-Robot Camera and 2 HDD Servos.

The above tests have been done with a Microsoft USB Camera, the next step is to test the new EZR Camera (I'll be more careful this time!)

Thanks for reading!
PRO
Synthiam
#14  
Are you using the multicolor rob or ycbr color detection? Try enabling histogram on those detections - that'll assist with lighting conditions and angles.

The good news is that the camera and cube would always be the same angle in the real version - unlike holding it with your hand at random angles. So lighting would be more consistent.
#15  
I am looking forward to seeing an Inmov solving the puzzle?
PRO
USA
#16  
@DJ,
The detection is done for each square.

I'm using HSV colors:

User-inserted image


User-inserted image


Hue is 360/2 = 0..179 (byte)

Challenges:
1) detect red-orange-yellow: They are very close e.g. The distance between Orange and Red and Yellow is only 7.5 points.

2) White
There is no white on the wheel, so need to look for saturation and value too, that is the reason why Blue is being recognized as White.

3) Middle cube with the brand sticker.
Is not only a color issue but a contour issue too... What a pin in the neck.

I didn't tried combining the HSL model:
User-inserted image


Regarding the Angles:
Assuming 0 is the middle, i need to detect the perspective, the problem is left or right, i need to review the calculation, I'm using an average angle, but after a specific rotation (cube corners (TL-BR) swap direction) and the global perspective is affected.

Quote:


The good news is that the camera and cube would always be the same angle in the real version - unlike holding it with your hand at random angles. So lighting would be more consistent.


Correct I'm trying the worst scenario.

Regarding the colors the problem is not finding 6 different colors the main problem is calling them a name.

I saw some internet rubik's cube videos, and they scan the 6 faces and the color clustering is done after, so is more easy to divide to color 1-6 (color distance) and present a BGR square color in the screen versus identifying the color.

I believe i'll handle both approaches:

A) Cube Robot: Color Distances (A-B-C-D-E-F) for solving.

B) Free movements, Color Recognition. If i can't find an automatic model, the user will need to adjust the intervals.

This is like magic... If you solve the cube without naming the colors, you have the idea the Robot recognizes identifies the color.
PRO
USA
#17  
@Andy,

The Inmoov project is almost a lifetime project, I still have some questions:
1) Finger control & usability
2) Legs are for walking ?
#18  
I know this is a challenging project. Way over my head if I had tried to conceive it. However I looked at your coding to try to learn something. You make it look so easy. What a wonderful process. :)
PRO
USA
#19  
@Dave,

Thanks for the words.

The idea is to make others challenges more EZ:)

I believe your are talking about the EZ-script:

Code:


:loop
WaitForChange($CubeVisionSequenceCounter)
SayWait("I see ")
REPEAT($x,0,8,1)
$cell=$CubeVisionColors[$x]

if ($cell=1)
$color = "Blue"
ElseIf ($cell=2)
$color = "Red"
ElseIf ($cell=3)
$color = "Orange"
ElseIf ($cell=4)
$color="Yellow"
ElseIf ($cell=5)
$color="Green"
ElseIf ($cell=6)
$color="White"
endif

SayWait($color)
ENDREPEAT
goto(loop)


i used the KISS principle (https://en.wikipedia.org/wiki/KISS_principle):)

The $CubeVisionSequenceCounter is incremented every time a complete different sequence is found.

So the idea is to monitor that variable and do something after.

@DJ:
I couldn't find a blocky equivalent for WaitForChange
PRO
USA
#20  
I always tell my friends Robots is more software than hardware.

There are so many sensors, but what makes them unique is the software (Firmware , API, Framework or Program).

One good example is the 3d camera:
1) PrimeSense/ Microsoft Kinect1/ Asus (1st generation) (Gone)
2) Microsoft Kinect2 (Will be gone after 2018)
3) Intel Realsense Cameras
4) Orbbec Cameras
5) Google Tango

All of them have more or less the same sensors, but... only 1 st generation and Microsoft Kinect V2 handles skeleton tracking.

So the software is the real thing.
#21  
KISS, yes. This's a concept I struggle with. Most of the time I overbuild and over think. It always works out better when I later reverse engineer and remove all the complexity I worked so hard implementing. LOL *eek*

EZ Robot and the rest of us are fortunate the group of guys like you have chosen to stick around here and show us amateurs the path. ;)

Respect. ;)
PRO
Synthiam
#22  
Use either one of these, and they can include white. They have a histogram option, which may provide more consistency in varying lighting conditions.

1) Simply create a profile for each color, adjust the sliders to isolate the color and give it a name.
2) The camera control can be configured to specify a maximum number of objects to detect (in this case 9)
3) Your plugin can watch the variables and see the location and color of each variable.
4) Sort the variables by X ascending and Y ascending using linq

User-inserted image
PRO
Synthiam
#23  
Voila, here you go. Once you create a profile for each color and give them a name, this will sort them for you...

Code:


class Square {

public enum ColorListEnum {
Red,
Green,
Blue,
White
}

public ColorListEnum Color;
public int X;
public int Y;

public Square(string color, int x, int y) {

if (color == "red")
Color = ColorListEnum.Red;
else if (color == "green")
Color = ColorListEnum.Green;
else if (color == "blue")
Color = ColorListEnum.Blue;
else if (color == "white")
Color = ColorListEnum.White;
else
throw new Exception(string.Format("Unknown color {0}", color));

X = x;
Y = y;
}
}

Square[] GetCubeColors() {

if (EZ_Builder.Scripting.VariableManager.GetVariable("$CameraIsTracking") != "1")
throw new Exception("Camera not detected any objects");

int count = Convert.ToInt16(EZ_Builder.Scripting.VariableManager.GetVariable("$CameraObjectCount"));

if (count != 9)
throw new Exception(string.Format("Camera only detected {0}/9 squares", count));

// Get all the variables and put them into our class of colors with X and Y
Square[] squareList = new Square[9];
for (int i = 0; i < count; i++) {

string colorName;
int x;
int y;

if (i == 0) {

colorName = EZ_Builder.Scripting.VariableManager.GetVariable("$CameraObjectColor");
x = Convert.ToInt16(EZ_Builder.Scripting.VariableManager.GetVariable("$CameraObjectX"));
y = Convert.ToInt16(EZ_Builder.Scripting.VariableManager.GetVariable("$CameraObjectY"));
} else {

colorName = EZ_Builder.Scripting.VariableManager.GetVariable("$CameraObjectColor_" + i.ToString());
x = Convert.ToInt16(EZ_Builder.Scripting.VariableManager.GetVariable("$CameraObjectX_" + i.ToString()));
y = Convert.ToInt16(EZ_Builder.Scripting.VariableManager.GetVariable("$CameraObjectY_" + i.ToString()));
}

squareList[i] = new Square(colorName, x, y);
}

// Sort the list of colors by X ascending and Y ascendinng
var sorted = from o in squareList orderby o.X ascending, o.Y ascending select o;

return sorted.ToArray();
}
PRO
Synthiam
#24  
One last thing, you can also do this entirely inside the plugin, rather than depending on people creating Color Profiles. If you'd like to see how, let me know and i can show you the EZ_B.CameraDetection namespace.

Simply do something like this...

Code:


EZ_Builder.UCForms.FormCameraDevice _cameraControl;

void detach() {

if (_cameraControl != null) {

EZ_Builder.EZBManager.Log("Detaching from {0}", _cameraControl.Text);

_cameraControl.Camera.OnNewFrame -= Camera_OnNewFrame;

_cameraControl = null;
}

Invokers.SetEnabled(button3, true);
Invokers.SetText(button3, "Attach Camera");
}

void attach() {

detach();

Control[] cameras = EZ_Builder.EZBManager.FormMain.GetControlByType(typeof(EZ_Builder.UCForms.FormCameraDevice));

if (cameras.Length == 0) {

MessageBox.Show("There are no camera controls in this project.");

return;
}

foreach (EZ_Builder.UCForms.FormCameraDevice camera in cameras)
if (camera.Camera.IsActive) {

_cameraControl = camera;

_cameraControl.Camera.OnNewFrame += Camera_OnNewFrame;

EZ_Builder.EZBManager.Log("Attached to: {0}", _cameraControl.Text);

Invokers.SetEnabled(button3, true);

Invokers.SetText(button3, "Detach Camera");

return;
}

MessageBox.Show("There are no active cameras in this project. This control will connect to the first active camera that it detects in the project");
}

class Square {

public enum ColorListEnum {
Red,
Green,
Blue,
White
}

public ColorListEnum Color;
public int X;
public int Y;

public Square(string color, int x, int y) {

if (color == "red")
Color = ColorListEnum.Red;
else if (color == "green")
Color = ColorListEnum.Green;
else if (color == "blue")
Color = ColorListEnum.Blue;
else if (color == "white")
Color = ColorListEnum.White;
else
throw new Exception(string.Format("Unknown color {0}", color));

X = x;
Y = y;
}
}

Square [] CubeColors;

// Set up your own color definition profiles or let the user adjust them
EZ_B.Classes.CustomColorConfig [] _colorConfigs = new EZ_B.Classes.CustomColorConfig[] {
new EZ_B.Classes.CustomColorConfig("red",true, 10, 10, 50, 0.2f, 0.2f, 0.1f, 0.9f),
new EZ_B.Classes.CustomColorConfig("green",true, 10, 10, 50, 0.2f, 0.2f, 0.1f, 0.9f),
new EZ_B.Classes.CustomColorConfig("white",true, 10, 10, 50, 0.2f, 0.2f, 0.1f, 0.9f),
new EZ_B.Classes.CustomColorConfig("blue",true, 10, 10, 50, 0.2f, 0.2f, 0.1f, 0.9f)
};

void Camera_OnNewFrame() {

if (_isClosing)
return;

List<EZ_B.ObjectLocation> detectedObjects = new List<EZ_B.ObjectLocation>();

foreach (var colorConfig in _colorConfigs)
detectedObjects.AddRange(_cameraControl.Camera.CameraCustomColorDetection.GetObjectLocationByColor(true, colorConfig));

if (detectedObjects.Count != 9)
throw new Exception(string.Format("Camera only detected {0}/9 squares", detectedObjects.Count));

// Get all the variables and put them into our class of colors with X and Y
Square[] squareList = new Square[9];
for (int i = 0; i < detectedObjects.Count; i++) {

var detectedObject = detectedObjects[i];

squareList[i] = new Square(detectedObject.ColorName, detectedObject.CenterX, detectedObject.CenterY);
}

// Sort the list of colors by X ascending and Y ascending and assign to a global variable for using elsewhere
CubeColors = (from o in squareList orderby o.X ascending, o.Y ascending select o).ToArray();
}
PRO
USA
#25  
@DJ,

Thanks, the information is useful.

I didn't want to reinvent the wheel... I had idea to explore the color profiles so we are on the same page:)

btw I'm already using the camera guidelines to reduce the ROI area.
PRO
Synthiam
#26  
Oh, almost forgot... to edit the color filters for each color, this is how you do it...

Code:


EZ_B.Classes.CustomColorConfig GetColor(EZ_B.Classes.CustomColorConfig color) {

string colorName = color.ColorName;

using (EZ_Builder.UCForms.FormCameraCustomColor fc = new EZ_Builder.UCForms.FormCameraCustomColor(_cameraControl.Camera)) {

fc.Configuration = color;

if (fc.ShowDialog() == DialogResult.OK) {

// set the color name to the original incase the user changed it
fc.Configuration.ColorName = colorName;

return fc.Configuration;
}

return color;
}
}
PRO
USA
#27  
Guys,

The plugin is 99% done.

Demo:


PS: Sound level is low.
PRO
Synthiam
#28  
Woooo! Now THAT is a plugin. Maaaaan what a great implantation. You have blown me away man
#29  
@ptp,

Great Job As Always...., You are unbelievable, and you have surpassed yourself, AGAIN....

Ron
#30  
Ya, looks great, Pedro. I'll be playing with it today.
PRO
USA
#31  
@DJ/All,
Thanks for the incentive.

The EZ-Robot HDD servos are a must for this project.

Initially i used only 4, after a few minutes i changed my mind .. and i borrowed (no way back) the other 4 from another project. The robot/machine is near my monitor and i do the code during the nights, i could not concentrate with the analog servos buzzing my ears.

All,
Thanks for watching, It's a cool robot to demo...

I'm involved in a few meetups so is a good opportunity to talk about robots and discuss bored stuff e.g. C#, C++, OpenCV.

is amazing what you can build with a 3D Printer, 8 servos, camera and wifi controller and ARC!
#32  
ptp, I can't expand the control window for the plugin to see all of the controls. Any thoughts?
User-inserted image
PRO
USA
#33  
Last minute change ... Fixed! please update the plugin.
I'm on Skype if you need help.
PRO
Canada
#34  
Wow! This is amazing. Great work PTP