Asked — Edited

Vr Options?

Does anyone know if the SDK of the Oculus Rift package would work with EZB if there was a plug in developed for it? It doesn't really need to be something as big and bulky as that. Very unfortunate that Vuzix is no longer offering support with EZB. That would work great.

Like the Vuzix, I'd like to pull from the IMU data to move servos and transfer the camera feed from the EZ camera to the goggles over two complete separate computers in different places.

Any ideas or suggestions on how to POC this, would be grateful.


ARC Pro

Upgrade to ARC Pro

Discover the limitless potential of robot programming with Synthiam ARC Pro – where innovation and creativity meet seamlessly.

PRO
USA
#2  

Hmm wonder how I missed this? Has not been updated in some time. I’ll have a look and see if this will work for me.

If this was for the DK1, what version did you have and test with?

Canada
#3  

The latest one I tried was the 0.8.0.0 runtime

https://developer.oculus.com/downloads/package/oculus-runtime-for-windows/

Sorry Dj has the DK2 version not DK1.

PRO
USA
#5  

I don't like that you have to tilt your head for rotational values. Wonder if there is a better solution.

PRO
Synthiam
#6  

There is a better solution if you use the tracking sensors. They’re external sensors that you setup around the environment. Google on the products you’re looking at will tell more detail. The only headset that I’m aware of which doesn’t use external sensors is the windows mixed reality

You’d have to look into their sdk to create a plugin for ARC.

PRO
USA
#7  

I've been looking at these FatShark headsets for flying drones. They have a built in pan and tilt head tracking for 2 servos and camera feed. Wonder if these could be modified to work like the other headset plug ins? Not so cheap tho.

PRO
USA
#8  

Ha we posted at same time! Ok sounds good I will look them up.

PRO
USA
#9  

Creating a POC with Alan and Alena to be used as long distance Avatars. Business guy in Tokyo calls into corporate headquarters for meeting. At the headquarters is an Alan in the meeting room. Guy in Tokyo has on VR headset and can see EZ robot camera image from Alan live, and as the guy moves his head, Alan moves his head in unison so he can see everybody at the meeting. As he speaks so does Alan using our audio servo control board.

PRO
Synthiam
#10  

Ya you’d need a plugin to do that. I’d probably start with a viewer complied with unity that sends servo movements and receivea Camera date to and from ARC.

Unity is like ezrobot but for graphic programs. Lots of people call us the Unity of robotics. If you can use ezrobot, you can use unity:) so they say!

All I know is that unity has a pile of built in vr support.

PRO
USA
#11  

Ok great. I’m all about graphics. I have a version installing now. I also have a couple of local coders that are helping with this. Will post the plugin when finished.

PRO
USA
#12  

Unity also has a lot of great use with Bvh motion files and I might be able to get the whole motion capture thing happening within the unity/ez robot environment .

PRO
Synthiam
#13  

That’ll be awesome! I’d use that plugin for sure.

PRO
Canada
#14  

Unity would be awesome. Especially if we could simulate our builds before we build them. I used to work a lot in OpenSim (an open source version of second life) that allows you to create 3D models and interact with them using things like vuzix oculus etc Has a modeling program, physics engine and is written in C# .net

What I like about OpenSim or secondlife is you can quickly and easily build 3D models using prims and then script them and integrate with them using an API so building and scripting servo's, sensors etc is quite easy. You can also import your 3D models. Support for other avatars is available and there is a fairly large Virtual World community.

PRO
Synthiam
#15  

Quite a bit off topic:) but you just described blender... and it’s free!

Import stl files (ezrobot provides all ours for every product) and voila, simulate in blender!

PRO
USA
#16  

Blender is a full featured program that is built on python. Its very powerful and can be extended by python written plug ins.

Unity after taking a look is more c++ and C# based. I did find a plug for Unity that ties into an Ardurino and will allow you to control servos from primitives and vise a versa. Pretty darn cool and all live in real time. And writes the code for you. Check out some of these examples. We should be able to control this with EZB?!

PRO
Synthiam
#17  

If it ties to arduino, then it’ll connect to ARC. Post the project details and a plugin would be really easy to make. It’ll connect via serial connection, which we can hikack as well

PRO
USA
#18  

This is pretty exciting. I’m officially on spring break with family. But will get to it when I get back!

PRO
USA
#19  

OK got this working today. I will make a video or do a write up to get this going. Its pretty straight forward with ardurino, but if you are unfamiliar with 3d programs finding things can be a pain. So some sorta visual references will help.

Everything is free (Unity and the plug in). I'll post a download link to the project file for Unity soon, but better to get a jump on and install Unity (make account etc) and the grab the plug in.

Unity

Ardunity Plug in

PRO
Synthiam
#20  

I don’t see any code - as requested. If you want this to work with ezrobot, share the arduino code or communication protocol. Otherwise, this is the wrong forum :)

PRO
USA
#21  

Some days its just impossible to post here. Today being one of them. I already said in my last post I was getting together the files which would be available from a download. I suggested a download of unity and the plug in for you to test on. Without it would be difficult to understand how they work together.

And for the record

Quote:

Post the project details
does not say post the code.

PRO
USA
#22  

This code does you no good without the .h libraries and the .cpp files for the plug in as suggested for the download links.

#include <Servo.h>
#include "Ardunity.h"
#include "GenericServo.h"

GenericServo servo0(0, 2, false);

void setup()
{
  ArdunityApp.attachController((ArdunityController*)&servo0);
  ArdunityApp.resolution(256, 1024);
  ArdunityApp.timeout(5000);
  ArdunityApp.begin(115200);
}

void loop()
{
  ArdunityApp.process();
}
PRO
Synthiam
#23  

you're right, that code won't help :)

I won't be installing unity or reproducing what you have done. What i will do is look at the communication protocol from their code and reproduce it in a plugin.

PRO
USA
#24  

This will really open up a whole new world of robotics if you can get a plug in going and get EZB to connect with unity. Unity is free, and everyone from kids up to directors of feature films are using it.

I'd certainly be up for creating a video series on using them together for robotics.

PRO
Synthiam
#25  

Nice!

Are there any libraries that the unity asset came with? Other than the code you posted. Because if it communicates to an addition over serial, I can easily replicate

PRO
USA
#26  

Yes:

Here is an image with all the libs. I will add them one by one to follow on code format:

User-inserted image

PRO
USA
#27  
/*
  Ardunity.cpp - Ardunity Arduino library
  Copyright (C) 2015 ojh6t3k.  All rights reserved.
*/


//******************************************************************************
//* Includes
//******************************************************************************

#include "Ardunity.h"
#include "HardwareSerial.h"

extern "C" {
#include <string.h>
#include <stdlib.h>
}

//#define DEBUG_ARDUNITYAPP

//******************************************************************************
//* Constructors
//******************************************************************************

ArdunityAppClass::ArdunityAppClass()
{	
	firstController = 0;
    connected = false;
    timeoutMillis = 5000; // default timeout
}

//******************************************************************************
//* Public Methods
//******************************************************************************

void ArdunityAppClass::begin()
{
	commSocket = 0;
    bypassSocket = 0;
	readyReceived = false;
    bypassProcessUpdate = 0;
	processUpdate = 0;
    connected = false;
	Reset();
    
#ifdef DEBUG_ARDUNITYAPP
    Serial.begin(9600);
    Serial.println("\nDebug ArdunityApp");
#endif
}

void ArdunityAppClass::begin(long speed)
{
	begin();
	Serial.begin(speed);
	commSocket = (Stream*)&Serial;
	
	commSocket->write(CMD_RESET);
    commSocket->flush();
}

void ArdunityAppClass::begin(Stream *s)
{
	begin();
	commSocket = s;

	commSocket->write(CMD_RESET);
    commSocket->flush();
}

void ArdunityAppClass::begin(Stream *s, Stream *s1)
{
    begin();
    commSocket = s;
    bypassSocket = s1;
    
    commSocket->write(CMD_RESET);
    commSocket->flush();
}

void ArdunityAppClass::resolution(int pwm, int adc)
{
#if defined(__SAM3X8E__) || defined(_VARIANT_ARDUINO_ZERO_) // only DUE or ZERO
	maxPWM = pwm - 1;
	int bit = 0;
	int value = maxPWM;
	while(value > 0)
	{
		value = (int)(value >> 1);
		bit++;
	}
    analogWriteResolution(bit);
    
    maxADC = adc - 1;
	bit = 0;
	value = maxADC;
	while(value > 0)
	{
		value = (int)(value >> 1);
		bit++;
	}
    analogReadResolution(bit);    
#else
	maxPWM = 255;
	maxADC = 1023;
#endif	
}

void ArdunityAppClass::timeout(unsigned long millisec)
{
	timeoutMillis = millisec;
}

void ArdunityAppClass::process(void)
{
	process(commSocket);
}

void ArdunityAppClass::process(Stream *s)
{
	ArdunityController* controller;
	Stream *backup = commSocket;
	commSocket = s;

	if(commSocket != 0)
	{
		while(commSocket->available() > 0)
		{
			byte bit = 1;
			int inputData = commSocket->read(); // this is 'int' to handle -1 when no data

			if(inputData >= 0)
			{
				if(inputData & 0x80)
				{
					if(inputData == CMD_PING)
					{
#ifdef DEBUG_ARDUNITYAPP
                        Serial.println("RX:CMD_PING");
#endif
						commSocket->write(CMD_PING);
                        commSocket->flush();
#ifdef DEBUG_ARDUNITYAPP
                        Serial.println("TX:CMD_PING");
#endif
					}
					else if(inputData == CMD_START)
					{
#ifdef DEBUG_ARDUNITYAPP
                        Serial.println("RX:CMD_START");
#endif
                        connected = true;
                        preTime = millis();

						if(startCallback != 0)
							(*startCallback)();
                        
						controller = firstController;
						while(controller != 0)
						{
							controller->start();
							controller = controller->nextController;
						}
                        
                        if(bypassSocket != 0)
                        {
                            bypassSocket->write(CMD_START);
                            bypassSocket->flush();
                        }
                        
						commSocket->write(CMD_READY);
                        commSocket->flush();
#ifdef DEBUG_ARDUNITYAPP
                        Serial.println("TX:CMD_READY");
#endif
					}
					else if(inputData == CMD_EXIT)
					{
#ifdef DEBUG_ARDUNITYAPP
                        Serial.println("RX:CMD_EXIT");
#endif
                        connected = false;
                        
						controller = firstController;
						while(controller != 0)
						{
							controller->stop();
							controller = controller->nextController;
						}
                        
                        if(bypassSocket != 0)
                        {
                            bypassSocket->write(CMD_EXIT);
                            bypassSocket->flush();
                        }

						if(exitCallback != 0)
							(*exitCallback)();
					}
					else if(inputData == CMD_READY)
					{
#ifdef DEBUG_ARDUNITYAPP
                        Serial.println("RX:CMD_READY");
#endif
						readyReceived = true;
                        preTime = millis();
                        
                        if(bypassSocket != 0)
                        {
                            bypassSocket->write(CMD_READY);
                            bypassSocket->flush();
                        }
                    }
					else if(inputData == CMD_EXECUTE)
					{
#ifdef DEBUG_ARDUNITYAPP
                        Serial.println("RX:CMD_EXECUTE");
#endif
						if(processUpdate > 0)
						{
							controller = firstController;
							while(controller != 0)
							{
								controller->execute();
								controller = controller->nextController;
							}
                            
							processUpdate = 0;
						}
                        
                        if(bypassSocket != 0)
                            bypassSocket->write(CMD_EXECUTE);
					}
			
					if(inputData == CMD_UPDATE)
                    {
#ifdef DEBUG_ARDUNITYAPP
                        Serial.println("RX:CMD_UPDATE");
#endif
						processUpdate = 1;
                        
                        if(bypassSocket != 0)
                            bypassSocket->write(CMD_UPDATE);
                    }
					else
                    {
#ifdef DEBUG_ARDUNITYAPP
                        if(inputData >= CMD_UNKNOWN)
                        {
                            Serial.print("RX:Unknown (");
                            Serial.print(inputData);
                            Serial.println(")");
                        }
#endif
						Reset();
                    }
				}
				else if(processUpdate > 0)
				{
					if(processUpdate == 1)
					{
						ID = inputData;
						processUpdate = 2;
					}
					else if(processUpdate == 2)
					{
						numData = inputData;
						if(numData > MAX_ARGUMENT_BYTES)
							Reset();
						else
						{
							processUpdate = 3;
							currentNumData = 0;
						}
					}
					else if(processUpdate == 3)
					{
						if(currentNumData < numData)
							storedData[currentNumData++] = inputData;

						if(currentNumData >= numData)
						{
							// Decoding 7bit bytes
							numData = 0;
							for(int i=0; i<currentNumData; i++)
							{
								if(bit == 1)
								{
									storedData[numData] = storedData[i] << bit;
									bit++;
								}
								else if(bit == 8)
								{
									storedData[numData++] |= storedData[i];
									bit = 1;
								}
								else
								{
									storedData[numData++] |= storedData[i] >> (8 - bit);
									storedData[numData] = storedData[i] << bit;
									bit++;
								}
							}

							currentNumData = 0;
						
							controller = firstController;
							while(controller != 0)
							{
								if(controller->update(ID) == true)
									break;
								controller = controller->nextController;
							}
						
							processUpdate = 1;
						}
					}
                    
                    if(bypassSocket != 0)
                        bypassSocket->write(inputData);
				}
				else
					Reset();
			}
		}
        
        if(bypassSocket != 0)
        {
            while(bypassSocket->available() > 0)
            {
                int inputData = bypassSocket->read(); // this is 'int' to handle -1 when no data
                if(inputData >= 0)
                {
                    if(inputData & 0x80)
                    {
                        if(inputData == CMD_UPDATE)
                        {
                            if(readyReceived == true)
                            {
#ifdef DEBUG_ARDUNITYAPP
                                Serial.println("TX:CMD_UPDATE");
#endif
                                commSocket->write(CMD_UPDATE);
                                
                                controller = firstController;
                                while(controller != 0)
                                {
                                    controller->flush();
                                    controller = controller->nextController;
                                }
                            }
                            bypassProcessUpdate = 1;
                        }
                        else if(inputData == CMD_EXECUTE)
                        {
                            if(readyReceived == true)
                            {
#ifdef DEBUG_ARDUNITYAPP
                                Serial.println("TX:CMD_EXECUTE");
                                Serial.println("TX:CMD_READY");
#endif
                                commSocket->write(CMD_EXECUTE);
                                commSocket->write(CMD_READY);
                                commSocket->flush();
                                readyReceived = false;

                            }
                            bypassProcessUpdate = 0;
                        }
                        else
                            bypassProcessUpdate = 0;
                    }
                    else if(bypassProcessUpdate == 1)
                    {
                        if(readyReceived == true)
                            commSocket->write(inputData);
                    }
                    else
                        bypassProcessUpdate = 0;
                }
            }
        }
        else
        {
            if(readyReceived == true)
            {
#ifdef DEBUG_ARDUNITYAPP
                Serial.println("TX:CMD_UPDATE");
#endif
                commSocket->write(CMD_UPDATE);
                
                controller = firstController;
                while(controller != 0)
                {
                    controller->flush();
                    controller = controller->nextController;
                }
#ifdef DEBUG_ARDUNITYAPP
                Serial.println("TX:CMD_EXECUTE");
                Serial.println("TX:CMD_READY");
#endif
                commSocket->write(CMD_EXECUTE);
                commSocket->write(CMD_READY);
                commSocket->flush();
                readyReceived = false;
            }
        }
	}

	commSocket = backup;

	controller = firstController;
	while(controller != 0)
	{
		controller->process();				
		controller = controller->nextController;
	}
    
    if(connected)
    {
        unsigned long curTime = millis();
        if(curTime >= preTime) // if overflow then skip
        {
            if((curTime - preTime) > timeoutMillis) // check timeout
            {
                connected = false;
                
                controller = firstController;
                while(controller != 0)
                {
                    controller->stop();
                    controller = controller->nextController;
                }
            }
        }
        else
            preTime = curTime;
    }
}

void ArdunityAppClass::select(byte id) 
{
	commSocket->write(id & 0x7F);
	numArgument = 0;
}

void ArdunityAppClass::flush() 
{
	float a = numArgument / 7;
	float b = numArgument % 7;
	byte addedNum = (byte)a;
	if(b > 0)
		addedNum++;

	commSocket->write((numArgument + addedNum) & 0x7F);
	// Encoding 7bit bytes
	byte bit = 1;
	byte temp = 0;
	for(byte i=0; i<numArgument; i++)
	{
		commSocket->write((temp | (storedArgument[i] >> bit)) & 0x7F);
		if(bit == 7)
		{
			commSocket->write(storedArgument[i] & 0x7F);
			bit = 1;
			temp = 0;
		}
		else
		{
			temp = storedArgument[i] << (7 - bit);
			if(i == (numArgument - 1))
				commSocket->write(temp & 0x7F);
			bit++;
		}		
	}
}

void ArdunityAppClass::attachController(ArdunityController* controller)
{
	ArdunityController* c = firstController;
	while(true)
	{
		if(c == 0)
		{
			firstController = controller;
			break;
		}
		if(c->nextController == 0)
		{
			c->nextController = controller;
			break;
		}

		c = c->nextController;
	}
	
	controller->setup();
}

void ArdunityAppClass::detachController(ArdunityController* controller)
{
	ArdunityController* c = firstController;
	ArdunityController* c1 = 0;

	while(c != 0)
	{
		if(c == controller)
		{
			if(c1 == 0)
				firstController = controller->nextController;
			else
				c1->nextController = controller->nextController;
			break;
		}
		
		c1 = c;
		c = c->nextController;
	}
}

void ArdunityAppClass::attachCallback(byte command, callbackFunction newFunction)
{
	switch(command)
	{
    	case CMD_START:
			startCallback = newFunction;
			break;

    	case CMD_EXIT:
			exitCallback = newFunction;
			break;
  	}
}

void ArdunityAppClass::detachCallback(byte command)
{
	switch(command)
	{
    	case CMD_START:
			startCallback = 0;
			break;

    	case CMD_EXIT:
			exitCallback = 0;
			break;
  	}
}

boolean ArdunityAppClass::push(byte* value, int size)
{
	if((MAX_ARGUMENT_BYTES - numArgument) < size)
		return false;
		
	for(int i=0; i<size; i++)
		storedArgument[numArgument++] = value[i];
	
	return true;
}

boolean ArdunityAppClass::push(UINT8 value)
{
	return push((byte*)&value, 1);
}

boolean ArdunityAppClass::push(INT8 value)
{
	return push((byte*)&value, 1);
}

boolean ArdunityAppClass::push(UINT16 value)
{
	return push((byte*)&value, 2);
}

boolean ArdunityAppClass::push(INT16 value)
{
	return push((byte*)&value, 2);
}

boolean ArdunityAppClass::push(UINT32 value)
{
	return push((byte*)&value, 4);
}

boolean ArdunityAppClass::push(INT32 value)
{
	return push((byte*)&value, 4);
}

boolean ArdunityAppClass::push(FLOAT32 value)
{
	return push((byte*)&value, 4);
}

boolean ArdunityAppClass::push(STRING value)
{
	byte size = 0;
	while(size < 255)
	{
		if(value[size] == '\0')
			break;
		size++;
	}
	
	if((MAX_ARGUMENT_BYTES - numArgument) < (size + 1))
		return false;
	
	storedArgument[numArgument++] = size;	
	for(int i=0; i<(int)size; i++)
		storedArgument[numArgument++] = value[i];
	
	return true;
}

boolean ArdunityAppClass::pop(byte* value, int size)
{
	if((numData - currentNumData) < size)
		return false;
	
	for(int i=0; i<size; i++)
		value[i] = storedData[currentNumData++];
}

boolean ArdunityAppClass::pop(UINT8* value)
{
	return pop((byte*)value, 1);
}

boolean ArdunityAppClass::pop(INT8* value)
{
	return pop((byte*)value, 1);
}

boolean ArdunityAppClass::pop(UINT16* value)
{
	return pop((byte*)value, 2);
}

boolean ArdunityAppClass::pop(INT16* value)
{
	return pop((byte*)value, 2);
}

boolean ArdunityAppClass::pop(UINT32* value)
{
	return pop((byte*)value, 4);
}

boolean ArdunityAppClass::pop(INT32* value)
{
	return pop((byte*)value, 4);
}

boolean ArdunityAppClass::pop(FLOAT32* value)
{
	return pop((byte*)value, 4);
}

boolean ArdunityAppClass::pop(STRING value, int maxSize)
{
	if((numData - currentNumData) < 1)
		return false;
	
	byte size = storedData[currentNumData++];
	
	if((numData - currentNumData) < size)
		return false;
		
	for(int i=0; i<(int)size; i++)
	{
		if(i < maxSize)
			value[i] = (char)storedData[currentNumData];

		currentNumData++;
	}
	
	if(size > maxSize)
		size = maxSize - 1;
	
	value[size] = '\0';
	return true;
}


//******************************************************************************
//* Private Methods
//******************************************************************************

// resets the system state upon a SYSTEM_RESET message from the host software
void ArdunityAppClass::Reset(void)
{
	if(processUpdate > 0)
    {
		commSocket->write(CMD_READY);
        commSocket->flush();
    }
	
	processUpdate = 0;
	numData = 0;
	currentNumData = 0;
	numArgument = 0;
}


ArdunityAppClass ArdunityApp;


PRO
USA
#28  
/*
  Ardunity.h - Ardunity Arduino library
  Copyright (C) 2015 ojh6t3k.  All rights reserved.
*/

#ifndef Ardunity_h
#define Ardunity_h

#include "Arduino.h"
#include "ArdunityController.h"

// Do not edit below contents
#define MAX_ARGUMENT_BYTES    116
#define CMD_START         0x80 // start
#define CMD_EXIT	      0x81 // exit
#define CMD_UPDATE        0x82 // update
#define CMD_EXECUTE       0x83 // execute
#define CMD_READY         0x84 // ready
#define CMD_PING	      0x85 // ping
#define CMD_RESET	      0x86 // reset
#define CMD_UNKNOWN	      0x87 // unknown

extern "C" {
  typedef void (*callbackFunction)(void);
}


class ArdunityAppClass
{
public:
	ArdunityAppClass();

	// for application
	void begin();
	void begin(long speed);
    void begin(Stream *s);
	void begin(Stream *s, Stream *s1);
    void resolution(int pwm, int adc);
    void timeout(unsigned long millisec);
    void process(void);
	void process(Stream *s);
    void attachController(ArdunityController* controller);
	void detachController(ArdunityController* controller);
	void attachCallback(byte command, callbackFunction newFunction);
	void detachCallback(byte command);

    // for module
    int maxPWM;
    int maxADC;
    
	void select(byte id);
	void flush();
	boolean push(UINT8 value);
	boolean push(INT8 value);
	boolean push(UINT16 value);
	boolean push(INT16 value);
	boolean push(UINT32 value);
	boolean push(INT32 value);
	boolean push(FLOAT32 value);
	boolean push(STRING value);
	boolean pop(UINT8* value);
	boolean pop(INT8* value);
	boolean pop(UINT16* value);
	boolean pop(INT16* value);
	boolean pop(UINT32* value);
	boolean pop(INT32* value);
	boolean pop(FLOAT32* value);
	boolean pop(STRING value, int maxSize);

private:
    Stream* commSocket;
    Stream* bypassSocket;
	ArdunityController* firstController;

	callbackFunction startCallback;
	callbackFunction exitCallback;

    boolean connected;
    unsigned long preTime;
    unsigned long timeoutMillis;
	boolean readyReceived;
    byte bypassProcessUpdate;
	byte processUpdate;
	byte ID;
	byte numData;
	byte currentNumData;
    byte storedData[MAX_ARGUMENT_BYTES + (MAX_ARGUMENT_BYTES / 8) + 1];
	byte numArgument;
	byte storedArgument[MAX_ARGUMENT_BYTES];
	
    void Reset(void);
	boolean push(byte* value, int size);
	boolean pop(byte* value, int size);
};

extern ArdunityAppClass ArdunityApp;

#endif

PRO
USA
#29  
/*
  ArdunityController.cpp - Ardunity Arduino library
  Copyright (C) 2015 ojh6t3k.  All rights reserved.
*/

//******************************************************************************
//* Includes
//******************************************************************************
#include "ArdunityController.h"
#include "Ardunity.h"


//******************************************************************************
//* Constructors
//******************************************************************************

ArdunityController::ArdunityController(int id)
{
	_id = (UINT8)id;
    started = false;
	updated = false;
	dirty = false;
    _enableUpdate = 1;	
	nextController = 0;
}

//******************************************************************************
//* Public Methods
//******************************************************************************
void ArdunityController::setup()
{
	OnSetup();
}

void ArdunityController::start()
{
    started = true;
	updated = false;
	dirty = true;
	OnStart();
}

void ArdunityController::stop()
{
    started = false;
	OnStop();
}

void ArdunityController::process()
{
	OnProcess();
}

boolean ArdunityController::update(byte id)
{
	if(_id == (UINT8)id)
	{
		OnUpdate();
		ArdunityApp.pop(&_enableUpdate);
		return true;
	}

	return false;
}

void ArdunityController::execute()
{
	if(updated)
	{
		OnExecute();		
		updated = false;
	}
}

void ArdunityController::flush()
{
    started = true;
	if(canFlush && (_enableUpdate == 1) && dirty)
	{
		ArdunityApp.select(_id);
		OnFlush();
		dirty = false;
		ArdunityApp.flush();
	}
}

PRO
USA
#30  
/*
  ArdunityController.h - Ardunity Arduino library
  Copyright (C) 2015 ojh6t3k.  All rights reserved.
*/

#ifndef ArdunityController_h
#define ArdunityController_h

#include "Arduino.h"

extern "C" {
typedef unsigned char UINT8;
typedef char INT8;
typedef unsigned short UINT16;
typedef short INT16;
typedef unsigned long UINT32;
typedef long INT32;
typedef float FLOAT32;
typedef char* STRING;
}


class ArdunityController
{
public:
	ArdunityController* nextController;

	ArdunityController(int id);

	void setup();
	void start();
	void stop();
	void process();
	boolean update(byte id);
	void execute();
	void flush();

protected:
    boolean canFlush;
    boolean started;
	boolean updated;
	boolean dirty;
	
	virtual void OnSetup() {}
	virtual void OnStart() {}
	virtual void OnStop() {}
	virtual void OnProcess() {}
	virtual void OnUpdate() {}
	virtual void OnExecute() {}
	virtual void OnFlush() {}

private:
	UINT8 _id;
	UINT8 _enableUpdate;
};

#endif

PRO
USA
#31  
/*
  GenericServo.cpp - Ardunity Arduino library
  Copyright (C) 2015 ojh6t3k.  All rights reserved.
*/

//******************************************************************************
//* Includes
//******************************************************************************
#include "Ardunity.h"
#include "GenericServo.h"


//******************************************************************************
//* Constructors
//******************************************************************************

GenericServo::GenericServo(int id, int pin, boolean smooth) : ArdunityController(id)
{
	_pin = pin;
	_smooth = smooth;
	_first = false;
    canFlush = false;
}

//******************************************************************************
//* Override Methods
//******************************************************************************
void GenericServo::OnSetup()
{
}

void GenericServo::OnStart()
{
	_servo.attach(_pin);
	_first = true;
}

void GenericServo::OnStop()
{
	_servo.detach();
}

void GenericServo::OnProcess()
{
	if(started && _smooth)
	{
		if(_curAngle != _endAngle)
		{
			float t = (float)(_endTime - millis()) / (float)(_endTime - _startTime);
			if(t <= 0)
				_curAngle = _endAngle;
			else
			{
				float a = (float)(_endAngle - _startAngle) * (1 - t);
				_curAngle = _startAngle + (int)a;
			}

			_servo.write(_curAngle);
		}
	}
}

void GenericServo::OnUpdate()
{
	UINT8 newAngle;
	ArdunityApp.pop(&newAngle);
	if(_angle != newAngle)
	{
		_angle = newAngle;
		updated = true;
	}
}

void GenericServo::OnExecute()
{
	if(_smooth)
	{
		if(_first)
		{
			_first = false;
			_curAngle = (int)_angle;
			_startAngle = _curAngle;
			_endAngle = _startAngle;
			_servo.write(_curAngle);
			_endTime = millis();
		}
		else
		{
			_startAngle = _curAngle;
			_endAngle = (int)_angle;
			_startTime = millis();
			_endTime = (_startTime - _endTime) + _startTime;
			if(_endTime <= _startTime)
			{
				// Timer overflow
				_curAngle = _endAngle;
				_startAngle = _endAngle;
				_servo.write(_curAngle);
				_endTime = _startTime;
			}
		}
	}
	else
	{
		_servo.write(_angle);
	}
}

void GenericServo::OnFlush()
{
}

//******************************************************************************
//* Private Methods
//******************************************************************************

PRO
USA
#32  
/*
  GenericServo.h - Ardunity Arduino library
  Copyright (C) 2015 ojh6t3k.  All rights reserved.
*/

#ifndef GenericServo_h
#define GenericServo_h

#include <Servo.h>
#include "ArdunityController.h"


class GenericServo : public ArdunityController
{
public:
	GenericServo(int id, int pin, boolean smooth);	

protected:
	void OnSetup();
	void OnStart();
	void OnStop();
	void OnProcess();
	void OnUpdate();
	void OnExecute();
	void OnFlush();

private:
    int _pin;
	boolean _smooth;
	boolean _first;
	int _startAngle;
	int _endAngle;
	int _curAngle;
	unsigned long _endTime;
	unsigned long _startTime;

	Servo _servo;
	UINT8 _angle;
};

#endif

#34  

@fxrtst What makes you think that Unity is the better option compared to eg Blender?

Also if you are just looking for an option to drive and record servos out of a virtual environment, we are already having two free options at hand...which are 3ds max, and Maya!

As far as I understood, you accomplished driving servos using Maya?

I am currently using 3ds max to drive my servos and it works just as the Unity examples you posted...

In any 3d application you will have to find a way to extract the servo angles out of your imported .bhv or .fbx, there is no black magic which will do this for you!

I think the key to success is always to build an exact replication of the mechanics within the 3d environment to get it all work correctly!

But if there is an option to link ARC with Unity, it would for sure open up a ton of new options and I would be very happy to put time in for learning and testing! :D

#35  

I double checked the clips...Unity could be actually a bit better for live control, 3ds max is a bit slow for this! :)

#36  

Also with Unity you can add Machine Learning to your robot...I will have a look!

#37  

Correct me if I am wrong, but I think since ARC has the option to retrieve data thru the Custom HTTP Server, there is no need for a plugin! Unity can connect to ARC by using a GET or POST request right?

No need for an Arduino either! ;)

https://answers.unity.com/questions/11021/how-can-i-send-and-receive-data-to-and-from-a-url.html

https://docs.unity3d.com/Manual/UnityWebRequest-SendingForm.html

PRO
USA
#38  

Yes! Check out this $20 Unity plug in. This guy works for a robotics company and wrote an incredible IK solver. Watch this video and then watch his other longer set up videos. Its pretty east to set up the IK. Then there is always FBX if you wanted to bake out your animations in 3D Max then use Unity to port out the animation to your robots.

I have to look at your links, would be amazing to hook thru http server!

Edit: added a longer video with VR goggles at around 4:00

PRO
USA
#39  

It looks like http server could in fact be used...from what i understand from the links you posted. I did see the post from the guy who posted code was from 2010 and someone at bottom said it was not compiling with the next version which at that time looked like 4.2. Hopefully it could be altered to work with todays version of Unity.

This would open up a whole new world for how you could use EZ products.

PRO
USA
#40  

I have a series of servos moving live within Unity environment right now using the plug in i had links to posted above. I also recorded motion and was able to play it back using the time line. It took awhile to figure it out as the docs are lacking, but managed to get enough info from their videos. It is buggy and crashed Unity often. Also it kept losing the com port after a certain amount of time, then you would have to reconnect. Def feels like unfinished software.

Another problem I had, was when I tried to set up IK for a series of servos. I found out there is no built in IK in Unity, only joints, ie hinge etc. That's when I stumbled on to the Bio Ik plug in from the assets store. I will try and set up an IK chain with the servos with it this weekend.

Maybe someone can look into connecting Unity and ARC via "Post" and "get", cuz i can tell you its beyond my skill set. We would have to figure out how to convert arbitrary rotation and translate values to 1-180 degrees for the servo positions?!

#41  

@fxrtst I will look at it...looks promising, I am just downloading Unity now!

PRO
USA
#42  

Its quite different than we are used too. Its obvious a gaming engine, but there are parts of it that are familiar to 3d animation. The great thing is there is alot to be found on the internet via youtube etc, to help fill in the blank spots.

Also I replaced the Bio IK Video on post #39 with another one that shows VR use and manipulation at around the 4:00 minute mark.

PRO
USA
#43  

BTW I was able to get a simple scene from Lightwave to Unity. It had a Unity export plug in. I'm sure they have one for Max.

PRO
USA
#44  

Also found a great autorigging and bones FK IK system in the asset store, super easy to use. You can rig anything not just bipeds. So you could create any robot size or shape bring it in and rig it for auto or manual IK rigging. Very very simple to use. Called Puppet 3D.

#45  

Yes, I could export my model with all the hierarchic dependencies into Unity... But its a bit tricky to get used too, but I agree on that it might be very much worth exploring!

I will check what you found! It all looks very good! The guy who invented the plugin lives nearby my hometown...funny! :D

PRO
USA
#46  

Yes it is definitely a gaming engine and not so much a 3d package. But once you add a few plug ins it works great. I will continue to test.

#47  

Great, let me know how far you are getting...I am up to explore more! Just trying to send some values the ARCs Custom Server, but will have to call it a day soon! sleep

#48  

So I guess it should work, you can send values out from Unity to ARC, this one for example connects the EZ-B in ARC on startup. The same way you could control servos...


using UnityEngine;
using System.Collections;
using UnityEngine.Networking;

public class MyScript : MonoBehaviour
{
    void Start()
    {
        StartCoroutine(GetText());
    }

    IEnumerator GetText()
    {
        UnityWebRequest www = UnityWebRequest.Get("http://192.168.178.20/Exec?password=admin&script=ControlCommand(%22Connection%22,%20Connect0)");
        yield return www.SendWebRequest();

        if (www.isNetworkError || www.isHttpError)
        {
            Debug.Log(www.error);
        }
        else
        {
            // Show results as text
            Debug.Log(www.downloadHandler.text);

            // Or retrieve results as binary data
            byte[] results = www.downloadHandler.data;
        }
    }
}

#49  

Something strange happened when I pasted the code, it should look like this! I guess formatting it did so... :)

User-inserted image

PRO
Synthiam
#50  

That arduino code is very useful but not the way you think. Because it’s the most terrible implantation of a protocol I’ve seen. I think it’s best to design our own. So, looks like we’ll revisit supporting other software - unless you think unity is the way to go.

PRO
USA
#51  

@mickey For you when you get up! think you are 9 hours ahead of me?!

Yeah thats strange the code did not paste correctly. Were you able to to connect to ARC/Unity using that code in the second image? If so we are half way there!

@DJ, yeah I can say that code is really half the story, since the other half is the lack luster plug in. What I can reiterate, is that the plug in is unstable, unfinished, poorly documented and not well supported (last post from them on google plus forum for plug in was 7/17). Seems he might be in litigation with his two colleagues over the software.

But seems Mickey has found a way to hook Unity to ARC via http server. That cuts out their flaky plug in i've been testing on.

I personally think Unity is the way to go as far as choosing a program. Especially if its accessible by ARC. I think its important that we choose free software so everyone can use it. Other software for animation can be in the thousands of dollars per user license. Unreachable for many.

That really leaves us with Blender and Unity.

Thoughts, comments?

PRO
USA
#52  

After doing a bit more research with Blender and watching some videos, it looks to use its own game engine (BGE) to connect to the outside world. Pyserial and a few python scripts to get it going.

My only concern, it seems to be a slower connection than Unity. Kinda like the difficulty Mickey was having with 3D Studio Max. I have no idea why that would be or how to get around it.

PRO
Canada
#53  

If we look at your original use case

"Creating a POC with Alan and Alena to be used as long distance Avatars. Business guy in Tokyo calls into corporate headquarters for meeting. At the headquarters is an Alan in the meeting room. Guy in Tokyo has on VR headset and can see EZ robot camera image from Alan live, and as the guy moves his head, Alan moves his head in unison so he can see everybody at the meeting. As he speaks so does Alan using our audio servo control board. "

Are you really looking at controlling yaw pitch and roll of your robot remotely and feeding video back into a headset? I think if that is the case the 3D rendering that mimics this is a bit of a red hearing. If you have the data you could also feed that into a 3D space so you have an avatar that mimics the movement for others but is it really needed for the use case.

Maybe all you really need is just an old vuzix as you previously stated and pull the data off the vuzix and feed into servo's and pull video feed off EZ-B camera and feed into vuzix. https://synthiam.com/Support?id=166

#54  

@fxrtst Hahaha, I should be...but it seems like I am stuck in Japanese time! This is why I am up again already! :D

I was getting really excited when I was giving the whole Unity concept a second thought, since it offers one thing that all the other options don't. Which is portability!

When I was putting out my success driving my robots servos within 3ds max I way kind of wondering why nobody else was up to follow that path, but the truth is... Nobody want to got thru the whole process of installing some new software and trying to understand a scene someone else created in some unknown environment, while having to learn how the user interface is handled beforehand.

Well with Unity you do not have to, the final product can be exported as a standalone version which everybody can use out of the box!

You are so right about this @fxrtst Unity is the key, I will put some work in! We all can still reuse our work previously made in Blender, Maya, 3ds max etc and connect it thru Unity with ARC!

@DJSures This could be a great option to build new interactive ways to explore your product...the least of it would be a visual feedback of JD's movement, either being dragged around in the viewport for live control, or to interact with objects in a virtual setting! I have seen only a few robotic platforms, like eg the Nao Robot, offering this! I think it would have a huge impact on consumers!

Anyways, I managed to connect Unity to ARC, no pluigin required,NO ARDUINO! :D

I will wire up the Virtual JD, I am still having it fully rigged somewhere...so we can have a common playground!

#55  

I managed to get the JD with all the hierarchical dependencies into Unity, my custom controls do not work, but the goal would be to re-rig him in Unity to learn about the progress anyways! ;)

User-inserted image

PRO
USA
#56  

@mickey ah still in Japan! Thought you were back! I’m still up 11 pm west coast USA time. I think you sleep like I do or rather don’t sleep much stress

So glad to hear the test was a success ( getting unity to chat with EZ.) I brought in a rigged Lightwave object and ran into same issue. Just have to rerig. You can bake motions and bring that in via FBX. If you animate in MaX. But if you want to animate in Unity gotta rerig. It’s fun and easy with the IK rigger I mensioned earlier.

I’m glad we both are on the same page with Unity. I agree with you on portability. And it’s free etc etc. and I’m glad you can see what this can mean for EZ Products. Nothing has been commercially available like this... ever. Opens a massive world for robotics including machine learning for robots.

@nink this does apply to the my original title of this thread but we have wandered back to a path that crosses over with a prior thread started by Mickey about using animation programs for moving robots. For me this goes waaayyy past VR and avatars to include motion capture, machine learning and advanced robot movements. As stated before this has been a 30 year pursuit. All my POCs lead into one direction and for the very first time I can just about reach it with my fingertips. Most most exciting!

#57  

@fxrtst Same here...this could be the key! I was always trying to figure out how to implement machine learning into moving my robot, this seems to be a door to a lot of options! Also baking animations, setting up joints and pivot points within your favorite 3d animation package and then exporting them to be used in Unity is actually a very good thing! You can still playback the different motions created, which is also something desirable...

Well lets roll up our sleeves and get going! ;)

P.S. I am already in back in Germany, but I guess my internal clock is not!

PRO
USA
#58  

Oh haha still on Japan time ! Oh I know how that is. Traveling sucks sometimes! Btw I saw the picture you with pepper robot. Where did you see it?

My sleeves are rolled up! Just in time for bed! Haha. Chat more soon!

#59  

Most of the Softbank Stores use Pepper these days...but also the smaller shops feature a Pepper every now and then! :)

One more thing, do you think the IK Rigger is of good use for us? The main thing we need is a good way to extract rotations out of IK joint...gimbal locking and euler rotation order is a tricky thing!

Bio IK seems to be the best option since it looks like it is avoiding those weird IK jumps which make your robot punch itself in the head... :P

PRO
USA
#60  

Yeah I do but I need to look into more about painting weights. Since all of our mechanical parts will be 100% influence I want to make sure that’s part of the program. It makes since it should be as all his examples are robots! Lol.

If not the other ik puppet one is a good contender. It’s an autorigger but you can create bones for anything. Has weight painting easy UI.

PRO
Canada
#61  

@Mickey666Maus this is very cool. I love the digital twin with JD in unity. Now if only we had the lidar module with Roli....

PRO
USA
#62  

Quote:

I think its important that we choose free software so everyone can use it.
I read a few times Free.... So unless I'm wrong is important to clarify:

Unity is a framework not an end product, basically you have two options:

  1. develop/toy around e.g. (Evaluating) with Visual Studio and/or Unity developer tools.
  2. build a product e.g. (Developing, Testing, Publishing)

The first option is really free to everyone, if you are evaluating and learning.

The second option (build a product) is only free if you as developer (person) or company don't have more than $100K as income/revenue.

https://en.wikipedia.org/wiki/Unity_(game_engine)

PS: Does not matter if your unity game/product is free and your income/revenue comes from selling shoes or selling services.

PRO
USA
#63  

Paying for a license is not a bad thing, the important step is to understand Unity's business model and you need to make a decision for you (Unity's client) too.

I believe you will need a licence (free/100K, or other) to develop and publish an Unity Plugin.

This is not uncharted territory... QT framework: people start developing with free versions and only before publishing their product/game they buy a pro license. Or they release a free/open source product and then they want to release a commercial product.

Unity is everywhere:

  1. Google: Tango (dead), ARCore
  2. Apple: ARKit
  3. Microsoft: Windows Mixed Reality

ARC supporting Unity is really a good thing.

PRO
USA
#64  

Another great thing is that the unity asset store will allow anyone to develop and monetize a plugin. Anyone? But may not need one as Mickey has hooked unity with http server.

When I was saying free I meant for a end user not as a developer. if i were to develop something where I was using unity to promote or use it in conjunction with Alan or under Robomodix where I was making 100k or more then a support license would be the way to go.

My free comment was strictly for EZ Robot end user, most of which I assume ( not all) are hobbiest. A free program vs a paid $800-$2000 professional 3d software package. Where the professional 3d software would most likely be out of reach for most hobbiest. A free use software would likely get more people interested in trying or using it in conjunction with EZ Products.

#65  

I managed to wire a slider to my Virtual DJ, but with moderate success... There is lot lot of stuff to learn! I realized that Unity is really some developers app.

But I still think it is worth exploring... :)

PRO
USA
#67  

Yeah i saw that robot engine. Its a few years old. But they implemented alot into it. doesn't seem to have continued development. But might have some keys to getting values out of unity.

I'll post my progress tomorrow with the other plug in. I'm playing with IK rigs now. Trust me there is a bit of a learning curve with Unity but its possible to do everything with plug ins and no coding.

We can set up and rig all the EZ Robots digitally in Unity and have available for everyone to try. Then others wont have to go thru what we are doing for the EZ Robots.

I had said earlier I'd be happy to do a whole video series on this one the bugs are worked out.

I'd start by downloading the free Ardunity plug in i suggested at the beginning of this thread, its free and at least you can get some servos working. Maybe you can figure out how to capture the outgoing rotational values from with in the plug in and send them to ARC?!

#68  

Hey I just wanted to edit my post, to say its not up to date with the lastest Unity Enigine! :D

Please post the stuff you figured out so far, it'll help me to safe time and get our POC up and running as fast as possible...I won't mind spending a few bucks on plugins! ;)

BIO IK still top of the list? I wanted to give it a try today...I guess you really do not need to use the Arduino Plugin, it'll just cost time to set it all up!

Let's find out how to share projects? Would be cool to have a Virtual JD for the forum? It could be on GitHub or maybe straight out of Unity?

PRO
USA
#69  

We can set up a dropbox if you want to share project stuff or I think there is a cloud based collaboration in the top right part of screen of unity. Not sure how that works with dependencies like plug ins etc.

I bought both the IK plugs and messing around with a simple rig to control a couple servos. The 3D puppet is my favorite for ease of use, but looking for limits etc at the moment. Easy weight painting. Bio Ik has a a really easy way to set limits...so still on the deciding. I'm Setting up a scene for each plug in. Then see where i run into problems if any.

EDIT, I do like the ease in and ease out of the motion from Bio IK

PRO
Canada
#70  

You could try opensim or Second Life, has a good scripting language and you can make external calls.

If I get a chance I will try and make a remote control in second life that talks to the ARC http server plugin. This way you could have a remote control that would control both a virtual model of the robot and the real robot at the same time so any avatar in Second Life could control the robot in the real world. You could probably also feed the video from the robot into second life so the person controlling the robot (and any other avatar in world) could see what the robot sees. http://wiki.secondlife.com/wiki/LlHTTPRequest

User-inserted image

PRO
USA
#71  

Hey guys I put together a 12 min video of where I've been able to go so far. Please bare with me as this is the first screen capture ive done. Micrphone needs to be worked out.

I've managed to connect two servos to a characters head and got them to move based on rotational values. The character was rigged with IK (inverse Kinematics) in 3D Puppet. Used the flaky plug in (ArdUnity) until we can get a good pipeline going.

Problems or 'to do list':

I am looking into why I cannot get rotational values from Bio IK rigs. (Edit: Working now) Need to look into animating motion on a time line. Need to get Unity to talk to ARC and get a servo to move on the EZB Do we need to set up limits in Unity or do it on the EZ side? Do we need a unity EZ plug in? Do we need an ARC plug in for Unity?

PRO
USA
#72  

@fxrtst: What is the Unity Role ? A Modeling tool ?

PRO
USA
#73  

@Mickey666Maus: Post #56: Blender:

  1. Can you export your 3d/nodes JD representation to a Blender format ? Unity:
  2. are the JD nodes (e.g. servos) movable or static ?
  3. How did you setup the JD did you import from a 3d format ?
PRO
USA
#74  

@PTP, no, not modelling. It is animation engine. The concept here is to be able to animate your entire robot, not just one pose at a time. So you can build very complex animations using some very sophisticated tools, like Inverse Kinematics and machine learning and apply them to motion on your robots.

This is different than ARCs auto positioner which is pose based animation. This is a UI based visual representation of your robot, where you can either record and playback or animate your robot in real time.

The JD that Mickey has produced in 3D Studio Max is hinged at the joints and can be moved when rigged correctly. Thats why we are rigging it with IK in Unity. So we will have an avatar for the real JD.

#75  

@ptp Unity can import 3ds max scenes directly, or .fbx would be the next best option. I can upload the Virtual JD as .fbx so you can use it in Blender!

Once the .fbx is imported in Blender or Unity, all the parts are in correct order, so if you will move eg the shoulder, the arm will follow...also all the pivot point are set up correctly so rotation of all parts will be correct if you set your rotation local!

You will have to rig the Virtual JD to make it move in Unity and in Blender, since custom controls and custom rigs do not work cross platform...but you can back animation to your model in Blender and export the model with the animation to Unity. Not sue if that would be a good thing to do though, since it could be tricky to extract the rotational values doing it this way.

I will make a demo of the Virtal JD and EZ-B connection later to demonstrate what can be done with Unity and ARC! :)

PRO
USA
#76  

Did you watch the video above? I managed to get IK working with both the 3D Puppet and the Bio IK. Did you get either of those plugs, or just working out the rotational values to EZ Bulider?

PRO
USA
#77  

@Mickey666Maus: A fbx would be nice.

Some noob questions... I'm not a 3d guy, although i believe there are a lot documentation, please be patient trying to get a quick idea how all the stuff works...

Can you guys describe (not verbose) how you assemble the JD from the STL files to unique object.

I presume the fbx contains skeleton elements (bones/joints) , can it be done with Blender ?

Does the fbx contains information regarding the bones/joints rotation orientations, and angles limits e.g. (only Tilt between 90-130 degrees).

IF the fbx contains (the above) details, and is imported to the Unity, all those details are preserved and less work is needed or not ?

PRO
USA
#78  

@PTP. Check your email for details.

To answer the question regarding assembling the stl files: You would download all the parts, then you could then import them into a free program like blender. Not knowing exactly how to use blender, I can say you would drag and position the parts into a JD form. Then you would position the joints. This is where the part would pivot around. I.E does the joint just move like a elbow or swivel like shoulder joint. These pivot points very are important! Because you cannot move them inside Unity. Unity will add a pivot point in the middle of the object.

But you don't have to set up bones unless you want to use an advanced motion setting like inverse Kinematic . Where as the bones and parts are connected together in a chain, like you pull a finger and the rest of the body moves because it is attached by bones (like a real body).

If you just want to use animation tools in Unity to move the robot ie like pose animation in ARC, you don't need bones.

The FBX contains pivot point info for each part, its location in world space X,Y,Z, local space X,Y,Z the parts themselves, any textures you may have added to the robot and any animation you may have baked onto the model (from blender).

I do not believe the fbx contains angle limits. In my example the rotations limits are set in the free plug in i was using within Unity.

You will need to do a little work in unity, the fbx will not contain everything you need unless you are just using to animate WITHOUT bones.

#79  

@ptp This is the .fbx of the Virtual JD, there is some useless stuff still floating around, which used to be the custom rig in 3ds max, just delete them and use the model to get used to working in your 3d enviroment of choice! :)

Unity offers a great range of customizing the environment with custom C# code, since you have always been great in helping to code, it would be awesome to have you on board! :D

PRO
USA
#81  

I used Mickeys JD model to demonstrate the unity process from beginning to end. Fairly quick process.

Edit : watched video and I refer JD as DJ at the beginning, sorry for the confusion

Edit: Obsolete.Watch the next video from Mickeymaus, this plug in is no longer needed. He has connected Unity and ARC via http server.

:P

#82  

@fxrtst et voilà! :D

I am happy we are making progress here! Once we have the Virtual JD as a common ground, it will be the developing platform for future ideas! Bringing Unity up was a very cool thing...I am having a lot of fun already!

Getting a full IK setup going is something I will only believe once we got it. Gimbal lock and Euler Angle order is a tricky thing to resolve, but I think we might be able to get it done!

You should place a camera in your scene so you will have the correct view once you hit the play button, since the final product will be the game view. Which can be used as a standalone, so everybody in the forum will be able to have fun with a virtual JD! :)

PRO
USA
#83  

Beautiful. Ok you’ve connected the rotation to the sliders. So how did you connect to the EZB do you have a script in ARC? How are you using the rotational values as real positions 1-180? Can you share the ARC and unity project files?

PRO
USA
#84  

Im pretty sure i can get the IK working. I knew Unity was the way to go after you discovered http server hooks.

I'm going to set up an arm with IK from JD see what happens :)

PRO
USA
#85  

I finally got the hang of the IK. I reset your arm (merged) in Lightwave and reset pivot points, exported out as FBX for Unity. This shows Realistic mode which creates a sort of mass for the objects. Have not yet tried to connect servos, but no reason too try (because you have the EZ connection.) But need to check actual angles of each part are available to hook. I'll try with ArdUnity.

BTW turns out (as far as I can tell) that you have to run IK in scene mode, because the goal object cant be moved in game mode.

#86  

@fxrtst This looks VERY promising! Good to see that Bio IK gets the job done! :D

Could you share the new merged JD file as an .fbx, it'll save me time and we will have a common model to work off... Btw, which screen capturing software are you using, I would like to use it for future clips too!

The handling of Unity within ARC is pretty straightforward, all you need to do is to connect your EZ-B with your preferred Wifi mode, start the Http Server and have the servos connected as you are used to.

In Unity it is a bit more tricky, I wrote some code to take the values from the slider and to send them to ARC everytime a slider is being changed. So I guess the next step is to rig our JD in Bio IK like you did and see how to implement that code in tandem with ARC and Bio IK...

I will buy the plugin later today, if you can share your scene I can save some time, but I can also try it myself...I will have to learn how it works anyways! ;)

This looks very good...great progress, and I guess we are really onto something this time! :D

PRO
Canada
#87  

wow that is really cool @fxrtst. I am wondering if you are going to need some form of feedback loop back into the model to know when the instructions have been executed on EZ-B. We don't have position data from HDD servo's so if we move the model with the assumption the servo completed the move in time before we make the next move we could get all sorts of strange results. Maybe some form of delay needs to be "calculated" based on the complexity of the move requested.

PRO
USA
#88  

@Nink, There are a few use case examples for how this could be used. One is "live" where the user would just move sliders around to move the robot like Mickey has demonstrated, i.e "pose animation". Then you have an IK rigged robot like my demonstration, less pose, but still kinda just moving the robot around (follow the ball). The power comes in the animation process (not yet show). After using a combination of pose and IK, FK and recording the motions, you could then fine tune the animation by simply moving keys around.

The servos only move at a finite max speed. So if using the model in Real Time or Live mode, you would be able to adjust the speed of the live playback to match the real robot servos, so you would be less likely to get strange results. In both our videos where we move servos, I can tell you that the movement is very very close to 1:1, no real lag.

@mickey , yeah the code for Unity, both the get/post portion and the sliders was what i was interested in looking at if you could post the Unity scene would be great. How do you want to share these assets? Dropobox or?

I'm in Mr mom mode this week as my wife is travelling for work. So will be in Kid mode for a good portion of the week. I will try and get some things done as time permits.

I'm using Bandicam for screen capture

Last nights success capturing angle data from each part. There is no worry about the IK hijacking the rotational values, they are passed along normally. Yay.

I. This video two servo set up one for upper arm and the other lower arm.

#89  

@fxrtst This is great! If you would know how many hours I spend getting the rotational values right in 3ds max, and how impossible it was to rig my robot correctly! After almost one year, changing the tool made it happen in less than a week! So good you brought up Unity!

Besides live motion and playback, we will be having a ton of options for our robots now! One of them being a fully IK controlled robotic arm that is controlled via Occulus Rift controllers....which was the first idea of this threat!

I am pretty sure we can get a simple script going which can be used easily within Unity so you would just have to drag and drop objects to send the rotations...

It might be even good to keep parts of that servo plugin you are using right now?

Anyways, I will clean up my messy code, and upload the scene once I am back from work...this is great!:D :D

PRO
USA
#90  

Haha yes rigging can be very frustrating. I hurt my back in 1997 and could not work as a make up effects artist for a year. I taught myself computer animation and worked from home. Unfortunately it was Lightwave and not Maya or I might have switched careers! But I worked for a company that created a tv show called Startroopers roughnecks. It was a kids tv series. One of the first CGI kids tv shows. But you had to be an all rounder. Had to do everything, modeling, lighting, rigging, painting.

Anyways it is exciting to look at the potential for this system. VR, Ik and robots. What more could you want?! Game changer. It’s very exciting.

Getting it all connected ( ARC to unity) and capturing the rotational values to send via http will be tricky (well for me anyways ha!). Be nice to be able to capture translated values as well. I have some ideas for that too!

Hopefully I’ll get that model cleaned up and rigged this week. So we can have a good model to test with. Make sure you grab Bio Ik so the scene will work when we share!

#91  

Quote:

Hopefully I’ll get that model cleaned up and rigged this week. So we can have a good model to test with. Make sure you grab Bio Ik so the scene will work when we share!

Good idea...I will get myself the plugin and start learning about it's functionality, and once you are done rigging the virtual JD I will hook it up to ARC!

Capturing the translated values should not be a problem either, we can build a script doing so in Unity... :)

PRO
USA
#92  

Guys, Thanks for the videos and the information. I still need more time, I feel lost inside Unity, is like being in Airplane cockpit a lot of buttons:) I'll need to schedule some time to learn the foundations.

@Mickey: JD fbx: me and Blend we are not blended:) When you have time can you clean up the file, remove all the non relevant stuff, also there are a lot of dots everywhere, can you also confirm the nodes/joints are correct. Not urgent.

PRO
USA
#93  

@PTP Mickeys model has alot of extra stuff due to the way that 3D Studio max compounds parts. I am rebuilding the JD Model with all the correct joints, so that it will work inside Unity. Hopefully I will have that done soon.

Yes Unity is not the easiest program to grasp if you have no background in 3d packages. That said it is a gaming engine and that works very different from 3d animation packages!. You don't need to know all the buttons, just the ones we will be using. One of my videos explains the 3 or 4 sections we will be using.

Once we have scenes/projects and a model to share then we can work together to figure out a pipeline.

@everyone I did find that all the c# code is hidden under the UI, in the stack. There is a little gear icon next to every item (far right), if you click on that it opens the script in visual studio for editing. (see code for the servo control with the plugin below) I have no idea if this helps our cause, just pointing it out. User-inserted image

using UnityEngine;
using System.Collections.Generic;

using UINT8 = System.Byte;


namespace Ardunity
{
	[ExecuteInEditMode]
	[AddComponentMenu("ARDUnity/Controller/Motor/GenericServo")]
    [HelpURL("https://sites.google.com/site/ardunitydoc/references/controller/genericservo")]
	public class GenericServo : ArdunityController, IWireOutput<float>
	{
		public int pin;
		public bool smooth = false;

		[Range(-45, 45)]
		public int calibratedAngle = 0;
		[Range(-90, 90)]
		public int minAngle = -90;
		[Range(-90, 90)]
		public int maxAngle = 90;
		[Range(-90, 90)]
		public float angle = 0;

		public Transform handleObject;


		private int _preCalibratedAngle = 0;
		private int _preMinAngle = -90;
		private int _preMaxAngle = 90;
		private float _preAngle = 0;
		
		protected override void Awake()
		{
			base.Awake();
			
			enableUpdate = false; // only output.
		}

		void Start()
		{
			_preCalibratedAngle = calibratedAngle;
			_preMinAngle = minAngle;
			_preMaxAngle = maxAngle;
			_preAngle = angle;
		}

		void Update()
		{
			if(handleObject != null)
			{
				angle = handleObject.localRotation.eulerAngles.y;
				if(angle > 180f)
					angle -= 360f;
			}

			if(_preCalibratedAngle != calibratedAngle)
			{
				calibratedAngle = Mathf.Clamp(calibratedAngle, -45, 45);
				if(_preCalibratedAngle != calibratedAngle)
				{
					_preCalibratedAngle = calibratedAngle;
					SetDirty();
				}
			}

			if(_preMinAngle != minAngle)
			{
				minAngle = Mathf.Clamp(minAngle, -90, _preMaxAngle);
				if(_preMinAngle != minAngle)
				{
					_preMinAngle = minAngle;
					if(angle < _preMinAngle)
						angle = _preMinAngle;
				}
			}

			if(_preMaxAngle != maxAngle)
			{
				maxAngle = Mathf.Clamp(maxAngle, _preMinAngle, 90);
				if(_preMaxAngle != maxAngle)
				{
					_preMaxAngle = maxAngle;
					if(angle > _preMaxAngle)
						angle = _preMaxAngle;
				}
			}

			if(_preAngle != angle)
			{
				angle = Mathf.Clamp(angle, _preMinAngle, _preMaxAngle);
				if(_preAngle != angle)
				{
					_preAngle = angle;
					SetDirty();
				}
			}
		}
		
		protected override void OnPush()
		{
			Push((UINT8)Mathf.Clamp(_preAngle + _preCalibratedAngle + 90, 0, 180));
		}
		
		public override string[] GetCodeIncludes()
		{
			List<string> includes = new List<string>();
			includes.Add("#include <Servo.h>");
			return includes.ToArray();
		}
		
		public override string GetCodeDeclaration()
		{
			string declaration = string.Format("{0} {1}({2:d}, {3:d}, ", this.GetType().Name, GetCodeVariable(), id, pin);
			if(smooth)
				declaration += "true);";
			else
				declaration += "false);";
			
			return declaration;
		}
		
		public override string GetCodeVariable()
		{
			return string.Format("servo{0:d}", id);
		}
		
		float IWireOutput<float>.output
        {
			get
			{
				return (float)angle;
			}
            set
            {
				if(value > 180f)
					value -= 360f;
				else if(value < -180f)
					value += 360f;
				angle = (int)value;
            }
		}
		
		protected override void AddNode(List<Node> nodes)
        {
			base.AddNode(nodes);
			
            nodes.Add(new Node("pin", "", null, NodeType.None, "Arduino Digital Pin"));
            nodes.Add(new Node("angle", "Angle", typeof(IWireOutput<float>), NodeType.WireTo, "Output<float>"));
        }
        
        protected override void UpdateNode(Node node)
        {
            if(node.name.Equals("pin"))
            {
				node.updated = true;
                node.text = string.Format("Pin: {0:d}", pin);
                return;
            }
			else if(node.name.Equals("angle"))
            {
				node.updated = true;
                return;
            }
            
            base.UpdateNode(node);
        }
	}
}
#94  

@fxrtst this is exactly what I was thinking to do... So we will try to build a script that is sending the rotational values over to the ARCs Http Server by using the object rotations. Or we will hijack the servo plugin and modify the code so rather than having it send out the values to the COM port it'll send them to the Http Server, all of this should be fairly easy! :)

PRO
USA
#95  

Excellent! My wife is back today, so hopefully I can get going on the IK and rebuild the model. Now I wish I had a JD Robot to connect this to! Now I think about it I could just print one!

PRO
Canada
#96  

I wonder if you could use the MQTT plugin to connect. Looks a little old last update 2014 but may provide more reliable connectivity versus the http server. https://github.com/vovacooper/Unity3d_MQTT https://synthiam.com/redirect/legacy?table=plugin&id=169

PRO
USA
#97  

Good find. But I am very unfamiliar with this process. What do you fellas say?

#98  

No need for any plugin, the connection between Unity and ARC is working out of the box...no plugin no Arduino required! ;)

Today is Robot Sunday, I will post my progress later this day...

But @Nink it is absolutely possible to send Data out of Unity to any kind of server, I will connect my robot using a Rest Server running on a Rasberry Pi. But first of all we should concentrate on getting a stable build for the Virtual JD done, so we will have a common platform here within the forum which we will use later to explore new ways to drive our robots! I saw you mentioning Vuzix over at the other threat...it will be surely possible to use data from any device eg Vuzix or Oculus Rift to drive our Virtual Robot and have Unity send the rotational values over to ARC!

PRO
USA
#99  

Looking forward to your Robot Sunday results. I’m Heading to bed. I’ll prolly have the JD ready sometime Monday. Busy family weekend with very little work accomplished.

#100  

@Mickey

Curious, what Rest Server do you use?

Thanks

PRO
Canada
#101  

"@nink .it will be surely possible to use data from any device eg Vuzix or Oculus Rift to drive our Virtual Robot and have Unity send the rotational values over to ARC! "

@Mickey I am thinking a human exoskeleton using something like dynamixels and pick up on the location details of the servo's this coupled with a Head Mounted Display and we are now in the drivers seat.

#102  

@HerrBall, I am using Blynk Local Server...fast, reliable and easy to install on Windows or Linux eg Rasberry Pi!

@Nick, I would love to use the Oculus Controllers to drive my IK rigged robot, and I guess we are heading the right direction here!

Anyways I am kind of stuck since the Unity plugin is messing up my rotations...a bit frustrated at the moment, but I will surely find a way to get it to work!

@ptp, I had to update my Bio IK plugin to make it work, did you have to do the same? It also looks different from the version festured in the You Tube tutorials, and my rotations are getting messy! I will try to reset my transformations in 3ds max and am hoping this will do the trick! Otherwise I am looking forward to your scene and maybe a quick "how to" tutorial! Might be just a problem of my transformations or hierachy..lets see!

PRO
USA
#103  

@Mickey was the last one for PTP or me :)

PRO
USA
#104  

@mickey, no i did not update the plug in. It seems to work fine here.

You are correct, the current version of Bio IK is completely different from the youtube videos. You have to use the included readme file to see the current controls and implementation.

It bases all joints on the set pivot you make inside your 3d package. When I moved them in Lightwave and brought into Unity and applied the plug in it worked great. You cannot move pivot points inside Unity, you much do it in an external program.

Once i get the JD finished I will do a quick tutorial on the new version of Bio IK and how to set it up.

#105  

@fxrtst Ups...got messed up!:D Which version of Unity are you currently using? I could not get it to work without updating and was a bit afraid this could have caused the mess! Looking foward to your scene! :)

I will contact the guy which made the plugin about known issues...

PRO
USA
#106  

Im using ver 2017.4.0f1 of unity and bio Ik ver 2.

#107  

So I guess my problem were those animation controllers that I added in 3ds max... Bio IK is really cool, but still there is a good deal of tweaking to be done to get the rotations right!

I just made a quick clip to show how full IK can be send to ARC using its native Http server, no plugin required! There is still some work to be done in Unity to get the rotations right, but it seems to be working, so we can progress! :)

PRO
USA
#108  

Did you manage to clean up the code so we can share that function? Can't wait to try it out!

I have very little to no clean up in Bio IK. I just set the limits on the joints and create goals. It works without anymore fiddling. What do you feel you need to do extra?

#109  

I guess the best route would be if you could share the scene, I will then try to write the code for all servos involved and put it back to the thread so everyone can see how far we got...

It might take some time to get it running clean...I still need to check how to get the correct rotations, since the object rotations are not matching the servo positions!

This test just shows that values can be extracted and can be send to ARC for driving our servos!

Maybe someone good in coding Csharp could accomplish way faster than I can, since my coding skills are still lousy... :)

#110  

I wrote the author of the plugin...it would be a great help if we could call the target value slider of those Bio IK joints! I am not the greatest in scripting, I guess it can be called easily if one knows a little more about Csharp than I do... :)

User-inserted image

PRO
USA
#111  

Cool! I have the JD completely rebuilt and have it in unity. I'm working on getting all the rotational values set for all the joints in Bio IK. Should be done soon and then will test the rig and see if i need to make adjustments. I will also add end effectors for the goals.

Im not sure how to create sliders like you did, but maybe you can add them for the claws? I have the correct pivots for them just need to be rotated with a slider to open and close , i think they should not be included at the ends of the ik chain. I did it in my tests and they are just floppy and have no real controls.

User-inserted image

#112  

I used this tutorial to create the sliders...but I can add them to the scene if you do not want to. One thing about the sliders is, they will work only in Game Mode, not in Scene Mode. So there ist still some stuff to sort out!

The guy which made the Bio IK plugin replied, I will check what I can accomplish! :)

PRO
USA
#113  

Well darn...that's interesting. I'll have to find out if there is a way to move the effectors within the game mode live. That would solve that problem.

BTW I had to rebuild the pivots, the bio ik adds a rotation if the pivots are not in alignment with one another. No big deal, just added more time.

#114  

Ah, I had weird flips in my rotations too...I added another effector to balance it out, but now you are saying this, I guess I should go back and check my pivot points too! :D

PRO
USA
#115  

Fixed it, pivots have to be in perfect alignment, i was just attempting by eye as i didnt think it was critical and was off by a fraction,. So this time i aligned them with numerical accuracy, works perfect.

#116  

Sweet, I got the script to extract the rotations ready...it needs to be attached to all child nodes which have a running BioIK joint! :)

#117  

using System.Collections;
using UnityEngine;

namespace BioIK {
    public class TransformValue : MonoBehaviour {

        public void Update() {
            Debug.Log(ReadValues());
        }

        private Vector3 ReadValues() {
            BioJoint joint = this.GetComponent<BioJoint>();
            if(joint != null) {
                return new Vector3((float)joint.X.GetTargetValue(), (float)joint.Y.GetTargetValue(), (float)joint.Z.GetTargetValue());
            } else {
                Debug.Log("No joint found for " + name + ".");
                return Vector3.zero;
            }
        }

    }
}

PRO
USA
#118  

Nice!

Here is the video of the current set up. Sorry I'm out of it, from being tired. Had to rebuilding the rig 4 times. So I am droning on a bit in the video.

PRO
Canada
#119  

This is very cool. For head neck, shoulder, arm joints this will work very well, but I think there will be some challenges with moving his legs, knees, feet etc. The required physics to ensure JD doesn't face plant into the ground will be quite complex. When JD walks, stands up, sits down etc is a sophisticated balancing act of movements and so we are going to need a physics engine or some form of machine learning algorithm that is configured to understands the relationships between components to define how, when and where the legs can be moved to ensure he remains in a stable position, or we will lift one leg and he will simply fall down.

Amazing progress.

PRO
USA
#120  

Unity does indeed have a built in physics engine. I'm currently looking into how to implement it with IK. A floor will be helpful!

#121  

@fxrtst Could you quickly let us know the issues of having to re-rig the JD...one thing you told was, having to align the pivot points with great care! Just being curious, so we all will know what to watch out for! Looks great, I guess you can eliminate a lot of the swinging motion by just setting the motion type to Instantaneous. I guess It will also help a great deal to study the setup of the examples within the plugin, which at least I have not done yet...

So good is that you found out how to move the BioIK Objectives while having Unity in Game mode, this way we can also use the sliders for the grippers, and most importantly...we can export the whole thing as a standalone version, so all members of the forum will be able to use it without going thru the process of setting up and learning about Unity! :D

Naming convention is very critical in the setup, so please make sure you name any joint in Bio IK corresponding to the servo the joint will be driving... So instead of calling the joint "JD_Head_horizontal" it should be named "D0" because the script will pick up the joints name to send it along with the value to the server! As you have experienced while setting up BioIK, it can enable X,Y or Z motion on joints, since we only need one axis for our servo rotations, I will make three scripts which will extract and send either the X, Y or Z motion and send it to the server. Those scripts will have to be attached to the joints, so setup should be very easy...just check which axis is enabled on the joint and drag and drop the coresponing script!

@Nick This was the greatest challenge when setting up my Virtual JD in 3ds max, I guess since Unity is a game engine and I saw implementation of this feature in BioIK, we will also be able to achieve this! But since JD has no hips and no ankles, it makes it kind of difficult to make him walk. And I guess if you create motion for him, he should be always wired up so you can watch him to make sure he does not tumble over or has incorrect limits to his rotations! :)

PRO
USA
#122  

What happens when you dont align the pivots when you design it in a program like lightwave, is when the IK rig is generated, if there is a deviation on their line up say vertical, then the when the rig is created it will introduces a twist in some part of the rig. In my example I was getting a twist from the knee down. So to fix it all the pivots need to be aligned (all the red arrows show the alignments).

User-inserted image

Its easy to rename the joints by right clicking on the part so I will leave that to you. I don't know what servo goes to what as I dont have a Jd. I will also let you play around with the constrains and create new goals. This is def a WIP as this rigger is quite different than other rigging systems. This may not be the correct one for the job. But at least this gets us started with testing. I will try rigging it in 3D Puppet as well.

It would be great in the script if we can also get translate values as well. When I test with motion capture, some items like face motion capture will use translation values of small boxes, not rotational values.

I will upload v_04 of the Unity Scene here with a Dropbox link.

EDITI cant find where to export the whole project file..any one know how? I can export the scene file but it doesnt load any assets. The only thing I could find was creating a team and sharing assets?! But that would not create out personal copy? Makes global changes?**

User-inserted image

#123  

Quote:

It would be great in the script if we can also get translate values as well. When I test with motion capture, some items like face motion capture will use translation values of small boxes, not rotational values.

I guess i should be fairly easy since the BioIK Objectives also have and option that is called Displacement, but I will have to look into this...how would you use those movements to describe servo rotations though?

Anyways, I would not be bothered to try another plugin, but at the moment I am very happy with BioIK...let me know if you think that you found another promising lead! Since I do not own a JD myself this might end up just being a scene to check if everything is correctly in tandem which ARC...it would be good if someone with a JD would test if things work as expected!

I will check how to export a scene, I guess all you need to do is upload the folder and we are good to go... :)

#124  

OK...Just upload the folder and we are good!

Here is a GitHub Project of a robotic arm, it has also the same folder structure. You do not have to care about any exporting at all, just take the folder and throw it to Dropbox or Wetransfer etc... :)

GitHub example

PRO
USA
#125  

Both are available in the objectives....translation and rotation. I would use lets say a small 1 cm box moving in the y axis, that up down values would be converted into 1-180 just like rotational values. Think of it like a slider pot, its linear, but can be translated to servo rotation.

User-inserted image

Which folder?

PRO
USA
#127  

Ok I didn't clean it up so there are alot of redundant files in there. I guess I should have create a nice and neat project would have been smaller file size.

Edit; deleted

#128  

@fxrtst I will check the file, dont worry if there is a lot of thrash floating around...its just to get us going! ;)

As far as your request for extracting transitions goes, I guess I was not thinking far enough...there is no need for BioIK if I get it right? All the stuff is contained by the .fbx file, which consists of a bunch of moving boxes?

Send me an example .fbx if possible...

If it is the way I think, then you should be able to just take the positional data of the y axis. Use the highest y value, subtract the lowest y value and divide the outcome by 180 to get your servo position? :)

PRO
USA
#129  

Sounds good. I’ll up a bvh or fbx of the facial mocap. Yes I guess this would be without Bio Ik! My mistake! A script that could use any value in a scene would be helpful to me. I’m think not only setting up for full body and facial mocap but also control rigs, like for use to puppet control robots, VR etc!

PRO
USA
#130  

Got a facial mocap scene together in unity, best to observe the motion per the scene mode so you can watch the motion of the points being moved for each part of the face.

Link : Facial mocap Unity

User-inserted image

#131  

@fxrtst I do not have time on my hands now...but I will check on it asap! :)

PRO
USA
#132  

Ok No problems! Thanks!

#133  

Hey, I put the scripts to the Virtual JD...This is just to show how the whole thing works, so actually at the moment it is just sending the rotational values of JDs first two arm joints to servo D0 and D1 of your EZ-B. It seems to be working great, I experienced no lag, it seems to be realtime control!

As far as the setup goes...I created three different scripts RotX, RotY and RotZ. Drag and drop those scripts to the objects that you created joints for in BioIK. Always check which rotation you want to extract in BioIK first and use the corresponding script! The objects name should be changed to just the number of the servo that you want to drive in ARC.

In ARC just start the Http Server, setup the servos as usual...done! :)

If you are experiencing any difficulties, let me know!

edit...I assumed that the setup in Unity has the BioIK slider values set on -90/+90 if this is any different, you need to change the script!

#134  

For the facial mocap, I do not get this to work...I have tried to access the y position of the Eyebrow, and I could retrieve the first position, but it does not update! I tried different approaches but none of them worked for me! Maybe I am doing something wrong, but I could not figure out what it is!

Also, since the face is moving in space, I guess there will be difficulties in getting this scene to work correctly...

edit

I could not resist to try one more option...so now it works! I took another Game Object and linked it to the position of the Eyebrow. I attached a scene where you can see a cube linked to the eyebrows movement and how the y position gets send to the EZ-B...but I guess its a hell of work to find all the upper and lower limits for those facial controls, I doubt they will stay the same in changing scenes? If you find out the value limits it can be easily translated to servo movement, but I dont know if all this is heading the right direction.

I guess the best route would be to create a rig in your favorite 3D application which contains all the servos that will drive the facial animation, and rig the facial animation to move your virtual robot inside of your 3D application, playback can be done in Unity!

Believe me I tried to map mocap to a robot, it does not work this way, the key is to create a mechanical rig which is a representation of the real world robot. That way motion can be created, or mapped for your robot! :)

https://we.tl/uT8tLNO2WN

PRO
USA
#135  

Ok I will give it a spin tonight. Nice job on the virtual JD. I'll let you know if i have any trouble setting it up. Glad to hear its reatime, I suspected Unity was going to drive everything real time when i saw the ArdUnity plug in working!

Thanks for trying with the face mocap. I figured I would have to create a retarget so all the positional data would be remapped to a robot standin or avatar. Which would then have consistent data. I will create a real world example. and see how we can solve the issues.

#136  

Yes I guess if you would remap the data and there would be consistent limits plus only those values you want to use to drive the servos, it should not be a problem to wire it up in Unity! :)

PRO
USA
#137  

One last try using the plug ins from ArdUnity. I immediately noticed that the face is super small, so all positional data is fractional. There is a multiplyer built into the plug in but I went up to 10,000 and still no movement from a servo. So retargetting will be the way to go. I’ll read up more on retargetting in unity.

#138  

Yes, or try to do it in Maya, if it gets exported the right way it should not be a problem to set everything up in Unity! :)

#139  

Hey, I just realized there is some issue with my script...I declared some variables wrong, so it gets messed up if you start to add more servos! I will fix this asap!

@fxrtst

Quote:

There is a multiplyer built into the plug in but I went up to 10,000 and still no movement from a servo
If you just want to see servo movement, crank up the value inside of the script by multiplying it...that should make you be able to see some servo movement! But retargeting will finally be the only solution to the problem!

PRO
USA
#140  

Ok I’ll wait til you fix the script before I try and mess around with it. My wife and I celebrating our 10th wedding anniversary, so will be gone til Sunday. I’ll check it out then!

PRO
USA
#141  

Question. Do you think it would be possible to have data going into unity via web socket and out of unity say to the http server in ARC at the same time? I’m really interested in try to figure out how to stream live data into unity then out to EZB. But I don’t know enough about web sockets.

#142  

@fxrtst Congratulations! Enjoy! :)

And yes, this is exactly what I am aiming for next! It should not be a problem at all! Actually it works in a way that we will have a server, eg the ARCs native server or any other server, and this server will be used to process the data. So Unity can get data of the server or put data to the server...so can ARC! This way we can drive the Objectives in Unity by eg Camera Data from the EZ-B, or use IFTTT or the rotation sensor in you phone and all the other fun stuff to drive the IK Goals in Unity!

I had it all setup in 3ds max already, but it was painfully slow...thanks to Unity we will have a way better connection! :D

#143  

I am having the scripts almost fixed, I will try to implement a funktion to save the servo rotations to .txt file or store them to an array for later use in a standalone playback build, but first of we need to get live mode correctly running!

PRO
USA
#144  

Excellent! Great news!

#145  

This is a bit off topic, but it shows where we can get using Unity... Belief me, I do not fully understand what is going on in the You Tube clip, but I am positively sure that a virtual environment is a great benefit for our robots! :)

https://unity3d.com/de/machine-learning

#146  

@fxrtst It was so great you brought Unity into the Game! I am done reworking the scripts, and tested everything on my robot! Live Control works great! Finally I can run full IK on my robot, I will make some clip to show how I got it to work soon! :D

PRO
USA
#147  

Fantastic! Looking forward to video!

PRO
USA
#148  

This guy has a great series. I watch some of his videos a couple months ago. He knows a lot about deep learning and AI. It would be amazing to be able to use some of these ideas in robots!

Check out his channel... he raps about AI!

PRO
Canada
#149  

This is really what we need to prevent robot from falling, navigating comple items and optimize gate when moving. Example Lidar scan environment,SLAM collect data and render point cloud then run the model on how to navigate environment and quickly go through all possible options then when the optimum model found send the commands to Servos to perform motion. This would be ground breaking.

#150  

There are a lot of ways to train you ineligent agents, I guess we will come up with some fun stuff soon...

#151  

Btw @fxrtst what were your problems with BioIK? I am having a really good live feedback with Unity right now, using 8 servos...but the IK system is kind of weird, it seems like my model gets mirrored or something, hard to explain! I might contact the guy wich wrote it! confused

PRO
USA
#152  

My only issue is it’s not like any other ik system. Like I could control the feet better with a traditional Ik system with the ability to lock his feet to the ground . I guess I need to read up a bit more to switch fk/ik on/off in BioIK

PRO
Canada
#153  

Love it, very cool.

#154  

So I corrected the scripts, and this is the WIP for Virtual JD back for more progress to come! I set up one arm, so the rotations will be send to ARC, the gripper will have to have its own controls in the future! I realized the JD was setup a bit strange...all rotations should range from -90/90 within BioIK, which I corrected, but this made the IK act a bit strange! So I guess we will have to set it up again. I just thought I will upload the scene once more so the new scripts are included and anyone can try to get stuff working out correctly!

I experienced dropouts from the Wifi with my EZ-B IoTiny, but I am not sure if this is due to server flooding or just a bad Wifi channel...I havent had this issue before, so please confirm if this is a general issue!

I am having no issues on my robot being controlled with another server/board so please let me know about any issues so we can resolve this! :)

Unity is great and the Virtual JD will be a huge benefit for ARC! Lets keep on exploring what it can do for us... :)

If you experience any difficulties setting up Unity in tandem with ARC, please let me know!

https://we.tl/Wznfe7QDWz

#155  

I did some further checking and I guess it is not a Wifi related problem...as soon as I am sending more than one value over to ARC, the board drops out! So I guess since it is working good with the sliders, it must be an issue with flooding... I set the scripts in a way that values are only being send if there is a value change, so I would need to know how to correctly send them without the EZ-B dropping out!

If there is anyone here that could help, I guess we can go on...otherwise we are stuck!

My other robot is connected via Ethernet and runs a local REST server, this scenario works flawless...so I guess we will just have to find a way to sent the data correctly, without the EZ-B or the ARCs server dropping out! :)

#156  

This is the code I am using to send the values over to the ARCs HTTP server...

using System.Collections;
using UnityEngine;
using UnityEngine.Networking;

namespace BioIK
{
    public class RotNew : MonoBehaviour
    {

        public float Value = 0f;
        public int position;
        public bool Changed = true;

        IEnumerator GetText()
        {
            while (true)
            {
                if (Changed)
                {
                    var request = (string.Format("http://192.168.178.20/Exec?password=admin&script=Servo(D{0},{1})", name, position + 90));
                    print(request);
                    UnityWebRequest www = UnityWebRequest.Get(request);
                    yield return www.SendWebRequest();
                    if (www.isNetworkError || www.isHttpError)
                    {
                        Debug.Log(www.error);
                    }
                    else
                    {
                        // Show results as text
                        Debug.Log(www.downloadHandler.text);

                        // Or retrieve results as binary data
                        byte[] results = www.downloadHandler.data;
                    }

                    Changed = false;


                }
                yield return 0;
            }
        }

        void Start()
        {
            StartCoroutine(GetText());
        }

        void Update()
        {
            BioJoint joint = this.GetComponent<BioJoint>();
            double value = joint.Y.GetTargetValue();
            position = Mathf.RoundToInt((float)value);

            if (Value != position)
            {
                Value = position;
                Changed = true;
            }
        }
    }
}
#157  

I even tried to use really long delay between sending those values...and did not get it to work, I hope somebody out there has an idea how to fix this! confused

PRO
USA
#158  

@mickey, do you have any of your EZBs set up with the usb/serial direct connection? You could try that, and if it works maybe you are getting buffer overflow on wifi?

#159  

I have never known about a connection like that...I only connected my EZ-B using Wifi! But if you have this connection available please check if the scene works for you!

PRO
USA
#160  

@Mickey666Maus:

  1. Download EZB Mono SDK, extract the EZ_B.dll and add to the Project assets.

  2. Replace your RotNew class (post #157) with this code:


namespace BioIK
{
    using System;
    using EZ_B;
    using UnityEngine;

    public class RotNew : MonoBehaviour
    {
        public void Start()
        {
            EZBController.Instance.Connect("192.168.1.1" );
        }

        public void OnApplicationQuit()
        {
            EZBController.Instance.Disconnect();
        }

        public void Update()
        {
            BioJoint joint = this.GetComponent<BioJoint>();
            var value = joint.Y.GetTargetValue();
            var position = Mathf.RoundToInt((float)value);

            if (value == position)
            {
                return;
            }

            EZBController.Instance.SetServo(this.name, position + 90 );
        }

        private class EZBController
        {
            private EZB ezb;
            private object lockObject = new object();

            private EZBController()
            {
                this.ezb = new EZB("MyEzb" );
                this.ezb.OnConnectionChange += this.EzbOnConnectionChange;
            }

            public static EZBController Instance
            {
                get { return Creator.Singleton; }
            }

            public void Connect(string hostname)
            {
                lock (this.lockObject)
                {
                    if (!this.ezb.IsConnected)
                    {
                        Debug.Log("Connecting to EZB hostname:" + hostname);
                        this.ezb.Connect(hostname);
                    }
                }
            }

            public void Disconnect()
            {
                lock (this.lockObject)
                {
                    if (this.ezb.IsConnected)
                    {
                        this.ezb.Disconnect();
                    }
                }
            }

            public void SetServo(string port, int position)
            {
                if (!this.ezb.IsConnected)
                {
                    Debug.LogWarning("EZB is not connected, SetServo will be ignored");
                    return;
                }

                try
                {
                    var servoPort = (Servo.ServoPortEnum)Enum.Parse(typeof(Servo.ServoPortEnum), "D" + port);
                    this.ezb.Servo.SetServoPosition(servoPort, position);
                }
                catch (Exception ex)
                {
                    Debug.LogError(ex);
                }
            }
            private void EzbOnConnectionChange(bool isConnected)
            {
                Debug.Log("EZB connection changed to: " + (isConnected ? "Connected" : "Disconnect"));
            }

            private sealed class Creator
            {
                private static readonly EZBController instance = new EZBController();

                internal static EZBController Singleton
                {
                    get { return instance; }
                }
            }
        }
    }
}

Note1: Check the EZB address and correct if necessary

Good Luck !

PRO
USA
#161  

Thanks PTP. I was hoping the mono sdk would work. Thank you for looking into it. I will try and set up today.

#162  

@ptp so good to have you here...I could not have figured it out by myself! I will try it right now! :)

#163  

Well it is kind of funny, if I am adding the EZ_B.dll to my project it gives me no errors, but still the CS0246 is showing in Unity...how do I save or refresh everything correctly in Visual Studio?

PRO
USA
#164  

Working? I haven't had the chance to try it out.

PRO
USA
#165  

Walk me through exactly how you have it set up, so i can replicate.

PRO
USA
#166  

can you print/post more details regarding the CS0246 error ?

#167  

So first I am getting... are you missing a using directive or an assembly reference?

If I am adding a reference to the EZ_B.dll in my assets folder, the errors are gone in Visual Studio, but still persist in Unity!

#168  

No errors in Visual Studio...

User-inserted image

But still in Unity...

User-inserted image

PRO
USA
#169  

@Mickey666Maus, Hum... it seems Unity expects .framework 2.0/3.5 class libraries...,

Unity Version ?

Can you check if this option is available ? https://docs.unity3d.com/Manual/ScriptingRuntimeUpgrade.html

the other solution is to create the required functionality (connect to EZB and servo Command).

#171  

It works...I would have never figured it out by myself! :D

PRO
USA
#172  

It works ? Can you be more specific ?

Did you changed the framework ? Is the assembly/dll loaded and no more CS2046 ? Are you successfully communicating with EZB ? Are the servos moving ?

#173  

@ptp This is great, everything works perfect! This had been the point where without you the Virtual JD would have been a dead end! :D

It would be great if you could also install Unity and run the scene, so you will have a better overview on what is going on with the setup! Each part of the Virtual JD has a BioIK joint attached to it, so for the arm of JD we will have to run 3 scripts...one for each DOF in JDs arm plus one later on for the gripper!

So what is happening right now is, the first script works great, but as soon as the second one gets added it gives me an error for not being able to connect to the EZ-B...since obviously the first one is connecting already!

Would there be any solution for this? :)

#174  

@ptp

Quote:

Did you changed the framework ?
Yes... https://stackoverflow.com/questions/42513782/why-cant-i-see-net-4-6-option-for-api-compatibility-level?rq=1

Quote:

Is the assembly/dll loaded and no more CS2046 ?
No errors in Unity...No errors in Visual Studio!

Quote:

Are you successfully communicating with EZB ?
Yes!

Quote:

Are the servos moving ?
Yes!

#175  

We could probably instead of getting the values out from the individual joints, try getting them all at once with a script extracting them from Root Object?

so

float value = <BioIK Component>.FindSegment(name).Joint.X.GetTargetValue();

instead of

float value = <BioJoint Component>.X.GetTargetValue();

so we would not have to have multiple scripts trying to connect to the EZ-B?

Or there is also this parameter?

double[] solution = <BioIK>.Evolution.GetSolution();

PRO
USA
#176  

You guys are flying way over my head but will ask, how do we choose X,Y,Z axis for all the joints from the root IK?

The Ardunity plugin doesn't seem to have an issue with a script running on each joint in unison with Bio IK.

#177  

I will just try something here...and this could work! And hopefully @ptp can come up with a codeable solution for my idea! :D

PRO
USA
#178  

Obs:

  1. You have a single connection to the EZB. EZBController is a "Singleton" it's shared, so all the RotNew instances call Connect, the first call connects the other calls are ignored. EZB only accepts one TCP/Serial client connection. You can't have your Unity, ARC or a custom SDK application all connected at same time to the same EZB.

The function SetServo (EZB servo position) sends 2 bytes via TCP/Wifi to the controller. No EZ-Script is involved, faster than that only sending the same 2 bytes via Serial Port (EZB wired).

3) The concurrency is handled inside the EZB library. No problem if you have 2 or 20 joints (RotNew instances) sending values at same time, all the values will be serialized (FIFO - First In First Out) and transmitted via the existent connection (point 1).

and yes, you can have a single Unity script capturing the joint values and sending all them with a single API call:


EZB's method signature:
public void SetServoPosition(ServoPositionItem[] servos);

the advantage is a single TCP payload message (e.g. 10 servo positions 10 x 2 bytes = 20 bytes), versus individual 20 messages of 2 bytes.

Don't expect a huge improvement , unless you have bad WIFI connection.

I assume you have configured a RotNew instance per joint, if you have a way to assign a single instance script, i can create another script.

#179  

@ptp thanks for explaining in detail how this works!

1 I understood! 2 Very good, the faster the better for us! :)

Quote:

No problem if you have 2 or 20 joints (RotNew instances) sending values at same time, all the values will be serialized
When I hook up two instances of the script to the BioJoints, the board drops out.

Quote:

I assume you are configure a RotNew instance per joint, if you have a way to assign a single instance script, i can create another script.
This is correct...but there will be three separate scripts, one for X, Y, and one for Z rotations. When I am having one script lets say RotNew_Y and ond RotNew_Z the second script gives me an error saying it cannot connect to the EZ_B... This is why I guess we will have to have one script which can be used to either extract all rotations at once, or works in a way that you can change it to whichever rotation we would want to extract?

PRO
USA
#180  

@Fxrtst:

Quote:

how do we choose X,Y,Z axis for all the joints from the root IK?

Each servo can only represent a single axis (X/Y/Z) rotation e.g. (Yaw/Pitch/Roll).

Mickey666Maus script:


var value = joint.Y.GetTargetValue();
var position = Mathf.RoundToInt((float)value);

Is only looking for the Y part and then only Mickey666Maus knows why:) he adds +90


ZBController.Instance.SetServo(this.name, position + 90 );

So only Y+90 is sent to the EZB Servo.

I'm not an expert in 3D Math... but some points to think:

  1. is the joint.Y.GetTargetValue(), joint.X.GetTargetValue(), joint.Z.GetTargetValue() positions or rotation values ?

  2. If they are positions, does not help. You will need at least 2 reference points and calculate the angles, for each rotation.

  3. If they are rotation values are Radians or Degrees ?

  4. Are rotation values absolute, or related to the Parent's joint ?

  5. Rotation system: Left Hand or Right Hand, you need to understand where X,Y,Z are pointing and if the value increases to the left/right or up/down or CW/CCW.

  6. Each servo represents a Joint Rotation, for example the JD Ankle has two servos: S0 (rotation on axis X) foot servo, S1 (rotation on axis Y) the next servo up. ROS/Right Hand System

So for the servo S0 you want the X rotation value, for the servo S1 you want Y.

  1. Servo rotation is not related to a specific standard (Left/Right hand). Rotation 0 = is the 90 degrees of the servo, if goes positive to the left, and the servo goes to the other side you need to invert the value.

To summarize, if you are communicating directly to the EZB servo, this means you need to find away to setup (Unity) for each servo (not per joint) the following information:

  1. Joint name
  2. Axis value X,Y,Z
  3. Inverted calculation or a Formula

So you have some homework to do :)

PRO
USA
#181  

Is the drop out specific to EZB/ ARC Wi-Fi? I Don't have a FTDI breakout board or I would test.

PRO
USA
#182  

https://answers.unity.com/questions/38924/unity-is-a-left-handed-coordinate-system-why.html Unity is Left Handed.

https://www.evl.uic.edu/ralph/508S98/coordinates.html

Please verify the rotation directions.

ROS on the other side is Right hand:

Quote:

The base coordinate frame in ROS follows the right hand rule and is essentially "engineering" frame as used by most mathematics and engineering professors I've encountered. This would be x-axis forward, y-axis left, and z-axis up.

PRO
USA
#183  

Quote:

Is the drop out specific to EZB/ ARC Wi-Fi? I Don't have a FTDI breakout board or I would test.

I don't know, but the update method is called at each frame, and you can't expect to have servos responding so fast.

I think the best option is to print the values in a debug window and see the values and how fast they are being collected (updated).

#184  

@ptp The joint values are rotations in degrees...you get get Radians if you pass GetTargetValue(true) During the setup stage in BioIK it me and @fxrtst both setup our model with degrees that were ranging from -90/+90 degrees in our angle values, so I just put a lazy +90 to the script, it has no other meaning...we could set up the 3D model differently and the +90 is obsolete! :D

#185  

Quote:

Are rotation values absolute, or related to the Parent's joint ?
They are related to their parents values!

PRO
USA
#186  

@Mickey666Maus, So 0 degrees => servo 90 degrees

Next question: X,Y,Z - where they point ? (when you increase the value): (CW/CCW, UP/Down, Left/Right)

#187  

Quote:

  1. Joint name

  2. Axis value X,Y,Z

  3. Inverted calculation or a Formula

  • Is always just the number of the servo that will be adressed!

  • Is either joint.Y.GetTargetValue(), joint.X.GetTargetValue(), or joint.Z.GetTargetValue(), during the setup in BioIK the axis to be extracted is specified and then later on the corresponding script is attached!

  • is actually just 180 - TargetValue :)

  • #188  

    Quote:

    X,Y,Z - where they point ? (when you increase the value): (CW/CCW, UP/Down, Left/Right)
    This is handled by the BioIK plugin, no need to think about up Vectors, Gimbal Lock, or any of this type of stuff...I spend countless hours to solve those problems, and I am so happy that finally there is a solution for this at hand! At setup we can restrict the axis and the rotational freedom in angles for each joint, BioIK uses an Algorithm to calculate the underlying math!

    PRO
    USA
    #189  

    The previous script has a bug!

    Try this one. Look to debug window!

    
    namespace BioIK
    {
        using System;
        using EZ_B;
        using UnityEngine;
    
        public class RotNew : MonoBehaviour
        {
            private int[] lastRotation = new [] { int.MinValue, int.MinValue, int.MinValue };
    
            private void Start()
            {
                EZBController.Instance.Connect(&quot;192.168.1.1&quot;);
            }
    
            private void OnApplicationQuit()
            {
                EZBController.Instance.Disconnect();
            }
    
            private void Update()
            {
                BioJoint joint = this.GetComponent&lt;BioJoint&gt;();
                var rotation = new int[]
                {
                    Mathf.RoundToInt((float)joint.X.GetTargetValue()),
                    Mathf.RoundToInt((float)joint.Y.GetTargetValue()),
                    Mathf.RoundToInt((float)joint.Z.GetTargetValue()),
                };
    
                if (rotation[0] == this.lastRotation[0] &amp;&amp; rotation[1] == this.lastRotation[1] &amp;&amp; rotation[2] == this.lastRotation[2])
                {
                    //nothing changed
                    return;
                }
    
                if (rotation[0] != this.lastRotation[0])
                {
                    Debug.Log(string.Format(&quot;Joint:[{0}] X changed New=[{1}] Prev=[{2}]&quot;, this.name, rotation[0], this.lastRotation[0]));
                }
                if (rotation[1] != this.lastRotation[1])
                {
                    Debug.Log(string.Format(&quot;Joint:[{0}] Y changed New=[{1}] Prev=[{2}]&quot;, this.name, rotation[1], this.lastRotation[1]));
                }
                if (rotation[2] != this.lastRotation[2])
                {
                    Debug.Log(string.Format(&quot;Joint:[{0}] Z changed New=[{1}] Prev=[{2}]&quot;, this.name, rotation[2], this.lastRotation[2]));
                }
    
                this.lastRotation = rotation;
    
                EZBController.Instance.SetServo(this.name, rotation[1] + 90);
            }
    
            private class EZBController
            {
                private EZB ezb;
                private object lockObject = new object();
    
                private EZBController()
                {
                    this.ezb = new EZB(&quot;MyEzb&quot; );
                    this.ezb.OnConnectionChange += this.EzbOnConnectionChange;
                }
    
                public static EZBController Instance
                {
                    get { return Creator.Singleton; }
                }
    
                public void Connect(string hostname)
                {
                    lock (this.lockObject)
                    {
                        if (!this.ezb.IsConnected)
                        {
                            Debug.Log(&quot;Connecting to EZB hostname:&quot; + hostname);
                            this.ezb.Connect(hostname);
                        }
                    }
                }
    
                public void Disconnect()
                {
                    lock (this.lockObject)
                    {
                        if (this.ezb.IsConnected)
                        {
                            this.ezb.Disconnect();
                        }
                    }
                }
    
                public void SetServo(string port, int position)
                {
                    if (!this.ezb.IsConnected)
                    {
                        Debug.LogWarning(&quot;EZB is not connected, SetServo will be ignored&quot; );
                        return;
                    }
    
                    try
                    {
                        var servoPort = (Servo.ServoPortEnum)Enum.Parse(typeof(Servo.ServoPortEnum), &quot;D&quot; + port);
                        this.ezb.Servo.SetServoPosition(servoPort, position);
                    }
                    catch (Exception ex)
                    {
                        Debug.LogError(ex);
                    }
                }
                private void EzbOnConnectionChange(bool isConnected)
                {
                    Debug.Log(&quot;EZB connection changed to: &quot; + (isConnected ? &quot;Connected&quot; : &quot;Disconnect&quot; ));
                }
    
                private sealed class Creator
                {
                    private static readonly EZBController instance = new EZBController();
    
                    internal static EZBController Singleton
                    {
                        get { return instance; }
                    }
                }
            }
        }
    }
    
    
    #190  

    I have it all working, I wanted to make a clip today to show how good it works with live control of a robot with 8 DOF, no lag...full IK!

    But now we get going with ARC integration, I want to participate of course! Especially now we are having you here to get things going! :D

    PRO
    USA
    #191  

    Yes we are are extracting math that has already been calculated via the Bio IK plug in. It converts the angles to somewhere between 0 -90, 90 those are the only values produced for X,Y and Z rotate or translated. We just need to get those values to the servos in a form of 1-180 degrees.

    PRO
    USA
    #193  

    @mickey, i thinks things are getting lost in translation. In post #191, that you have 8 DOF working, is that with another "board" and NOT working with ARC? LOL I'm so confused when you say "I have it working".

    #194  

    @fxrtst Yes this is correct! And I wanted to show this to proof it can be done, so maybe some more attention gets drawn to this thread!

    But now we are having @ptp so we are blessed! :D

    #196  

    @ptp This script is great! I just cannot belive how fat you got this going!

    So it works with JDs first two joints, while for whatever reason it seems to show the extracted values and also the amount of change for the two joints, but only one servo is moving...I will have to check my wiring to make sure it is not just some lazy wire somewhere!

    But as soon as I hook up the third joint, the EZ-B drops out again...

    PRO
    USA
    #197  

    I'm preparing another script.

    PRO
    USA
    #199  
    
    namespace BioIK
    {
        using System;
        using System.Collections.Generic;
        using System.Linq;
        using System.Runtime.CompilerServices;
        using EZ_B;
        using UnityEngine;
    
        public class RotNew : MonoBehaviour
        {
            private static ServoDefinition[] definitions = new ServoDefinition[]
            {
                //DEFINE YOUR SERVOS HERE
                //Start with one only, but you can assign RotNew to all Joints
                //Remember &quot;0&quot; is the joint name, you can define a proper name, the name must match the definition.
    
                //map joint &quot;0&quot;/X to D0
                new ServoDefinition(&quot;0&quot;, AxisType.X, Servo.ServoPortEnum.D0, false),
    
                //map joint &quot;0&quot;/Y to D1 but invert the servo value
                new ServoDefinition(&quot;0&quot;, AxisType.Y, Servo.ServoPortEnum.D1, true),
    
            };
    
            private int[] lastRotation = new[] { int.MinValue, int.MinValue, int.MinValue };
    
            private void Start()
            {
                EZBController.Instance.Connect(&quot;192.168.1.1&quot;);
            }
    
            private void OnApplicationQuit()
            {
                EZBController.Instance.Disconnect();
            }
    
            private void Update()
            {
                //Note: Start with a conservative value here
                //a high value slows down.
                const int interval = 30;
                if (Time.frameCount % interval != 0)
                {
                    //don't run yet
                    return;
                }
    
                BioJoint joint = this.GetComponent&lt;BioJoint&gt;();
                var rotation = new int[]
                {
                    Mathf.RoundToInt((float)joint.X.GetTargetValue()),
                    Mathf.RoundToInt((float)joint.Y.GetTargetValue()),
                    Mathf.RoundToInt((float)joint.Z.GetTargetValue()),
                };
    
                if (rotation[0] == this.lastRotation[0] &amp;&amp; rotation[1] == this.lastRotation[1] &amp;&amp; rotation[2] == this.lastRotation[2])
                {
                    //nothing changed
                    return;
                }
    
                var servosToMove = new List&lt;EZ_B.Classes.ServoPositionItem&gt;();
    
                var jointServos = definitions.Where(js =&gt; js.JointName.Equals(this.name, StringComparison.OrdinalIgnoreCase)).ToArray();
    
                if (rotation[0] != this.lastRotation[0])
                {
                    Debug.Log(string.Format(&quot;Joint:[{0}] X changed New=[{1}] Prev=[{2}]&quot;, this.name, rotation[0], this.lastRotation[0]));
    
                    foreach (var servo in jointServos.Where(s =&gt; s.Axis == AxisType.X))
                    {
                        servosToMove.Add(new EZ_B.Classes.ServoPositionItem(servo.Port, FixRotation(rotation[0], servo.Inverted)));
                    }
                }
    
                if (rotation[1] != this.lastRotation[1])
                {
                    Debug.Log(string.Format(&quot;Joint:[{0}] Y changed New=[{1}] Prev=[{2}]&quot;, this.name, rotation[1], this.lastRotation[1]));
    
                    foreach (var servo in jointServos.Where(s =&gt; s.Axis == AxisType.Y))
                    {
                        servosToMove.Add(new EZ_B.Classes.ServoPositionItem(servo.Port, FixRotation(rotation[1], servo.Inverted)));
                    }
                }
                if (rotation[2] != this.lastRotation[2])
                {
                    Debug.Log(string.Format(&quot;Joint:[{0}] Z changed New=[{1}] Prev=[{2}]&quot;, this.name, rotation[2], this.lastRotation[2]));
    
                    foreach (var servo in jointServos.Where(s =&gt; s.Axis == AxisType.Z))
                    {
                        servosToMove.Add(new EZ_B.Classes.ServoPositionItem(servo.Port, FixRotation(rotation[2], servo.Inverted)));
                    }
                }
    
                this.lastRotation = rotation;
    
                if (servosToMove.Count &gt; 0)
                {
                    EZBController.Instance.SetServosPositions(servosToMove.ToArray());
                }
            }
    
            private int FixRotation(int deg, bool inverted, int middleAngle = 90)
            {
    
                return !inverted ? middleAngle + deg : (middleAngle * 2 - (middleAngle + deg));
            }
    
            private class EZBController
            {
                private EZB ezb;
                private object lockObject = new object();
    
                private EZBController()
                {
                    this.ezb = new EZB(&quot;MyEzb&quot;);
                    this.ezb.OnConnectionChange += this.EzbOnConnectionChange;
                }
    
                public static EZBController Instance
                {
                    get { return Creator.Singleton; }
                }
    
                public void Connect(string hostname)
                {
                    lock (this.lockObject)
                    {
                        if (!this.ezb.IsConnected)
                        {
                            Debug.Log(&quot;Connecting to EZB hostname:&quot; + hostname);
                            this.ezb.Connect(hostname);
                        }
                    }
                }
    
                public void Disconnect()
                {
                    lock (this.lockObject)
                    {
                        if (this.ezb.IsConnected)
                        {
                            this.ezb.Disconnect();
                        }
                    }
                }
    
                public void SetServoPosition(Servo.ServoPortEnum servoPort, int position)
                {
                    if (!this.ezb.IsConnected)
                    {
                        Debug.LogWarning(&quot;EZB is not connected, SetServoPosition will be ignored&quot;);
                        return;
                    }
    
                    try
                    {
                        this.ezb.Servo.SetServoPosition(servoPort, position);
                    }
                    catch (Exception ex)
                    {
                        Debug.LogError(ex);
                    }
                }
    
                public void SetServosPositions(EZ_B.Classes.ServoPositionItem[] servos)
                {
                    if (!this.ezb.IsConnected)
                    {
                        Debug.LogWarning(&quot;EZB is not connected, SetServosPositions will be ignored&quot;);
                        return;
                    }
    
                    try
                    {
                        this.ezb.Servo.SetServoPosition(servos);
                    }
                    catch (Exception ex)
                    {
                        Debug.LogError(ex);
                    }
                }
    
    
    
                private void EzbOnConnectionChange(bool isConnected)
                {
                    Debug.Log(&quot;EZB connection changed to: &quot; + (isConnected ? &quot;Connected&quot; : &quot;Disconnect&quot;));
                }
    
                private sealed class Creator
                {
                    private static readonly EZBController instance = new EZBController();
    
                    internal static EZBController Singleton
                    {
                        get { return instance; }
                    }
                }
            }
    
            private enum AxisType
            {
                X = 0,
                Y = 1,
                Z = 2
            }
            private class ServoDefinition
            {
                public Servo.ServoPortEnum Port;
                public string JointName;
                public AxisType Axis;
                public bool Inverted;
                public ServoDefinition(string jointName, AxisType axis, Servo.ServoPortEnum port, bool inverted)
                {
                    this.Port = port;
                    this.JointName = jointName;
                    this.Axis = axis;
                    this.Inverted = inverted;
                }
            }
    
        }
    }
    
    

    Please look to the code comments.

    PRO
    USA
    #201  

    check again the script post, I've changed the Update rate.

    Relevant code:

    
     private void Update()
            {
                //Note: Start with a conservative value here
                //a high value slows down.
                const int interval = 30;
                if (Time.frameCount % interval != 0)
                {
                    //don't run yet
                    return;
                }
    
    

    The above code runs the code once every 30 frames. Start with a conservative value, until you reach the breaking point :)

    PRO
    USA
    #203  

    If you have the RotNew assigned to joints "0" and "2" ...

    Why the question ? Is not compiling or moving ?

    *** EDITED ***

    1. Don't forget the conservative interval, to avoid clogging the Unity.
    2. Check the Joint names and values in the debug windows, confirm the joints names are correct and confirm the values are between -90/+90
    #204  

    Also using your definitions, I am getting this if I rename the second BioJoint... I changed the Name of Joint 1 to "Hello World", Joint 0 ist still named 0.

    User-inserted image

    #205  

    This is the naming convention... So Joint 0_X shows as 0, Joint 0_Y shows at Hello World...which is the name of the next Joint in the Hierarchy, and Joint 2 does not show at all!

    User-inserted image

    PRO
    USA
    #206  

    So "0" is being matched or not ? "HelloWorld" is not defined, so you see the values, but no servo movement.

    #207  

    Edit...Joint 2 shows with his correct name Joint 2! :)

    #209  

    Could you insert the new void Update() so I do not mess up...it is getting kind of late here in Germany already! :)

    I will check again for your instructions and try to test as best as I can! :)

    PRO
    USA
    #210  

    wait! I'm lost is the RotNew assigned per joint or per Joint/Axis ?

    PRO
    USA
    #211  

    Post a link or email me (email is in profiles) to download your project.

    I'll try to see if i can debug it (without having Unity skills) :)

    #212  

    Quote:

    I'm lost is the RotNew assigned per joint or per Joint/Axis ?
    It is being attached to the BioJoint Object!

    #213  

    I am really impressed by your coding skills..all of this went so fast! I will have to check if I answered everything you asked!

    But first of all I will create an online version of the project at Github!

    PRO
    USA
    #214  

    @ptp joint/axis. Bio IK has xyz axis per joint. Mickeys script must be applied per axis per joint, correct me if I’m wrong Mickey.

    #215  

    @fxrtst this is not correct, it needs to be applied once per BioJoint Object!

    #216  

    Ah...too bad, since the project was set to collab I cannot put it up on Github, hope this .zip works! :)

    As far as I understand @ptp wrote the script in the following way. First you declare all the joints and which axis you would want to extract from those joints, plus if the movement should be reversed. Second if there is a match of the attached objects name with a declared joint, the script will check for a change in rotational value of the given axis and send the value to the EZ-B.

    The joint names with scripts attached are showing in the debug window, I cannot confirm if only the recognized joints are moving, at the moment there is no movement at all...

    All values are displayed correctly as far as I can see...

    #217  

    Here is the Link for the Unity project...I will sadly have to go and sleep now! Its been a pleasure, I learned a lot!

    I am positively sure we will get this to work, now @ptp is with us!

    Any help on the Unity side of things, @fxrtst will be more than happy to help, so good to have you helping with the coding @ptp!

    PRO
    USA
    #218  

    Well heck I must have read something wrong way back in the beginning when you were going working on the scripts!

    PRO
    USA
    #219  

    The ServoDefintion name must match the name you see in the log. And the Axis.X/Y/Z must match the X value you see in the log.

    this code gets the joint:

    
    BioJoint joint = this.GetComponent&lt;BioJoint&gt;();
    
    

    and this gets the rotations values:

    
    Mathf.RoundToInt((float)joint.X.GetTargetValue()),
    Mathf.RoundToInt((float)joint.Y.GetTargetValue()),
    Mathf.RoundToInt((float)joint.Z.GetTargetValue()),
    
    

    If you are inside RotNew (Instance) and you run this.GetComponent<BioJoint> I'm assuming the instance should be assigned per Joint.

    but can work either way.

    If you assign per Axis: and if you have 20 joints * 3 axis = 60 instances... plus when you are inside the instance, you don't know the axis, only the joint so you still need to map axis parameter, a little redundant or non logic.

    IF you bind to an Axis, makes sense to return an Axis object linked to Joint object.

    #220  

    @ptp Sorry about having to leave for a night of sleep...it was a big leap on bringing the Virtual JD to live!

    All the above is correct, and everything seems to be working...exept that the extracted values are not being send to the EZ-B! I will take a look, but I guess you will be able to figure it out in no time!

    Just some stuff being not propperly passed down for some reason I guess?

    Did you get the scene? Could you unzip? Unity is great, it does not take long to get the basic user interface...I can make a quick tutorial if you want me to? :)

    PRO
    USA
    #221  

    no files. did you send an email ?

    PRO
    Canada
    #222  

    @ptp Mickey posted a RAR file link

    PRO
    USA
    #223  

    @Nink,

    Thanks i missed that.

    PRO
    USA
    #224  

    @PTP, let me know if you need any help locating anything inside unity.

    PRO
    USA
    #225  

    @fxrtst, So far no problem. I did most of the changes (code) remotely, looks good via remote camera & JD with a power supply.

    @Mickey666Maus, Can you drop me an email ? i would like to ask some questions regarding existent code files.

    PRO
    USA
    #226  

    @PTP, that's great!

    PRO
    USA
    #228  

    PS: No sound

    PRO
    Canada
    #229  

    Hi PTP VERY COOL. I Just installed unity and paid my $20 for BIO IK, JD_rigged_V4 up and running. Could you share EZ_B script please. Thanks

    PRO
    USA
    #230  

    @Nink,

    Folder: https://github.com/ppedro74/ezrobot-playground/tree/master/Unity/Assets

    Script: https://github.com/ppedro74/ezrobot-playground/blob/master/Unity/Assets/Scripts/EZNew.cs

    PRO
    USA
    #231  

    Great job as always. Hat off to you good sir. FYI the way I rigged JD if you select the objects on left pane called right arm goal or left arm goal and move those around on the main window JD will move to follow.

    PRO
    USA
    #232  

    I’m sure I will have a thousand questions for you.

    PRO
    USA
    #233  

    two points:

    1. The Head Joint (Y Rotation) is not being captured, i believe the joint is assigned to Z Axis.

    2. The Gripper is a single servo, but is configured using two joints, must be changed to a single joint tracking the angle over the Y axis.

    PRO
    USA
    #234  

    Ok I’ll take a look. @mickey moved around some joints angles or flipped some axis around after I posted the file. He said something was not performing correctly. Both are a quick fix.

    PRO
    USA
    #235  

    The gripper issue is there is no real physics at play so there are no gears moving against gears to get them to open. If I reduce to one joint it will move the real gripper servo but will not be reflected in live animation until I can figure out how to attach a separate animation that is triggered by a slider, probably the best way.

    PRO
    USA
    #236  

    BTW the update speed at 100 ms is near real time..that is amazing. And you almost lost your JD!

    PRO
    USA
    #237  

    I'll update the script to support both wifi and wired connections (Serial).

    Another option to improve the speed.

    PRO
    USA
    #238  

    Do we have to add the mono sdk with this set up? We have been through so many set ups. Can somebody please do a list of the set up steps from very beginning to end.

    PRO
    USA
    #239  

    Yes you need to add the mono EZ_B.dll is the same i have here: https://github.com/ppedro74/ezrobot-playground/tree/master/Unity/Assets

    PRO
    USA
    #240  

    Thanks! Yes wired and wireless would be great!

    PRO
    USA
    #241  

    Files have been deleted, read your email.

    #242  

    @Nick Yes BioIK is great, it helped us a lot to get full IK running on our robotics project, Sebastian Starke the author is a great guy. I was not aware that uploading the Unity scene might end in the misuse of his work. I paid for the plugin and I would like anyone using it to do the same to respect his work!

    So please remove the link from post #223

    :)

    #243  

    @all This is great, now the biggest challenge is done...Full IK is being transfered realtime to the EZ-B! THIS IS AWESOME!

    I could not resist to laugh a little when I saw the clip of poor JD, because the IK setup is terrible...I was uploading the scene in a rush, just for POC so we can get things going! This setup is far far away from being usable! :D

    I changed the rotation limits within BioIK so they are all range from -90/+90, this is where the mysterious +90 is originating from... I made this to have consistency in servo degrees, but the IK setup has to be changed to make JD move the way we want him to move, and not look like somebody broke his arm! :D

    Also the Grippers do not belog to the IK Chain...I guess the best option will be to either make a slider for opening an closing the Grippers, or attach them to left/right Mouse buttons for live control!

    #244  

    So the next step will be to have a neat IK setup for JD, including Gripper control...which is a good thing, we will learn a lot about hooking things up in Unity and on how to properly use BioIK!

    Another thing we need to start thinking about are the leg controls for JD, we should try to setup the rig in a way that the feet are reacting to the ground...BioIK has examples on how to do this, I have not looked into them yet.

    And at last we should make a standalone application so everyone owning a JD will be able to drive his real JD using the Virtual JD we created! I guess we would need a text field to put the IP address, and we will have to take care that our servo wiring is corresponding to the real life JDs wiring!

    Great progress this is huge! :D

    edit Just checked the new revision of the script, and @ptp thoughtfully included the naming convention for JDs servo setup!

    #245  

    I checked the script and everything works as expected...this is such a great leap forward! Lets keep on developing the Virtual JD! :)

    PRO
    Canada
    #246  

    Thanks @PTP I will test tonight after work. I updated my post.

    PRO
    USA
    #247  

    So we should now have control for min and max and also inverting the servo direction within the plug in? Are we completely bypassing the ARC? Or will any constraints we set in ARC on servos be adhered too?

    PRO
    USA
    #248  

    Also is the script just built for JD, or is it universal for any character? In other words will we need to make a new script for every new character like Six?

    #249  

    #248 Unity is directly communicating to the EZ-B, the way it is set up now it will bypass ARC, so all constraints will have to be applied directly within Unity!

    #249 The script is universal, although you will have to build a model inside of Unity for every robot obviously, plus you will have to register the servos in the script!

    PRO
    Canada
    #250  

    @fxrtst ptp script is well documented. You just need to do new ServoDefinition for each ServoPortEnum for Six (will be a cut and paste after doing one leg).

    PRO
    USA
    #251  

    Ok great. I have a lot going on and struggling to find the time to download and test.

    PRO
    USA
    #252  

    Quote:

    So we should now have control for min and max and also inverting the servo direction within the plug in? Are we completely bypassing the ARC? Or will any constraints we set in ARC on servos be adhered too?

    Please check my post #179

    The current implementation uses the Mono SDK's EZ_B.DLL. The connection is established between Unity and EZB controller. So there is no ARC.

    ARC EZ-Script Min Max e.g.:

    
    # Left Gripper
    SetServoMin(d6, 30)
    SetServoMax(d6, 90)
    
    

    equivalent using the EZB_B SDK (DLL):

    
    this.ezb.Servo.SetServoMin(Servo.ServoPortEnum.D6, 30);
    this.ezb.Servo.SetServoMax(Servo.ServoPortEnum.D6, 90);
    
    

    I didn't use the EZ_B functions, but is not equivalent of doing a "ARC Bypass". Those functions implement the same logic i did, although if you have an application with multiple components and multiple calls to SetServoPosition(s) makes more sense to initialize the limits once and all the calls will respect the min max.

    I'll fix that later to enforce good practices.

    I'll improve the script to support Serial Connections and via ARC.

    To summarize you have 3 options:

    1. Connection to the EZB using a Hostname/Port (EZB SDK or via custom code)

    Pro: Versatile mode: Wireless no cables.

    Cons: You can't share the EZB with ARC. Wifi speed is very subjective, interference, distance, No 5 Ghz wifi only 2.4 G

    1. Connection to the EZB using Serial Port physically connected to the Camera Port

    Pro: Has the minimum entropy, you send 2 bytes per servo directly to the controller.

    Cons: You can't share the EZB with ARC. You need a Physical connection
    You lose the camera Port / Camera

    1. Connection to the ARC's Http Server

    Pro: Versatile mode: Wireless no cables. You can share the EZ-B with ARC, you can use autoposition, ez-scripts all that "boring" stuff that is keeping you from coding bare metal C or C++ :)

    Cons: Wifi issues from point 1) More entropy: Unity/Custom App->WIFI->ARC Http Server->EZ_Builder execution machine->EZ_B.DLL->EZB

    Protocol is HTTP, verbose protocol much more than 2 bytes per servo.

    You pick your poison.

    PRO
    USA
    #253  

    Ok thank you PTP, thats alot to digest and answers alot of my questions. Obviously each of us has their own project ideas and how we are going to implement this into our robots.

    I personally will be looking into how to get facial mocap to move servos to work with your Perception Neuron plug in. That way I have full solution over face and body motion capture.

    I will also be diving deep into Unity, setting up animations via keyframe by keyframe using IK/FK to set up motions quickly like gait etc.

    Both of these will likely be playback from files, not live. Your solution is complete. Thanks again as always for your valuable time to help us solve this!

    PRO
    USA
    #254  

    ROS Framework uses the URDF (xml) description to describe a Robot. That description is useful to identify the physical robot boundaries used in navigation.

    Using a description model you separate the Application from the Robot.

    Maybe Unity allows a "model-plugin-component" or method to describe the model outside of the Unity Project.

    Regarding the JD (Humanoid):

    1. How to Improve the walk ?

    Per Leg: +1 servo between Ankle (Z rot), Knee (X rot). The new servo becomes the Ankle (X Rotation). +1 servo between Hip(X Rot) and the Body. The new servo becomes the Hip(Z Rot). DJ created a STL to attach 2 rotation servos, and is available for download. +1 (optional ?) Hip servo (Rot Y), this one should be inside the body, and a new adapter to the previous HIP servo.

    I believe both Nao and the Bioloid have those extra servos. Having them will allow more fluid movements.

    1. Unity

    We don't have an ARC Kinematic API, everything is done through scripting/auto-position.

    this is not an easy task.

    The ROS framework provides the MoveIt framework, the calculations are heavy and slow, but once you wire everything you don't script the movements.

    Can Unity with plugins help these areas: 2.1) Avoid collision between body parts (You have servo limits, but they are not enough)

    2.2) Restrict the movements when mimicking human poses e.g. elbow to the back etc.

    2.3) Move the gripper to a specific position and calculating all the other joints to achieve the final goal ?

    PRO
    USA
    #255  

    Quote:

    1) How to Improve the walk ?

    1. I agree for a better "walk" experience there are missing joints that exists in the human form. More joints will help. There are other ways to create a walk cycle. But it would be trial and error to work correctly. There needs to be constraints for the feet so you can lock them flat to ground. A missing component to bio IK. Most other rigging systems have this. I said this before Bio IK may not have been the best choice for a full IK system. That's why i purchased 3d puppet, which is more like conventional IK system. I just never got that far to test with. A work around is to add more goals for the knees.

    2. Agreed and this is why I have been trying to get a visual UI representation for animation for years. FK and IK are simply to aide in the animation process. Creating an animation allows you to use the Ik rig to pose your character to create keyframes. You can (as we are doing now) create a realtime floppy avatar to play with. To me this is NOT where the power in Unity comes from. The leverage is in using IK to build animations. You can create buttons and sliders to control and blend animations using a machine state called Mechanim. This allows you to transition from one animation to the next, i.e. Stand idle>Sit down> stand up>do push up. etc.

    2.1) Yes every part of an object has a collision body already associated with it, meaning if you use the physics engine built into unity you can keep parts from intersecting one another in the animation window. These can be adjusted until there are no collisions on JD. Keep in mind the more complex a system the slower it will run.

    2.2) When first set up the rig on JD I set all the limits of joints in Bio IK. So there was no way to bend back an elbow. Watch my original videos of my tests, elbow does not go backwards. These limits can all be set in Bio IK.

    2.3) If you watch Sebastian's videos he shows a humanoid robot moving its waist to try and reach the goals. JD has no waist, so his body must be the "root" when creating the IK rig. In a human body you have a "root" AND a waist joint. JD was not created this way and is a limit by the actual physical robot, not by my rigging or a limitation in Bio IK.

    #256  

    I think first of all we need to not let ourselves be burdened down by Jds DOF restrictions, we should focus on the main goal..which is creating a Virtual JD for everyone having a JD to use Unity to move him as good as the IK system allows to!

    Second task is to explore what Unity can do for us. Having a virtual environment with a representation of our real robot should let us describe every movement possible and transfer this real time or playback to our robots!

    Lets not forget, the JD is a testing platform for all that is yet to come! Once we will have him set up, we can use JD as a common ground to explore all the possibilities that Unity will offer in the future!

    And yes, Unity offers a lot more than just dragging IK goals in the viewport! I will make an example scene asap!

    And if you are asking me, option one is fine! This is the most easy setup for all of us...Unity to EZ-B as easy as it can be!

    PRO
    Canada
    #257  

    If you go with option 1 are we giving up all the existing functionality in EZ-Robot (The ability to script, use existing peripherals, plugins etc)? Is there an option 4? MQTT from Unity to EZ-Robot. Pro's and Cons of option 3 but I think we are going to need a protocol to interface not just from Unity=>MQTT or HTTP etc => EZ-Robot=>EZB but from EZB (Ultrasonic Sensor, images etc)=>EZ_robot (as well as all other attached peripherals with EZ-Robot like Head mounted displays, mice, controllers etc)=>MQTT=>Unity.

    I can't imagine I will always want to be clicking a ball with a mouse to move model and robot. Ultimate goal would be a bi directional digital twin (move virtual JD Robot and Physical robot moves, Move Physical robot position with EZ-Robot and virtual robot changes position. With digital servo's like Dynamixels you can even go one more step and track position of servo=>EZB=>EZ-Robot=>MQTT=>Unity so you can program virtual model just by moving robot limbs. MQTT protocol at that point could also transmit to other robots (Move one physical robot arm and 3 physical robot arms and virtual robot arm moves) Multi Robot synchronized movements...

    User-inserted image

    #258  

    @Nick I totally agree with you my point was just that, the more we get involved in a heavy setup...the slower the progress of reaching the first goal will get!

    We are all having different things in mind, @ptp just mentioned we should add more joints to JD to get better DOF...

    I think your idea is great but we should first of all, get the JD proppely running, second would be to integrate ARC using its native HTTP server since it is a single click to add it to everyones projects...

    After this is accomplished, we can move on to integrate our ideas like eg MQTT protocol, raising JDs DOF or Oculus Rift control...there are a million possibilities out there! :)

    #259  

    @all I already started to look into animating objects in Unity, it works well to animate the IK goals, but the calculation of the joints will only be seen once Unity jumps into play mode, I am not sure yet if there is a way to actually see your robot make those moves while you are animating, I might ask the author of BioIK about this...

    I will look deeper into this, and also into setting up constraints in Unity, so we will not experience any weird flips or jumps in servo rotation when setting up animations for JD! :)

    PRO
    Canada
    #260  

    No worries @Mickey666Maus was just making sure we all new we were making a conscious decision to move off ez_robot as the platform. There is a lot of plugins etc for unity3d, it is cross platform, leading 3D game dev engine so it would make a great development platform. It also gets us off dependence on windows and .net.

    If we go back to the original post in this thread @fxrtst goal was a virtual robot connected to oculus rift and EZB. Unity supports oculus so that would solve his goal as well. I guess this would be our new architecture.

    User-inserted image

    #261  

    @Nick I really love those visual flowcharts you are making, is it an app or do you just make them up by yourself? It makes it very easy to see how stuff branches out and to not loose focus on what has to be addressed to keep up a good pace of progression!

    I got kind of excited about the idea that even if we were only a small team like the four of us which are mainly participating in this thread....so you, @ptp , @fxrtst and me, and each of us had had the equivalent of one hour a day to put intio this, we would have at least 28 hours a week to work on getting this ahead!

    @ptp set up a project at Bitbucket where we can keep things up to date with Git, and we could try to find the best fields for each us us to divide what has to be solved!

    I guess @fxrtst and I would be good for modeling, animating and you guys could help coding?

    So for this I guess it would be ver good to have clean structured goals! :)

    This seems to be very proising, and a big leap for all of us, which are into robotics! :D

    PRO
    USA
    #262  

    @all, just my opinion. After reading through the options that PTP has laid out, I guess I was hoping that we had an option #3 with the ability to have a tethered serial mode for onboard computers. I definitely would miss using the abilities of ARC. A Mixed mode environment is important to me. I.E. the ability to use both regular RC servos and dynamixels. Especially for advanced robots.

    As far as advancement in the model, I'm with Mickey on this one. Getting everything to work as a POC, is more important than having everything rigged properly with working dynamics etc. No reason to spend energy on perfecting the model if we are unable to use Unity and robots the way we were want.

    The next step is animation. If we can't get that to work correctly, then Unity will only serve useful as a live puppet playing machine. It would solve the title of this thread, but i think we all agree we are way past that with our current exploration.

    A small digression, but an important one for me. Let me say I'm in awe of anyone that can play an instrument, read sheet music or anyone who can code. I struggle with understanding these things. My head is full of all the thousands things I can do proficiently, like 3d animation, 3d printing, make up and effects, animatronics, video editing, motion capture, and on and on.

    As an artist I'm a visual learner. I'm challenged daily by sequences and when I ask a question for clarity or an explanation about something, imagine you are standing in front of child who knows nearly nothing. As an example, I need to know what every component is and where it goes in sequence. In our little project here, I have no idea where the c# script that PTP made goes or how it is applied to the scene. I also struggle with setting up or understanding how http server works. .

    A detailed description step by step, of the procedures would be helpful to me, i.e. "first.....place this here...second add this there...then connect this to that." Usually after I study something, I eventually get it.

    Imagine if i handed you a paint brush prosthetic, make up and glue then pushed you into a room and said "Now age Brad Pitt to 75 years old, oh and you have three hours to complete it." Where do you start?

    #263  

    Yes this is exactly what I would like us to do, get the Virtual JD running first...in a way that we will all just have to open the project in Unity and hit play!

    So the way @ptp is setting it up now is very direct...there is no way to have it not working because Unity is talking directly to the EZ-B! I hope I can get to work on the scene a bit tonight or tomorrow, its almost there. And you do not need to understand every aspect of the script since @ptp coded it in a way that once set up it does not need to be attended in any way...

    Once everything runs smoothly, we can progress to tethering by cable, MQTT or HTTP server...but this should be the step AFTER we have all the rest sorted out, otherwise we will have too many loose stings.

    Correct me if there is a better approach... :)

    PRO
    USA
    #264  

    @nink, love the visual flow charts!

    #265  

    This project plugin looks awesome. I'm super excited, this sounds exactly what I've been wanting to do!

    PRO
    Synthiam
    #266  

    Rather than using the http server, I recommend a plugin that binds to a tcp or udp socket and accepts a binary protocol. This will be faster.

    And, in the plugin you can add a filter to smooth the servos. And you can add a recording feature. And you can export to frames in Auto Position if you want.

    Plugin gives you more control than simple http

    #267  

    Yes, after we fixed all the issues in Unity, an integration with ARC would be great!

    I started working on a better IK solution...its far from done, but some progress is made! :)

    @ptp I would like to share the scene, but I do not want to have the problem of BioIK being included in the upload, could you advice me how to use Git to correctly to update the Bitbucket project? I dont want to mess it up! :D

    #268  

    I guess next things to do would be...

    1. Setting up Gripper controls for JD
    2. Setting up a textfield to input the IP of the EZ-B
    3. Setting up IK for JDs legs...

    I guess one essential thing would also be to look up Puppet 3D, since we are getting very involved in BioIK, it would be great to check what other systems offer... Just to make sure we are not heading into a dead end! :)

    PRO
    USA
    #269  

    Sorry guys I've been pretty busy with the premier of Avengers Infinity War. Hope to get back involved later next week. You should contact Sebastian and see how he produced the walking video where the character is walking over rough terrain. There must be a way to create a collision and or lock down of the feet on any surface.

    Yea the reason i was hoping for a separate plug like ArdUnity, it's independent and could be added to any game object. I think we have a problem depending on Bio IK . It needs to be independent so we have the freedom to try other plugs like 3d Puppet or the multitude of other plugs.

    PRO
    Canada
    #270  

    Nice work @mickey very smooth (Mine still jumps around and has a broken arm thing going on :-)

    @DJ I am not sure a hard wired plugin is the right approach here. We need something that is open and extensible that can integrate to multiple systems. The 3D platform and the robot need to interact with the world around them and work with other systems. Closed systems and proprietary protocols are going the way of the dinosaur. Yesterday IFTTT just received another $24M in funding (Total $63M) so even the big players realize interoperabilaty is essential. This is why I thought MQTT was a good approach, especially as you already support a client and server.

    #271  

    I can already think of a dozen clients in DC who might appreciate this software.

    PRO
    Synthiam
    #272  

    Then create a plugin that uses mqtt. Even with the significant performance challenges of its overhead with that protocol for a data stream application, it’ll work. Either way, the points to my suggestion as a plugin for the receiver are specifically regarding what to do with the data. I’d recommend rereading my previous response to save me from retyping :)

    A plugin is very easy and gives you complete control over what to do with the data.

    Ps, This is an impressive one of a kind project. Proud of it originating here!

    PRO
    USA
    #273  

    Its a team effort with some very promising results.

    PRO
    USA
    #274  

    @mickey I'm going to do some testing with 3D Puppet. It performs more like a traditional rigging system. Just need to add 100% weights to all the objects so the hard surfaces don't bend.

    #275  

    This is a great forum that brought us all together! :)

    @fxrtsr I am really looking forward to see what can be done in Puppet 3D, I am aware that the greatest disadvantage of BioIk is that we will have to enter play mode to see the IK being carried out! On the other hand, most IK systems that I tried, gave me a really hard time to set up axies restrictions the way I needed them. Plus extracting the rotation values was a nightmare! BioIk does all this for us!

    I am very curious about what you will find out! :)

    #276  

    I added Gripper control via Mousewheel. Also the IK goals got a translucent material, so you get to see the Gripper better. Also added head constraint just to show more motion! :)

    PRO
    USA
    #277  

    Very Nice, I like the idea connecting the grippers to the mouse wheel vs a slider so you can move the goal and open and close the gripper at same time.

    I'm still working thru 3d puppet and will publish my findings soon.

    #278  

    I added leg IK and also and input field for the EZ-Bs IP address...

    I have been wrenching my brain about this, but I cannot figure out any way to have JD interact with the floor...this might be just me not getting it done, or it might be due to JDs DOF limits? If anyone has an idea on how to do this, please give it a shot...I cannot!

    What we could do is animate a walkcycle by hand, and have something like a remote controlled walk/stop/turnleft/turnright thing hardcoded within the app?

    Any thoughts on this? :)

    PRO
    USA
    #279  

    Looking good!

    I worked all weekend with 3d Puppet and I can say that because its designed for one solid skin organic objects,its not playing well with hard surfaces objects. There is also the issue of not clearly being able to turn off axis easily. There are alot of issues and I decided to scrap it due to how difficult it wold be to set up a character.

    The only thing we are missing in Bio IK is the ability to move from Ik to FK and the ability to lock a goal without it floating. In a standard rig you have goals locked to say an ankle, this ankle/foot has its own controls. That makes animating a walk cycle much more easy using IK.

    Sabastian has shown the robot character walking over land. But this in fact could be another plug in at work. I found one called Mec Foot Placer in the asset store. Might be worth looking into.

    But to the point, I have tried everything to get the IK to interact with the floor with built in physics, but no luck yet. I tried adding rigid bodies to each of the feet. Must have to do with how his IK system works.

    #280  

    Thanks for spending the time on checking 3d Puppet, I already figured it might not work...but since we are spending a lot of time with BioIK, it might not be the worst thing to check what other plugins might offer! I spend countless hours to try and build a mechanical rig for my robot in 3ds max, it was a nightmare....and extracting rotations was horrible. This is why BioIK was such a great find, I think this was our breakthru to get stuff done!

    Your are right about the thing that we cannot have IK/FK blend, but actually locking IK is the opposite problem, the IK seems to always be locked to the objectives...but it does not bother me that much, I think we can traditionally animate the model pretty good!

    I think the problem is more that JD has no hips and no ankles...this was how far I got before I abandoned the idea to have JD interact with the floor! :)

    #281  

    Sorry I'm not contributing as much as I'd like, but I'm extremely new to all this and would likely get in the way more than anything. So is the idea to be able to model a robot in Unity, then map servos for control through ARC? Oculus support?

    #282  

    Yes, this is the goal...Oculus support is also something I would really like to add to this project! :)

    #283  

    @fxrtst I found out how Sebastian made the IK work for reacting with the ground...first he made a traditional animation, then he created a bone system in Unity and BioIK is getting setup as usual. At last a Projection-Objective is added. This Object will modify the Walk Cycle thru BioIK in a way that the ground is always taken care of!

    So we should use our 3d animation package to create walking motion traditionally and then we can use BioIK to modify those motions!

    I will create an example using JD... :)

    #284  

    So cool! I envisioned using telepresence robot with oculus headset and oculus touch (or gloves) to control hands! Think of the possibilities if this works!

    #285  

    I was able to get my robot's head to move around by using Bigscreen login through Oculus to my desktop and then manually manipulating the servos through ARC. Ridiculously clunky and inefficient way of doing tele-presence, but just shows this concept IS very possible!

    PRO
    USA
    #286  

    @mickey That was the missing piece. He did not include documentation for the other objectives, I was wondering how/ what the projection objective was for.

    JD is definitely a challenge for the first setup in our tests. In part do to his physical limitations and missing joints. I’m really looking forward to setting up some animations and get a real robot to reproduce them live.

    #287  

    Next step for the Virtual JD will be adding reference joints, so Unity will recognize him for further animations. After doing so we can hardcode animations, like eg a walk cycle...which can be altered live by the BioIK plugin.

    Also I will add a recorder, so motion can be recorded and played back... :)

    #288  

    I tried to find if Microsoft's Augmented Reality headsets could be linked to Ez-Robot since it it a Microsoft software and hardware. If it has been previously been discussed ignore this reply.

    Thanks Ellis

    PRO
    USA
    #289  

    Here is a test with a hexapod I'm designing for my son. DJ is definitely on the right path by suggesting an export/import plug in for the Unity/ARC bridge. Or maybe a live feed into ARC and then you could record the motions with servo recorder? You can create looping walk cycles in Unity easily.

    Anyways here is a quick test.

    #290  

    Yeeeeeeeaaaaaahhhhh! Looks awesome!

    I guess all different routes will be something that we can benefit off! We can record and playback motion straight out of Unity, I am working on a standalone .exe with a UI... But also from ARC or from a Rasberry Pi using Mono SDK will be a very sweet option to explore all the different options given! :)

    PRO
    USA
    #291  

    I have to be honest i did not try out PTPs scripts and the EZB. I need some details on setting it all up. This is with Ardunity and Bio IK. I look forward to your stand alone solution!

    #293  

    @fxrtst Well its more about getting used to Unity at the moment I guess...all of the stuff we learn there can be translated later! :)

    #294  

    Actually, as far as I understood Unity, the best practice is to rig and animate your model using 3rd party applications and use Unity to blend and modify those motions! I am struggling to get JD rigged propperly, I cannot get to understand how to create the propper constraints! Anyone out there knowing how to do this?

    I started to explore BioIK more...it us really interesting that you can use full body IK, which means, if you are moving the Objective eg to the ground, the robot will bend his upper body to reach out for it! :)

    PRO
    USA
    #295  

    That’s good. I was missing how to do full body ik with Bio IK. Did you move the root to another location?

    I agree that bio ik needs constraints as well as objectives. I did manage to add a rotational objective to my last joint in the hex leg. This almost immediately stopped the annoying jerking that happens. To be clear i added a box as a child to the last joint. This box was then chosen as the objective for the rotation of that last joint. But I do not move this goal. This somehow makes the ik chain behave better as seen in the video.

    #296  

    @fxrtst you can add a displacement Objective to your root...this makes a huge difference to those nasty jumps too! I already wrote Sebastian on how to lock joins to a certain position!

    I also added a box object as a root to the hierachy, if you do so you will get full body IK! :)

    PRO
    USA
    #297  

    Ahh perfect I missed adding an game object as root! Cool. I’ll try messing with the displacement objectives!

    #298  

    Hey guys, anything new here? I am having no time at my hands at the moment, my day time job is really not leaving spare time for robotics...but I hope it will change soon, and I can continue helping to build this addon for EZ-Robot! :)

    PRO
    USA
    #299  

    I’ve been busy pretty printing up parts for my sons hexapod, trouble shooting printers, and dealing with some family issues. It’s going to be slow going for me as well. I'm pretty sure all of June will be busy too.

    Once I finish this hex I’m going to rig it in Lightwave and animate it there. It’s an environment I’m used to and I will be able to create looping animations to test with easily. I will then export as fbx and import to unity. Hopefully you guys can walk me thru setting up the EZB with Unity. This hex uses every digital port on the EZB!

    Anyways hopefully we can continue moving forward. I’d like to see a capture record/playback for exporting from unity and importing to ARC !

    Below is the box of parts... only about 50% done :(

    User-inserted image

    #300  

    Man that print looks awesome, where did you get the model from... Did you design it yourself?

    I guess rigging and animating it in Lightwave is the way to go, all my research showed me this is the way of best practice! I might also have to get some personal things sorted out before I can get back to robotics, but its itching...so let's see! :)

    PRO
    USA
    #301  

    Yeah I created the Hex in Zbrush for my son. Here are a couple pics of concept and then the final model. So many parts...what was I thinking?! It so huge not sure if it will walk but I'm going to give it a try!

    User-inserted image

    User-inserted image

    User-inserted image

    #302  

    OMG, how did you manage to pull this off so quickly? Great model! I cannot wait to see it in action! :D

    PRO
    USA
    #303  

    Just about 2 weeks designing, and another week til all the parts are completed printing. Then I will assemble and test the legs and add weight to it to see if it will stand up!

    #304  

    It looks amazing! Which type of servos will you be using...will the body and it's arms also be moving? :D

    Great job!

    Is there an enclosure for the EZ-B in the back of its body? ;)

    PRO
    USA
    #305  

    I’m using the HDD servos from EZ. His body will rotate and tilt, his head will move up and down and side to side.. well as much as he can with his design. Then the arm with the cannon goes up and down, the other arm is stationary ( with hand) . LEDs are in the eyes and cannon. The EZB fits inside the chassis’s. I had to add a piece below for the battery to fit inside of and the device on top houses the speaker. He will be loaded with all kinds of sounds !

    PRO
    USA
    #306  

    @fxrtst: Nice build!

    Are you developing new servo brackets ?

    PRO
    USA
    #307  

    Thanks! I've developed the brackets just for this build. They would be useless for anything other than this robot. I wanted the servos to look as though they are part of the design. I've printed them in PETG, I will be running a power test next week to see if they will hold up to the weight. At this point I think it will weigh about 12 pounds. Hopefully with the weight distributed over 18 servos it will mange. Because of its size and the flex in PETG I'm afraid it might oscillate as it walks too. Test test test. Time will tell!

    PRO
    USA
    #308  

    As the 3D printers finish up the last parts over the next few days, I want to play around with the idea of recording and playing back animations from Unity. I still need the functionality of EZB and ARC for a full featured experience on this Hexapod.

    I'm thinking that a simple plug in that records the animation data and plays it back seems like the most likely (easier?) way to demonstrate Unity and ARC working together-ish. I don't think for this demonstration that it is important to have a "live" connection. But this could be a great demonstration for recording and playback.

    @ptp, is this something that is possible? Does anyone know if a CSV file is the best way to go? Then create a plug in in ARC that reads the recorded data?

    PRO
    USA
    #309  

    I've done a bit of research...and Unity XML will save out vector 3 with translations and rotational values...is this something that could be used instead of CSV? Looking at options.

    #310  

    @fxrtst Good luck with the PTEG, I never used this for printing...but I heard a lot of good things about it! I started printing my robot parts in PLA and experienced, that during time the shoulders started deforming from the weight...so I switched to ABS. Which in turn is not the most forgiving material to print, warping is a big issue. The Prusa Printer has no enclosure, which does not make life easier!

    I heart that Polymaker PC-Max should be one of the strongest filaments out there, but it is quiet pricey!

    Getting the rotational values formatted into XML or CSV should not be a problem! Once you get it all set up, a test to check if live playback out of Unity works would be fun too!

    If you ever have any time to do a tutorial, you workflow from Zbrush to printing would still interest me a lot...how do you get the exact measures for your servo brackets in Zbrush...it is so hard to do correctly scaled geometric things build within Zbrush...and then having them printed out correctly! I could never figure out how to do this!

    #311  

    @fxrtst you will have zero problems with the PETG my entire Inmoov robot is printed with that stuff. and Inmoov is heavier then 12 lbs, even Bob Houston's waist that holds the entire upper body does not flex. the only brand I use is MG Chemicals, olther brands I tried were garbage. you bond the parts using weldon 3 it is just like using acetone with ABS.

    PRO
    USA
    #312  

    @Mickey I will indeed do a tutorial on my process. I've learned alot starting 6 years ago with Alan. Zbrush has come a very long ways and can now do polyginal modeling (box modeling), but I still cut my models like its clay with dynamesh. It gets rough and models up close look like cr*ap, but when you print at 100,200 or 300 microns none of that matters. The model resolution is higher than printer can print, so turns out fine!

    Yes, I'm up for any test once it gets finished. And XML/CSV outta Unity/ez builde will be much appreciated!

    @nallycat Yes I've use PETG for the past two years and it is very strong. But I've not tried MG Chemicals...now I gotta look! I've used eSun and it has not jammed on me or delaminated etc. Stuff is super strong and affordable. Only problems might arise would be from poor design work or servos not being able to handle the weight. LOL.

    PRO
    USA
    #313  

    2 more model to print then its test time. I'll likely just rig the legs to the chassis and slowly add weight to the chassis and try to reach the total weight of the completed model and then walk it.

    Here are a couple progress shots unsanded right off the machines.

    User-inserted image

    User-inserted image

    User-inserted image

    User-inserted image

    User-inserted image

    #314  

    @fxrtst Now I am getting really jealous!

    Btw...what are those metal things near the servos, bearings?

    PRO
    USA
    #315  

    Which metal things are you referring to?

    User-inserted image

    #316  

    Hahaha...not the ones in the box, the things you attached to those servos! :)

    PRO
    USA
    #317  

    Ah! Those are extra strong servo horns by a company called Actobiotics. They have so many cool modular parts for robots. I’m using a lot of thrm for Alan’s arm and torso.

    User-inserted image

    User-inserted image

    PRO
    Synthiam
    #318  

    Man - this is gonna be amazing. Will you be creating a Project Showcase for it? So it can be included in our news letters and features?

    #320  

    Thanks for the info... I will check those parts, they look strong and helpfull for building! :)