Use Meta Quest 2 VR headsets to control robots by viewing their cameras and manipulating servos with natural hand gestures or controllers. This skill brings a futuristic touch to remote robot operation, enhancing safety and efficiency in various work environments.
How to add the Oculus Quest Robot robot skill
- Load the most recent release of ARC (Get ARC).
- Press the Project tab from the top menu bar in ARC.
- Press Add Robot Skill from the button ribbon bar in ARC.
- Choose the Virtual Reality category tab.
- Press the Oculus Quest Robot icon to add the robot skill to your project.
Don't have a robot yet?
Follow the Getting Started Guide to build a robot and use the Oculus Quest Robot robot skill.
How to use the Oculus Quest Robot robot skill
Use the Oculus Meta Quest 2 VR Headset to view any robot's camera and control any robot’s servos with your hands or controllers. Servos can be attached to finger movements, hand movements, controllers, and buttons. Additionally, a built-in remote desktop controls ARC without removing the VR headset. This robot skill requires the Meta Quest 2 app installed from the Oculus Store to control a robot.Transform your robot interaction experience with the Oculus Quest robot skill for Synthiam ARC, enabling immersive control of robot cameras and servos using the Meta Quest 2 VR headset. This innovative skill offers a hands-on approach to robotics, complete with a remote desktop feature for seamless ARC control, promising a future where remote work and safety in hazardous environments are enhanced through virtual reality.
Why Control Robots From VR?
While it is exciting to use this robot skill and have a robot mimic your movements with VR accessories, this is also a preview of future technology in the workplace. Robots can perform many jobs, but the computing power to automate the task is unavailable. Today, VR technology is advanced enough to allow humans to control robots remotely and accomplish tasks. This means, for example, that warehouse workers can work from the comfort and safety of their homes while improving outcomes by remotely controlling robots. As artificial intelligence improves, robots will require less human intervention, but humans will still be needed to help the robot when it gets stuck. This means one person can control many robots, complementing Synthiam's approach with our Exosphere product.Imagine people working from home while operating robots remotely in dangerous locations, such as mining, disaster recovery, or space exploration. With Synthiam's agnostic approach, our platform can program any robot. With the addition of virtual reality remote control, we are saving lives and improving job safety. There is no better time to familiarize yourself with this technology at home, school, or the workplace.
Downloading Quest Robot Controller App
Get the Oculus Quest Robot App: Oculus Quest Robot.apk (v0.10 June 25, 2022).*Note: Due to complications with Meta's review process, this app must be side-loaded while we work with Meta to have the app accepted for their store. Meta is currently disallowing this app because it connects to a PC. We're working to convince Meta that this useful VR app should be accepted for their review process to be used by students, robot builders, and inventors. Stay tuned!
Stereoscopic Camera View
This robot skill and Oculus Quest VR app have optional support for two cameras (left and right eye). This provides the user with a 3D stereoscopic view. If two cameras are not required, a single camera can be used instead. The position of the two cameras is critical to prevent eye strain for the viewer. It is generally advised to measure the distance between your pupils. Use that measurement when distancing each camera from the center of each lens.Using Oculus Meta Quest App with Virtual Reality
When the oculus meta quest robot control app is loaded, you will find yourself on the bridge of a spaceship. The left controller can move your view around the ship to explore. The right controller can interact with the menu that will connect to the robot.
Enter the IP Address of the Synthiam ARC software. The robot skill will show public and private IP addresses you can enter in this field. Using this robot skill on your local network, use the local IP address. If connecting over the internet through a port on your router, use the public IP address.
The port can be edited, but the default value is always 8082.
Pressing the connect button will load the robot control view.
About The Oculus Meta Quest 2
The Oculus Meta Quest 2 (marketed since November 2021 as Meta Quest 2) is a low-cost virtual reality (VR) headset developed by Facebook Reality Labs (formerly Oculus). It is the successor to the company's previous headset, the Oculus Quest. The Quest 2 was unveiled on September 16, 2020, during Facebook Connect 7.As with its predecessor, the Quest 2 can run as a standalone headset with an internal, Android-based operating system not requiring a PC or high-cost graphics card. It is a refresh of the original Oculus Quest with a similar design but lighter weight, updated internal specifications, a display with a higher refresh rate and per-eye resolution, and updated Oculus Touch controllers.
Main Window
The main window displays real-time tracking information of Quest input devices. Either the controllers or hand detection can be used. If the hand tracking is used, each finger position will be displayed, starting with the thumb and the hand's position. The controller location, analog stick, buttons, and trigger positions will be displayed if controllers are used.Controller Inputs
Both hand tracking and controller tracking can control their respective servos. When the controllers are set aside, and hand tracking is detected, the configured hand servos will enable. The hand servos will be disabled when used, and the controller servos will become enabled. This switches automatically based on what control method you use. Switching control methods can be done on the fly.Configuration
Press the configure button on the robot skill to view the configuration screen for this robot skill.Headset
The servos for X (horizontal), Y (vertical), and Z (tilt) can be assigned. These servos will move with your headset to look around. Generally, you will want these servos attached to a camera so you can look around at the environment with the headset.
The option to pause servos when the robot moves and pause the headset servos while any Movement Panel is being moved. When checked, the headset servos are only usable when the robot is not moving. This is useful to reduce motion sickness while the robot is moving.
Left Hand & Right Hand
The position tracking group will bind servos to the hand's position. The X is horizontal, Y is vertical, and Z is the distance outward from your body.
The finger tracking group allows assigning servos to each finger. Gripping each finger into a fist will move the respective servos. This is useful when using a robot hand, such as the InMoov, to control each finger individually.
Left Controller & Right Controller
The position tracking will bind servos to the hand's position. The X is horizontal, Y is vertical, and Z is the distance outward from your body.
Triggers and analog sticks can also control servos. The triggers are index trigger and hand trigger. The analog stick for each controller can be assigned servos.
Scripts can be assigned to the A and B buttons when pressed or released.
1) The TCP ports for servo Server and Remote Desktop are read-only. They cannot be changed,
Enable/Disable the remote desktop feature. This feature has no password protection, so be cautious if you enable it. Do not enable this feature on public networks.
The number of frames to skip when transmitting servo positions. If too many servo positions are set, the EZB may brown out from lack of power supply.
Left Camera (or single camera if using only one camera)
Right camera (optional if using only the left/single camera)
Remote Desktop
The remote desktop lets you move the mouse with the right controller. You can use the trigger to left-click and the hand-grab to right-click. The remote desktop feature does not provide authentication, so only use this on private networks.
Press the menu button on the left controller to exit the remote desktop mode.
The analog stick on the left controller can be used to move around the floating screenshot window.
Robot Control
When viewing the robot control, the menu button on the left controller will load the menu to enable and disable specific servos.
In the popup menu, you can adjust the camera depth for comfort.
The return to lobby option will do just that, return you to the main menu lobby.
The remote desktop can remotely control ARC through the VR headset if the robot skill configuration is enabled.
We are waiting for the app to be available in the Oculus store. A link will be provided in the robot skill manual above when approved.
The Oculus Quest Robot App can be downloaded on the Oculus Quest 2 headset from here: https://www.oculus.com/experiences/quest/5090145164400422
Also see a link in the manual description above
Anyone have one of these that could shoot a video. Would love to see it in action before I spend $620 on another VR headset. Is the 128GB headset sufficient or do you need 256GB
I don’t think the games will impress you if you come from Steam VR with a real GPU. The 128GB is sufficient because the games are tiny for it. It’s about as powerful as a 2015 iPhone doing VR, IMO. The games are very cartoony and do not have much detailed texture.
But, as for controlling robots, it’s super great - probably the best. The hand tracking is marvelous. No controllers are needed - and no computer is needed (other than for arc).
There are a few other features we are throwing in soon. Such as Remote Desktop into an ARC so you can control it from the headset. As it is now, it’s super stable and surprisingly impressive as an all-contained unit.
I should also add that finger tracking is perfect for inmoov owners. I think that’s super awesome.
OK thanks I will wander over to best buy in the morning and get one. Hopefully I can finally throw out my wii controllers and vuzic 920VR I use to control my robot (I think you and I are the only people who still own a vuzic anyway). Now I just need to upoload my robot to the metaverse...
Oh gosh, that's funny - I forgot about the Vuzix! Yeah, that can be donated, haha. It'll be pretty amazing once you can control ARC from this headset. I'm excited about it.
I think there's also a scan ability being added to the app. So it can scan to find arc. I haven't seen the feature list in-depth, but it's pretty incredible what Synthiam is doing with it.
Definitely excited to start using this! I'll download it tonight and let you know how it goes!
I have the 128GB Version and enjoy the games on it. Moss is an amazing game on there and I enjoy playing Walkabout mini-golf too
Beat Saber is a good game if you're looking for exercise! I don't think I've ever not broken a sweat playing that game.
EDIT: After looking more closely is seems that I'm hoping for way too much out of this. Looks like one cant actually edit ARC using a virtual keyboard and mouse but only move servos. I've been watching way too much Star Trek Discovery lately. LOL.
Well, I did my absolute best to try and find this app on my Quest 2 and my phone/laptop but I just couldn't find it.
First off, I tried the link provided (on a laptop, Quest 2, and phone) and it only goes "Whoops! It looks like you're lost" Not found page. I'm assuming this is because the app isn't officially accepted into the main store yet.
It seems that there isn't a direct (Quest) Meta App Lab interface. App Lab apps can be searched through the Quest app store (on the Quest 2) or through Sidequest (on a phone or computer) but I just couldn't find the Synthiam app. I'm guessing the App Lab interface is on the developer side.
I tried searching for: "Synthiam, ARC, Robot control, Robot skill, Robot, 5090145164400422" on Sidequest and the Quest app store but I didn't have any luck with finding the Synthiam app.
Any suggestions?
Haha really? That’s funny - Not sure why the app is in limbo. I can post the apk and you can load it from the oculus developer hub. But that’s not a long term solution. I’ll have to reach out and find out what’s going on
I was running firmware version 37 so I upgraded to 39 but I didn’t see any difference in terms of ability to see the Synthiam app, unfortunately.
I can’t see how the firmware would make a difference for available apps. But the update does have improved hand tracking! I’m guessing the availability of the app has something to do with a backlogged review process of the store? I can only guess because there’s no communication from them
I wonder if it is like the Apple store. You submit, pray and wait. Then wait some more.
If you can sideload the APK, here it is while we wait for the review (v0.5): Oculus Quest Robot.apk
Here is the latest APK for sideloading (v0.7) while we wait for Meta to review our app submission: Oculus Quest Robot.apk
Right on! I'll try to sideload the app tonight. I believe I already activated developer mode so it shouldn't be too difficult to install.
Do you guys have a tutorial that you use to sideload quest apps? Figured I'd ask before I go looking.
I just press the upload button in the oculus developer hub.
To expand on DJ's response, the developer hub can be downloaded from here: https://developer.oculus.com/downloads/package/oculus-developer-hub-win/
Synthiam does not support adding apps to the Quest using this method, so there will not be a formal tutorial. The wait for Meta to review and accept the app will allow the app to be added to your quest from the store.
In the meantime, if you use the developer hub software, the apk can be installed by selecting your Quest from the devices menu. There will be an upload button that will allow selecting an APK to be added. I hope that helps!
Robot skill updated to v4
Fix for hand servo configuration not saving
Added remote desktop server (configurable in General tab)
Robot skill updated to v6
Meta Quest APK updated to v0.8 (Oculus Quest Robot.apk)
I'm excited, I got the Synthiam app sideloaded onto my quest last night! I used the SideQuest app on my laptop and loaded the APK into the headset via a USB-C cable. I found the app in the "Unknown Sources" section of the app menu on the Quest.
I used this skill and the camera skill in ARC to check out the camera view. It's so weird to be looking at yourself when you use a laptop camera!! A true out-of-body experience!
Is anyone else trying this out? I'd love to compare notes!
I'm wondering which robot would be the quickest to make/buy that has fingers? Is it still the inMoov?
Cool I still haven't purchased a quest yet was holding off until someone other than dev had it working. Did you try with JD and his grippers?
Right on @Nink, No I haven't tried that yet but I guess I could. I was just thinking that full finger movement would definitely be next level, but I guess I could start off small
I used JD with the grippers. I assigned my index finger to the gripper, and it worked well.
I'd like to 3d print only an inmoov arm and attach it to a mount or something. I don't need the full inmoov but just an arm. What would be really wild is an inmoov arm attached to a base that you can drive around. Then you could remotely operate a hand from anywhere in the world with it.
Picked up the oculus then sideloaded the ARC APK. Connected JD configured head track and that seemed to work but when I went to do controller track and then connect it kept locking up. I just did a couple of servos on one arm. I set min max on the servos to conservative movement.
If anyone has a working profile for a JD with Oculus would be good since this is a common robot for this community we can all use as a great starting point.
What do you mean it locked up?
Screen just froze and camera disconnected, head track stopped working JD stopped responding. I went back to lobby and then disconnected EZB in ARK and reconnected. Arm control still didn’t work. I disabled servos and repeated disconnected reconnect and head track worked. I can try another JD etc and try and play again this weekend if the honey do list isn’t to long.
I had one lock up once as well. The meta app was still functional except the Remote Desktop and camera screens were white. The ARC software froze and then when I clicked on the screen an error briefly showed up and ARC closed.
Hmmmm I wonder if the commands are being sent too fast and flooding the channel. I’ll take a look this weekend.
ARC never died on me. I just lost coms. I am using an overclocked gaming PC hardwired to an 802.11ac WIFI router in the same room so hopefully no bottle necks. My hunch is I need to do a starter pose that the robot and I stand in so our joints and servo's are at least close to aligned and not have everything try to move at once.
You’re thinking the jd ezb browned out?
updated to v8
We have updated this robot skill with an option to skip servo commands being sent to the EZB. This is a setting that you can configure. If you are experiencing lockups with the EZB or disconnects, increase this value to skip sending every received frame command.
Cool, gonna try this when I get home tonight! I'll try it with JD.
Unfortunately, I've been too busy with hardware development to test this out. BUT, I did stumble across a open-source 3D printable hand design by Open Bionics that seems promising. Although the servos are like $70 a piece. The hand does seem more solid than the inMoov hand.
If you're curious, check it out here: https://www.thingiverse.com/thing:3000641
OK, I've decided to go all-in on this because I feel like this skill is the future!
I need to have a robot with arms and fingers so I've decided to print an inMoov.
I'm going to need all the help I can get so I'm going to start another thread to get advice from the great community here!
I just checked out that inMoov hand you pointed to. Looks amazing! Is that the version you are going to print? You mentioned the servos. I couldn't find any listed in that link. However I did see a discontinued very small PQ12-63-12-P LINEAR ACTUATOR. Wow, I would love to find out how that works. So small and compact. Not much info on that page about it though. I guess google is my friend. Off to research stuff like this. Thanks.
EDIT: I found this little actuator for sale with a nice discription at The Robot Shop here: Little Linear Actuator Too bad it's going away. It's even 12v with a Potentiometer already installed. They do offer this in a 6v version that I think you were looking at. You're correct though, it's expensive! Ouch!
Hi Dave, thanks for doing a bit more research into that! I've come to the conclusion that I will stick with the original inMoov hands because that brings the price down significantly and gives me full-finger articulation. From the looks of it, the Open Bionics hand has some of the fingers tied together, and I looking to take advantage of individual control of each finger.
Small error discovered, the scripts that can be created under the buttons (Button A Pressed ....) are not saved. If you save the config and open it again, they are gone...
Greetings Uwe
I have a question, ARC is still quite new to me. How can I use the two servo values (sticks / joystick X / Y) to control an HBridge to drive my rover. There is an option e.g. in the skill joystick (Movement Control), how do it here? I helped myself with a script, but I think this is definitely not the right way, here are my outstanding scripting skills :
Greetings Uwe
This robot skill is currently under development - so many of the features you are asking about haven't been completed yet. However, these items are on the list to be completed. I don't have an estimated date, but I know they're actively working on finishing this robot skill.
If you want to test my stereoscopic attempts, here is the APK file for the Oculus Quest and this skill.
ARC_3D_Test.zip
Is a test, only rudimentary functions. Only the video is received, no servo data is sent.
You can test it with a side by side video, e.g. here to download https://stereoimage.de/download-video?download=90:europa just send the video to the camera skill using the "Stream All The Video Sources" skill
ARC must run under the IP 192.168.1.75 The skill "Oculus Quest Robot" with the servo Server Port: 8082
Right Controller Button A toggle Mono/Stereo Right Controller Button B toggle Cam/Test (use Left & Right Sticks).
I'm really surprised how good the latency is when I integrate the Stereopi as a USB cam on the Camera Skill.
Greetings Uwe
Do you have to use the stereo pi? Or can you use two cameras? What’s special about the stereopi that couldn’t be done with two cameras?
im asking because if we add dual camera support, we’d need to understand what the stereo pi offers that two usb cameras doesn’t? It’s a lot of overhead to have a raspberry pi running only for cameras
The stereopi synchronizes both cameras, the two cameras must be in sync. If you can sync the cams there shouldn't be a problem.
Yes, an extra raspberry pi is a bit overhead, but I was hoping that the Stereopi would stream itself. It also does this, but it is far too weak, the latency is impossible. There is a possibility via UDP, the latency was good but has many other disadvantages (cannot be reached from the Internet...). Originally I also wanted to control 2 servos directly via the Stereopi, theoretically possible but as written, it is too weak to implement everything on the pi.
That's how I ended up with you and ARC and I'm very enthusiastic so far.
That’s great information, thanks. I’ll take a look at what you’ve done and see if it can be added to this robot skill. I’m confident it can. The frame sync won’t be an issue because ARC uses the win32 low level video capture per frame.
theres a few other developments going on at the moment - so it might take a week or more for someone to get around to it. Please share any advancements you experience in the meantime.
This skill here is based on the skill "Servo And Camera Server" - isn't it? Otherwise I couldn't receive the video data in Unity.
Is it possible to get the Unity script from this skill here where I can see what format the controller data is being sent to ARC, then I could get on with my Unity tinkering (very very rudimentary, I'm new to Unity) I'd love to work with this skill here and not with the other .
Of course I'll keep you up to date. My AtomicPi Rock Pi X will be delivered in the next few days, so I'm curious how it handles a USB cam, for example...
Greetings Uwe
I created a unity project for the stereo display as an example. This is how I solved it, I'm really new to unity, no idea if there are other ways to display it. I only found paid assets but they all require a real video. Unity itself can display a side-by-side 180/360 video on the skydome, but this was not the solution either.
The structure briefly explained:
a new Layer Right a new layer Left
one camera, Target Eye: Right, Culling Mask Layer Left off One Camera, Target Eye Left, Culling Mask Layer Right off
a GameObject Plane for video display
a GameObject Plane for Cover Right -> Layer Right a GameObject Plane for Cover Left -> Layer Left
Arrange accordingly, plane for video display, in front of them the two planes for left/right coverage So only the corresponding half is seen by one eye, the other is always covered This means that nothing has to be converted or broken down on the video stream.
Here are the Unity project data:
Stereo_ARC.zip
Greetings Uwe
played around with unity, really funy...
Whoa, roli looks pretty wild in unity - good job! I think we're starting on some of the changes to this robot skill next week. Your changes are a tremendous influence on that!
Just to let you know these changes are in development. Hopefully there will be an update in the next week or so. Stay tuned
I'm curious what's coming, I'm looking forward to it. May I ask again quietly, could I get the script lines where you send the controller data from Unity to ARC ....
The unity project is too large to add as an attachment. But the C# scripts are easy enough to identify.
This is every C# script in the project. Most scripts are for various buttons on the scenes. But the ones that matter most use the COMM folder. Here it is: RobotView.zip
great, thank you very much. I checked it out, it's exactly what I need...
Take a look at the robot view folder and the file robotview.cs has the information for sending the controller data.
Yes, I've found it, let's see how I can integrate it...
it works, I was able to customize the scripts, great! Now it can go on, my Rock Pi has also been delivered ...
Take a look at this Dual Camera servo Server robot skill: https://synthiam.com/Support/Skills/Virtual-Reality/Dual-Camera-Servo-Server?id=21198
The App and robot skill has been updated to include support for single or dual cameras. A dual-camera provides a stereoscopic 3D view. This has been tested with UweP Microsoft LifeCam rig: https://synthiam.com/Community/3DParts/Sensors-Cameras/Stereocamera_rig--f0adc1f0-9df2-4d50-8588-e9670093b89b
Get the APK in the description above to sideload. Meta/Oculus/Facebook has rejected our app for the AppStore because it controls robots.
Dang, that feels short sighted that they rejected the app. Thanks for the Stereoscopic support, that’s awesome! I’m definitely going to try using 2 cameras in the inMoov eyes!
Hi Jeremie,
This is why it started for me. I installed exactly these 2 cameras in my InMoov (MS LifeCam HD-3000). It has always been my dream to operate the InMoove via VR, now thanks to the skill I have come a good deal closer to the dream.Right on @UweP, it’s a shared dream then! I also want to do the same thing as I feel that humans piloting robots through VR headsets (with gloves or controllers) is going to be a part of our near future. It’s going to allow remote work to blossom and for next level exploration.
Humans piloting robots paves the way for the robots to one day take over repetitive and low skill tasks as well once they are fully trained. This is DJ’s exosphere idea and I can completely see VR + Humanoid operation playing an important part.
@DJ I was wondering if this might help:
Although your app wasn’t accepted in the official Meta store, maybe you could add it to the Sidequest store. Sidequest is like a third party App Store for the Quest. There aren’t as many users there but still a pretty decent amount!
It requires approval for that as well. I have a ton of threads on their developer forum about this as well. Because their original documentation says it doesn’t need approval but it does
Crazy! I thought Sidequest was pretty open and anything goes with their platform. There are some pretty low-quality games on there. I expected the approval process to be pretty much non-existent, LOL That's really too bad.
Side quest gets its apps from the App Lab. It’s a sort of reviewed state for apps. But we don’t even qualify for that because our vr robot app requires a robot. Is what their feedback has been. So we added Remote Desktop and explained it connects to a pc, not a robot. So it’s the same as virtual desktop that has already been approved. In which case they must accept it now. Time will tell
This is a fantastic skill - and I love it. I am now rigging up a linear actuator based Stewart platform for a robot neck, paired with Oculus Quest 2. Things are looking good.
Any idea how to set the servos though for X-Y-Z if it's 6 DoF parallel actuators? Been fiddling with them and can't figure it out...
Anybody smarter than me understand what a servo profile for a Stewart Platform might look like?
Otherwise I gotta go back to just 4 DoF, even though I already built the platform...
Any thoughts would be great!!!
@DJSures?
What's a steward platform? The photo of that thing you posted?
You'll probably need to use multi-servos for each axis. To look down, you'd need to raise servos in the back while lowering servos in the front. So multi servo with one servo inverted. And to tilt left and right would be the opposite. The trouble is that there's no tilt value from the headset. So you'd need to rotate left or right, in which case you'd do the opposite of looking up or down.
Multi servo is the way to go. Every robot skill allows multiple servos per set. It's just in the advanced button, and then you can add as many servos per axis as you wish. You can share servos across multiple axes as well. That might take a bit of tweaking to get the multiplier correct, so it doesn't jerk between the two positions. But it can be done with trial and error or sitting down with a pencil and lots of math. I go by trial and error in testing
Thanks! I've been playing with different sets of multi-servos with no luck all week. Didn't realize the headset doesn't have tilt value (that's probably been my problem). It must have tilt? The multiplier might be the ticket though! Hadn't thought to try that. I'll keep you posted. Stewart platforms are most commonly used in aircraft simulators because they can roll, tilt, twist in so many different positions. Made sense to re-create it for humanoid neck.
I'm going to unveil my 8 year long ARC project in the next few months...
https://acrome.net/product/stewart-platform
Experimented last night. Got Y axis working fine. There definitely is tilt value in the Oculus Quest 2 being fed, but even with multipliers (negative and positive) I can't get it to do more than jerk around if using the Y and Z axis together. Might have to just go back to 4 DoF, but the 6 parallel actuators just allow for so many smooth organic / biomimetic poses!
Hardy Dynamics
Interesting site,
Also I am interested in your 8 yearlong ARC project you been working on, keep us posted, thanks.
Thanks! I've been working on a full humanoid for 8 years. Finally got it working in 3D printed PLA, now going back and re-doing the whole thing in aluminum. Will be quite interesting when I get the new version presentable!
sounds great
So, you have a aluminum. 3d printer? Want to see that...
Hi Any plans to update this skill ti Quest 3?
meta has completely dropped support and assistance for controlling third party hardware products so i'm not sure if we'll see an update for that. The only update i'd like to see in this skill is the kinematics for controlling arms. What advantages does the quest 3 have over the 2? Meta (facebook) won't even approve this app because it controls third-party hardware...
Does it not work on quest 3? I thought about buy the 3 but apart from colour AR transparency I couldn’t see the value in it.
Perhaps, if this robot skill had kinematics, I might utilize this robot skill more frequently. However, I have reservations about VR due to its excessive level of immersion. In my opinion, VR falls short because it can be an isolating experience. Additionally, I believe VR's potential is being undermined by Meta's focus on gaming for a predominantly younger audience, neglecting other valuable use-cases for their technology (such as robotics)
Agree but I just got the Quest 3 and it is awesome, virtual displays, hands recognition and interaction with virtual objects for example are very good. I think a new wave of applications will come as this is much better than the predecesors.
That's great news that the product has seen improvement! I would be delighted to witness Meta's support for third-party hardware, as this could significantly streamline Synthiam's development efforts for their hardware-related products. Initially, this robot skill was crafted with education in mind, responding to numerous requests from schools seeking such a solution. However, after its creation, Meta (formerly Facebook) became unresponsive and ultimately rejected our application. It has been a year, and they have not displayed any interest in applications that facilitate third-party hardware, particularly in the context of robotics. We did make an appeal on their forum, but it seemingly went unnoticed, possibly because Meta does not actively monitor their support forum.
Understood, I hope they change their strategy in the near future as this is a great field to explore and develop.
Hey I was just wondering if someone could help me with an issue that I'm running into. I'm trying to use this skill with the Quest 2 and I got it to connect but am running into some issues.
When I use the Connect Single option, I can see out of the Camera device but in the Main Window, I can't see any of the positional information updating.
When I use the Connect Dual option, I can't see anything out of the camera (just a gray screen) but the positional information on the Main Window updates, allowing me to control motors.
When I try to save the script on the button presses, they don't save properly. This one is more minor but thought it would be good to point out.
The main issue that I'm running into is that some of the features work separately but not together (controls as well as vision). Do you think you could help me troubleshoot the issues? Thanks
That's an interesting scenario. I must load and test it to see what could cause that - nothing comes to mind as a cause right now as it works for everyone else who uses it. I do know the position data isn't sent until the PAUSE checkbox is unchecked in the robot skill.
As for the grey screen with dual camera. Do you have both cameras selected in the robot skill configuration screen?
So I've got the dual camera servo to work (just had to add a second camera) but I'm having some issues with trying to control the fingers. Essentially, I'm running into an issue where when I close my entire fist the fingers aren't reading the maximum angle (ie a finger is fully retracted at 150 deg but the skill only reads like 70 deg). I found a setting to set the servo ratio to a different value but for some reason it's not saving. I want to save the ratio to something like 2 or 1.5 so that 70=140 but when I save it, it just reverts back to 1. Do you know if I'm doing anything wrong?
Servo ratio only works with slave servos: synthiam.com/Support/ARC-Overview/Servo-Controls
Increase your MAX or decrease your MIN servo position. That will move the servo further for the range of the finger. The oculus robot skill scales the MIn and MAX values. At least, they should be internally. I know this robot skill is getting an overhaul in the next 2 months, so there's probably several changes being done.
Is there support for wrist movement in this skill? I was able to control a hand with an Meta Quest 2, but I was just curious if this skill also allows for wrist movement or if there is another skill for that/I'll have to script my own.
There isn’t but the next version we can add it. The next version will be released after ARCx is released in a little over a month or so. All robot skills will be recreated for ARCx and I know this one has a list of new improvements. Wrist can easily be one of them for ya!