Thumbnail

Intel Realsense D435i

by Intel

Connect the D435i Intel Realsense depth cameras to the ARC navigation messaging system (NMS) for distance detection and mapping.

Requires ARC v19 (Updated 3/29/2022) Hardware Info

How to add the Intel Realsense D435i robot skill

  1. Load the most recent release of ARC (Get ARC).
  2. Press the Project tab from the top menu bar in ARC.
  3. Press Add Robot Skill from the button ribbon bar in ARC.
  4. Choose the Navigation category tab.
  5. Press the Intel Realsense D435i icon to add the robot skill to your project.

Don't have a robot yet?

Follow the Getting Started Guide to build a robot and use the Intel Realsense D435i robot skill.

How to use the Intel Realsense D435i robot skill

Connect the D435i Intel Realsense camera to the ARC Navigation Messaging System (NMS) for distance detection and mapping. This should also work with other Intel RealSense depth camera models because the FOV and parameters are used from the device.

User-inserted image

Main Screen

User-inserted image

  1. Select the device by the serial number. This allows multiple Intel Realsense devices to be added to a project.

  2. START/STOP button connecting to the RealSense device.

  3. Log window for status and errors.

Image Align

The Image Align tab allows selecting the area of the image that you wish to detect for the point cloud. For example, having the detection too low will detect the floor at a close distance. Ideally, set the bar high enough to avoid detecting the floor.

User-inserted image

  1. Adjust the top of the detected image range.

  2. Adjust the bottom of the detected image range.

  3. detected image range is highlighted in purple.

Configuration

User-inserted image

1) Offset of the T265 from the center of the robot. The positive number means the sensor is toward the rear; a negative number means the sensor is toward the front. The measurement is from the center of the robot. The sensor must sit aligned with the center of the robot, not toward the left or right. The sensor must face 0 degrees heading.

  1. Sensor resolution during initialization. If the values are changed from the default, obtain them from the Intel RealSense viewer. Changing these values may be necessary if using a different sensor than the D435i. Set both width & height to 0 for auto-size, which isn't the most efficient for performance.

  2. The number of depth pixels to skip when processing the depth data. This is useful for lower CPU PCs or when high-resolution depth is not required. Since there is a limited field of view (FOV) on the sensor, it may not be necessary to process every depth pixel. If the sensor resolution is 1280x720, the CPU would be processing 921,600 depth points for every scan of the specified framerate. At 1280x720 @ 15 FPS, that's 13,824,000 depth points per second. If the Skip Scan Points for X & Y are set to 10 for that resolution/fps, the CPU would only need to process 9,216 depth points per scan (or 138,240 per second).

The editor will recommend a value for the best balance between performance and detection based on the sensor FOV.

  1. The camera video from the RealSense can be pushed into a selected Camera Device. In the selected camera device, choose CUSTOM as the device type. This will accept the video feed from this robot skill. Ensure START is pressed on the selected camera device as well.

Troubleshooting

If you experience any issues, ensure the USB 3 port is used on the computer. Second, load the Intel RealSense Viewer. There may be a prompt to "Enable USB Meta Data," which is a system-wide change. Ensure you answer ENABLE to that option. Lastly, update the firmware if there is a prompt to do so.

If the sensor works in the Intel RealSense viewer, it will work with this robot skill. Depending on the version of the sensor, you may need to configure the capture width/height/framerate. This can be done in the robot skill configuration screen. The RGB and Depth camera resolution values must be the same. You can use the Intel RealSense Viewer to see what values work for your camera. If the RGB camera is 640x480 x 15 FPS, the depth camera must also have a supporting resolution with the same settings. This is because this robot skill will parse the depth and RGB data together.

Variables

This skill will create four global variables that scripts can reference.

User-inserted image

$D435IsRunning - status of the D435 robot skill if it is connected or not to the sensor.

$D435FurthestDistanceCM - furthest distance detected

$D435FurthestDistanceDegree - furthest degree of the furthest distance CM variable.

$D435NearestDistanceCM - nearest distance detected

$D435NearestDistanceDegree - nearest degree of the closest distance CM variable.

ARC Navigation Messaging System

This skill is part of the ARC navigation messaging system. It is encouraged to read more about the messaging system to understand available skills HERE. This skill is in level #3 group #1 in the diagram below. This skill contributes telemetry positioning to the cartesian positioning channel of the NMS. Combining this skill with Level #3 Group #2 skills for obstacle avoidance. And for Level #1, The Navigator works well.

User-inserted image


ARC Pro

Upgrade to ARC Pro

Unleash your creativity with the power of easy robot programming using Synthiam ARC Pro

PRO
Synthiam
#1   — Edited

Fixed Intel's bug so the device can be started and stopped

There is a small performance improvement as well, which will be noticeable on SBCs

Also, this only displays depth sensors in the drop-down

PRO
Synthiam
#2   — Edited

Pretty big bug fix that correctly calculates the point-cloud distances into distance scans. You may have noticed that flag objects, such as walls were rounded. This is no longer the case.

Here you can see the wall, corner of the wall, and the edge of the monitor. 

User-inserted image

And here's a low scan of just the table in front - where it's flat and no longer curved.

User-inserted image

PRO
Synthiam
#3   — Edited

Updated with a huge performance improvement. I didn't realize the Intel sample code was copying the depth data into a managed array on every frame ouch. So I'm referencing the unmanaged memory pointer directly. Should have known better than to copy that bit of code from their sample:D

Should experience a 10 times improvement. I went from 27% cpu to 2% cpu

User-inserted image

PRO
Synthiam
#4  

Like The Navigator, there are variables for the nearest and furthest object detected... just in case this is being used without an NMS skill like the navigator

User-inserted image

PRO
Canada
#5  

Thank you so much for making this.  Just installed on the Rock Pi X.  Seems to work well although took me a bit of time to get it working. I have to run it off the USB 3.0 port as I get an error on USB 2.0 but the T265 seems to work on the USB 2.0 port ok

Couple of questions Did you manage to get to work off the USB 2.0 port or do we not have enough bandwidth ? If you can use off usb 2.0 could it work with 2 or more D435i ? If you mount the camera sideways could you set image range vertically instead of horizontally?

PRO
Synthiam
#6   — Edited

The device is usb c which requires usb 3. I’m sure there’s more information about that in the product manual

PRO
Canada
#7   — Edited

Ok it works with USB 2.0 with the realsense view app provided by intel.

I guess I could try and  use the USB C port for 1 and the USB 3.0 port for another.

PRO
Synthiam
#8  

I don’t know how that could be. It’s usb-c which is usb 3. If it’s true, then maybe their app is doing some wizardry that uses a different driver and somehow defies usb-c specifications lol

PRO
Synthiam
#9   — Edited

I googled for you:

User-inserted image

they somehow magically use the usb2 channel over usbC and then internally have a different driver that operates at a low speed.

PRO
Synthiam
#10  
  • performance improvement
#11  

Is the IMU function of this skill used, or would it work with the less expensive d435?

Although I am currently satisfied with the T265 and Lidar, I am realizing that on a larger robot the Lidar will miss items near the floor, so I might want something mounted lower facing forward instead or to augment the Lidar.

PRO
Synthiam
#12  

You don’t need to mount it lower, and I wouldn’t recommend doing so. You adjust the area of the image to be detected. Check through live hacks for examples of what’s documented in the manual

there is no imu data stream returned from the device with intel’s sdk. I have tried to get the imu data as well with no success

I believe it will work with the earlier models. I can’t speak on the performance differences between models.

#13  

Thanks.  I believe the d435 and d435i are identical except for the IMU as far as specs, but I will confirm.  Not even sure I am going to get one yet, but it intrigues me...

Alan

PRO
Synthiam
#14  

If you do mount it low, angle it up. I think I read somewhere once about the preferred angle.

#15  

I confirmed from the Intel site that the d435 and d435i are identical specs except for the IMU, so I ordered a d435 to play with.  Interestingly, one of the Amazon listings, and a Tiger Direct listing show a "30 pack" for just a few dollars more than a single unit.  I asked Tigerdirect online to confirm if they are really shipping 30 units and they didn't respond, so for $10 more, I took the chance.  If I get 30 of them, I'll sell them cheap or trade for other robot gear with members here..... (they did say shipping weight is 4.5 lbs and I know the device is only a few ounces so it is possible that they actually did bulk packaging and someone mis-priced it, but it happened at multiple sites (Walmart and a few other sites also had it listed the same way, but out of stock - one even said "bulk pack" which leads me to think I may actually get 30 of them......).

#16  

I got my d435 (it was not actually a 30 pack, but had clearly come from one.  Didn't have a box, manual, tripod, etc...  So, I spent more for less.  Should send it back but I am anxious to get using it.  Which leads to the problem.

Downloaded the driver and RealSense Viewer app from Intel.  Plugged it in, tested it, and it works perfectly in the viewer.

Added the plugin to the project, and the camera object can see the RGB camera, but the plugin gives the following error when I attempt to connect (whether or not the RGB camera is connected, also tried in a new project after restarting ARC and only added the plugin with the same results):

Starting Intel RealSense D435... Couldn't resolve requests

No errors showing up in debug, just in the plugin window.

User-inserted image

#17  

@DJ, I hate to comment just to bounce, but any idea why this works in the viewer and not in ARC.  Should I return it and get the 435i even though the SDK should be identical or can you provide a debug version with enough logging to figure out what is wrong?  I expect to have time on Sunday for some testing if you want me to debug it.

PRO
Synthiam
#18   — Edited

I’m thinking it might be resolution related. I added a todo to have the plug-in robot skill accept a drop down of resolutions in the config menu

its the weekend so no one is working to debug anything for you. But I’m guessing sometime next week there will be an update to the skill for ya to try

PRO
Synthiam
#19  

Hey can you use the intel program viewer thingy. And tell me what resolutions it supports. There’s ones for the rgb camera and tracking.

#20  

Will do tonight.  I think I had it set to 640x480 bit it supports higher and lower. I'll get list tonight or in the morning.

If you want me to test a debug version with extra logging, happy to do so as usual.

Alan

PRO
Synthiam
#21  

Ya 640x480 is 4:3 which is a square. The 435i uses widescreen 16:9 or something close. So it’s probably an unsupported resolution.

#22  

I didn't get a chance to test this weekend, but will after work today.  Thanks.  the resolution thing makes sense.  I was planning on using the RGB camera as the only camera on the bot, so set it to a fairly low res so I wouldn't have an issue with object recognition, but I can stick an EZ-B camera on for that if 16:9 uses too much CPU for object recognition and tracking.

Alan

PRO
Synthiam
#23  

Updated to allow specifying resolution. Read the manual above for instructions

User-inserted image

#24  

I got it to work only at the very highest resolution which only supports 6 frames per second, but it was taking 20 seconds or more to detect a distance change in the skill (again, it worked fine in the Intel viewer), so the robot was running into walls.  Most of the resolutions gave the same error as before, without your added text about checking the USB port and supported resolutions.  640x480 gave an error pointing to a particular line of code, but I ran out of troubleshooting time, so I didn't catch it for you.  Will try again tonight.

I'll gather more detailed info for you tonight.  If you want me to try a debug build with extra logging, I am always happy to do that.

PRO
Synthiam
#25  

when you "test" in the realsense app, are the RGB camera and Sensors both at the same resolution?

#26  

I have set them the same and different resolutions, and it works regardless.

Alan

PRO
Synthiam
#27   — Edited

Your tests in the realsense app would need to be the same resolution for both the sensor and rgb. Otherwise, there image and distance detected would be out of sync for mapping in ARC.

What is the resolution and format for both rgb and sensor in realsense app that provide the best cpu usage?

PRO
Synthiam
#28   — Edited

Also, try setting the width & height in the latest version to 0. That will use the auto-size, or the last size used (i.e. from intel realsense viewer)

#29  

Will try after work tonight and let you know.

I am sure we'll get this figured out.

Should I run the calibration app that came with the driver download too?   I didn't both since you didn't mention it in the documentation.  Also, there are several profiles in the viewer app for what you are expecting to see (ie, hand signals, far distances close, etc...).  Is there a preferred one and does the device remember it after it is used in the app to when the skill accesses it?

Alan

PRO
Synthiam
#30   — Edited

I don't know anything about a calibration app. That's most likely a question for the intel company that makes the product. You're somewhat exceeding my knowledge of the device haha:)

#31  

I have tried everything I could think of, I removed the Intel drivers I had installed and just tried whatever Windows installed.  Still worked in the viewer, but the same issues in ARC.  I tried multiple resolutions.  The only 2 that the RGB and the Stereo have in common are 640x 480 which gives this errors (same as 0x0 no matter what the resolution in the viewer was:

User-inserted image

or 1280 x 720 which works but only when set to 6 frames per second and actually only updates the distance about every 20 seconds so the robot drives into walls.

All other resolutions give some variation of:

User-inserted image

Here are screen shots of the supported resolutions:

User-inserted image

and

User-inserted image

I also tried removing the Intel driver and letting Windows find its own driver in case it was a driver issue, with the same results.   If this is not what you see in the viewer, I will return the device and get a 435i.  Otherwise, I am going to go back to Lidar for a while so I can work on navigation scripts that work until we have a 435 skill that actually works.  (that sounded more pissy than I meant.  My mom is in the hospital with Pneumonia,  Thank G-D it is not Covid, but I am stressed).

As always, if you want to provide a debug version with high or highest logging level set to figure out what is causing the issue, I am glad to install and test, although depending on my Mom's condition and if they send her home, I may be offline for a week or so...  They won't let me visit the hospital despite the fact I am fully vaccinated, so at least for the next few days I should be available to test.

#32  

I have a question, about the Intel Device. Can it work for outdoors as well? would beinteresting to have an indoor/outdoor sensoe on the robot.

PRO
Synthiam
#33  

That’s super weird - wonder what they’re doing differently. I’m using their examples in the sdk.

did you set width and height to 0? And same error?

I know you keep asking about a debug version - but you have it already:). the error message that you see is the most verbose there is.

This is why I dislike working with any intel products... discontinued products with discontinued support and incomplete sdk.

PRO
Synthiam
#34  

One more thing ....

I noticed your screenshots do not have the rgb sensor or depth sensor active. Can your future screenshots include real-world test scenario. Sometimes things can be missed if liberties are taken.

I’d like to know the these things when you get an ideal resolution working with the realsense viewer. Make sure the rgb and depth resolutions & framerate are identical and both sensors are active. Once that’s done, tell me these things...

  • the common resolution
  • the common frame rate
  • the depth format (ie z16)
  • the rgb format (ie rgb24)
#35  

I'll test again tonight.  Yes, I tried 0x0.  It gave the same error as the 640x480 test (first screen shot in my last message)

There are only 2 common resolutions available. 640x480 and 1280x720.  1280x720 is the only one that kind of works in ARC, but not in a usable way.

PRO
Synthiam
#36  

Understood. Let me know the frame rate and formats for 640x480

PRO
Synthiam
#37   — Edited

In the latest version, you may be able to use 1280x720 and specify higher skip depth points. try 10 or even 20 and see what your performance is like (or use the recommended performance values). This isn't a long-term solution, but necessary for even slow PCs with low depth resolutions.

*Note: Read about this setting in the manual at the top of this page.

User-inserted image

#38  

Thanks for the update DJ.  Have a family emergency going on.  I couldn't test last night.  Might be able to this weekend, but then I will probably be unable to do anything for a week or so.

Alan

PRO
Synthiam
#39  

Sorry to hear that Alan - take care of your fam:)

#40  

Thanks DJ.  Family first always. I'll share more (in a different thread) after I know more about what is really going on.   Hearing what my Mom thinks she hears vs what the doctors actually say is not always the same thing, but the situation is not good, just may not be as dire as I first thought.

PRO
Synthiam
#41  

Updated to latest intel library that apparently includes bug fixes - as they say:)

#42  

DAMM, really may just need to find one, I was always curious about this one.

PRO
Synthiam
#43  

v18 - hides error that Intel driver sometimes raises when stopping sensor

PRO
Synthiam
#44  

v19 updated

  • default resolution values are 640x480x6fps to match the latest firmware settings

  • fix for the offset value that wasn't being calculated correctly

PRO
Colombia
#45  

Thanks, regarding the question if could be possible for the RS 435 to stop feeding the NMS, and use it for other purposes (like an arm cam), what do you think?

PRO
Synthiam
#46  

I can quickly change this robot skill so the depth data is not pushed to NMS.  How would you access the data from the sensor? There are 307,200 data points at 640x480 resolution. That would be a vast array. Do you have any ideas about how you would like to access the data?

PRO
Colombia
#47  

For now I was taking about that  basic option to disable the NMS feed to use it as also as a CAM only.

One idea to take advantage of  the depth information (outside the NMS) is to use it for example in the arm to have  better precision  to manipulate and recognize objects. Datapoints volume could be manipulated in a similar way as is in the current skill with the image gap selection.  Thanks

PRO
Synthiam
#48  

I understand to make it not send to NMS. However, I am struggling to figure out a way to present 307,200 data points to the user. Can you give me an example of how it can be used? Would you want to know furthest distance and closest distance of each frame? Or how can I make the data available to you?

#49   — Edited

I am intrigued by this product as they sell an outdoor waterproof style as well. I am planning on buying one but want to verify that it works ok within ARC. @TechGuru did you get it working ok? Can it be used as a standalone product without having to go through NMS. I'd like to manipulate the variables in different ways.