Asked

RTSP Stream Compatibility Issues

,

I am attempting to use a Reolink E1 Pro camera with ARC's Camera Device robot skill using its RTSP stream. I'm connecting it through an IoTiny controller.

  • Camera IP (local): 192.168.178.10
  • RTSP links (without password):
    • Main stream: rtsp://192.168.178.10:554/h264Preview_01_main
    • Sub stream: rtsp://192.168.178.10:554/h264Preview_01_sub

Both streams work fine when tested with VLC, but when I input the RTSP URL into the Camera Device, I encounter the following errors: System.ApplicationException: Failed creating device object for moniker System.Collections.Generic.KeyNotFoundException

It appears the Camera Device does not recognize the RTSP stream directly. Could you please assist with the following:

  1. Is there a supported workaround for using RTSP cameras with ARC?
  2. Is there a recommended method to make the Reolink RTSP feed compatible with the Camera Device in ARC?

Thank you for your guidance.



Related Hardware (view all EZB hardware)
EZ-B Iotiny by EZ-Robot
Control IoTiny with Synthiam ARC: 100MHz Cortex‑M4 Wi‑Fi IoT/robot controller-camera, servos, ADC, I2C and digital I/O for easy internet automation
Wi-Fi / USB
Servos 8
Camera
Audio
UART
I2C
ADC
Digital

Related Robot Skills (view all robot skills)
Camera Device by Synthiam
Use PC or robot cameras for vision tracking: color, QR, glyph, face, object, motion; record video/data, control servos and movement, run scripts
Stream video from any URI/protocol (RTMP, RTSP, HTTP, UDP, etc.) to a selected ARC camera device for real-time network feed playback.

ARC Pro

Upgrade to ARC Pro

Take control of your robot's destiny by subscribing to Synthiam ARC Pro, and watch it evolve into a versatile and responsive machine.

Author Avatar
PRO
Germany
#1  

sorry i just have seen a skill "stream all the video sources" and it works with his skill. Nervertherless , can you tell me if we can get the video stream without the skill "stream all ...." from my first question ?

Author Avatar
PRO
Canada
#2  

Great - glad Stream All The Video Sources solved it. To answer your follow-up: yes, you can get the RTSP stream into ARC without that skill, but it depends on how ARC/Camera Device can access RTSP on your Windows PC. Below are the practical options, ordered from easiest to most reliable.

Short answer

  • Easiest: keep using Stream All The Video Sources (recommended).
  • If you want to avoid that skill: try starting the RTSP URL directly from the Camera Device (ControlCommand), but if you hit the KeyNotFound/moniker error (DirectShow codec error), use one of these workarounds: install a DirectShow codec (LAV Filters) or restream RTSP to an HTTP MJPEG feed (via ffmpeg or VLC) and point Camera Device to the HTTP URL.

Details + stepbystep

  1. Try Camera Device directly (fast test)
  • From a script or the ARC scripting console you can try:
// start camera using the RTSP URL
ControlCommand("Camera", "CameraStart", "rtsp://192.168.178.10:554/h264Preview_01_sub");

// check whether the camera is active (returns true/false)
var active = ControlCommandWait("Camera", "IsCameraActive");
print("Camera active: " + active);
  • If this works, you’re done. If you get the same error (Failed creating device object / KeyNotFoundException) it usually means Windows/ARC cannot open that RTSP source via DirectShow because the system lacks the needed codec/filter.
  1. Make Windows able to open RTSP (DirectShow filter approach)
  • Install LAV Filters (Open-source DirectShow filters that add RTSP/h264 support) or a trusted codec pack that includes an RTSP-capable DirectShow source.
  • After installing, restart your PC and try ControlCommand CameraStart with the RTSP URL again.
  • This will let the Camera Device use RTSP directly (no extra skill) if the system DirectShow filter can handle the stream.
  1. Restream RTSP to an HTTP MJPEG (most reliable & avoids installing codecs)
  • Use ffmpeg or VLC to convert the RTSP feed into an HTTP MJPEG stream that Camera Device accepts as a "JPEG Snapshot HTTP Video Device".
  • Example ffmpeg command (listen mode - creates an MJPEG HTTP stream on port 8090):
ffmpeg -rtsp_transport tcp -i "rtsp://192.168.178.10:554/h264Preview_01_sub" \
  -f mjpeg -q:v 5 -r 15 -s 640x480 -listen 1 "http://0.0.0.0:8090/";
"C:\Program Files\VideoLAN\VLC\vlc.exe" "rtsp://192.168.178.10:554/h264Preview_01_sub" \
  --sout "#standard{access=http,mux=ts,dst=:8080}" --no-sout-all --sout-keep

Why restreaming is useful

  • It sidesteps DirectShow codec issues and is very stable.
  • Stream All The Video Sources essentially does this internally (it supports many protocols) - that's why it worked for you.
  1. Other options (virtual camera)
  • Use OBS + Virtual Camera or other "virtual webcam" drivers to make the RTSP feed appear as a local webcam device. Camera Device can then use that webcam device index. More setup and CPU usage, but possible if you prefer a virtual webcam.

Notes about IoTiny and network

  • IoTiny is the EZ-B controller for servos/I/O. The RTSP camera is a separate IP device on your LAN, and ARC (running on your PC) must be able to reach that IP. The IoTiny does not need to be involved in the camera network path.
  • Make sure firewall rules on the PC allow ffmpeg/VLC/ARC to bind/listen or connect.

Recommendation

  • If you want the simplest supported path and you already have Stream All The Video Sources working - keep using it.
  • If you need to avoid that skill, try ControlCommand CameraStart first. If it fails, use the ffmpeg restream to HTTP MJPEG - it’s robust and quick to set up.

Want me to walk you through one of these?

  • Do you run ARC on Windows? Which option do you want to try next (direct CameraStart, install LAV filters, ffmpeg restream)? I can give exact, stepbystep instructions for your chosen path.

I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
Author Avatar
PRO
Germany
#3  

If this works, you’re done. If you get the same error (Failed creating device object / KeyNotFoundException) it usually means Windows/ARC cannot open that RTSP source via DirectShow because the system lacks the needed codec/filter.

i got the same error. I will stay with the skill "Stream All The Video Sources", it is easier. i will play with camera and see how to get the best tracking by using, objet,glyph,color ...

Author Avatar
PRO
Canada
#4  

Great - sounds like a good plan to keep using Stream All The Video Sources if it already works for you. Below are focused tips and quick control commands to help you get the best tracking results (object, glyph, color) with ARC + Camera Device. If you want, I can walk you stepbystep through tuning one tracking type.

Useful links

Quick control commands (JavaScript examples)

  • Start camera with the Camera Device (if you ever test direct URLs):
ControlCommand("Camera", "CameraStart", "rtsp://192.168.178.10:554/h264Preview_01_sub");
var active = ControlCommandWait("Camera", "IsCameraActive");
print("Camera active: " + active);
  • Toggle preview type (Off / Original / Processed):
ControlCommand("Camera", "CameraSetPreviewType", "Processed");
  • Skip display frames (help CPU):
ControlCommand("Camera", "CameraSkipDisplayFrames", 5); // show every 5th frame in preview
  • Enable/disable servo or movement tracking:
ControlCommand("Camera", "CameraServoTrackEnable");
ControlCommand("Camera", "CameraServoTrackDisable");
ControlCommand("Camera", "CameraMovementTrackEnable");
ControlCommand("Camera", "CameraMovementTrackDisable");
  • Snapshot:
ControlCommand("Camera", "CameraSnapshot"); // auto filename

Practical tuning tips by tracking type

  • General (applies to all tracking)

    • Lighting: bright, even lighting and noncluttered background matter more than camera resolution.
    • Resolution/FPS: reduce resolution to 320x240 or 320x180 and 10-15 FPS if CPU is limited. Lower reduces noise and speeds processing.
    • Preview & CPU: if ARC UI slows, set Preview to Off or Processed and use CameraSkipDisplayFrames to save CPU.
    • Use Variable Watch to monitor $Camera* variables while tuning.
    • If camera is on the robot, avoid streaming high-resolution over congested WiFi - reduces frame rate and tracking stability.
  • Color / Multi-Color / YCbCr

    • Use Multi-Color or YCbCr when background has similar hues. YCbCr often performs better under varying lighting.
    • Train/define colors using the training tool and adjust Saturation/Luminance/Min Size while holding the object in front of the camera.
    • Increase Minimum Object Size to ignore small noise blobs.
    • If color detection flickers, increase smoothing by requiring detections across multiple frames (the Camera settings have min detection count/delay).
  • Object Tracking (AVM)

    • Train only unique details (logo, label, distinct texture), not the entire object. Follow the Camera Device manual’s training workflow: tilt/rotate the object while training.
    • Train from multiple angles and distances.
    • Use "Learn While Tracking" cautiously - it can drift and start learning the background over time.
    • Increase Min Detection Count so it requires several frames before reporting a detection.
  • Glyph / QR Code

    • Print glyphs / QR codes at recommended sizes and use a plain background.
    • Glyph detection is reliable but CPU intensive; set a lower resolution and allow more frames for detection.
    • Use CameraClearLastGyph and CameraResetQRCode control commands if you need repeated triggers from the same code.

Camera on robot (pan/tilt)

  • If you have pan/tilt servos, configure the Camera Device servos (min/max positions and increments) and use CameraServoTrackRelativeEnable for incremental servo motion.
  • Tune horizontal/vertical increment steps to balance smoothness vs responsiveness.

If you want a suggested workflow to tune one tracking type now

  1. Pick the tracking type you want to optimize (object / color / glyph).
  2. Set resolution to 320x240 and processed preview off (or skip display frames = 3).
  3. Configure the Camera skill min object size and detection count conservative (so noise is ignored).
  4. Train or adjust color/objects while watching $CameraObjectCenterX/Y and $CameraIsTracking in Variable Watch.
  5. Adjust lighting and thresholds until detection is stable.

Would you like me to:

  • Walk you stepbystep to tune object tracking for your Reolink camera?
  • Provide an example script to auto-start Stream All The Video Sources + Camera when your project loads?
  • Or give specific settings (resolution, skip frames, min sizes) for your PC specs?

Also, if you attach screenshots of errors or settings, note that I can’t view images-please paste the exact text or settings you see and I’ll help interpret them.


I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.