
PRO
jp15sil24
Germany
Asked
Roomba Movement Panel Sensor Interaction
Hi Athena,
I'm curious if the "iRobot Roomba Movement Panel" skill in Synthiam ARC utilizes the Roomba's wheel encoders during navigation? Understanding how the skill interacts with the Roomba's sensors will help me optimize its movement in my project. Thank you!
Related Hardware Roomba
Related Control
iRobot Roomba Movement Panel
Hi there!
Yes, the iRobot Roomba Movement Panel skill in ARC does make use of the Roomba’s wheel encoders for navigation-but with an important condition.
How It Works:
Bidirectional Communication Requirement:
The skill utilizes the wheel encoder data only if you are using a bidirectional connection (either via the PC COM Port or the hardware UART on the EZB). These connection types let ARC both send commands and read sensor data concurrently. If you’re using a one-way connection (such as Software Serial using only the RX line), the sensor data-including the wheel encoder counts-won’t be available.
Wheel Encoder Integration:
When using a bidirectional setup, the Roomba’s onboard wheel encoders (typically around 508.8 ticks per revolution) are streamed back to ARC. The Movement Panel skill then uses this odometry data to calculate the robot’s current position in Cartesian coordinates. This data can be pushed into ARC’s Navigation Messaging System (NMS) for mapping and further navigation improvements.
Optimizing Navigation:
With the wheel encoder data available, you can better monitor and optimize your robot’s movement. For example, if you plan to use waypoint navigation or integrate mapping skills (like The Navigator), be sure to enable sensor streaming in the configuration. This ensures that the used encoder values are continuously updating the internal odometry, which helps compensate for any drift during longer runs.
For more details on sensor streaming and the required connection types, you may want to review our Roomba support documentation here:
Roomba Hardware Support.
Do you have any further questions about setting up the sensor streaming or optimizing your Roomba's navigation? Happy to help!
I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
it means also we can not use the intel t265 in this case, right ? both sensors wheel encoder and intel t265 are in the same group L3G2.
If you’re asking Athena, you have to tag her name in the message. But I can respond - yeah you can’t use both lose hint types.
I mean, you could try but the data is gonna be wonky. The wheel encoders slip, and the t265 drifts. So even if they were to be calculated off each other, both are always incorrect.
The reason why the slam can generate an accurate pose is because it uses known landmark characteristics. So comparing against the slam pose is going to give you an almost near perfect pose value. The pose hint gives the slam an approximate place to begin looking for where the robot could be on the map. The slam does the rest.
We’re still thinking about the t265 wondering if the position needs to be reset to home every few seconds - if that helps compensate for the drift.
We have seen that t265 slowly start drifting so bad that our robot thinks it’s outside. That’s why you generally see commercial robots using outside in pose sensors (cameras or sensors in the room). But the other option for inside out pose sensors is having multiple scan angles and combining multiple slam maps. But that’s starting to get super cpu or gpu intensive