Dark Harvest
DJ hi, I am getting some weird map results in Better Navigator using the encoder counter skill. Using a Roomba via PC COM port and for Level #3 Group #2, I taped into Roomba encoders that go to an Arduino with Wheel Encoder UNO firmware. For Level #3 Group #1, I use a RPLidar. In the config of the Wheel Encoder Counter skill, I enable the push to NMS checkbox and leave all other settings default as they exactly match the Roomba. In The Better Navigator, I use External as pose hint source. What I am experiencing is that when I move forward or backward, the map creation seems ok, but as soon as I turn left or right, the map gets all messed up. The counts from the encoders are surprisingly good. My guess is that the BN is not processing encoder counts correctly with the left and right commands?
It sounds like you're experiencing issues with the mapping accuracy when using The Better Navigator with your Roomba and encoder setup. Let's go through some steps to troubleshoot and potentially resolve the issue:
Verify Encoder Configuration:
Check Encoder Wiring:
Review Wheel Encoder Counter Settings:
Pose Hint Source:
Lidar Configuration:
Test with Different Movement Speeds:
Check for Software Updates:
Debugging:
Consult the Manuals:
If you've gone through these steps and the issue persists, consider reaching out to Synthiam support for further assistance. They may be able to provide more specific guidance based on your setup. Additionally, you can explore the possibility of using a different pose hint source or adjusting the configuration settings to better suit your environment and hardware.
I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
Thank You, @Athena, Tried all Pose Hint Sources but the result seems to be the same. In the first screen capture, I just go forward. In the second, I turn left, and the map goes berserk.
It seems like the issue might be related to how the encoder data is being interpreted during turns. Here are a few additional steps you can take to troubleshoot and potentially resolve the issue:
Calibration of Encoders:
Encoder Data Validation:
Review Arduino Firmware:
Check for Slippage:
Adjust Turning Speed:
Lidar Data Correlation:
Debugging Logs:
Cross-Check with Other Skills:
Consult Synthiam Support:
Alternative Mapping Skills:
By following these steps, you should be able to narrow down the cause of the issue and find a solution. If you have any more questions or need further assistance, feel free to ask!
I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
@Athena,
One thing I noticed is that no matter if I move forward or backward the encoder counter skill counts always up to positive numbers. Shouldnt it be subtracting counts when moving backward?
Yes, you are correct. When moving backward, the encoder counts should typically decrease, resulting in negative values. If the encoder counter skill is only counting up, regardless of the direction, it could indicate a configuration issue. Here are some steps to address this:
Check Encoder Wiring:
Review Arduino Firmware:
Encoder Counter Skill Configuration:
Test with Simple Movements:
Debugging:
Consult the Encoder Manual:
Firmware Update:
Contact Support:
By addressing these points, you should be able to resolve the issue with the encoder counts not reflecting the correct direction. If you have any more questions or need further assistance, feel free to ask!
I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
@Athena, Please take a look at the code from DJ's UNO Firmware with Wheel Encoder Counter; it only counts up. How would you proceed to modify the code to take into account all 4 movement directions?
To modify the code to account for all four movement directions, you'll need to implement a way to determine the direction of movement. Since the encoder only provides counts and not direction, you can use additional logic to infer direction based on the commands sent to the robot. Here's a general approach to modify the code:
Track Movement Direction:
Adjust Encoder Counts Based on Direction:
countLeft()
andcountRight()
functions to adjust the counts based on the current direction. For example, if the robot is moving backward, decrement the count instead of incrementing it.Here's a modified version of the code with these changes:
Additional Steps:
currentDirection
variable based on the movement commands sent to the robot. This might involve modifying the communication protocol to include direction information.If you have any more questions or need further assistance, feel free to ask!
I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
The firmware wont' know the wheel direction - so that's gonna be a challenge if you're trying to do it in the EZB firmware. The robot skill is where it needs to be. Either that - or the robot skill can send a command to the firmware that explains the current direction. Actually, i like that.
I think the robot skill and firmware should be modified to accept a command that defines the wheel directions. That way, the firmware knows to add or subtract the encoder value. Everytime the Movement Panel direction changes, it sends an update to the firmware. The firmware will then use that value to determine how to alter the encoder count.
What do you think?
Btw the wheel encoder robot skill firmware is here if you wanna see how it's working: https://github.com/synthiam/Robot-Skill-Wheel_Encoder_Counter
Thank you @Athena and DJ,
The count pulses from the roomba encoder are suprisingly acurate. Just forget that the encoder is from the roomba and assume its a regular pulse count encoder. Would the Movement Panel from roomba send the direction to NMS or The Better Navigator?
I also think its a good idea to send direction to the arduino.
It's great to hear that the encoder pulses are accurate! Given that, let's focus on how to integrate the direction information effectively.
Movement Panel and Direction Communication
Movement Panel to NMS/Better Navigator:
Sending Direction to Arduino:
Implementation Steps
Modify the Robot Skill:
Update Arduino Firmware:
Direction Command Protocol:
0x01
for forward,0x02
for backward, etc.).Testing:
Example Code Snippet
Here's a basic example of how you might implement the direction command handling in the Arduino firmware:
Conclusion
By implementing these changes, you can ensure that the encoder counts accurately reflect the robot's movement direction, improving the navigation system's performance. If you have any further questions or need additional assistance, feel free to ask!
I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
Okay i modified the firmware and robot skill of the Wheel Encoder to add or subtract the values based on the Movement Panel direction. Check the updated v10 wheel encoder robot skill and the new firmware for it.
I was under the impression, based on your previous message, that you are using the roomba to send it's pose data to the NMS from an Arduino UNO with the wheel encoders. Based on this question, I'm not sure I fully understand now what you're doing now.
The roomba has built-in encoders and the roomba Movement Panel can send it's pose data to the NMS without you needing an arduino or anything else. It does it on its own
If you're not using the roomba's built-in encoder value counter and using an arduino wheel encoder, then you'd have to continue on with what i posted above.
I would use #1 because it doesn't require any additional hardware. The Roomba Movement Panel (https://synthiam.com/Support/Skills/Movement-Panels/iRobot-Roomba-Movement-Panel?id=19164) manual explains more.
LOL DJ, what a mess. I apologise for my poor fraise construction as it is not my native language. I am indeed using the Roombas encoders hoked up to an Arduino. I read in some posts that the data coming from the Roomba directly would go back to "0" when reaching 65565 or so. Will test the new firm and skill. Thank you.
Haha okay - no prob. We're progressing
I don't recall about the roomba robot skill going back to 0 but that could be the case. I'm sure it's something that can be fixed. Have you experienced issues by trying it? My first recommendation would be not use the roomba robot skill and see if you like that.
However, if you want to test out the new code i created - let me know. It "should" work based on my conceptual plan but it isn't tested. I'm just sitting watching the waves roll in while it rains outside. So i figured i should write some code and get you rock'n.
Bad news DJ, now as soon as I go forward both encoder counts jump from 0 to 6547.
Ah and here is the thred I was talking about the roomba encoders.
Ah okay - sounds like my logic is reversed in the firmware. You can edit and swap the ++ and for the encoder count increments to try it.
im not at home anymore so I can’t make the changes rn
Fixed - i was totally wrong with my bitwise. That's what i get for rushing things Try the new version, it'll work!
https://synthiam.com/Firmware/Wheel-Encoder-Firmware-NVXCXGPCQMJ
The good news, it didnt blow up my Arduino or my house.xD The bad news, it counts always positive. It loock like the variable "_WheelDirections" is not updated. If I can help in debuging and testing let me know DJ.
Ah yes - update the robot skill and try the new firmware https://synthiam.com/Firmware/Wheel-Encoder-Firmware-NVXCXGPCQMJ
What’s the verdict? Did that new skill and firmware fix it
Well here is how it tested out:
1- The counting issue is fixed. When turning left: left wheel counts down and right counts up. When turning right: right wheel counts down and left wheel counts up.
2- Moving the robot forward seems ok, the map gets updated.
3- Moving the robot left or right screews it all up. The values from the encoder are acurate so I imagine it has something to do with how those values are processed in BN?
Tell me how I can help DJ. Thanks
Just tested the "send position to nms" from the roomba Movement Panel and it works but not very acurate. The map doesent get messed up.
I’m guessing it gets messed up because the dimensions are not entered correctly. The math to convert wheel ticks to pose is very advanced and requires incredible precision of the dimensions.
try with the dimensions added because that’s important.
Thank you DJ, the dimensions are correct. I dont want to steal any more of your time with this, so I will use the fake NMS pose hint for now. It is difficult for someone to debug something without having the setup to test it.
I can’t have a non-working product so I’ll have to figure it out.
@Athena,
In the script editor with Better Navigator we can use "controlCommand("The Better Navigator", "StopNavigating");". Is there a "PauseNavigating" variable instead of Stop?
Yes, you can use the
ControlCommand()
function to pause navigation in The Better Navigator. The command you're looking for isPauseNavigating
. Here's how you can use it in your script:This command will pause the navigation process, allowing you to resume it later if needed. If you have any more questions or need further assistance, feel free to ask!
I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
By viewing the available control commands in ARC for the Better Navigator robot skill, Athena hallucinated that answer: You can see the available control commands for a robot skill by right-clicking in the editor or looking at the cheat sheet. The manual for the Better Navigator also has a section at the bottom that includes control commands. There is no Pause; there is only STOP. According to the manual, STOP will stop the navigation that is currently in progress.
You can instruct the NMS to pause: https://synthiam.com/Support/javascript-api/Navigation/SetNavigationStatusToPause (also see other NMS commands in the scripting manual)
If you provide more details of what you're attempting to do, that will be helpful to @Athena and myself
Hahaha, Athena hallucinating. Thats funny. I am just playing around with some ultrasinic sensors. I have 3 in the front or the roomba and using the ultrasonic collision script skill. I want to detect obstacles and pause navigation and find a clear path and then resume to the waypoint.
You can have the ultrasonic sensor push it's data to the NMS. All you need to do is specify the heading the sensor is pointing relative to the front of the robot and the offset. There's filters for highest and lowest to remove false positives. Once the checkbox is checked, the sensor data will be merged with the lidar data.
But, if you want to script the "stop" yourself, I think that's what the NMS scripting commands are for.
To achieve obstacle detection and pause navigation using ultrasonic sensors, you can utilize the Navigation Messaging System (NMS) scripting commands. Here's a step-by-step guide on how to set this up:
Configure Ultrasonic Sensors:
Push Data to NMS:
Script to Pause Navigation:
SetNavigationStatusToPause()
command to pause navigation when an obstacle is detected. Here's an example script:Find a Clear Path:
Resume Navigation:
SetNavigationStatusToNavigating()
command to resume navigation:Testing:
By using the NMS scripting commands, you can effectively manage navigation pauses and resumes based on sensor input. If you have any more questions or need further assistance, feel free to ask!
I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
That is so cool, didnt know that. I jumped right to the ultra sonic collision script skill. Thank you @Athena and DJ.
You're welcome! I'm glad you found the information helpful. The Ultrasonic Collision script skill, combined with the Navigation Messaging System, can be a powerful way to enhance your robot's navigation capabilities. If you have any more questions or need further assistance as you continue to work on your project, feel free to reach out. Happy robot building!
I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.