Asked — Edited

What Is Really Needed For Indoor Navigation?

I have read about many ways to develop Indoor Navigation. The Hardware recommended varies from an IPS system to a Lidar based system, even straight scripts, the use of cameras, Encoders and maybe even a compass. There are simple methods given in tutorials but they are not the "real" navigation wanted.  Total confusion is what I end up with. I can see the validity of each device and even combinations of devices with added safety sensors such as drop off sensor, bumper sensors etc. Sophistication is needed but I end up with Mystification.

My main question is what works, is needed and makes sense? Anyone want to discuss this subject?

I know there are many who want to build the base for an Imoov or their favorite robot build or just get an Adventure Bot to travel throughout the house using a single verbal command. I would.

If Dj wants to do a Hack Night on this subject I think many would be interested.


Related Hardware EZ-B v4

ARC Pro

Upgrade to ARC Pro

Join the ARC Pro community and gain access to a wealth of resources and support, ensuring your robot's success.

#1   — Edited

Order this Oculus ROV, learn ROS so we can all be done discussing this because ez robot's hardware for this simply isn't coming. And only .01% of the people on here are smart enough to create something like this.... Heck most of people on here can barely program anyway so an "out of the box" solution is not looking good... No offense. I think you are "beating a dead horse"

I am happy to eat crow sandwich if ez robot proves me wrong, but I am not holding my breath

#3  

Yep (but I didn't buy it, someone I know did) ... I used it for stem lessons and it preformed very well at room mapping, auto charging, and of course navigating... You want the slam navigator version... You are also going to need to know a little linux as ROS runs in that environment...

#4  

Ok that is option 1 I will look further into it's functionality to see how it works and what it uses. (Lidar etc.) I am not ready to send the $ 1,500.00 yet.

What else can be done using more familiar devices like I mentioned? Even if it is less sophisticated.

PRO
Synthiam
#6   — Edited

That’s the point of synthiam, Richard. A high percentage of users, including yourself cannot program with ros. That’s because it’s not sustainable for an industry to require c++ engineer programmers. Although, anyone who argues that would fit right in with the 1970’s Unix hippies:)

i can’t speak for Ezrobot, but I imagine the lidar and ips do not align with their target market. That being said, all synthiam reference design hardware is open source and there’s a number of companies looking to manufacture them.

The main challenge with a lidar configuration is the robot needs to be designed for a very specialized case. For instance, that expensive robot Richard posted is incredibly useless for any practical robot application because it’s designed to demonstrate pre-installed navigation features.. doesn’t align with the creative aspect of DIY robots.

My initiatives are always to make solutions modular. I’ve done that with the synthiam hardware reference designs. You can make it yourself, or wait to our partner team figures out who should make it for sale.

In in the meantime, don’t let Richards crudeness affect your creative efforts. An industry that requires you to feel outcasted because you’re not not a programmer would not be sustainable industry to exist. Fortunately, like all technologies, our efforts will prevail by removing the barriers of engineering requirements. You know, like html did for the internet or windows did for the PC...

Back to the original question. The IPS, is what I believe to be the most reliable solution for indoor navigation.

#7  

Thanks Dj.  I understand Richards response. Due to the slow advancement and releases of products which will do the job without someone having to be an engineer, programmer, designer etc. to make it work is frustrating.

I see the advantages of a modular device which can be added to projects which are already a work in progress. You have been working on this for quite while. I hope to see it develope and happen again soon.

I thank you for the advice to continue to look into the IPS. Ron

PRO
Synthiam
#8  

To be clear - synthiam will never manufacture and sell hardware:)

#10   — Edited

Well sadly the truth hurts.... Wasn't being crude, just a realist... I will give you that ROS is not the future of DYI robot programming.... It's seriously complicated with a steep learning curve... What Windows did for the PC (making PCs so that average person can use them, not just the super smart), something is still needed to make robots more accessible for average people to use and program.... EZ Robot "is/was" filling that role. @DJ do you still own the rights to ARC or did you lose it in the divorce (-:? Maybe we can keep this "train" a rolling....

PRO
USA
#11   — Edited

@DJ:

Quote:

The IPS, is what I believe to be the most reliable solution for indoor navigation.
ASFAIK the IPS is a IR beacon with a IR Camera, so each IPS device will tell you if the Bot/Beacon is in the room and if is visible and not hide behind an obstacle and only then you have the X / Y beacon position in a Picture. So you will need multiple IPS devices hanged to the walls, furniture or ceiling spread around the house/apartment, plus power cables. You will need to explain your family members why there are cameras everywhere (privacy issues) and then you need to calibrate the sensors X / Y to distances.

IR beacons are prone to daylight interference so you need to close some blinds to avoid the daylight e.g. windows / balconies etc.

Only after you wire all the sensors, you need to script all the positions / distances.

So basically you gave up your privacy, and you live like a Vampire (no daylight)...(exaggerating)

Is this simple and reliable or am i missing something ?

Why IR beacons, how is different from a camera with robot/object tracking ?

#12  

I don't like it personally but until something else comes along, ROS and slam are pretty much still the only game in town for effective indoor navigation....

Portugal
#13  

Wavefront is a cheap alternative for indoor navigation. There is a plugin from rz90208 but it is incompleet. ptp you have the skills to develop such a plugin.

PRO
Canada
#15   — Edited

I'm definitely interested in this conversation:)

I know there are a few different Indoor navigation products out there now that can be purchased at robot shop. Some use Ultrasonic sensors that communicate with each other. Others use an Infrared patterns on the ceiling. These systems don't use ROS but are involved to set up in their own right. By contrast, with the Synthiam IPS Reference Design it's as simple as plugging in a USB device and clicking a mouse. I sincerely hope the project will be made by ezrobot or other partner company, as it is so easy to use. While the technology for those systems I mentioned are very cool, it's the cost that is the biggest detriment. If produced, the IPS reference design would likely be the most cost effective solution on the market.

@Ptp I can speak a bit on how it works in the practical sense. Since it's Infrared based it can be disrupted by direct sunlight but isn't effected much by reflected sunlight. The image contrast can be adjusted manually to block false positives. It also has a pretty dark daylight filter on the Infrared camera side to block as much daylight as possible. The IPS itself can be placed on a wall and aimed toward the floor which will cut out the majority of any infrared interference that could possibly get picked up. Using it with the curtains open shouldn't be a big deal as long as you don't have many shiny reflective items on the ground level. Yay! You don't have to be a Vampire!

You are correct in saying that you would need one IPS per area if you would like to have full coverage. Full coverage would likely only be feasible with an on-board system.

In terms of getting trapped under obstacles, you would just have to map out your robot's path ahead of time with the app to avoid those kind of areas.

As for privacy, I am certain the IPS could be used without the real-time camera if needed. Really, once set up in a permanent location the IPS could simply use a screenshot for the real time image, the navigation is all done with the Infrared camera so essentially the real-time camera could be turned off. I don't believe that functionality exists in the app but I'm sure that this kind of feature could be explored.

In the case of the IPS, the IR beacon is used because it is tiny and can still be picked up from upward of 16 feet away. Object/glyph/color recognition is much harder to do at this kind of range, unless you have a huge glyph/color sample.

Portugal
#16  

Rovio robot used the same concept with it's north star beacons. Jeremie have you seen the video I posted about the new Intel camera? What do you think?

PRO
Synthiam
#17   — Edited

Well said Jeremie. I’ll add a bit. The ips uses the visual light spectrum camera to map the areas the robot cannot drive (obstacles like sofas and chairs). The system also maps the way points, which are used for navigation.

For example, a way point could be a kitchen or a doorway to the kitchen. you can then use a ControlCommand to have the robot navigate to the waypoint.

You only require one ips per room. This is because the visual lens of an ips is 170 or so degrees. Higher mounted the better. The ips also has usb or serial output, so it can connect to an iotiny for WiFi connectivity or usb direct to pc.

The positive experience arises when comparing to lidar because ips operates far more successful for its indoor placement in a home. You will have to experience to believe it.

Also, the ips combined stellar 100% successful navigation with swarm arrangement - meaning you can have hundreds (or two) robots navigating and participating in the environment. Sharing is caring!

The suggestion of buying a robot with ros navigation blah blah is silly. If you want a navigation diy robot that is custom built to your requirements - go to school for 10 years and learn ros and C++ lol. Or the alternative that synthiam will provide shortly will be far more amazing.

I used to keep the proven template of standard industry economics related to maturity of this industry to myself because I feared others would copy us. But, at this stage it seems others are far too determined to prove economics wrong and keep making robotics more complicated rather that accessible - even at the cost of Jibo, Anki, Kuri, baxtor, etc.... So, that’s where I come in. I wait patiently for a technology to be proven, I adapt it and make it accessible to people with ideas and lack programming skills. Hey, quote me now... that paragraph will belong in history books:). I predicted the bankruptcy of Kuri, Baxter, Anki AND jibo on many many occasions. There’s one other approaching that you’ll say wow I thought warehousing robots were all successful. But, I won’t say who it is. Keep you guessing.

Back on topic...

We’re developing something to leverage existing low cost navigation products to hack for development  - such as the cost effective old neato lidar equipped vacuums. And also a number of robotshop lidars with our new slam device driver - which you’ll see in about a month (sooner in beta). It lets many popular lidars connect to ARC’s new navigation subsystem. Using a standardized protocol that can be adapted by any lidar hardware.

Remember, synthiam doesn’t have the overhead of hardware manufacturing. This means high speed implementation of software features... stay tuned!

Needless to say - ros and others pave the groundwork for synthiam to identify what’s useful and what isn’t so we can make it accessible to everyone.

richard - it’s never been about who’s first. Which is why coming in last is a good thing for this business.;)

Portugal
#18  

What if someone stands in front of the beacon? In my humble opinion the future is in cameras and robot vision, and the future is now. With this I do not want to contradict your sharp eye for the tech business, its just my opinion.

#19   — Edited

What if someone stands in front of the Camera or Lidar? A similar issue. In my opinion, if the component prices are realistic, a combined navigation system using both optical and lidar will work well.  The only issue I have is, my need is now. I have three platforms ready to go.

#20  

@Andy Roid "What if someone stands in front of the Lidar? A similar issue." ROS on the Oculus ROV will navigate around obstacles even if they change position.... In other words moving furniture around won't affect SLAM navigation...

#21  

Again, I agree, but even my Arduino can work out such a situation.

#22   — Edited

@Andy Roid Well if your arduino can do it then your indoor navigation problem is seems solved then, right?

What I was trying to say is SLAM (Oculus ROV can anyway) can navigate a constantly changing environment and still navigate to it's destination or next way point... It is way too complicated and expensive imho, however. Another "reliable", simpler and cheaper solution is needed...

#23  

Agreed ! Lets hope the NEAR future satisfies our needs.

PRO
Synthiam
#24  

There’s a lot of work to be done in the slam space. I’m intrigued by it and hope it’s a success because what matters to me is that one of the solutions works - I don’t care which one:). The beacon technology is used most in warehousing robots or where reliability is required. Slam and lidar has been attempted to be used for outdoor navigation, and it’s been getting better.

The visual slam slam stuff I find most intriguing because the math must be nuts!

If someone stands in front of an ips camera sensor, the robot stops and waits for the beacon to be detected again.

#25   — Edited

@DJ... Ok so no hardware manufacturing, got that.... Can I ask if you/Jerimie would do custom boards for users who would pay for them (albeit a higher price)?... You have IPS already working so any chance you guys would make these board(s) on the side for some members willing to pay?

#26   — Edited

Or maybe a DIY kit Supplier. Chips pre programmed with firmware? My Uncle had a similar business.

PRO
USA
#27   — Edited

I'm really intrigued with the simplicity and the results ...

I don't mind building one, although a PCB/KIT would reduce the build time ....

@DJ: I see there is a BOM, firmware, can we start, is the plugin ready to use ? Otherwise does not make sense the challenge.

PRO
Canada
#28  

@ptp

The only snag I can foresee is that ezrobot has a custom sized ribbon PCB for their camera. So you won't be able to directly order the camera. It is much smaller in length than the standard camera you can buy off the shelf. That being said, if you had 2 x ezrobot cameras (the second generation) you could transplant the camera units themselves onto the IPS. Oh and the camera's have a 100 degree wide angle lens installed, those could be a bit tricky to source. One of those lens also needs the IR filter removed. I will upload some more documentation for the cameras to github today.

I can let @DJ answer your other question. I believe there is a bit more work to be done on the plugin.

PRO
USA
#29  

@Jeremie:

Q1) Can you confirm wide angle lens degree ?

Quote:

This is because the visual lens of an ips is 170 or so degrees.
Q2) Is the regular (non wide with IR filter) camera required ?

PRO
Canada
#30  

@ptp

Q1. So it is a 100 degree wide angle lens which likely translates to a large amount of degrees if it were a normal lens.

Q2. The documentation that I will be uploading later today will make it clearer.

#31  

Can an adapter board or cable be built to make it fit?

PRO
USA
#32  

@Jeremie: Are the jtag pins available in the PCB ? Can you generate/upload the gerber files ?

PRO
Canada
#33   — Edited

@Andy Roid, Unfortunately not. The original camera (oem - not ezrobot) flex cable is just too long for the IPS.

@ptp, no the JTAG pins aren't available but the SWD programming pins are broken out.  SWD is an STmicro specific programming interface that only requires 4-pins. SWD works with the ST-Link v2 programmer ($30) which is much more affordable than most JTAG programmers.

I will upload the gerbers for all the hardware projects to github today.

#34   — Edited

I was just curious. I was wondering if the ez cameras needed to be attached to the board or could be remote mounted as long as they were together. (side by side) or if the original standard duel camera remote located. Then the longer cable could be used ?

PRO
Canada
#35  

@Andy Roid, I guess that may be technically possible but you would also need something to mount them to in order to keep them flat against a surface.

#36  

A remote mount, part of a picture frame was my thought.

#37  

Thanks for the answers. Just my mind running again.

PRO
USA
#39  

@Jeremie: Thanks, the gerber files worked. I ordered the PCBs.

Portugal
#40  

@ptp Are you trying to build the IDP? Cumps.

PRO
Synthiam
#41   — Edited

I think what would be useful for all of you is to work on a desired outcome. The conversations of navigation seem to be about autonomy. What’s the purpose of your navigation needs? What are you actually wanting the robot to do?

Remember, a solution requires an actual problem

maybe the answer is I don’t know. I want the robot to move around and charge itself and make my breakfast and walk my kids to school. If that’s the case, the answer to your navigation needs is still in exploration stage. And therefore you simply need a navigation device to excitement with.

#42  

Hi Dj, My goal is to be able to have my robot travel to a specified location, perform a function, and return to the home position.There can be multiple locations determined by a verbal command. Also a second command would be the function. Then an execute command. An example of a function is speak a greeting, scan a face using an omron camera for facial recognition respond, then return to a home location. ( A greeter at the front door or travel to the play room to tell at the kids to be quiet. ) This is just what comes to mind right now.  Does this help? Ron

#43   — Edited

Quote:

I think what would be useful for all of you is to work on a desired outcome. The conversations of navigation seem to be about autonomy. What’s the purpose of your navigation needs? What are you actually wanting the robot to do?
I have a dream:

  1. Robot wanders the house (or the first floor at least.  Stair Climbing is probably outside my budget) without getting stuck or lost or banging into walls and furniture.
  2. Robot reacts to family, guests, and intruders, but not to pets that it encounters (I'll leave the details of the specific reaction to a later conversation, but I think this is a use case that all users who have been interested in navigation share on some level).
  3. When I call it (either by voice if within range, or with a request from my computer, phone or smartwatch) robot comes to me using the shortest possible unobstructed route. 3a) I want the robot to follow me, without bumping into obstructions even in areas it is not familiar with (have discussed beacon following before and have received a bunch of suggestions, but haven't had time to experiment yet.  The obstruction avoidance is why I am interest in Lidar for this.  I find the ultrasonic and IR distance sensors are not reliable enough)
  4. Robot self charges when battery begins to run low, resume wandering when charge is complete.
  5. Robot retrieves recognized cat (or dog or child) toys that it has been trained to recognize and brings them to me upon request.

My Neato vacuum can do #1, #3, and #4, and any EZ-Robot can do #2.  #5 I can currently do with Roli, but it takes too much human interaction to find the toy and get it picked up.  It is more a remote arm than an autonomous tool, but I am sure if I spend more time with it I could probably make it happen with existing scripting and auto-actions.  Maybe some hardware changes to make the gripper not need as precise positioning, and I might need specific toys with brighter colors.

So, I think this is all possible with technology that exists, the question is whether it can be done at DIY prices.  (I don't want to spend more than about $400 on the navigation piece).

Alan

Portugal
#44  

Hey DJ, i understand you question but are a bit confused why someone as creative and visionary as you would need a reason for localization in robotics. Well, what is the reason for having a hobby or a home robot at this stage in the first place? For me it is all about having fun, explore new things in robotics, making the most of what exists in this field and evolving new ideas and possibilities. If someone like me likes autonomous robots (I think all do), then having the possibility to give the robot that awareness is one big step for autonomy and more. Regards, Paulo

PRO
Synthiam
#45   — Edited

These are great answers.

Thanks for the compliment. My creativity and vision is influenced by your challenges and desired outcomes. I need to understand your true needs before I can provide a solution. As I mentioned, a solution requires a problem... right now there’s just I want navigation. I require more details of the expected outcome. It hasn’t been very clear of what the outcomes were.

So far it kind of looks like you’d want the robot to randomly navigate around the home and pickup things or scan faces?

the random navigation and exploring might be interesting because it would be like a pet. And when it sees someone it knows, it can say hello lol

i wouldn’t use the omron camera though. The cognitive face skill control is super smart: https://synthiam.com/Software/Manual/Cognitive-Face-16210

PRO
USA
#46  

I agree with most of these requests. An autonomous robot moving about like my neato without running into a fully furnished home or work place. I believe that any company or individual would benefit from this system on their robots.

My neato app maps the room and then I can simple use my finger on the app to mark off "do not enter" areas.

User-inserted image

PRO
USA
#47   — Edited

I also like the idea of using something like Azure services to analyze video stream in real-time to detect people moving around the room and or to recognize and approach them.

#48  

Yes, I believe a robot able to come when called, random travel throughout the house, perform limited task like take to, fetch recognize a face, basically be a " smart" pet is why and what is desired. Program wise I would like to identify key locations. I pick one then the robot goes there. Self mapping would be great.

#50  

As hobbiest we are looking at the same picture. Our techical sophistication level is lower but not by much. Our pockets aren't as deep, but we still want to accomplish the same results. With software development and hardware available here, we could,  though at a lesser degree, do what they are saying.

I'm excited !

#51   — Edited

Reviewing the posts I think I would like to see a navigation system using the IPS and Lidar combination. IPS would be the main control with the Lidar used for object advoidance, and allow the tracking around objects when detected.

With this in mind, I am beginning to modify my Adventure Bot which already has a EZR camera mounted on  it. I am adding 3 ultrasonic sensors around the base and a platform for a Lidar. I will place EZ bits near the rear to allow for the IPS transmitter when it becomes available.

I look forward to updates when they become available.

#52  

Hello Dj, it has been a while since ideas were given on what was our desires and resons. Based on these, have you any further thoughts ?  Also you had mentioned doing a hack night showing Lidars and how to use them. Any progress ?

I really don't want to see this thread go dry and I understand this is not an easy task. This is a huge endeavor. I hope you can come up with a plan.