TransWikia.com

Collaboration of mobile robot and survaillance camera - classic localization (still) needed?

Robotics Asked by mrsing on October 3, 2021

Just started with the topic of mobile robotics..
so I’m still into concept making and little programming, but have not setup everything or tested at all. I got a (differential) mobile robot (lego mindstorm) and a external camera mounted on the ceiling of my testarea, looking top down. Robot is equipped with a bumper for collision detection and ultrasonic sensor. Area has some static obstacles (3-4) in it and there should be another lego robot moving around as dynamic obstacle. No markers or colors on the ground.

I plan to do a camera self-calibration taking odometry data and tracking the position of the robot by template matching. Creating the map get’s done by image processing, with edge filtering and later converting to a gridmap.
So next for localization:
While I get more and more into all the subjects I ask myself how to do the localization in the best manner and still as collaboraive system. Is it possible or even clever to use a particle filter out of robot sensor data?
Or is my localization given, just by my template matching and image position extracting? The problem I see comes with the second robot, which will get the localization very complex.

Another Question is, how can I achieve an iterative map update? – The map should react on happenings, like the moving robot (dynamic).

I was reading about layering the occupancy/grid maps and update the master map. So combining it with A* Path Planning should make it dynamic I guess?

Thanks and best regards

2 Answers

I am also working on visual tracking using overhead imagery. What I am proposing to do is training a YOLO V3 object detector on an edge device (camera with a wide-angle lens) to first identify and classify the objects within a scene. Once that is done, the objects can then be tracked within the scene using SORT, DeepSORT, or a similar algorithm.

Now regarding navigation, I have experimented with a Genetic Algorithm but for a 2D static environment (will release the code and the graduate report soon). Maybe you could start there, rather than hardcoding the static environment, you may want to train the vision system to build the 2D static map. Once that information is passed into the robot, it may use GA to find the best path and then move across the map. You mentioned a moving obstacle, the vision system can track multiple objects so another idea could be to run the GA multiple times and reoptimize your robot's path as it traverses along a feasible path while avoiding collision with the companion robot. In my humble opinion, it would be best if you fixed the scope of your project first as in literature you would soon see, just this scenario has hundreds of different implementation each with increasing level of difficulty.

On a concluding note, have a look at these papers, I hope they come of some use to you.

  1. Overhead View Person Detection Using YOLO -- https://doi.org/10.1109/UEMCON47517.2019.8992980
  2. Implementation of Path Planning using Genetic Algorithms on Mobile Robots -- https://ieeexplore.ieee.org/document/1688529

Good luck

Answered by Azmyin Md. Kamal on October 3, 2021

You ask several questions, but I will focus mostly on the localization part.

Is it possible or even clever to use a particle filter out of robot sensor data?

Yes. You can use a particle filter, but I don't understand what part of your system is collaborative. Instead, you have a robot operating in dynamic environment (with a remote sensor). See next comment.

Or is my localization given, just by my template matching and image position extracting?

You can get position (and maybe) orientation measurements from the camera, but the measurements are corrupted with error, so I suggest fusing the measurements from encoders and the camera in a filter. You mentioned a particle filter, but I would use a Kalman filter (however particle will work).

Another Question is, how can I achieve an iterative map update?

I don't know what you mean. I suggest looking up some literature on Simultaneous Localization and Mapping (SLAM). I think you are wanting to do something along these lines, but doing this is not easy.

I was reading about layering the occupancy/grid maps and update the master map. So combining it with A* Path Planning should make it dynamic I guess?

I am not sure what you are asking here. Dynamic usually refers environments that are not static, so in dynamic, obstacles or landmarks can change position while in static they do not.

I hope this is helpful. The questions are very general, so hard to answer specifically.

Answered by Ralff on October 3, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP