Introduction


   We propose a fully autonomous system for mobile robot exploration in unknown environments. Our system employs a novel frontier detection algorithm based on the fast front propagation (FFP) technique and uses parallel path planning to reach the detected front regions. Given an occupancy grid map in 2D, possibly updated online, our algorithm can find all the frontier points that can allow mobile robots to visit unexplored regions to maximize the exploratory coverage. Our FFP method is six~seven times faster than the state-of-the-art wavefront frontier detection algorithm in terms of finding frontier points without compromising the detection accuracy. The speedup can be further accelerated by simplifying the map without degrading the detection accuracy. To expedite locating the optimal frontier point, We also eliminate spurious points by the obstacle filter and the novel frontier region (FR) filter. In addition, we parallelize the global planning phase using the branch-and-bound A*, where the search space of each thread is confined by its best knowledge discovered during the parallel search. As a result, our parallel path-planning algorithm operating on 20 threads is about 32 times faster than the vanilla exploration system that operates on a single thread. Our method is validated through extensive experiments, including autonomous robot exploration in both synthetic and real-world scenarios. In the real-world experiment, we show that an autonomous navigation system using a human-sized mobile manipulator robot equipped with a low-end embedded processor that fully integrates our FFP and parallel path-planning algorithms.

Autoexplorer (SW architecture)

Figure 1: Autoexplorer overview

The frontier detector finds a set of candidate frontier points, which are subsequently fed into the global planner to identify the optimal frontier point.

Experiments and Results

Experiments in Synthetic Environments

Figure 2: The images on the left-most column are Tesse-Unity environments. Their exploration results are shown on the right two columns. Here, the robot's trajectories are superimposed on top of the explored maps


Figure 3: AWS Gazebo environments and the corresponding exploration results


We tested our autonomous exploration system on two different virtual environments: 1) Tesse-Unity simulator and 2) AWS Gazebo simulator. Each simulator has its own specialty for testing mobile robot navigation tasks. For example, Tesse-Unity offers photo-realistic rendering scenes, while AWS Gazebo simulator provides rich sensory data that interacts with the environment. We observed that our method successfully completed all of these synthetic environments. Figure 2 and 3 show the examples of the explored areas with the robot's trajectories.

Real world experiments (Large-scale benchmark)

Figure 4: The detected frontier regions on a large scale map-image of Deutsche museum bag-file


We tested our system on Deutsche museum bag-file to see the performance of our method on pose graph optimization type SLAM. This dataset explores a very large scale space that increases up to 18M cells, including many potential frontier regions. We observed that our system successfully completed the dataset. Figure 4 qualitatively shows that a FFP result on a double down-sampled image is qualitatively comparable to other cases, including DFD.

Real world experiments with a real robot

Figure 5: Autonomous navigation in the real-world environment: (a) the Fetch robot approaches the first frontier point (b) the robot arrives at the first target, then locating the second frontier point in turn (c) the Fetch arrives at the third goal, discovering new areas subsequently. (d), (e), and (f) are the views from the Fetch robot’s perspective at (a), (b), and (c), respectively.
Figure 6: The covered area of Ewha-SK Telecom center where the sequence of blue dots represents the robot’s exploration trajectory


Lastly, we tested our SW stack introduced in Figure 1 on the Fetch robot's onboard CPU (Intel i5 Haswell) and memory (16GB). In this experiment, the Fetch robot was operated in fully autonomous mode relying on our frontier points detection module. ROS navigation stack generated adequate plans to drive the robot toward the frontier points. We deployed our Fetch on the entire floor of Ewha-SK Telecom Center. Figure 5 shows the exploration process, and Figure 6 shows the trajectories of the robot and the covered area.

Publications

Contact Info

Ewha Graphics Lab
Department of Computer Science & Engineering, Ewha Womans University
  52, Ewhayeodae-gil, Seodaemun-gu, Seoul, Korea, 03760
  +82-2-3277-3925

  Kyung M. Han, hankm@ewha.ac.kr
  Young J. Kim, kimy@ewha.ac.kr