Mobile robot

SLAM (Simultaneous Localization and Mapping) system with Four RGB-D camera

SLAM system using four RGB-D cameras (Microsoft Kinect) is developed. 3D environmental structures are captured in real-time while the robot moves. For precise localization, CPS-SLAM is utilized and the robot position is detected by a parent robot using a total station (laser range finder).

SLAM using four kinect cameras photo System configuration photo Measurement results

Papers

Fast 3D localization for mobile robot using Normal Distributions Transform

We propose an efficient 3D global localization and tracking technique for a mobile robot in a large-scale environment using 3D geometrical map and a RGB-D camera. With the rapid development of high-resolution 3D range sensors, high-speed processing of a large amount of 3D data is becoming an urgent challenge in robotic applications such as localization. To tackle this problem, the proposed technique utilizes a ND (Normal Distributions) voxel representation. Firstly, a 3D geometrical map represented by point-clouds is converted to a number of ND voxels, and local features are extracted and stored as an environmental map. In addition, range data captured by a RGB-D camera is also converted to the ND voxels, and local features are calculated. For global localization and tracking, the similarity of ND voxels between the environmental map and the sensory data is examined according to the local features or Kullback-Leibler divergence, and optimum positions are determined in a framework of a particle filter.

Kinect color image Kinect depth image Localization process

Papers

Quadcopter Helicopter

Spatial change detection using voxel classification by normal distributions transform

Detection of spatial change around a robot is indispensable in several robotic applications, such as search and rescue, security, and surveillance. The present paper proposes a fast spatial change detection technique for a mobile robot using an on-board RGB-D/stereo camera and a highly precise 3D map created by a 3D laser scanner. This technique first converts point clouds in a map and measured data to grid data (ND voxels) using normal distribution transformation and classifies the ND voxels into three categories. The voxels in the map and the measured data are then compared according to the category and features of the ND voxels. Overlapping and voting techniques are also introduced in order to detect the spatial changes more robustly. We conducted experiments using a mobile robot equipped with real-time range sensors to confirm the performance of the proposed real-time localization and spatial change detection techniques in indoor and outdoor environments.

photo Mobile robot with Kinect Detected differences ICRA 2019 Video

Papers

Tour guide robot and personal mobility vehicle

We are developing a tour guide robot and a personal mobility vehicle, which utilize not only the embedded sensors but also surrounding sensors placed along streets.

Personal vehicle Wheelchair-type personal vehicle
photo
Tour guide robot, Qurin
photo
Tour guide robot, Qurin 2
Tuor guide robot Navigation using QZSS and 5G
Real time transmission of 4K 360 degrees video via 5G network Navigation using LiDAR and 5G
Tour guide experiment in theme park Guide experiment in hospital

Papers

Autonomous lawn-mowing robot

We are developing an autonomous lawn-mowing robot. The robot is equipped with QZSS MICHIBIKI for centimeter-level positioning using CLAS and 3D-LiDAR for obstacle detection. Navigation2 in ROS2 is adopted for automatic planning and motion control toward multiple targets.

photo
1st model autonomous lawn-mowing robot
photo
2nd model autonomous lawn-mowing robot

Papers