This robot lcoalisation package for lidar-map based localisation using multi-sensor state estimation.

Overview

A ROS-based NDT localizer with multi-sensor state estimation

This repo is a ROS based multi-sensor robot localisation. An NDT localizer is loosely-coupled with wheeled odometry and IMU for continuous global localization within a pre-build point cloud map.

Prerequisites

You will need the robot_localization package. The configurations of multi-sensors of our robot are detailed in cfgs/global_ekf.yaml and cfgs/local_ekf.yaml.

Localization in a pointcloud map(pcd)

A demo video on CourtYard dataset:

IMAGE ALT TEXT HERE

How to use

Build in your ros workspace

clone this repo in your ros workspace/src/, and then catkin_make (or catkin build):

cd catkin_ws/src/
git clone https://github.com/FAIRSpace-AdMaLL/ndt_localizer.git
cd ..
catkin_make

Get a pcd map

You need a point cloud map (pcd format) for localization. You can get a HD point cloud map from any HD map supplier, or you can make one yourself (of course, the accuracy will not be as high as the HD map).

We use our offline version of lio-sam to build the point cloud map: https://github.com/FAIRSpace-AdMaLL/liosam_mapper

Previously-generated maps can be downloaded from here The court_yard data (rosbags) for mapping or testing ndt_localizer can be downloaded here: Court Yard Data The beach data (rosbags and previously-generated maps) can be downloaded here: Beach Data

Setup configuration

Config map loader

Move your map pcd file (.pcd) to the map folder inside this project (ndt_localizer/map), change the pcd_path in map_loader.launch to you pcd path, for example:

">
<arg name="pcd_path"  default="$(find ndt_localizer)/map/court_yard_map.pcd"/>

, then in ndt_localizer.launch modify the trajectory path (as a height map for initialization):

">
<arg name="path_file" default="$(find ndt_localizer)/map/court_yard_map.csv" doc="Mapping trajectory as height map" />

You also need to configure the submap parameters:

">
<arg name="submap_size_xy" default="50.0" />
<arg name="submap_size_z" default="20.0" />
<arg name="map_switch_thres" default="25.0" />

Config point cloud downsample

Config your Lidar point cloud topic in launch/points_downsample.launch:

">
<arg name="points_topic" default="/os_cloud_node/points" />

If your Lidar data is sparse (like VLP-16), you need to config smaller leaf_size in launch/points_downsample.launch like 1.0. If your lidar point cloud is dense (VLP-32, Hesai Pander40P, HDL-64 ect.), keep leaf_size as 2.0.

Config static tf

There are a static transform from /world to /map:

">
<node pkg="tf2_ros" type="static_transform_publisher" name="world_to_map" args="0 0 0 0 0 0 map world" />

Config ndt localizer

You can config NDT params in ndt_localizer.launch. Tha main params of NDT algorithm is:

">
<arg name="trans_epsilon" default="0.05" doc="The maximum difference between two consecutive transformations in order to consider convergence" />
<arg name="step_size" default="0.1" doc="The newton line search maximum step length" />
<arg name="resolution" default="3.0" doc="The ND voxel grid resolution" />
<arg name="max_iterations" default="50" doc="The number of iterations required to calculate alignment" />
<arg name="converged_param_transform_probability" default="3.0" doc="" />

These default params work nice with 64 and 32 lidar.

Run the localizer

Once you get your pcd map and configuration ready, run the localizer with:

cd catkin_ws
source devel/setup.bash
roslaunch ndt_localizer ndt_localizer.launch

Wait a few seconds for the map to load, then you can see your pcd map in rviz.

Give a initial pose of current vehicle with 2D Pose Estimate in the rviz.

This operation will send a init pose to topic /initialpose. Then you will see the localization result:

Then, play the rosbag in other terminal (e.g. rosbag play --clock court_yard_wed_repeat_night_2021-03-03-19-07-18.bag).

The robot will start localization:

The final localization msg will send to /odometry/filtered/global by a multi-sensor state estimation using wheeled odometry, IMU and lidar localisation.

The localizer also publish a tf of base_link to map:

---
transforms: 
  - 
    header: 
      seq: 0
      stamp: 
        secs: 1566536121
        nsecs: 251423898
      frame_id: "map"
    child_frame_id: "base_link"
    transform: 
      translation: 
        x: -94.8022766113
        y: 544.097351074
        z: 42.5747337341
      rotation: 
        x: 0.0243843578881
        y: 0.0533175268768
        z: -0.702325920272
        w: 0.709437048124

Acknowledgement

Thanks for AdamShan's autoware_ros implementation https://github.com/AbangLZU/ndt_localizer.git.

You might also like...
An SDL2-based implementation of OpenAL in a single C file.

MojoAL MojoAL is a full OpenAL 1.1 implementation, written in C, in a single source file. It uses Simple Directmedia Layer (SDL) 2.0 to handle much of

 Ingescape - Model-based framework for broker-free distributed software environments
Ingescape - Model-based framework for broker-free distributed software environments

Ingescape - Model-based framework for broker-free distributed software environments Overview Scope and Goals Ownership and License Dependencies with o

R2LIVE is a robust, real-time tightly-coupled multi-sensor fusion framework, which fuses the measurement from the LiDAR, inertial sensor, visual camera to achieve robust, accurate state estimation.
R2LIVE is a robust, real-time tightly-coupled multi-sensor fusion framework, which fuses the measurement from the LiDAR, inertial sensor, visual camera to achieve robust, accurate state estimation.

R2LIVE is a robust, real-time tightly-coupled multi-sensor fusion framework, which fuses the measurement from the LiDAR, inertial sensor, visual camera to achieve robust, accurate state estimation.

R3live - A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package
R3live - A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package

R3LIVE A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package News [Dec 31, 2021] Release of cod

Official page of
Official page of "Patchwork: Concentric Zone-based Region-wise Ground Segmentation with Ground Likelihood Estimation Using a 3D LiDAR Sensor"

Patchwork Official page of "Patchwork: Concentric Zone-based Region-wise Ground Segmentation with Ground Likelihood Estimation Using a 3D LiDAR Sensor

Generate Height map with Generator (OpenGL and imgui) and Construct Splat Map with generated height map using Algorithm
Generate Height map with Generator (OpenGL and imgui) and Construct Splat Map with generated height map using Algorithm

Generate Height map with Generator (OpenGL and imgui) and Construct Splat Map with generated height map using Algorithm(DPS, BFS, Gradient Descent ... etc) . At Renderer, with height map and blend map which are generated in front of this stage, render high quality terrain with OpenGL

Dummy-Robot my super mini robot arm robot items
Dummy-Robot my super mini robot arm robot items

Dummy-Robot 我的超迷你机械臂机器人项目。 资料待整理 已添加3D模型设计源文件。 已添加夹爪硬件设计文件和LED灯环PCB 已添加无线空间定位控制器PCB文件 已添加无线示教器Peak软硬件工程(作为submodule) 已添加REF的硬件设计文件 已添加DummyStudio上位机 已

A ROS package for mobile robot localization with 2D LiDAR

mcl_ros About mcl_ros mcl_ros is a ROS package for mobile robot localization with 2D LiDAR. To implement localization, Monte Carlo localization (MCL)

DIY Zigbee CC2530 Motion sensor (AM312/ AM412/ BS312/ BS412), Temperature /Humidity /Pressure sensor (BME280), Ambient Light sensor (BH1750), 2.9inch e-Paper Module
DIY Zigbee CC2530 Motion sensor (AM312/ AM412/ BS312/ BS412), Temperature /Humidity /Pressure sensor (BME280), Ambient Light sensor (BH1750), 2.9inch e-Paper Module

How to join: If device in FN(factory new) state: Press and hold button (1) for 2-3 seconds, until device start flashing led Wait, in case of successfu

DIY Zigbee CC2530 Motion sensor (AM312/ AM412/ BS312/ BS412), Temperature /Humidity /Pressure sensor (BME280), Ambient Light sensor (BH1750), 2.9/2.13/1.54 inch e-Paper Module
DIY Zigbee CC2530 Motion sensor (AM312/ AM412/ BS312/ BS412), Temperature /Humidity /Pressure sensor (BME280), Ambient Light sensor (BH1750), 2.9/2.13/1.54 inch e-Paper Module

How to join: If device in FN(factory new) state: Press and hold button (1) for 2-3 seconds, until device start flashing led Wait, in case of successfu

This project was made with a NodeMCU ESP8266 WiFi module, Raspberry Pi4, humidity sensor, flame sensor, luminosity sensor, RGB LED, active buzzer.
This project was made with a NodeMCU ESP8266 WiFi module, Raspberry Pi4, humidity sensor, flame sensor, luminosity sensor, RGB LED, active buzzer.

Smart.House.IoT.Project This project was made with a NodeMCU ESP8266 WiFi module, Raspberry Pi4, Temp and Humidity sensor, Flame sensor, Photoresistor

Tightly coupled GNSS-Visual-Inertial system for locally smooth and globally consistent state estimation in complex environment.
Tightly coupled GNSS-Visual-Inertial system for locally smooth and globally consistent state estimation in complex environment.

GVINS GVINS: Tightly Coupled GNSS-Visual-Inertial Fusion for Smooth and Consistent State Estimation. paper link Authors: Shaozu CAO, Xiuyuan LU and Sh

RRxIO - Robust Radar Visual/Thermal Inertial Odometry: Robust and accurate state estimation even in challenging visual conditions.
RRxIO - Robust Radar Visual/Thermal Inertial Odometry: Robust and accurate state estimation even in challenging visual conditions.

RRxIO - Robust Radar Visual/Thermal Inertial Odometry RRxIO offers robust and accurate state estimation even in challenging visual conditions. RRxIO c

Small Robot, with LIDAR and DepthCamera. Using ROS for Maping and Navigation
Small Robot, with LIDAR and DepthCamera. Using ROS for Maping and Navigation

🤖 RoboCop 🤖 Small Robot, with LIDAR and DepthCamera. Using ROS for Maping and Navigation Made by Clemente Donoso, 📍 Chile 🇨🇱 RoboCop Lateral Fron

A fantasy map generator based on Martin O'Leary's
A fantasy map generator based on Martin O'Leary's "Generating fantasy map" notes

Fantasy Map Generator This program is an implementation of a fantasy map generator written in C++ based on the methods described in Martin O'Leary's "

SSL_SLAM2: Lightweight 3-D Localization and Mapping for Solid-State LiDAR (mapping and localization separated)  ICRA 2021
SSL_SLAM2: Lightweight 3-D Localization and Mapping for Solid-State LiDAR (mapping and localization separated) ICRA 2021

SSL_SLAM2 Lightweight 3-D Localization and Mapping for Solid-State LiDAR (Intel Realsense L515 as an example) This repo is an extension work of SSL_SL

WIP: ESP32 powered robot dog, quadruped robot. This is just code, hardware in the other repositories
WIP: ESP32 powered robot dog, quadruped robot. This is just code, hardware in the other repositories

Small Robot dog (quadruped) Hardware ESP32 IMU (not implemented) 12 servos TowerPro mg90d (hope it will work) Two 18650 Software Arduino IDE compatibl

A ROS robot supporting voice control, autonomous navigation and robot arm motion.
A ROS robot supporting voice control, autonomous navigation and robot arm motion.

ROS Service Robot Project Due to the GitLab restrictions, only screenshot is displayed here. Warehouse Overview robot: The development source code is

This package contains the common algorithms in robotic arm, and I have developed it based on universal robot. It will be continuously updateing.
This package contains the common algorithms in robotic arm, and I have developed it based on universal robot. It will be continuously updateing.

Robotic_Arm_Algorithms It contains the common algorithms in robotic arm, and will be recording the development as soon as I have completed the any one

Comments
  • TF odom to base_link

    TF odom to base_link

    Hi,

    Thank you for this very useful package. I tried to run the ndt_localizer with the court yard data but the localizer does not converge, and I get the following error below. I am quite new with ROS, and I am not sure how to establish the tf between odom to base link. Any help will be appreciated.

    [ERROR] [1665524128.452658342, 1614894376.816401712]: Please publish TF odom to base_link

    opened by barkindagda 0
Owner
null
SSL_SLAM2: Lightweight 3-D Localization and Mapping for Solid-State LiDAR (mapping and localization separated) ICRA 2021

SSL_SLAM2 Lightweight 3-D Localization and Mapping for Solid-State LiDAR (Intel Realsense L515 as an example) This repo is an extension work of SSL_SL

Wang Han 王晗 337 Dec 27, 2022
Frog is an integration of memory-based natural language processing (NLP) modules developed for Dutch. All NLP modules are based on Timbl, the Tilburg memory-based learning software package.

Frog - A Tagger-Lemmatizer-Morphological-Analyzer-Dependency-Parser for Dutch Copyright 2006-2020 Ko van der Sloot, Maarten van Gompel, Antal van den

Language Machines 70 Dec 14, 2022
Distributed (Deep) Machine Learning Community 682 Dec 28, 2022
A Modular Framework for LiDAR-based Lifelong Mapping

LT-mapper News July 2021 A preprint manuscript is available (download the preprint). LT-SLAM module is released.

Giseop Kim 300 Dec 30, 2022
Yggdrasil Decision Forests (YDF) is a collection of state-of-the-art algorithms for the training, serving and interpretation of Decision Forest models.

Yggdrasil Decision Forests (YDF) is a collection of state-of-the-art algorithms for the training, serving and interpretation of Decision Forest models. The library is developed in C++ and available in C++, CLI (command-line-interface, i.e. shell commands) and in TensorFlow under the name TensorFlow Decision Forests (TF-DF).

Google 268 Jan 9, 2023
[RSS 2021] An End-to-End Differentiable Framework for Contact-Aware Robot Design

DiffHand This repository contains the implementation for the paper An End-to-End Differentiable Framework for Contact-Aware Robot Design (RSS 2021). I

Jie Xu 60 Jan 4, 2023
LIO-SAM: Tightly-coupled Lidar Inertial Odometry via Smoothing and Mapping

A real-time lidar-inertial odometry package. We strongly recommend the users read this document thoroughly and test the package with the provided dataset first.

Tixiao Shan 2.2k Jan 4, 2023
A C library for product recommendations/suggestions using collaborative filtering (CF)

Recommender A C library for product recommendations/suggestions using collaborative filtering (CF). Recommender analyzes the feedback of some users (i

Ghassen Hamrouni 254 Dec 29, 2022
C-based/Cached/Core Computer Vision Library, A Modern Computer Vision Library

Build Status Travis CI VM: Linux x64: Raspberry Pi 3: Jetson TX2: Backstory I set to build ccv with a minimalism inspiration. That was back in 2010, o

Liu Liu 6.9k Jan 6, 2023
RP-VIO: Robust Plane-based Visual-Inertial Odometry for Dynamic Environments (Code & Dataset)

RP-VIO: Robust Plane-based Visual-Inertial Odometry for Dynamic Environments RP-VIO is a monocular visual-inertial odometry (VIO) system that uses onl

Karnik Ram 167 Jan 6, 2023