Direct LiDAR Odometry: Fast Localization with Dense Point Clouds

Overview

Direct LiDAR Odometry: Fast Localization with Dense Point Clouds

DLO is a lightweight and computationally-efficient frontend LiDAR odometry solution with consistent and accurate localization. It features several algorithmic innovations that increase speed, accuracy, and robustness of pose estimation in perceptually-challenging environments and has been extensively tested on aerial and legged robots.

This work was part of NASA JPL Team CoSTAR's research and development efforts for the DARPA Subterranean Challenge, in which DLO was the primary state estimation component for our fleet of autonomous aerial vehicles.


drawing drawing

drawing

Instructions

DLO requires an input point cloud of type sensor_msgs::PointCloud2 with an optional IMU input of type sensor_msgs::Imu. Note that although IMU data is not required, it can be used for initial gravity alignment and will help with point cloud registration.

Dependencies

Our system has been tested extensively on both Ubuntu 18.04 Bionic with ROS Melodic and Ubuntu 20.04 Focal with ROS Noetic, although other versions may work. The following configuration with required dependencies has been verified to be compatible:

  • Ubuntu 18.04 or 20.04
  • ROS Melodic or Noetic (roscpp, std_msgs, sensor_msgs, geometry_msgs, pcl_ros)
  • C++ 14
  • CMake >= 3.16.3
  • OpenMP >= 4.5
  • Point Cloud Library >= 1.10.0
  • Eigen >= 3.3.7

Installing the binaries from Aptitude should work though:

sudo apt install libomp-dev libpcl-dev libeigen3-dev 

Compiling

Create a catkin workspace, clone the direct_lidar_odometry repository into the src folder, and compile via the catkin_tools package (or catkin_make if preferred):

mkdir ws && cd ws && mkdir src && catkin init && cd src
git clone https://github.com/kennyjchen/direct_lidar_odometry.git
catkin build

Execution

After sourcing the workspace, launch the DLO odometry and mapping ROS nodes via:

roslaunch direct_lidar_odometry dlo.launch \
  pointcloud_topic:=/robot/velodyne_points \
  imu_topic:=/robot/vn100/imu

Make sure to edit the pointcloud_topic and imu_topic input arguments with your specific topics. If IMU is not being used, set the dlo/imu ROS param to false in cfg/dlo.yaml. However, if IMU data is available, please allow DLO to calibrate and gravity align for three seconds before moving. Note that the current implementation assumes that LiDAR and IMU coordinate frames coincide, so please make sure that the sensors are physically mounted near each other.

Test Data

For your convenience, we provide example test data here (9 minutes, ~4.2GB). To run, first launch DLO (with default point cloud and IMU topics) via:

roslaunch direct_lidar_odometry dlo.launch

In a separate terminal session, play back the downloaded bag:

rosbag play dlo_test.bag

drawing

Citation

If you found this work useful, please cite our manuscript:

@article{chen2021direct,
  title={Direct LiDAR Odometry: Fast Localization with Dense Point Clouds},
  author={Chen, Kenny and Lopez, Brett T and Agha-mohammadi, Ali-akbar and Mehta, Ankur},
  journal={arXiv preprint arXiv:2110.00605},
  year={2021}
}

Acknowledgements

We thank the authors of the FastGICP and NanoFLANN open-source packages:

  • Kenji Koide, Masashi Yokozuka, Shuji Oishi, and Atsuhiko Banno, “Voxelized GICP for Fast and Accurate 3D Point Cloud Registration,” in IEEE International Conference on Robotics and Automation (ICRA), IEEE, 2021, pp. 11 054–11 059.
  • Jose Luis Blanco and Pranjal Kumar Rai, “NanoFLANN: a C++ Header-Only Fork of FLANN, A Library for Nearest Neighbor (NN) with KD-Trees,” https://github.com/jlblancoc/nanoflann, 2014.

License

This work is licensed under the terms of the MIT license.


drawing drawing

drawing drawing

Issues
  • jacobian

    jacobian

    Hello author, I want to derive the jacobian in the code. I see we use global perturbation to update. When i try this, i meet this problem to look for your help. image When using Woodbury matrix identity it seems to make the formula more complicated, so i'm stuck to look for your help. Thanks for your help in advance!

    Best regards Xiaoliang jiao

    opened by narutojxl 6
  • the transformation between your lidar and imu?

    the transformation between your lidar and imu?

    I use a vlp16 lidar and a microstran imu(3DM-GX5-25) to run your program. But I failed. I think my transformation between imu and lidar is different from yours. So could you tell your true transformation between imu and lidar. the following pitcure is my configuration: 453670779 Thank you very much!

    opened by nonlinear1 6
  • some trivial questions

    some trivial questions

    Hello authors, I have some trivial questions to look for your help. Thanks for your help.

    opened by narutojxl 4
  • update key frames

    update key frames

    thanks for fully and quickly answering last question! now if feature info had been changed largely after mapping , the accuracy and stability of localization must be terrible. to avoid this situation, exactly update key frames whose feature info has changed should be better solution. so as your suggestion, how to exactly update key frames whose feature info has changed ? thanks.

    opened by qweesiic 2
  • Covariances Calculation in updatekeyframe

    Covariances Calculation in updatekeyframe

    great job!

    i found covariances was calculated in function updatekeyframe(); and i am not sure if
    图片 was executed after function updatekeyframe(), the time of covariances calculation in function updatekeyframe() maybe saved?

    opened by qweesiic 2
  • how to localize on a given pcd map?

    how to localize on a given pcd map?

    i have a trouble understanding how to localise on a given 3d pcd map, can anyone explain the steps

    i have a pcd map which i can load into rviz, and i have lidar-points and imu data, then how to use this package for localization?

    opened by srinivasrama 2
  • getSubmapKeyframes()

    getSubmapKeyframes()

    Hello authors, Thanks for sharing this excellent work to let us study firstly :) In getSubmapKeyframes() we call computeConvexHull() to calculate the keyframes indice whose position(xyz) consist of convex hull, we then push the indices into keyframe_convex. Now let we suppose keyframe_convex's content is keyframe_convex = [3, 6, 8, 10, 13]. We then push the distance between curr scan position to each of keyframe_convex position one by one from begining into convex_ds. Let we suppose convex_ds is convex_ds =[1.0, 0.2, 1.4, 0.3, 3.0]. Then in pushSubmapIndices() we calculate submap_kcv_ smallest elements, which represent submap_kcv_ nearest convex keyframes indices from current scan position. Let we suppose submap_kcv_ = 2. In pushSubmapIndices distance 0.2 and 0.3 will be sorted, so we push indice = 1 and indice = 3 into submap_kf_idx_curr, but i think we should push the corresponding indices in keyframe_convex namely 6 and 10 into submap_kf_idx_curr, which are distance 0.2 and 0.3 correspond keyframe indices, right?
    If i mistake somewhere code please correct me. If I'm right, the same problem also in pushSubmapIndices() of computeConcaveHull(). BTW, this->pose in L1252 is last scan position in global frame, why not use the current scan's guess value in global frame, namely this->T_s2s, we get from this->propagateS2S(T_S2S);. Maybe the robot movement is small and lidar frequency is 10hz, we can ignore this effect. Thanks for your time and help very much in advance!

    opened by narutojxl 2
  • how to set a different initialpose?

    how to set a different initialpose?

    Hi, thanks for your team's great work.And i have tested it with my own bag, the result is very close for my ground truth. There is a request that i want to set the initial pose ,such as (100,100,0,0,0,1,0.026). But it didn't work,the "robot/dlo/odom" has always begin with (0,0,0,0,0,0,1). I just changed the code in "odom.cc",which is: " this->odom.pose.pose.position.x = 3972.02; this->odom.pose.pose.position.y = 3497.76; this->odom.pose.pose.position.z = 0; this->odom.pose.pose.orientation.w = 0.026; this->odom.pose.pose.orientation.x = 0.; this->odom.pose.pose.orientation.y = 0.; this->odom.pose.pose.orientation.z = 1; this->odom.pose.covariance = {0.};

    /* this->odom.pose.pose.position.x = 0.; this->odom.pose.pose.position.y = 0.; this->odom.pose.pose.position.z = 0.; this->odom.pose.pose.orientation.w = 1.; this->odom.pose.pose.orientation.x = 0.; this->odom.pose.pose.orientation.y = 0.; this->odom.pose.pose.orientation.z = 0.; this->odom.pose.covariance = {0.}; */ this->T = Eigen::Matrix4f::Identity(); this->T_s2s = Eigen::Matrix4f::Identity(); this->T_s2s_prev = Eigen::Matrix4f::Identity();

    this->pose_s2s = Eigen::Vector3f(0., 0., 0.); this->rotq_s2s = Eigen::Quaternionf(1., 0., 0., 0.);

    //this->pose = Eigen::Vector3f(0., 0., 0.); this->pose = Eigen::Vector3f(3972.02, 3497.76, 0.); //this->rotq = Eigen::Quaternionf(1., 0., 0., 0.); this->rotq = Eigen::Quaternionf(0.026, 0., 0., 0.); " I would really appreciate it if someone could help me.

    opened by numb0824 2
  • Relative gyro propagation quaternion dynamics

    Relative gyro propagation quaternion dynamics

    Hi, Thank you for your excellent work! my question: The implementation of calculating quaternion from gyro measurements in the code(void dlo::OdomNode::integrateIMU()) is inconsistent with that in the paper. Please help me solve it. thank you.

    opened by CogmanCVN 0
  • Map generation

    Map generation

    Hi, How is the map generated? What algorithm is used? Can we upload a pcd map of the environment and then apply dlo just for localisation? I assumed dlo as a localisation algorithm, but it seems to take in the point cloud data from the bag file and generate a map on its own.

    opened by Srichitra-S 9
  • Coordinate Frame

    Coordinate Frame

    Hi,

    Thank you for sharing this great work.

    I've set of problem such as defining the coordinate system of the odometry . Which coordinate frame do you use for odometry NED or BODY. I thought you use NED coordinate frame so that I expected when I move Lidar + IMU to the North direction, x should be positive increased but it has not been stable behaviour. I also moved the system to the East direction it should be caused positive increase on y axis but same problem. I think you use BODY frame or something different coordinate frame for odometry.

    I wonder also how do you send odometry information to the autopilot of drone. I prefered mavros and selected the related odometry ros topic and send them without change so I saw the odometry message in the autopilot but when odometry receive to the autopilot of drone it looks like x and y exchanged in autopilot even they are not exchanged odometry output of direct lidar odometry.

    Summary of the Question is that which coordinate frame do you use for odometry ? How can I set up the odometry frame to NED? x and y are negative decreased when I move lidar+imu to the North and the East respectively. and Z looks like 180 reversed. Should I multiply x and y with minus or apply rotation matrix to fix negative decreasing x, y odometry ?

    How can I ensure that autopilot and your odometry coordinate plane are in the same coordinate plane ?

    opened by danieldive 3
Releases(v1.4.0)
Owner
VECTR at UCLA
Verifiable & Control-Theoretic Robotics Laboratory
VECTR at UCLA
A Robust LiDAR-Inertial Odometry for Livox LiDAR

LIO-Livox (A Robust LiDAR-Inertial Odometry for Livox LiDAR) This respository implements a robust LiDAR-inertial odometry system for Livox LiDAR. The

livox 291 Jun 22, 2022
Livox-Mapping - An all-in-one and ready-to-use LiDAR-inertial odometry system for Livox LiDAR

Livox-Mapping This repository implements an all-in-one and ready-to-use LiDAR-inertial odometry system for Livox LiDAR. The system is developed based

null 192 Jun 30, 2022
This package provides localization in a pre-built map using ICP and odometry (or the IMU measurements).

Localization using ICP in a known map Overview This package localizes the lidar sensor in a given map using the ICP algorithm. It subscribes to lidar

Robotic Systems Lab - Legged Robotics at ETH Zürich 82 Jun 21, 2022
Implementation of Monocular Direct Sparse Localization in a Prior 3D Surfel Map (DSL)

Implementation of Monocular Direct Sparse Localization in a Prior 3D Surfel Map (DSL)

Haoyang Ye 74 Jun 25, 2022
The QPEP-Enhanced Direct Sparse Odometry (DSO) with Loop Closure

QPEP-DSO: Quadratic Pose Estimation Problems (QPEP) Enhanced Direct Sparse Odometry with Loop Closure Code Arch The codes are motivated by DSO (https:

Jin Wu 8 Jun 23, 2022
This repo includes SVO Pro which is the newest version of Semi-direct Visual Odometry (SVO) developed over the past few years at the Robotics and Perception Group (RPG).

rpg_svo_pro This repo includes SVO Pro which is the newest version of Semi-direct Visual Odometry (SVO) developed over the past few years at the Robot

Robotics and Perception Group 908 Jun 27, 2022
A ROS package for mobile robot localization with 2D LiDAR

mcl_ros About mcl_ros mcl_ros is a ROS package for mobile robot localization with 2D LiDAR. To implement localization, Monte Carlo localization (MCL)

Naoki Akai 45 May 13, 2022
A simple localization framework that can re-localize in one point-cloud map.

Livox-Localization This repository implements a point-cloud map based localization framework. The odometry information is published with FAST-LIO. And

Siyuan Huang 72 Jun 30, 2022
LVI-SAM: Tightly-coupled Lidar-Visual-Inertial Odometry via Smoothing and Mapping

LVI-SAM This repository contains code for a lidar-visual-inertial odometry and mapping system, which combines the advantages of LIO-SAM and Vins-Mono

Tixiao Shan 966 Jun 24, 2022
Continuous Time LiDAR odometry

CT-ICP: Elastic SLAM for LiDAR sensors This repository implements the SLAM CT-ICP (see our article), a lightweight, precise and versatile pure LiDAR o

null 252 Jun 27, 2022
A ros package for robust odometry and mapping using LiDAR with aid of different sensors

W-LOAM A ros package for robust odometry and mapping using LiDAR with aid of different sensors Demo Video https://www.bilibili.com/video/BV1Fy4y1L7kZ?

Saki-Chen 45 Jun 14, 2022
A real-time, direct and tightly-coupled LiDAR-Inertial SLAM for high velocities with spinning LiDARs

LIMO-Velo [Alpha] ?? [16 February 2022] ?? The project is on alpha stage, so be sure to open Issues and Discussions and give all the feedback you can!

Andreu Huguet 113 Jun 15, 2022
This code converts a point cloud obtained by a Velodyne VLP16 3D-Lidar sensor into a depth image mono16.

pc2image This code converts a point cloud obtained by a Velodyne VLP16 3D-Lidar sensor into a depth image mono16. Requisites ROS Kinetic or Melodic Ve

Edison Velasco Sánchez 6 May 18, 2022
This repository uses a ROS node to subscribe to camera (hikvision) and lidar (livox) data. After the node merges the data, it publishes the colored point cloud and displays it in rviz.

fusion-lidar-camera-ROS 一、介绍 本仓库是一个ROS工作空间,其中ws_fusion_camera/src有一个工具包color_pc ws_fusion_camera │ README.md │ └───src │ └───package: c

hongyu wang 14 Jun 15, 2022
This project is used for lidar point cloud undistortion.

livox_cloud_undistortion This project is used for lidar point cloud undistortion. During the recording process, the lidar point cloud has naturally th

livox 42 Jun 16, 2022
Generate dense random crosswords

CrosswordGenerator crossword_gen is a program written in C allowing to generate random crosswords from a list of words. The following parameters are e

null 2 Oct 31, 2021
An FPGA accelerator for general-purpose Sparse-Matrix Dense-Matrix Multiplication (SpMM).

Sextans Sextans is an accelerator for general-purpose Sparse-Matrix Dense-Matrix Multiplication (SpMM). One exciting feature is that we only need to p

linghao.song 19 May 24, 2022
Anchor/Tag Configuration using Decawave DWM1000 UWB radios | Indoor Localization

Features This project implements the following features: Precise distance and location measurements using DWM1001 UWB radios. Nodes can be easily conf

Jonathan Pereira 22 Jun 9, 2022
Mobile platform for analysis of localization methods using the Intel RealSense T265 sensor

OptiBot Mobile platform for analysis of localization methods using the Intel RealSense T265 sensor About | Content | Implementation | License | Author

Kamil Goś 2 Feb 17, 2022