Fix some extrinsic parameter importing problems. 6-axis IMU works now. Lidar without ring works now.

Overview

LVI-SAM-MODIFIED

This repository is a modified version of LVI-SAM.


Modification

  • Add function to get extrinsic parameters.The original code assumes there are no translation between the sensors and their parameters are written in the code. But now both our data sets and LVI-SAM official data sets are working well.
  • Add "lidar to imu extrinsics" in params_camera.yaml.
  • Add "Extrinsics (lidar -> cam)"in params_lidar.yaml.
  • Using MahonyAHRS to caculate quaternion.So you don't need to prepare a 9-axis IMU.
  • Add lidar ring calculation method,whether your lidar provides "ring" information or not,it works.

Notes

  • Most of the changes are marked by "# 修改的地方 modified".
  • If you are using VSCode and "#include xxx" prompt error,please ctrl+shit+p and enter C/C++ Edit Configurations(UI), add the following paths in Include path. /opt/ros/melodic/include/** /usr/include/**
  • Please make sure you have the same version of dependencies as LVI-SAM.If you have problems installing or importing multiple version of dependencies,you can refer to this blog.
  • You need to download and compile yaml-cpp.

Acknowledgement

You might also like...
This package provides localization in a pre-built map using ICP and odometry (or the IMU measurements).
This package provides localization in a pre-built map using ICP and odometry (or the IMU measurements).

Localization using ICP in a known map Overview This package localizes the lidar sensor in a given map using the ICP algorithm. It subscribes to lidar

ROS wrapper for the family of IMU sensor devices manufactured by Witmotion Ltd.

Witmotion IMU sensor driver for ROS witmotion_ros module implements a ROS 1 wrapper for Witmotion IMU driver library. It reads the data from the famil

mCube's ultra-low-power wake-on-motion 3-axis accelerometer
mCube's ultra-low-power wake-on-motion 3-axis accelerometer

MC3635 mCube's ultra-low-power wake-on-motion 3-axis accelerometer Based on mCube's Arduino demo driver, this sketch is specific for the MC3635 3-axis

A simple example for 'Arduino' compatible boards to interface with I2C to the MPU-6050, a 6-axis micro-electromechanical IC

Arduino-MPU-6050 A simple example for 'Arduino' compatible boards to interface with I2C to the MPU-6050, a 6-axis micro-electromechanical IC ==About==

Bluetooth Joystick : A wireless joystick with ESP-32 microcontroller and Dual Axis Joystick Module using the Bluetooth connectivity.
Bluetooth Joystick : A wireless joystick with ESP-32 microcontroller and Dual Axis Joystick Module using the Bluetooth connectivity.

BluetoothJoystick Bluetooth Joystick : A wireless joystick with ESP-32 microcontroller and Dual Axis Joystick Module using the Bluetooth connectivity.

My DIY 3 Axis Camera Slider Project
My DIY 3 Axis Camera Slider Project

3Axis Camera Slider My DIY 3 Axis Camera Slider Project Authors NEWTech-Creative * Myles Newton GitHub YouTube Original project inspiration from * Raj

This repository is a drone attitude measurement solution with 9-axis

This repository is a drone attitude measurement solution with 9-axis

imGuIZMO.quat is a ImGui widget: like a trackball it provides a way to rotate models, lights, or objects with mouse, and graphically visualize their position in space, also around any single axis (Shift/Ctrl/Alt/Super)
imGuIZMO.quat is a ImGui widget: like a trackball it provides a way to rotate models, lights, or objects with mouse, and graphically visualize their position in space, also around any single axis (Shift/Ctrl/Alt/Super)

imGuIZMO.quat v3.0 imGuIZMO.quat is a ImGui widget: like a trackball it provides a way to rotate models, lights, or objects with mouse, and graphicall

Code of 4-axis cube robot.
Code of 4-axis cube robot.

CubeRobot In 2017, Zeguang Chang and I built a cube robot using DSP and STM32. We open source all the code of CubeRobot in this repository. You can fi

Comments
  • 请问对相机坐标系确定吗?

    请问对相机坐标系确定吗?

    您好,关于图 https://github.com/skyrim835/LVI-SAM-modified/blob/d035a501fb3bb04ff074a7e1f21a8bedf44cf3bf/images/LVI-SAM-original-coordinates.png 中的坐标系,我有个有疑问。 按照LVI-SAM的原版代码 https://github.com/TixiaoShan/LVI-SAM/blob/9b9ba85e300c382a272347b32af397ca3e8a78a5/src/visual_odometry/visual_feature/feature_tracker.h#L142 雷达系应该跟您的图一样,但是相机系应该是x向右,z向前,y向下吧。 请问您对图中相机的坐标系确定吗?

    opened by h-k8888 7
  • Question related to flashing in Rviz during LVI-SAM-modified

    Question related to flashing in Rviz during LVI-SAM-modified

    @skyrim835
    Thanks for your great work!
    I have some questions about flashing in Rviz.

    When I run LVI-SAM modified with my custom dataset(Velodyne + realsense D435i), it blinks like the below.
    I mean the mapping is unstable.

    When I run Faster-LIO, it works fine so I don't think this cause is a dataset problem.

    Did the same phenomenon occur when you run LVI-SAM-modified?
    If not, What part should I fix to solve the problem?

    I'd appreciate it if you could give me an answer.

    Thanks,

    opened by Taeyoung96 2
  • 自己传感器搭建问题

    自己传感器搭建问题

    大佬!我用的zed2和rslidar-16,自己用kalibr标定了zed2相机,然后用[calibration_camera_lidar]标定了相机左目与雷达,将config/params_camera.yaml config/params_lidar.yaml config/cutom/params_camera.yaml config/cutom/params_lidar.yaml 中的相关参数都改了,包括相机内参,和imu-camera-lidar之间的外参关系,但还是运行出错。我想问是哪里出了问题? 错误描述: [ INFO] [1660833435.146233507]: ----> Visual Feature Tracker Started. OpenCV Error: Assertion failed (0 <= _rowRange.start && _rowRange.start <= _rowRange.end && _rowRange.end <= m.rows) in Mat, file /build/opencv-L2vuMj/opencv-3.2.0+dfsg/modules/core/src/matrix.cpp, line 483 terminate called after throwing an instance of 'cv::Exception' what(): /build/opencv-L2vuMj/opencv-3.2.0+dfsg/modules/core/src/matrix.cpp:483: error: (-215) 0 <= _rowRange.start && _rowRange.start <= _rowRange.end && _rowRange.end <= m.rows in function Mat

    [lvi_sam_visual_feature-7] process has died [pid 6043, exit code -6, cmd /home/shangwei/prac2_ws/devel/lib/lvi_sam/lvi_sam_visual_feature __name:=lvi_sam_visual_feature __log:=/home/shangwei/.ros/log/0af21f40-1e32-11ed-9117-80fa5b3e83f4/lvi_sam_visual_feature-7.log]. log file: /home/shangwei/.ros/log/0af21f40-1e32-11ed-9117-80fa5b3e83f4/lvi_sam_visual_feature-7*.log

    params_camera.yaml %YAML:1.0

    Project

    project_name: "lvi_sam"

    #common parameters imu_topic: "/zed2/zed_node/imu/data_raw" image_topic: "/zed2/zed_node/left_raw/image_raw_color" point_cloud_topic: "lvi_sam/lidar/deskew/cloud_deskewed"

    Lidar Params

    use_lidar: 1 # whether use depth info from lidar or not lidar_skip: 3 # skip this amount of scans align_camera_lidar_estimation: 1 # align camera and lidar estimation for visualization

    lidar to camera extrinsic

    lidar_to_cam_tx: -0.0553684 lidar_to_cam_ty: -0.081193 lidar_to_cam_tz: -0.0244271 lidar_to_cam_rx: -1.22575643 lidar_to_cam_ry: 1.20940651 lidar_to_cam_rz: -1.21497081

    imu to lidar extrinsic

    imu_to_lidar_tx: -2.59153 imu_to_lidar_ty: 0.0203358 imu_to_lidar_tz: -0.107606 imu_to_lidar_rx: -1.22046289 imu_to_lidar_ry: -1.23098981 imu_to_lidar_rz: -1.19244752

    camera model

    model_type: PINHOLE camera_name: camera

    image_width: 1280 image_height: 720 distortion_parameters: k1: 0.0497547 k2: 0.0217677 p1: -0.000146471 p2: -0.000630983 projection_parameters: fx: 534.86 fy: 534.645 cx: 634.875 cy: 356.9945

    #imu parameters The more accurate parameters you provide, the worse performance acc_n: 1.4e-03 # accelerometer measurement noise standard deviation. gyr_n: 8.6e-05 # gyroscope measurement noise standard deviation. acc_w: 8.0e-05 # accelerometer bias random work noise standard deviation. gyr_w: 2.2e-06 # gyroscope bias random work noise standard deviation. g_norm: 9.805 # imu_hz: 400 # frequency of imu

    Extrinsic parameter between IMU and Camera.

    estimate_extrinsic: 0 # 0 Have an accurate extrinsic parameters. We will trust the following imu^R_cam, imu^T_cam, don't change it. # 1 Have an initial guess about extrinsic parameters. We will optimize around your initial guess. # 2 Don't know anything about extrinsic parameters. You don't need to give R,T. We will try to calibrate it. Do some rotation movement at beginning. #Rotation from camera frame to imu frame, imu^R_cam extrinsicRotation: !!opencv-matrix rows: 3 cols: 3 dt: d data: [ -0.00948391, 0.01912883, 0.99977205, -0.99985211, 0.01416312, -0.00975565, -0.0143465, -0.99971671, 0.01899168] #Translation from camera frame to imu frame, imu^T_cam extrinsicTranslation: !!opencv-matrix rows: 3 cols: 1 dt: d data: [-0.00477194, 0.02212476, 0.05040654]

    #feature traker paprameters max_cnt: 150 # max feature number in feature tracking min_dist: 20 # min distance between two features freq: 20 # frequence (Hz) of publish tracking result. At least 10Hz for good estimation. If set 0, the frequence will be same as raw image F_threshold: 1.0 # ransac threshold (pixel) show_track: 1 # publish tracking image as topic equalize: 1 # if image is too dark or light, trun on equalize to find enough features fisheye: 0 # if using fisheye, trun on it. A circle mask will be loaded to remove edge noisy points

    #optimization parameters max_solver_time: 0.035 # max solver itration time (ms), to guarantee real time max_num_iterations: 10 # max solver itrations, to guarantee real time keyframe_parallax: 10.0 # keyframe selection threshold (pixel)

    #unsynchronization parameters estimate_td: 0 # online estimate time offset between camera and imu td: -0.004492987156480011 # initial value of time offset. unit: s. readed image clock + td = real image clock (IMU clock)

    #rolling shutter parameters rolling_shutter: 0 # 0: global shutter camera, 1: rolling shutter camera rolling_shutter_tr: 0 # unit: s. rolling shutter read out time per frame (from data sheet).

    #loop closure parameters loop_closure: 1 # start loop closure skip_time: 0.0 skip_dist: 0.0 debug_image: 0 # save raw image in loop detector for visualization prupose; you can close this function by setting 0 match_image_scale: 0.5 vocabulary_file: "/config/brief_k10L6.bin" brief_pattern_file: "/config/brief_pattern.yml"

    project name

    PROJECT_NAME: "lvi_sam"

    lvi_sam:

    Topics

    pointCloudTopic: "/velodyne_points" # Point cloud data imuTopic: "/zed2/zed_node/imu/data_raw" # IMU data

    Heading

    useImuHeadingInitialization: false # if using GPS data, set to "true"

    Export settings

    savePCD: false # https://github.com/TixiaoShan/LIO-SAM/issues/3 savePCDDirectory: "/software/LVI-SAM/picresults/" # in your home folder, starts and ends with "/". Warning: the code deletes "LOAM" folder then recreates it. See "mapOptimization" for implementation

    Sensor Settings

    N_SCAN: 16 # number of lidar channel (i.e., 16, 32, 64, 128) Horizon_SCAN: 1800 # lidar horizontal resolution (Velodyne:1800, Ouster:512,1024,2048) ang_y: 1.0 #ang/N_SCAN在纵向激光头分布的角度/线数 timeField: "time" # point timestamp field, Velodyne - "time", Ouster - "t" downsampleRate: 1 # default: 1. Downsample your data if too many points. i.e., 16 = 64 / 4, 16 = 16 / 1

    IMU Settings

    imuAccNoise: 1.4e-03 imuGyrNoise: 8.6e-05 imuAccBiasN: 8.0e-05 imuGyrBiasN: 2.2e-06 imuGravity: 9.80511 imuHz: 400

    Extrinsics (IMU -> lidar)

    extrinsicTrans: [-2.59153,0.0203358, -0.107606] extrinsicRot: [ 0.218016,-1.18145,-53.0919,-0.996897, -0.132405, 0.763035,-0.109347,-1.00026,-0.504798] extrinsicRPY: [ 0.218016,-1.18145,-53.0919,-0.996897, -0.132405, 0.763035,-0.109347,-1.00026,-0.504798]

    LOAM feature threshold

    edgeThreshold: 1.0 surfThreshold: 0.1 edgeFeatureMinValidNum: 10 surfFeatureMinValidNum: 100

    voxel filter paprams

    odometrySurfLeafSize: 0.4 # default: 0.4 mappingCornerLeafSize: 0.2 # default: 0.2 mappingSurfLeafSize: 0.4 # default: 0.4

    robot motion constraint (in case you are using a 2D robot)

    z_tollerance: 1000 # meters rotation_tollerance: 1000 # radians

    CPU Params

    numberOfCores: 4 # number of cores for mapping optimization mappingProcessInterval: 0.15 # seconds, regulate mapping frequency

    Surrounding map

    surroundingkeyframeAddingDistThreshold: 1.0 # meters, regulate keyframe adding threshold surroundingkeyframeAddingAngleThreshold: 0.2 # radians, regulate keyframe adding threshold surroundingKeyframeDensity: 2.0 # meters, downsample surrounding keyframe poses
    surroundingKeyframeSearchRadius: 50.0 # meters, within n meters scan-to-map optimization (when loop closure disabled)

    Loop closure

    loopClosureEnableFlag: true surroundingKeyframeSize: 25 # submap size (when loop closure enabled) historyKeyframeSearchRadius: 20.0 # meters, key frame that is within n meters from current pose will be considerd for loop closure historyKeyframeSearchTimeDiff: 30.0 # seconds, key frame that is n seconds older will be considered for loop closure historyKeyframeSearchNum: 25 # number of hostory key frames will be fused into a submap for loop closure historyKeyframeFitnessScore: 0.3 # icp threshold, the smaller the better alignment

    Visualization

    globalMapVisualizationSearchRadius: 1000.0 # meters, global map visualization radius globalMapVisualizationPoseDensity: 10.0 # meters, global map visualization keyframe density globalMapVisualizationLeafSize: 1.0 # meters, global map visualization cloud density

    opened by sw-Tom 1
Owner
null
My old heavily modified version of bigbase v1, it has an impulse-like scrollbar, ytd header loader, Vector3 fix + gamestate fix and some other misc changes!

Old Bigbase V1 UI This is my old ui for bigbase v1 but i dont need it anymore because the dev of solar mod menu stole it, and the new paragon menu (Fr

null 13 Sep 13, 2022
ROS package to calibrate the extrinsic parameters between LiDAR and Camera.

lidar_camera_calibrator lidar_camera_calibrator is a semi-automatic, high-precision, feature-based camera and LIDAR extrinsic calibration tool. In gen

Networked Robotics and Sytems Lab 78 Dec 23, 2022
A Nintendo Switch homebrew for importing and exporting Miis.

MiiPort A Nintendo Switch homebrew for importing and exporting Miis. Installation Download a release and then place the .nro file at sd:/switch/MiiPor

null 22 Aug 23, 2022
Fast and Accurate Extrinsic Calibration for Multiple LiDARs and Cameras

Fast and Accurate Extrinsic Calibration for Multiple LiDARs and Cameras The pre-print version of our paper is available here. The pre-release code has

HKU-Mars-Lab 244 Dec 24, 2022
A Robust LiDAR-Inertial Odometry for Livox LiDAR

LIO-Livox (A Robust LiDAR-Inertial Odometry for Livox LiDAR) This respository implements a robust LiDAR-inertial odometry system for Livox LiDAR. The

livox 363 Dec 26, 2022
Livox-Mapping - An all-in-one and ready-to-use LiDAR-inertial odometry system for Livox LiDAR

Livox-Mapping This repository implements an all-in-one and ready-to-use LiDAR-inertial odometry system for Livox LiDAR. The system is developed based

null 257 Dec 27, 2022
Lidar-with-velocity - Lidar with Velocity: Motion Distortion Correction of Point Clouds from Oscillating Scanning Lidars

Lidar with Velocity A robust camera and Lidar fusion based velocity estimator to undistort the pointcloud. This repository is a barebones implementati

ISEE Research Group 163 Dec 15, 2022
LIO-SAM源码详细注释,3D SLAM融合激光、IMU、GPS

LIO-SAM-DetailedNote LIO-SAM源码详细注释,3D SLAM融合激光、IMU、GPS,因子图优化。 LIO-SAM的代码十分轻量,只有四个cpp文件,很值得读一读呢。 关于LIO-SAM的论文解读,网上已经有很多文章啦,同系列的LOAM、A-LOAM、LEGO-LOAM等,在

Tao Lu 273 Dec 29, 2022
Fuses IMU readings with a complementary filter to achieve accurate pitch and roll readings.

SimpleFusion A library that fuses accelerometer and gyroscope readings quickly and easily with a complementary filter. Overview This library combines

Sean Boerhout 5 Aug 22, 2022
Helper C++ classes to quickly preintegrate IMU measurements between SLAM keyframes

mola-imu-preintegration Integrator of IMU angular velocity readings. This repository provides: IMUIntegrator and RotationIntegrator: C++ classes to in

The MOLA SLAM framework 12 Nov 21, 2022