天天看點

bmi055 标定_Ubuntu16.04+RealsenseT265跑通VINS-Fusion

一、提前條件

系統版本:ubuntu16.04+ROS(kinetic)

預設已經掌握了ubuntu系統下的基本指令以及ROS的基本操作

二、realsenseT265的SDK測試

官方網站https://www.intelrealsense.com/get-started-tracking-camera/

照着其中https://github.com/IntelRealSense/librealsense/blob/development/doc/distribution_linux.md安裝過程進行安裝即可,保證運作realsense-viewer後能有相關界面。

三、realsenseT265的标定

1.準備工作(需要注意以下檔案編譯過程中,可能出現依賴庫缺失的報錯,這很正常,按照提示的錯誤資訊安裝對應依賴庫即可)

1.1 下載下傳并編譯ceres

git clone https://github.com/ceres-solver/ceres-solver

cd ceres

mkdir build

cd build

cmake ..

make

sudo make install

1.2 下載下傳并編譯code_utils

首先,安裝依賴庫

sudo apt-get install libdw-dev

之後,安裝code_utils

cd ~/catkin_ws/src

git clone https://github.com/gaowenliang/code_utils

cd ..

catkin_make

source ~/catkin_ws/devel/setup.bash

如果有報錯fatal error: backward.hpp: 沒有那個檔案或目錄。此時在code_utils下面找到sumpixel_test.cpp,修改#include "backward.hpp"為#include “code_utils/backward.hpp”後再catkin_make。

我也遇見過no module named “XXX”的錯誤,是因為沒安裝對應的依賴庫,安裝後再catkin_make即可。

1.3 下載下傳并編譯imu_utils

cd ~/catkin_ws/src

git clone https://github.com/gaowenliang/imu_utils.git

cd ..

catkin_make

source ~/catkin_ws/devel/setup.bash

注意:先編譯code_utils,再編譯imu_utils,不能放在一起編譯。

1.4 下載下傳并編譯kalibr

首先安裝依賴庫

sudo apt-get install python-setuptools

sudo apt-get install python-setuptools python-rosinstall ipython libeigen3-dev libboost-all-dev doxygen

sudo apt-get install ros-kinetic-vision-opencv ros-kinetic-image-transport-plugins ros-kinetic-cmake-modules python-software-properties software-properties-common libpoco-dev python-matplotlib python-scipy python-git python-pip ipython libtbb-dev libblas-dev liblapack-dev python-catkin-tools libv4l-dev

接下來安裝kalibr

cd ~/catkin_ws/src

git clone https://github.com/ethz-asl/Kalibr.git

cd ..

catkin_make

source ~/catkin_ws/devel/setup.bash

在看其他部落格時發現可能會出現python相關的問題,我用的是ubuntu16.04自帶的Python 2.7.12版本,在安裝過程中沒有出現問題。

1.5 下載下傳并安裝realsense-ros

首先安裝依賴庫

sudo apt-get install ros-kinetic-ddynamic-reconfigure

接下來安裝realsense-ros

cd ~/catkin_ws/src

git clone https://github.com/IntelRealSense/realsense-ros

cd ~/catkin_ws

catkin_make

source ~/catkin_ws/devel/setup.bash

至此,我們的準備工作就做好了。

2.标定流程

2.1修改工作空間下檔案的讀寫權限,不然後面在運作時會報錯

cd ~/catkin_ws

sudo chmod 777 src/ -R

2.2 相關檔案的修改

打開位于realsense-ros/realsense2_camera/launch目錄下的rs_t265.launch檔案,将原本的代碼修改為。

在~/catkin_ws/src/imu_utils/launch中建立t265_imu.launch如下

2.3 IMU的校準

将realsenseT265插上電腦後,建議在你喜歡的地方建立一個檔案夾,比如我是在桌面上建立了一個檔案夾,然後在其中打開終端,輸入以下指令

roslaunch realsense2_camera rs_t265.launch

roslaunch imu_utils t265_imu.launch

注意過程中顯示wait for imu data是正常情況,等待大約60分鐘即可出結果,在你建立的檔案夾内生成了BMI055_imu_param.yaml檔案,該檔案給出了加速度計和陀螺儀三軸的noise_density(字尾n)和random_walk(字尾w),同時計算出了平均值,後面IMU+攝像頭聯合标定的時候需要這些均值。

2.4 相機的标定

下載下傳官方給的april_6x6_80x80cm_A0.pdf或者其它标定檔案。列印或者在螢幕顯示,測量實際的尺寸後,在你之前建立的檔案夾中建立apriltags.yaml,我的檔案内容如下:

target_type: 'aprilgrid' #gridtype

tagCols: 6 #number of apriltags

tagRows: 6 #number of apriltags

tagSize: 0.16 #size of apriltag, edge to edge [m]

tagSpacing: 0.3125 #ratio of space between tags to tagSize

#example: tagSize=2m, spacing=0.5m --> tagSpacing=0.25[-]

之後,在你建立的檔案夾中打開終端,開啟realsenseT265

roslaunch realsense2_camera rs_t265.launch

修改話題釋出頻率

rosrun topic_tools throttle messages /camera/fisheye1/image_raw 10.0 /fisheye1

rosrun topic_tools throttle messages /camera/fisheye2/image_raw 10.0 /fisheye2

錄制檔案,注意錄制過程中要緩慢移動相機,使其能看到完整清晰的标定檔案(可以先在錄制前打開rviz,調用image的話題進行觀察,判斷移動的位置)

rosbag record -O cameras_calibration /fisheye1 /fisheye2

調用kalibr的算法計算各個攝像頭的内參和外參

kalibr_calibrate_cameras --target ./apriltags.yaml --bag ./cameras_calibration.bag --bag-from-to 2 35 --models omni-radtan omni-radtan --topics /fisheye1 /fisheye2

因為并不一定要用整個錄制視訊,2和35是你想要的起始和截止時間,可以修改。

如果在過程中出現Using the default setup in the initial run leads to an error of Cameras are not connected through mutual observations, please check the dataset. Maybe adjust the approx. sync. tolerance.相機不同步的報錯,可以通過修改話題釋出的頻率,或者在kalibr指令的末尾加上–approx-sync 0.04來解決。

最終會生成camchain-.cameras_calibration.yaml、results-cam-.cameras_calibration.txt和report-cam-.cameras_calibration.pdf。

2.5. Camera-IMU聯合标定

在你之前建立的檔案夾中,建立imu.yaml檔案如下(根據你之前的BMI055_imu_param.yaml填寫參數):

#Accelerometers

accelerometer_noise_density: 8.003e-02 #Noise density (continuous-time)

accelerometer_random_walk: 5.382e-03 #Bias random walk

#Gyroscopes

gyroscope_noise_density: 8.163e-03 #Noise density (continuous-time)

gyroscope_random_walk: 3.470e-05 #Bias random walk

rostopic: /imu #the IMU ROS topic

update_rate: 200.0 #Hz (for discretization of the values above)

修改rs_t265.launch其中的兩行代碼如下:

進入你之前建立的檔案夾,打開終端,開啟T265

roslaunch realsense2_camera rs_t265.launch

修改釋出頻率

rosrun topic_tools throttle messages /camera/fisheye1/image_raw 20.0 /fisheye1

rosrun topic_tools throttle messages /camera/fisheye2/image_raw 20.0 /fisheye2

rosrun topic_tools throttle messages /camera/imu 200.0 /imu

錄制檔案

rosbag record -O imu_cameras_calibration /fisheye1 /fisheye2 /imu

調用kalibr的算法計算IMU和camera外參

kalibr_calibrate_imu_camera --target ./apriltags.yaml --cam ./camchain-.cameras_calibration.yaml --imu ./imu.yaml --bag ./imu_cameras_calibration.bag --bag-from-to 2 35 --max-iter 30 --show-extraction

最終會輸出camchain-imucam-.imu_cameras_calibration.yaml、imu-.imu_cameras_calibration.yaml、results-imucam-.imu_cameras_calibration.txt、report-imucam-.imu_cameras_calibration.pdf四個檔案,你可以通過pdf檔案檢視你标定的準确性。

至此,我們完成了相機和IMU的标定。

四、運作VINS-FUSION

先下載下傳并編譯VINS-Fusion

cd ~/catkin_ws/src

git clone https://github.com/HKUST-Aerial-Robotics/VINS-Fusion

cd ..

catkin_make

source ~/catkin_ws/devel/setup.bash

在VINS-Fusion/config檔案夾中,建立檔案夾名為realsense_t265,并在其中建立fisheye1.yaml,fisheye2.yaml,stereo_imu.yaml三個文檔,内容如下(注意相關參數需要參考BMI055_imu_param.yaml、results-cam-.cameras_calibration.txt和results-imucam-.imu_cameras_calibration.txt自行修改):

fisheye1.yaml

%YAML:1.0

---

model_type: MEI

camera_name: camera

image_width: 848

image_height: 800

mirror_parameters:

xi: 1.75347951

distortion_parameters:

k1: 0.03509054

k2: -0.3590536

p1: 0.00250588

p2: 0.00058101

projection_parameters:

gamma1: 788.10829678

gamma2: 788.19861186

u0: 416.33019648

v0: 404.21771229

fisheye2.yaml

%YAML:1.0

---

model_type: MEI

camera_name: camera

image_width: 848

image_height: 800

mirror_parameters:

xi: 1.95209328

distortion_parameters:

k1: 0.18993538

k2: -0.8234989

p1: 0.00337246

p2: 0.00013959

projection_parameters:

gamma1: 843.77832442

gamma2: 843.19711184

u0: 419.54774026

v0: 406.36245572

stereo_imu.yaml

%YAML:1.0

#common parameters

#support: 1 imu 1 cam; 1 imu 2 cam: 2 cam;

imu: 1

num_of_cam: 2

imu_topic: "/camera/imu"

image0_topic: "/camera/fisheye1/image_raw"

image1_topic: "/camera/fisheye2/image_raw"

output_path: "~/output"

cam0_calib: "fisheye1.yaml"

cam1_calib: "fisheye2.yaml"

image_width: 848

image_height: 800

# Extrinsic parameter between IMU and Camera.

estimate_extrinsic: 1 # 0 Have an accurate extrinsic parameters. We will trust the following imu^R_cam, imu^T_cam, don't change it.

# 1 Have an initial guess about extrinsic parameters. We will optimize around your initial guess.

body_T_cam0: !!opencv-matrix #cam0 coordinate under body coordinate

rows: 4

cols: 4

dt: d

data: [-0.9999306582373404, 0.001093290620038494, 0.01172533294445286, 0.5201790241280906,

-0.0011878229478030982, -0.9999668259020238, -0.008058298336212454, 0.04796159845159734,

0.011716133905124508, -0.008071667179143842, 0.9998987850754022, -0.05434762530417168,

0., 0., 0., 1.]

body_T_cam1: !!opencv-matrix

rows: 4

cols: 4

dt: d

data: [-0.9999985320918137, 0.0011840662046389947, -0.0012384673758261621, -0.10570692194161202,

-0.001173065649638605, -0.999960187777609, -0.008845720815767944, 0.048863128512499784,

-0.0012488919887611945, -0.008844255027525165, 0.9999601089152549, -0.04892047362600982,

0., 0., 0., 1. ]

#Multiple thread support

multiple_thread: 1

#feature traker paprameters

max_cnt: 150 # max feature number in feature tracking

min_dist: 30 # min distance between two features

freq: 10 # frequence (Hz) of publish tracking result. At least 10Hz for good estimation. If set 0, the frequence will be same as raw image

F_threshold: 1.0 # ransac threshold (pixel)

show_track: 1 # publish tracking image as topic

flow_back: 1 # perform forward and backward optical flow to improve feature tracking accuracy

#optimization parameters

max_solver_time: 0.04 # max solver itration time (ms), to guarantee real time

max_num_iterations: 8 # max solver itrations, to guarantee real time

keyframe_parallax: 10.0 # keyframe selection threshold (pixel)

#imu parameters The more accurate parameters you provide, the better performance

acc_n: 8.00274e-02 # accelerometer measurement noise standard deviation. #0.2 0.04

gyr_n: 8.16326e-03 # gyroscope measurement noise standard deviation. #0.05 0.004

acc_w: 5.38231e-03 # accelerometer bias random work noise standard deviation. #0.002

gyr_w: 3.470378e-05 # gyroscope bias random work noise standard deviation. #4.0e-5

g_norm: 9.805 # gravity magnitude

#unsynchronization parameters

estimate_td: 0 # online estimate time offset between camera and imu

td: 0.0 # initial value of time offset. unit: s. readed image clock + td = real image clock (IMU clock)

#loop closure parameters

load_previous_pose_graph: 0 # load and reuse previous pose graph; load from 'pose_graph_save_path'

pose_graph_save_path: "~/output" # save and load path

save_image: 1 # save image in pose graph for visualization prupose; you can close this function by setting 0

之後打開終端,運作

roslaunch realsense2_camera rs_t265.launch

rosrun vins vins_node ~/catkin_ws/src/VINS-Fusion/config/realsense_t265/stereo_imu.yaml

roslaunch vins vins_rviz.launch

就可以開始愉快的跑VINS-Fusion了。

bmi055 标定_Ubuntu16.04+RealsenseT265跑通VINS-Fusion

參考文章:

Realsense T265标定及運作VINS–kalibr和imu_utils

如何用Realsense D435i運作VINS-Mono等VIO算法 擷取IMU同步資料

使用imu_utils工具生成IMU的Allan方差标定曲線

VINS-Fusion