librealsense: T265 Wheels Odometer calibration_odometry.json configuration. Wheels odometry fusion is not working.
Required Info | |
---|---|
Camera Model | T265 |
Firmware Version | 0.1.0.279 |
Operating System & Version | Linux (Ubuntu 18.04.3 LTS bionic) |
Kernel Version (Linux Only) | 4.15.0-65-generic |
Platform | Intel NUC i3 |
SDK Version | 2.29.0 |
ROS Version | Melodic Morenia 1.14.3 |
ROS Realsense Version | 2.2.8 |
Language | C++ |
Segment | Robot |
Issue Description
I have configured the calibration_odometry.json in my robot to fuse the wheels odometry with t265 visual inertial odometry. The camera T265 was mounted on the back of robot looking backwards and in a vertical position (rotated 90 degrees) with the cable downwards.
I have read many Issues, Pull Requests, Source Code and the Documentation about wheels odometry fusion in t265 but I could not make it work. Here are only a few of them: #3462, Wheel Odometry Calibration Questions, Wheel odometry calibration file format.
For the T265 frame I followed this:
T265 has the following coordinate system
Positive X direction is towards right imager
Positive Y direction is upwards toward the top of the device
Positive Z direction is inwards toward the back of the device
In this image below is how the camera was mounted in my robot:
The T265 frame and base_footprint frame are aligned in the same plan (center of the robot). So only 2 axis in translation are considered.
The frame that is used for wheels odometry data is base_footprint.
So I configured my calibration_odometry.json:
To config the “W”, I used this site (rotation_converter) to convert Euler angles (roll, pitch, yaw) to orientation in axis-angle representation (rad) used in json. Because the format applied in json has only 3 parameters I utilized the Axis with angle magnitude (radians) [x, y, z] instead of Axis-Angle {[x, y, z], angle (radians)} that has 4 parameters.
For both translation and orientation I assumed the t265 as a parent and base_footprint as a child in the “transform”.
Test
To check the results, I have changed the noise_variance parameter in json:
"noise_variance":
0.00000004445126050420168,
With this low value in variance, the system that calculates odometry takes wheels odometry as the most confidence data. So it computes movements based more on wheels odometry instead of T265 images. In practical, it considers only the wheels odometry. But in this case, what happened is that my odometry does not compute the translation forward and backward when I move the robot in these directions.
I have created a launch in ROS. When I execute the roslaunch, the calibration_odometry.json is read correctly, the odom_in topic has the wheel odometry data and the T265 node is subscribed to this topic. I noticed that by doing a debug in source code.
Also, I have created a different json config with other values (wrong), only to test. The result was interesting, the odometry computed a translation in axis y instead of axis x (considering base_footprint frame). This makes me think that I am not setting up the calibration_odometry.json correctly. But the “good news” is that the system is considering the wheels odometry data.
Can someone help me to setting up the calibration_odometry.json?
About this issue
- Original URL
- State: closed
- Created 5 years ago
- Comments: 15 (1 by maintainers)
@RealSenseCustomerSupport, sorry. I will be more clear. I might be wrong, but for the translation, we have to consider one coordinate frame as a reference. And following many examples using t265 and previous documentation, we assume the coordinate frame reference is the “t265 pose frame”. Also, the first bullet point in https://github.com/IntelRealSense/librealsense/blob/development/doc/t265.md#extrinsic-calibration-for-wheeled-odometry-examples says:
I assume “t265 pose frame” = “T265 origin frame”.
So If this frame rotates (example 2), the translation reference will rotate too.
In example 1: The translation parameters are very clear. “T”: [ 0.0, // No Translation. -0.92, // 0.92m below (-Y) the camera. 0.44 // 0.44m behind (+Z) the camera. ], And comments also explain this:
In example 2: The “W” parameter that accounts the rotation between frames is right, assuming the same logic. The first parameter is (-pi/4) because from t265 frame to robot origin, considering rotation around X-axis, the robot origin was rotated -pi/4. It is negative because of the right-hand rule (X+ is coming out the screen).
I had similar doubts concerning the integration of the wheel odometry. I moved the robot on a platform so that wheel could turn freely without the base actually moving. Then I also reduced wheel variance a lot to make sure that the wheel odometry contribution could be more evident.
This is the robot configuration:
The corresponding configuration file looks like this:
being the angle axis rotation the corresponding representation of the rotation from the realsense frame to the base frame (which corresponds to the velocimeter location and orientation).
To have a consistent behavior I had to change the scaling. In particular if this was set to the identity matrix (as by default), moving the wheels while keeping the base still was causing the output z position to change (in the /camera/pose/odom msg) while I was expecting the x component to change.
This suggested that the motion output from the wheel odometry is still expressed in the VR frame and I thought the scaling could help. In fact changing it to the one reported here gave consistent outputs and I started observing displacement in the x direction. Namely, wheel spinning positively gave a negative displacement (consistent with the ROS frame convection in which such estimates are expressed).
One additional thing: the wheel odometry message is expressed in the velocimeter frame, thus positive x speed = robot moving forward.
I have never seen the scaling changed in the configuration file in previous posts, or issues. It would be nice to get some confirm/clarification on this.