Skip to content

Commit

Permalink
[SW-1245] Custom Gripper Calibration for Spot (#131)
Browse files Browse the repository at this point in the history
# Change Overview
Related PR for ROS2 Functionality: bdaiinstitute/spot_ros2#455 (comment)

Add a custom automatic calibration routine to allow Spot to calibrate it's own hand cameras. 

# Testing Done
Ran the entire calibration, end-to-end, several times on a few different Spots and calibration boards in Spot lab. Also, ran different calibrations on the same collected datasets.

Note: Open3d is added as a dep so that the republisher in the related PR that shares the same requirements.txt is able to do point cloud to image conversions
  • Loading branch information
glvov-bdai authored Aug 22, 2024
1 parent 72adf67 commit 8d7033f
Show file tree
Hide file tree
Showing 7 changed files with 2,042 additions and 0 deletions.
1 change: 1 addition & 0 deletions requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ bosdyn-mission==4.0.2
grpcio==1.59.3
image==1.5.33
inflection==0.5.1
open3d==0.18.0
protobuf==4.22.1
pytest==7.3.1
pytest-cov==4.1.0
Expand Down
245 changes: 245 additions & 0 deletions spot_wrapper/calibration/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,245 @@
# Automatic Robotic Stereo Camera Calibration Utility with Charuco Target (a.k.a Multi-Stereo Madness)
TODO: placeholder for reference image not yet approved for release, hopefully added soon
![spot eye in hand cal](spot_eye_in_hand_cal_thumbnail.jpg)

# Table of Contents

1. [***Overview***](#overview)
2. [***Adapting Automatic Collection and Calibration to Your Scenario***](#adapting-automatic-data-collection-and-calibration-to-your-scenario)
3. [***Calibrate Spot Manipulator Eye-In-Hand Cameras With the CLI Tool***](#calibrate-spot-manipulator-eye-in-hand-cameras-with-the-cli-tool)
- [Robot and Target Setup](#robot-and-target-setup)
- [Example Usage](#example-usage-aka-hand-specific-live-incantations)
- [Improving Calibration Quality](#improving-calibration-quality)
- [Using the Registered Information with Spot ROS 2](#using-the-registered-information-with-spot-ros-2)
4. [***Using the CLI Tool To Calibrate On an Existing Dataset***](#using-the-cli-tool-to-calibrate-on-an-existing-dataset)
5. [***Understanding the Output Calibration Config File from the CLI***](#understanding-the-output-calibration-config-file-from-the-cli)
6. [Recreate the Core Calibration CLI Tool Without Depending On Spot Wrapper](#recreate-the-core-calibration-cli-tool-without-depending-on-spot-wrapper)

# Overview
This utility streamlines automatic
camera calibration to **solve for the intrinsic and extrinsic parameters for two or more
cameras mounted in a fixed pose relative to each other on a robot**
based off of moving the robot to view a Charuco target from different poses. If you already
have an existing dataset of synchronized stereo (or multi-stereo) photos of a Charuco target from different viewpoints,
you can use the CLI tool to compute the intrinsic/extrinsic parameters. Additionally,
the CLI tool's saving capability allows to store multiple unique calibration runs in one configuration file
with calibration metadata, to document related runs with different setups or parameters.

This was developed to calibrate the two cameras at the
end of Spot's optional manipulator payload, while being **as general as possible
to be easily adapted to new robots and camera configurations. The core calibration utilities depend only on NumPy and OpenCV.**

This utility works by sampling viewpoints relative to the Charuco calibration board,
visiting those viewpoints, and snapping photos with all cameras at each viewpoint. Then,
these images are used to calibrate the desired cameras both individually and with respect to each
other. This utility is particularly good at handling partial views of the board
due to it's use of a charuco board. For more info, see ```calibration_util.py```, where the functions
```get_multiple_perspective_camera_calibration_dataset``` and ```multistereo_calibration_charuco```
do most of the heavy lifting.

# Adapting Automatic Data Collection and Calibration to Your Scenario

Assuming that you have a charuco board you'd like to automatically calibrate your cameras/robot with:

**To calibrate a new robot with new cameras using the utility**, implement the abstract class ```AutomaticCameraCalibrationRobot``` from ```automatic_camera_calibration_robot.py```
(Only five methods, that are likely analogous to what's needed for automatic calibration even if this utility isn't used.
in Spot's case , excluding comments, it's under ~250 lines of code, see ```spot_in_hand_camera_calibration.py```),
and pass the implemented class as an argument to ```get_multiple_perspective_camera_calibration_dataset``` from ```calibration_util.py``` (see ```calibrate_spot_hand_camera_cli.py``` for an example).

**Adding a new camera to register with Spot's existing hand cameras** is as easy as adding a call to append
the new camera image in ```SpotInHandCalibration.capture_images``` in ```spot_in_hand_camera_calibration.py``` to the existing
list of images obtained with the default cameras (assuming that the new camera
is fixed relative to the existing cameras.).


# Calibrate Spot Manipulator Eye-In-Hand Cameras With the CLI tool

There is an [existing method to calibrate the gripper cameras with the Spot Api](https://dev.bostondynamics.com/protos/bosdyn/api/proto_reference#bosdyn-api-spot-GripperCameraCalibrationCommandRequest). However, you don't have
as much control and extensibility in the existing method as with this custom procedure.

In Spot's hand, there is an RGB camera, as well as a ToF camera, which ```calibrate_spot_hand_camera_cli.py```
automatically co-registers using the calibration utility via the default calibration board included
with most Spots. You can also use a different board if you'd like, just set the right CLI parameters
(see example usage below)

## Robot and Target Setup
Have Spot sit on the ground with the arm stowed such that nothing is within a meter of the robot.
Spot should NOT be on its charging dock.

No program should have robot control authority. Make sure Spot is sitting
with the arm stowed before releasing robot authority.
If you are using the ROS 2 Driver, disconnect from Spot. If you are using the tablet to control Spot,
select power icon (top) -> Advanced -> Release control.

Place the Spot default calibration board on the ground leaning against something in front of Spot, so
that the calibration board's pattern is facing Spot front (where the stowed arm points).
The up arrow should point in the direction of the sky. The board should be tilted away from
Spot at a 45 degree angle, so that the bottom of the board is closer Spot than the top of the board. The ground beneath
the board should be at about a 45 degree angle from the board. The board's bottom should be about a meter away
from the front of Spot while sitting. Nothing should be within a meter of the robot.

**See the reference image at the top of this README to see good board placement relative to Spot.**
TODO: reference image not yet approved

When calibrating, Spot will stand up, ready its arm, lower its base slightly, and
lower its arm slightly. As soon as this happens, if Spot
can see a Charuco board it will start to move the arm around for the calibration.

If the calibration board isn't placed at the right location, or there is more
than one board visible to the robot, this may lead to potentially unsafe behavior, so be ready to
E-Stop the robot at any moment. If the robot starts exhibiting unexpected behavior, stopping the program,
turning off the computer running the calibration, and hijacking control from the tablet can help stop robot movement.

It is possible to calibrate at further distances
(see ```--dist_from_board_viewpoint_range``` arg), but the sampling of the viewpoint
distance from the board must be feasible to reach with where the base of the
robot is started relative to the board. The previously recommended Spot and Target setup is what worked well
in testing for the default distance viewpoint range.

After the calibration is finished, Spot stows its arm and sits back down. At this point,
it is safe to take control of Spot from the tablet or ROS 2 , even if the calibration script is still
running. Just don't stop the script or it will stop calculating the parameters :)

## Example Usage (a.k.a Hand Specific Live Incantations)
For all possible arguments to the Hand Specific CLI tool, run ```python3 calibrate_spot_hand_camera_cli.py -h```.
Many parameters are customizable.

If you'd like to calibrate depth to rgb, with rgb at default resolution, saving photos to ```~/my_collection/calibrated.yaml```,
here is an example CLI command template, under the default tag (recommended for first time)
```
python3 calibrate_spot_hand_camera_cli.py --ip <IP> -u user -pw <SECRET> --data_path ~/my_collection/ \
--save_data True --result_path ~/my_collection/calibrated.yaml --photo_utilization_ratio 1 --stereo_pairs "[(1,0)]" \
--spot_rgb_photo_width=640 --spot_rgb_photo_height=480 --tag default
```
If you'd like to load photos, and run the calibration with slightly different parameters,
while saving both the resuls and the parameters to same the config file as in the previous example.
Here is an example CLI command template (from recorded images, no data collection)
```
python3 calibrate_spot_hand_camera_cli.py --data_path ~/my_collection/ --from_data \
--result_path ~/my_collection/bottle_calibrated.yaml --photo_utilization_ratio 2 --stereo_pairs "[(1,0)]" \
--spot_rgb_photo_width=640 --spot_rgb_photo_height=480 --tag less_photos_used_test_v1
```
If you'd like to calibrate depth to rgb, at a greater resolution, while sampling
viewpoints at finer X-rotation steps relative to the board, and slightly further from the board
with finer steps, here is an example CLI command template. Also,
to demonstrate the stereo pairs argument, let's assume that you also want to find rgb to depth (redundant for
demonstration purposes), while writing to the same config files as above.
```
python3 calibrate_spot_hand_camera_cli.py --ip <IP> -u user -pw <SECRET> --data_path ~/my_collection/ \
--save_data True --result_path ~/my_collection/calibrated.yaml --photo_utilization_ratio 1 --stereo_pairs "[(1,0), (0,1)]" \
--spot_rgb_photo_width=1920 --spot_rgb_photo_height=1080 --x_axis_rot_viewpoint_range -10 10 1 \
--dist_from_board_viewpoint_range .6 .9 .1
```

## Improving Calibration Quality
If you find that the calibration quality isn't high enough, try a longer calibration
with a wider variety of viewpoints (decrease the step size, increase the bounds).
The default calibration viewpoint parameters are meant to facilitate a quick calibration
even on more inexpensive hardware, and as such uses a minimal amount of viewpoints.

## Using the Registered Information with Spot ROS 2
If you have the [Spot ROS 2 Driver](https://github.com/bdaiinstitute/spot_ros2) installed,
you can leverage the output of the automatic calibration to publish a depth image registered
to the RGB image frame. For more info, see the ```Optional Automatic Eye-in-Hand Stereo Calibration Routine for Manipulator (Arm) Payload```
section in the [spot_ros2 main README](https://github.com/bdaiinstitute/spot_ros2/blob/main/README.md)

# Using the CLI Tool To Calibrate On an Existing Dataset
To use the CLI Tool, please ensure that you have one parent folder,
where each camera has a folder under the parent (Numbered from 0 to N). Synchronized photos
for each camera should appear in their respective directories, where matching photos have ids, ascending
upwards, starting from 0. The file structure should appear something like the following:
```
existing_dataset/
├── 0/
│ ├── 0.png # taken at viewpoint 0
│ ├── 1.png
│ └── 2.png
├── 1/
│ ├── 0.png # taken at viewpoint 0
│ ├── 1.png
│ └── 2.png
├── 2/
│ ├── 0.png # taken at viewpoint 0
│ ├── 1.png
│ └── 2.png
```

To see all possible arguments for calibration, please run ```python3 calibrate_multistereo_cameras_with_charuco_cli.py -h```.
Many parameters such as board proportions and Agmruco dictionary are customizable.

If you'd like to register camera 1 to camera 0, and camera 2 to camera 0, you could do the following:
```
python3 calibrate_multistereo_cameras_with_charuco_cli.py --data_path ~/existing_dataset/ \
--result_path ~/existing_dataset/eye_in_hand_calib.yaml --photo_utilization_ratio 1 --stereo_pairs "[(1,0), (2, 0)]" \
--tag default --unsafe_tag_save
```

# Understanding the Output Calibration Config File from the CLI

A calibration produced with ```multistereo_calibration_charuco``` from ```calibration_util```
can be saved as a ```.yaml```file with ```save_calibration_parameters``` from ```calibration_util```.
Here is a demonstration output calibration config file for example purposes:

```
default:
intrinsic:
1:
camera_matrix: flattened_3x3_camera_matrix_now_9x1
dist_coeffs: flat_5x1_opencv_distortion_coeffs
image_dim: [height, width]
0:
camera_matrix: flattened_3x3_camera_matrix_now_9x1
dist_coeffs: flat_5x1_opencv_distortion_coeffs
image_dim: [480, 640]
extrinsic:
1: # primary camera index (origin/base frame), first sublevel
0: # reference camera index, second sublevel
R: flattened_3x3_rotation_matrix_from_primary_to_reference_camera_now_9x1
T: 3x1_translation_matrix_from_primary_to_reference_camera
run_param:
num_images: 729
timestamp: '2024-08-19 03:43:06'
stereo_pairs:
- [1, 0]
photo_utilization_ratio: 2
num_checkers_width: 9
num_checkers_height: 4
dict_size: DICT_4X4_50
checker_dim: 0.115
marker_dim: 0.09
```

Each calibration run
that creates or modifies a config file is tagged with a unique name, to allow for tracking of several experiments,
which is the main title (in the above case, ```default```).
Under the main title, there are the fields relevant to the calibration.
Under ```intrinsic```, the intrinsic (camera matrix, distortion coefficents, and image height/width)
for each camera are recorded as a flattened representation
under the camera's index number as it appears in the list of images
returned by ```capture_images```. For example, if ```capture_images``` produces a list that is
```[image_by_primary_camera, image_by_reference_camera]```, then the intrinsic for the ```primary_camera```
would be stored under ```0```, and the intrinsic for ```reference_camera``` would be stored under ```1```.

Under ```extrinsic```, the first sublevel, again a camera index, corresponds to the camera index of which camera
is the origin for the ```extrinsic``` transform between two cameras as the first
field in a requested stereo pair. The second sublevel corresponds to the index of which camera
the ```extrinsic``` maps to from the origin camera. See the comments in the example config file
above for more information.
Additionally, arguments from an argparser can also be dumped into the yaml, which will be saved
under ```run_param``` (excluding args that are: ```password``` or ```username```)

# Recreate the Core Calibration CLI Tool Without Depending On Spot Wrapper
If you like the core tools of this utility, and you'd like a more portable version (```standalone_cli.py```) for
use independent of Spot that doesn't depend on Spot Wrapper, you could recreate
the CLI tool with no dependency on Spot Wrapper with the following command:
```
cat calibration_util.py <(tail -n +3 automatic_camera_calibration_robot.py) <(tail -n +21 calibrate_multistereo_cameras_with_charuco_cli.py) > standalone_cli.py
```
The core capability above depends primarily on NumPy, OpenCV and standard Python libraries.

# Contributors
Gary Lvov

# Special Thanks To...
Michael Pickett, Katie Hughes, Tiffany Cappellari, Andrew Messing,
Emmanuel Panov, Eric Rosen, Brian Okorn, Joseph St Germain and Ken Baker.
Loading

0 comments on commit 8d7033f

Please sign in to comment.