Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TeamG F24 Wiki Entry DJI Payload SDK Guide #181

Open
wants to merge 2 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions _data/navigation.yml
Original file line number Diff line number Diff line change
Expand Up @@ -179,6 +179,8 @@ wiki:
url: /wiki/state-estimation/Cartographer-ROS-Integration/
- title: Externally Referenced State Estimation for GPS Lacking Environments
url: /wiki/state-estimation/gps-lacking-state-estimation-sensors.md
- title: Setting Up OptiTrack Motion Capture System with ROS
url: /wiki/state-estimation/optitrack-mocap.md
- title: Programming
url: /wiki/programming/
children:
Expand Down
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
94 changes: 94 additions & 0 deletions wiki/common-platforms/dji-enterprise-payload-sdk.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,94 @@
---
date: 2024-05-01 # YYYY-MM-DD
title: Indoor UAV Navigation with DJI Enterprise Payload SDK (PSDK
---
This section will help with implementing indoor control of DJI Enterprise series drones. Unlike the other section present in the knowledge base, this does not use the Onboard SDK (which DJI is phasing out) instead we use the **Payload SDK** (PSDK). The Payload SDK is aimed at enterprise users of DJI Drones and can be used for control of the drone, changing flight modes, reading sensors and streaming data.


Through this article you should be able to
1. Setup a Payload App for your DJI Enterprise Drone
2. Fly your DJI Enterprise Drone using the Payload SDK
3. Fly your DJI Enterprise Drone using the ROS2 Wrapper for Payload SDK

## Requirements
In order to get started with the DJI Payload SDK, one would require the following **hardware** components
1. A DJI Enterprise Drone (e.g. DJI Mavic 3 Enterprise)
2. DJI Remote Controller
3. DJI Eport Lite Kit
4. A Compute Module (e.g. Raspberry Pi Zero, NVidia Jetson Orin)
5. A TTL to USB Converter Module (e.g. FT232RL)
6. (Optional for low latency video streaming) HDMI Capture Card
7. (Optional for camera feed streaming) USB C to ethernet module

Once you have all these requirements you can proceed with the rest of the document.

## Getting Started
In order to implement the DJI Payload SDK on a system one would first need to get our Payload Application approved by DJI. To do this you need to create an account on the DJI portal. You should be able to do this via the following link <https://developer.dji.com/user/apps/#all>

It is not necessary to be very accurate with the details of the Payload but a brief overview of what it will do will be required for the application. After entering all these details one can submit the Payload SDK application approval form. Within 2-3 days your payload application should be approved and you should get an email from DJI.

Upon this approval you should be able to login to the DJI Developer Portal and see details regarding your payload. Which should match the following format

| Name | Value |
|----------|----------|
| SDK Type | Payload SDK |
| App Name | Your_App_Name |
| APP ID | *six digit number* |
| App Key | 32 byte ID |
| App License| Long alphanumeric hash |
| Apply Status | accepted |

## Creating the Payload Application

The most straightforward way to run the Payload Application is to use run the samples scripts from Github. The code is available at <https://github.com/dji-sdk/Payload-SDK>. The code is written in C++ and can be compiled using CMake. Before compiling however, one would need to make changes in the configuration header files to include the APP ID, APP Key and APP License that you received from DJI.

Before you run the sample code, you will need to make changes in the CPP files and the C files.

For the CPP files you will need to change the following files:
1. `samples/sample_c++/platform/linux/manifold2/application/dji_sdk_config.json`
2. `samples/sample_c++/platform/linux/manifold2/application/dji_sdk_app_info.h`

For the C files you will need to change the following files:
1. `samples/sample_c/platform/linux/manifold2/application/dji_sdk_app_info.h`

If you are running on Jetson Nano change the file path to `samples/sample_c/platform/linux/nvidia_jetson/application/dji_sdk_app_info.h`

Make sure to follow the structure of the message, the APP ID, APP Key and APP License should be in the aprropriate format for the JSON file. Once you have made these changes you can compile the code using CMake.

Now that you have setup these ID's you will need to choose the type of communication that you will be using. There are three different types of communication that you can use with the Payload SDK. These are:
1. UART Communication
2. UART + USB Bulk Communication
3. UART + Ethernet Communication

Based on whether you want the data stream or not you could choose the bulk or ethernet options but you **have** to have the UART communication.
These settings are set in the file named `samples/sample_c/platform/linux/manifold2/application/dji_sdk_config.h`

![architecture](assets/drone_architecture.png)
## Running the Payload Application
Compile the code using CMake and then run the executable. Now you should be able to run the sample application. First connect the DJI Enterprise Drone to the Eport Lite Kit using their niche USB OTG Cable. (This funny cable looks like a regular USB C OTG but it does not work when flipped around). Make sure the sides A and B are connected correctly (Left side on drone is A and top side of EPort is also A).

Connect the USB to UART Module to the Eport Lite Kit using jumper cables and on the other side connect it to the Compute Module where the code was built. Run the code and you should be able to see a menu that can run the different functions of the Payload SDK.

### Using the ethernet module
This is a bit more complicated than the UART communication. You will need to connect the USB C to Ethernet module to the Eport Lite Kit and then connect the ethernet cable to the Compute Module. Then change the files `samples/sample_c/platform/linux/manifold2/application/dji_sdk_config.h` to update the communication mode to `DJI_USE_UART_AND_NETWORK_DEVICE`

You will also need to obtain the address of the ethernet device that you are using. This can be done by connecting the device to a linux system and running `lsusb -v` to get the pid and vid of the device. To get the ethernet address of the device you should run `ifconfig`. These ids should be updated in the `samples/sample_c++/platform/linux/manifold2/application/dji_sdk_config.json` file.

## ROS2 Wrapper for Payload SDK
The ROS2 Wrapper for the Payload SDK allows you to use ROS2 services and topics to control the DJI Enterprise Drone. There is a github repository here that
allows you to do that <https://umdlife.github.io/psdk_ros2/>


## Summary
The provided sections offer instructions for implementing indoor control of DJI Enterprise series drones using the Payload SDK (PSDK) for enterprise users. It covers setting up a Payload App, configuring hardware requirements, modifying code, and utilizing a ROS2 Wrapper for extended functionality.

## See Also:
- [DJI Payload SDK Documentation](https://developer.dji.com/payload-sdk/)
- [ROS2 Wrapper for Payload SDK](https://umdlife.github.io/psdk_ros2/)

## Further Reading
- [Article from DJI Forum that explains interfacing with ethernet](https://sdk-forum.dji.net/hc/zh-cn/articles/15754783739545-M30-T-M3E%E6%9C%BA%E5%9E%8BPSDK%E7%AB%AF%E5%8F%A3USB-%E7%BD%91%E5%8D%A1%E8%AE%BE%E7%BD%AE)

## References
- [1] “Payload SDK Documentation,” DJI Developer, https://developer.dji.com/payload-sdk/ (accessed May 1, 2024).
- [2] “Psdk ROS2 wrapper,” UMDLife - psdk_ros2 wrapper documentation, https://umdlife.github.io/psdk_ros2/ (accessed May 1, 2024).
94 changes: 94 additions & 0 deletions wiki/state-estimation/optitrack-mocap.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,94 @@
---
date: 2024-12-02
title: Setting Up OptiTrack Mocap System
---

Motion capture (mocap) systems, such as OptiTrack, provide highly accurate positional and orientational data, often regarded as ground truth, for rigid and deformable objects. This capability is invaluable in robotics for debugging subsystem modules, verifying controller designs, or comparing perception systems. By employing an array of overhead cameras and reflective markers, mocap systems accurately track movements and provide real-time feedback.

Two major brands dominate the mocap market: VICON and OptiTrack. Both systems operate on similar principles but offer distinct software ecosystems and hardware configurations. In this guide, we will focus on the OptiTrack system, detailing its setup, integration with robotics software frameworks like ROS and PX4, and applications such as PX4 integration for aerial robots.

## Overview of Motion Capture Systems

Motion capture systems rely on reflective markers affixed to objects or bodies. These markers are tracked by high-speed cameras equipped with infrared (IR) lights.

### Working Principles
- **Rigid Bodies**: For rigid objects like drones or ground robots, the relative positions of markers remain constant, allowing accurate calculation of the object’s position and orientation.
- **Deformable Bodies**: For flexible or articulated objects such as wires or human bodies, individual marker positions are tracked to reconstruct motion.

### Applications in Robotics
- **Controller Tuning**: Validate robot control algorithms by comparing mocap data to expected outcomes.
- **Perception System Evaluation**: Benchmark sensor-based localization against mocap-generated ground truth.
- **Real-time Motion Planning**: Use mocap data for precise control in dynamic environments.

**Note**: OptiTrack is particularly popular in the robotics community due to its affordability and ease of integration with open-source tools like ROS.

## OptiTrack System Setup

### Hardware Requirements
Before setting up the system, ensure your workspace meets the following requirements:
- **Camera Placement**: Arrange OptiTrack cameras to achieve overlapping fields of view. This maximizes tracking accuracy and avoids occlusions.
- **Reflective Markers**: Use pre-stitched marker configurations for rigid bodies or distribute markers evenly for deformable objects.
- **Environmental Considerations**: Reduce IR interference (e.g., sunlight or other IR sources) and minimize reflective surfaces that may cause false positives.

### Software Setup: Motive Software in Windows
Motive is the core software used for managing OptiTrack motion capture systems, including configuration, tracking, visualization, and analysis of motion data. It supports various applications such as biomechanics studies, animation, VR, and robotics.

#### Minimum System Requirements
- **Operating System**: Windows 10 or later (64-bit)
- **Processor (CPU)**: Intel Core i5 or equivalent
- **Memory (RAM)**: 8 GB
- **Graphics (GPU)**: Dedicated graphics card with DirectX 11 support (e.g., NVIDIA GeForce GTX 1050 or AMD Radeon RX 560)
- **Storage**: 500 GB or more SSD
- **USB Ports**: USB 3.0
- **Network**: Gigabit Ethernet
- **Display Resolution**: Minimum 1920x1080 (Full HD)
- **Other Requirements**: DirectX 11 or higher

[Installation and Activation Guide](https://docs.optitrack.com/v3.0/motive/installation-and-activation)

### Pulling Mocap Data to a Linux Computer and ROS Workspace
The OptiTrack system is typically operated via a Windows computer connected to the cameras. The data can be streamed to a Linux machine running ROS for further integration.

#### Steps:
1. Configure objects and setup using Motive software on the Windows machine.
2. Install the ROS driver `mocap_optitrack` on the Linux computer. Use these tutorials:
- [OptiTrack and ROS Tutorial](https://tuw-cpsg.github.io/tutorials/optitrack-and-ros/)
- [ROS Wiki: mocap_optitrack](http://wiki.ros.org/mocap_optitrack)
3. Ensure both systems are on the same network.
4. Stream data from Motive software by selecting the object name. Adjust broadcast frequency as needed (default: <100 Hz; maximum: 1000 Hz).

> **Note**: The mocap data uses an ENU frame (East-North-Up). This differs from the Motive software visualization where Y is up.

## Fuse Mocap Data into PX4

For aerial robotics, mocap data can be fused into PX4 for precise positioning. This is particularly useful in environments where GPS is unavailable. Detailed documentation is available [here](https://docs.px4.io/v1.12/en/ros/external_position_estimation.html).

By following this guide, you can:
- Treat mocap data as a fake GPS signal.
- Enable PX4's position hold mode without GPS.

### Tips for Successful Integration:
1. **Remap Data**: Map mocap data to `/mavros/vision_pose/pose` at 30–50 Hz in the ENU frame. The `mocap_optitrack` ROS driver handles this conversion. If not, write a script to ensure data consistency.
2. **Frame Alignment**: ROS uses the ENU frame, while PX4 internally uses the NED frame. Mavros ensures compatibility with ENU.
3. **EKF2_EV_DELAY Parameter**: Adjust this parameter in PX4 to synchronize IMU and mocap data. Compare orientations from both sources to determine the correct value.
4. **Bandwidth Management**: Minimize unnecessary network traffic to prevent data transfer bottlenecks, which can adversely affect motion planning.

## Conclusion
OptiTrack motion capture systems offer unparalleled precision for robotics applications. By effectively setting up hardware, configuring Motive software, and integrating mocap data into frameworks like ROS and PX4, roboticists can unlock new possibilities in localization, control, and real-time planning.

Adhering to best practices, such as proper marker placement and frame alignment, ensures accurate and reliable data for any application. Further exploration of advanced techniques and troubleshooting strategies will solidify OptiTrack’s role in cutting-edge robotics research and development.

## See Also
- [Setting up ROS Workspaces](https://wiki.ros.org/ROS/Tutorials)
- [Motion Capture for Robotics](https://roboticsknowledgebase.com/mocap)

## Further Reading
- [OptiTrack Official Documentation](https://optitrack.com/documentation/)
- [PX4 External Position Estimation](https://docs.px4.io/v1.12/en/ros/external_position_estimation.html)
- [Benchmarking Localization Systems](https://roboticsbenchmarking.com)

## References
- OptiTrack Motive Installation Guide: <https://docs.optitrack.com/v3.0/motive/installation-and-activation>
- TUW CPSG Tutorial: <https://tuw-cpsg.github.io/tutorials/optitrack-and-ros/>
- ROS Wiki - mocap_optitrack: <http://wiki.ros.org/mocap_optitrack>
- PX4 External Position Estimation: <https://docs.px4.io/v1.12/en/ros/external_position_estimation.html>