The Duke Robotics Club is a student organization of Duke University that develops Autonomous Underwater Vehicles (AUVs) for the RoboSub competition. Check out our website for more information about our club and projects.
This repository contains all code required for running and testing our AUVs. The code is built on the Robot Operating System (ROS) 2 framework. We use the Jazzy Jalisco distribution of ROS 2.
Our previous ROS 1 codebase can be found in the robosub-ros repository.
The high-level diagram below shows the components of the robot's onboard
software that are used during the robot's autonomous operation and the flow of information between them. The arrows indicate the direction of data flow. The labels on each arrow indicate what data is transmitted. The components are color-coded to indicate their type.
- Sensors who feed data into the software are orange.
- Software packages in
onboard
are blue. - Hardware whose actions are controlled by the software are green.
- Hardware that serves as an intermediary between the software and other hardware is yellow.
flowchart TD
IMU:::sensor --> VectorNav:::package
VectorNav --> |Orientation| SensorFusion[Sensor Fusion]:::package
VectorNav --> |Angular Velocity| SensorFusion
PressureSensor[Pressure Sensor]:::sensor --> PeripheralArduinoIn[Peripheral Arduino]:::intermediateHardware
Voltage[Voltage Sensor]:::sensor --> PeripheralArduinoIn
DVL:::sensor --> OffboardCommsIn[Offboard Comms]:::package
Gyro:::sensor --> OffboardCommsIn[Offboard Comms]:::package
IVC:::sensor --> OffboardCommsIn[Offboard Comms]:::package
PeripheralArduinoIn --> |Depth| OffboardCommsIn
PeripheralArduinoIn --> |Voltage| OffboardCommsIn
OffboardCommsIn --> |Linear Velocity| SensorFusion
OffboardCommsIn --> |Depth| SensorFusion
FrontCamera[Front Camera]:::sensor --> CV[Computer Vision]:::package
BottomCamera[Bottom Camera]:::sensor --> CV
SensorFusion --> |State| Controls:::package
SensorFusion --> |State| TaskPlanning[Task Planning]:::package
CV --> |Object Detections| TaskPlanning
Ping360:::sensor --> Sonar:::package
Sonar --> |Object Poses| TaskPlanning
Hydrophones:::sensor --> Acoustics:::package
Acoustics --> |Pinger Positions| TaskPlanning
TaskPlanning --> |Desired State| Controls
TaskPlanning --> |Servo Commands| OffboardCommsOut[Offboard Comms]:::package
Controls --> |Thruster Allocations| OffboardCommsOut
OffboardCommsOut --> |Pulse Widths| ThrusterArduino[Thruster Arduino]:::intermediateHardware
OffboardCommsOut --> |Servo Angles| PeripheralArduinoOut[Peripheral Arduino]:::intermediateHardware
ThrusterArduino --> Thrusters:::outputs
PeripheralArduinoOut --> MarkerDropperServo[Marker Dropper Servo]:::outputs
PeripheralArduinoOut --> TorpedoServo[Torpedo Servo]:::outputs
classDef sensor fill:#d94, color:#fff
classDef package fill:#00c, color:#fff
classDef outputs fill:#080, color:#fff
classDef intermediateHardware fill:#990, color:#fff
flowchart TD
IMU:::sensor --> VectorNav:::package
VectorNav --> |Orientation| SensorFusion[Sensor Fusion]:::package
VectorNav --> |Angular Velocity| SensorFusion
PressureSensor[Pressure Sensor]:::sensor --> PeripheralArduinoIn[Peripheral Arduino]:::intermediateHardware
Voltage[Voltage Sensor]:::sensor --> PeripheralArduinoIn
DVL:::sensor --> OffboardCommsIn[Offboard Comms]:::package
Gyro:::sensor --> OffboardCommsIn[Offboard Comms]:::package
IVC:::sensor --> OffboardCommsIn[Offboard Comms]:::package
PeripheralArduinoIn --> |Depth| OffboardCommsIn
PeripheralArduinoIn --> |Voltage| OffboardCommsIn
OffboardCommsIn --> |Linear Velocity| SensorFusion
OffboardCommsIn --> |Depth| SensorFusion
FrontCamera[Front Camera]:::sensor --> CV[Computer Vision]:::package
BottomCamera[Bottom Camera]:::sensor --> CV
SensorFusion --> |State| Controls:::package
SensorFusion --> |State| TaskPlanning[Task Planning]:::package
CV --> |Object Detections| TaskPlanning
Ping360:::sensor --> Sonar:::package
Sonar --> |Object Poses| TaskPlanning
TaskPlanning --> |Desired State| Controls
TaskPlanning --> |Servo Commands| OffboardCommsOut[Offboard Comms]:::package
Controls --> |Thruster Allocations| OffboardCommsOut
OffboardCommsOut --> |Pulse Widths| ThrusterArduino[Thruster Arduino]:::intermediateHardware
ThrusterArduino --> Thrusters:::outputs
classDef sensor fill:#d94, color:#fff
classDef package fill:#00c, color:#fff
classDef outputs fill:#080, color:#fff
classDef intermediateHardware fill:#990, color:#fff
Setting up the repository and development enviornment is an involved process. The full process is documented in the SETUP.md file.
- Open a terminal in the Docker container.
- Navigate to the root of the repository
/home/ubuntu/robosub-ros2
. - Run the following command to build all packages:
source build.sh
- This command builds all packages in the
core
andonboard
workspaces.
- This command builds all packages in the
- You are now ready to run the code!
See SCRIPTS.md for more information about how to use build.sh
and other scripts at the root of the repository.
- Open a terminal in the Docker container.
- Make sure you have built all packages by following the instructions in the Build Packages section.
- Run the following command:
ros2 launch execute robot.xml
- This command launches all nodes required to run the robot.
- It does not launch task planning, which must be run separately for the robot to complete tasks autonomously. See the task planning README for instructions on launching the task planning node.
We use Foxglove Studio for visualizing and debugging our code. Foxglove Studio is a web-based tool that allows us to visualize received data and send data in real time to ROS 2 topics and services.
To use Foxglove Studio:
- Open a terminal in the Docker container.
- Run the following command to start the Foxglove bridge:
fg-ws
- This is an alias that starts the Foxglove bridge, which enables Foxglove Studio to connect to the ROS 2 network.
- The bridge opens a WebSocket on port
28765
. This port is mapped to port28765
on the host machine, so you can connect to the WebSocket from your host machine.
- Open Foxglove Studio and connect to the WebSocket at
ws://IP_ADDRESS:28765
.- Replace
IP_ADDRESS
with the IP address of the host machine. If you are running the Docker container locally, you can uselocalhost
as the IP address.
- Replace
See foxglove/README.md for more information about developing for Foxglove.
To ensure code quality and consistent formatting, we use the following linters:
- Python: Ruff
- C++: ClangFormat
- Bash: ShellCheck
- TypeScript/JavaScript: ESLint
ESLint can be accessed by the CLI provided by foxglove.py
; see the Foxglove README for more information about running ESLint.
All other linters can be accessed via the CLI provided by lint.py
. This CLI is also used by the GitHub Actions build-and-lint
workflow.
To lint all code in the repository with lint.py
:
- Open a terminal in the Docker container.
- Navigate to the root of the repository
/home/ubuntu/robosub-ros2
. - Run the following command:
./lint.py
- This command lints all Python, C++, and Bash code in the repository.
- Any linting errors or warnings will be displayed in the terminal.
See
lint.py
in SCRIPTS.md for more information about the CLI provided bylint.py
.