Damien LaRocque

Damien LaRocque

A blog by Damien LaRocque

ROS 2 Tips & Tricks

Small configurations for your ROS 2 workflow

· · 2min · Damien LaRocque
cover

Here are a few tips and tricks and configurations to add to your system while working with ROS 2.

Timeline

Here is a timeline for recent previous distributions of ROS. When searching for a ROS packages on GitHub, the names of the distributions is often used as a repo branch. For example, at the time of writing this article, the turtlebot4_robot package had jazzy as its main branch. If you're working with a ROS 2 distribution that is not supported yet, prioritize using repositories with ROS 2 compatibility, as it is easier to migrate code between ROS 2 distributions (foxy ➡️ jazzy, for example) than to migrate from ROS (1) to ROS 2 (noetic -> humble, for example).

<figcaption class="mb-4">
    Ubuntu, ROS and ROS 2 timeline
</figcaption>

Terminal settings

Add the following lines to your terminal rc file (e.g.: ~/.bashrc):

# Console output colorizing
export RCUTILS_COLORIZED_OUTPUT=1
# Console output formatting
export RCUTILS_CONSOLE_OUTPUT_FORMAT="[{severity} {time}] [{name}]: {message} ({function_name}() at {file_name}:{line_number})"

Colcon

You can easily cd to the source of a specific ROS 2 package with colcon_cd. To enable it, add the following line to your terminal configuration file:

source /usr/share/colcon_cd/function/colcon_cd.sh
export _colcon_cd_root=/opt/ros/${ROS_DISTRO}/

You can also enable argcomplete for colcon. This speeds up calling colcon commands by enabling tab completion with colcon.

source /usr/share/colcon_argcomplete/hook/colcon-argcomplete.bash

colcon mixins are also useful to build with specific C++ build types without having to remember tedious and long commands. In your ~/colcon_ws, run these commands:

colcon mixin add default https://raw.githubusercontent.com/colcon/colcon-mixin-repository/master/index.yaml
colcon mixin update default

rosdep

rosdep is the dependency resolver to install dependencies for the ROS 2 packages in a colcon workspace. To initialize rosdep on your system, use these commanda

# Initialize and update rosdep
sudo rosdep init
rosdep update

And then, in the root directory of your colcon_ws, use this command to install the dependencies of the packages available in the workspace:

rosdep install --from-paths src -y --ignore-src

⚠️ and for ROS 1 ⚠️

rosdep install --from-paths src --ignore-src -r -y

ROS 2 package creation

You can create a ROS 2 colcon package in the src/ directory of your colcon workspace with this command:

ros2 pkg create --build-type ament_cmake --license <license> <package_name>

Here are the licenses options that are supported by ROS 2, when creating a package:

$ ros2 pkg create --license ? package
Supported licenses:
Apache-2.0
BSL-1.0
BSD-2.0
BSD-2-Clause
BSD-3-Clause
GPL-3.0-only
LGPL-3.0-only
MIT
MIT-0
ROS 2 Tips & Tricks

Using Citizen Science Data as Pre-Training for Semantic Segmentation of High-Resolution UAV Images for Natural Forests Post-Disturbance Assessment

Published in MDPI Forests journal!

· 3min · Damien LaRocque
cover

During the last months, I contributed to the paper Using Citizen Science Data as Pre-Training for Semantic Segmentation of High-Resolution UAV Images for Natural Forests Post-Disturbance Assessment, published in the Classification of Forest Tree Species Using Remote Sensing Technologies: Latest Advances and Improvements special issue of the Forests MDPI journal. This paper proposes a novel pre-training approach for semantic segmentation of UAV imagery, where a classifier trained on citizen science data generates over 140,000 auto-labeled images, improving model performance and achieving a higher F1 score (43.74%) than training solely on manually labeled data (41.58%). With this paper, we highlight the importance of AI for large-scale environmental monitoring of dense and vasts forested areas, such as in the province of Quebec.

Here is the abstract:

The ability to monitor forest areas after disturbances is key to ensure their regrowth. Problematic situations that are detected can then be addressed with targeted regeneration efforts. However, achieving this with automated photo interpretation is problematic, as training such systems requires large amounts of labeled data. To this effect, we leverage citizen science data (iNaturalist) to alleviate this issue. More precisely, we seek to generate pre-training data from a classifier trained on selected exemplars. This is accomplished by using a moving-window approach on carefully gathered low-altitude images with an Unmanned Aerial Vehicle (UAV), WilDReF-Q (Wild Drone Regrowth Forest—Quebec) dataset, to generate high-quality pseudo-labels. To generate accurate pseudo-labels, the predictions of our classifier for each window are integrated using a majority voting approach. Our results indicate that pre-training a semantic segmentation network on over 140,000 auto-labeled images yields an 𝐹1 score of 43.74% over 24 different classes, on a separate ground truth dataset. In comparison, using only labeled images yields a score of 32.45%, while fine-tuning the pre-trained network only yields marginal improvements (46.76%). Importantly, we demonstrate that our approach is able to benefit from more unlabeled images, opening the door for learning at scale. We also optimized the hyperparameters for pseudo-labeling, including the number of predictions assigned to each pixel in the majority voting process. Overall, this demonstrates that an auto-labeling approach can greatly reduce the development cost of plant identification in regeneration regions, based on UAV imagery.

{ "images": [ { "src": "diagram-forest.png", "title": "An overview of our developed semantic segmentation approach."}, { "src": "overview.png", "title": "Overview of our approach."}, { "src": "study-areas.png", "title": "Map of the seven study sites where UAV surveys were conducted."}, { "src": "qualitative-results.png", "title": "Qualitative results of our semantic segmentation approach."}, { "src": "voting-strategy.png", "title": "Voting strategy used in the pseudo-labeling process to generate pre-training data for S_M2F."} ] }

For more info,

Using Citizen Science Data as Pre-Training for Semantic Segmentation of High-Resolution UAV Images for Natural Forests Post-Disturbance Assessment

Artificial Intelligence Resources

Learning resources for anyone interested in the vast domain of AI

· · 3min · Damien LaRocque
cover

Here is a slightly curated list of learning resources and useful links in Computer Vision, Artificial Intelligence and related topics. For resources on robotics, visit the robotics resources post. If you have other great resources to suggest, feel free to contact me.

Deep Learning

Models

Convolutional Neural Networks

Transformers

Diffusion models

Applications

Computer Vision

Geospatial learning

Reinforcement Learning

Applications

Legged Robotics

Datasets

Toy Datasets

  • Iris
  • Wisconsin Breast Cancer Dataset
  • Wine Dataset
  • Ames Housing Dataset
  • MNIST
  • FashionMNIST
  • AutoMPG
  • ImageNet
  • CIFAR datasets

Object Detection

Important papers

Artificial Intelligence Resources

Robotics Resources

Learning resources for anyone starting in robotics

· · 2min · Damien LaRocque
The robot of Team Chat Robotique in the 2023 French Robotics Cup

Here is a slightly curated list of learning resources and useful links in robotics. For resources on AI, visit the AI resources post. If you have other great resources to suggest, feel free to contact me.

General

Newsletters

Robot Operating System

Robot Operating System (ROS) is a commonly used open-source robotics middleware. Please note: there is a space in ROS 2: ROS 2.

A new ROS 2 distribution is released every year on May 23rd, the World Turtle Day. Each distro is related to a release of Ubuntu. Here are the names of the last distributions, with their corresponding Ubuntu versions:

<figcaption class="mb-4">
    Ubuntu, ROS and ROS 2 timeline
</figcaption>

Official resources:

Robots

Legged Robots

  • RSL RL, a library with RL algorithms for legged robotics, from the Robotic Systems Lab (RSL), Prof. Dr. Marco Hutter, ETH Zurich

Robot Learning

  • LeRobot, a low-cost robotics project by HuggingFace, for accessible end-to-end robot learning
  • Robot Learning Course, course material for Marc Toussaint's and Wolfgang Hönig's Robot Learning course in TU Berlin.

Simulation

  • Genesis, a generative simulation tool for robotics

Datasets

Autonomous Driving

Robot Base Projects

ESP32-based

Robotics Resources

Christmas Tree PCB

· 2min · Damien LaRocque
A rendering of the PCB, in KiCad.

Towards the end of 2023, in preparation for the holidays, I designed and soldered about twenty small Christmas tree 🎄 PCBs to give to my relatives. During this year's holidays, I took advantage of some free time to clean up and prepare the project to publish it as an open-source hardware project on GitHub.

Assembled PCB

About

This project taught me the basics of ATtiny microcontroller programming (via UPDI), PCB art, and SMD components soldering. I hope this project inspire anyone interested in #hardware, #embedded #programming, and #electronics. If you're interested in making one for next year, I wrote down all the tips and tricks for making your own Christmas tree PCB in an instruction guide.

This project was made with the following free and open-source software:

  • FreeCAD for the outline
  • Inkscape for the PCB art
  • KiCad for the design of the PCB
  • PlatformIO for embedded programming

Images

{ "images": [ { "src": "images/rendering.png", "title": "Rendering"}, { "src": "images/front.png", "title": "Front layer"}, { "src": "images/back.png", "title": "Back layer"}, { "src": "images/sketch.png", "title": "Tree Outline in FreeCAD"}, { "src": "images/pcb-layout.png", "title": "PCB layout in KiCAD"} ] }

Links

For more info,

Christmas Tree PCB