The GRIFFIN Perception Dataset
Bridging the Gap Between Flapping-Wing Flight and Robotic Perception
This webpage is a part of the complement material of the paper The GRIFFIN Perception Dataset: Bridging the Gap Between Flapping-Wing Flight and Robotic Perception. The paper presents a perception dataset for bird-scale flapping-wing robots. The presented data include measurements from onboard sensors widely used in aerial robotics and suitable to deal with the perception challenges of flapping-wing robots, such as conventional cameras, event cameras, and Inertial Measurement Units (IMU), as well as ground truth measurements from a laser tracker or a motion capture system. Three types of datasets were recorded:
Base
ArUco
People
It describes ornithopter agile trajectories until the robot arrived to a safety landing altitude. Therewere no artificial landmarks in the scenario and themaneuver complexity depended on the wind conditions.
Datasets where the ornithopter described smooth trajectories trying to flight over ArUco markers on the ground used to add Ground Truth references and additionalal features in the scene.
The ornithopter flies over people and objects with the aim of providing samples for object detectors dealing with the ornithopter vibration effects and motion blur. Bounding box annotations are provided to allow straightforward training for people detection algorithms.
We provide a YAML file with the specifications of the APRIL grid used in the calibration datasets.
The APRIL grid at the beginning of each experiment is described in this YAML file.
The ArUco markers in the dataset had a dimensions of 0.58 × 0.58 m.
A 2-hours dataset of the IMUs measurements with the bird still is provided to obtain IMUs characterizations.
Finally, we provide a repository with useful scripts and ros packages for processing the dataset.