Competition Description

AG-ReID 2023 marks the inaugural event of the AG-ReID series. It presents an exceptional opportunity for researchers to independently evaluate the state-of-the-art in aerial-ground person reID algorithms. The competition also provides an evaluation protocol that includes datasets of person images captured from aerial and ground views. Researchers can use this protocol after the competition to benchmark their own solutions against AG-ReID winners and baselines. AG-ReiD 2023 has officially been added to the competition list of IJCB 2023 .

***

This competition will have two parts:

Part 1: "Algorithms-Self-Tested" will require competitors to perform self-evaluation on a test dataset, which has never been published before and incorporates person in the wild samples. The dataset contains a total of 100,502 frames, captured using a variety of cameras, including a UAV flying at altitudes between 15 to 45 meters, a ground-based CCTV camera, and a wearable camera on smart glasses, all on a university campus. Additionally, 15 soft biometric traits for each of the 1,615 identities in the dataset are provided, making the task more challenging and comprehensive.

Part 2: "Algorithms-Independently-Tested" will focus on evaluating the software solutions submitted by competitors. The evaluation process will be conducted by the organizers on dataset that is similar to the one used in Part 1.

***

Competitors can participate in one, two or all parts. For each part a separate winner demonstrating the best performance will be announced. Mean Average Precision (mAP) @k=1 will be used to assess the submissions.

Important Dates

Feb 27, 2023 (9:00 AM AEST) : Data is released to participants.
April 15, 2023 (9:00 AM AEST): Participants deadline for submissions: self-evaluation results for Part 1, software submissions for Part 2.
May 8, 2023 (9:00 AM AEST): Announcement of the top 25% participants
May 15, 2023 (9:00 AM AEST): Human examination and baseline results ready; paper summarizing the competition submitted to IJCB.

AG-ReID Dataset

Train

There will be 51,530 training images in JPG format depicting 807 identities.

Test

The test set comprises two subsets:
1. Aerial to Ground: 808 ids | 4,348 query images | 19,259 gallery images.
2. Ground to Aerial: 808 ids | 4,151 query images | 21,214 gallery images.

Annotations

P0001T03220A1C3F01111: (image_id) file name in JPG format.
P0001T03220A1: (person_id) unique person ID.
C3: image was taken by a CCTV camera (Camera C3)
F01111: frame of video where image extracted from.
Two other types of cameras, an aerial camera (C0) and a wearable camera (C2).

***

To download the dataset, first you need to register and log in to our Kaggle competion website , then select data tab from the top menu to download.
Note: Please read our privacy and usage policy before download the dataset.

How To Participate

To participate in Part 1 -- "Algorithms-Self-Tested":
Sign up and join AG-ReID competition on Kaggle website The dataset includes: (a) "bounding_box_train" dataset with correct labels presenting the format and nature of the test data; and (b) the actual "test" data including two settings "exp3_aerial_to_ground" and "exp6_ground_to_aerial", without ground truth labels. You will be asked to provide your rank-1 person_id for all of the test samples in the query, where the person_id can be extracted from the first 13 characters of image_id. The "test.csv" CSV file with the identities need to be sent by April 15 to Kaggle site to be considered in the competition. Important: the "test" data CANNOT be used in any way for training your algorithms. We trust in your fair evaluation. Only models pre-trained on ImageNet are permitted for fine-tuning. The usage of external data is prohibited except for the provided "qut_attribute_v8.mat" file contains soft-biometric attributes of all identities in the dataset.

To participate in Part 2 -- "Algorithms-Independently-Tested":
Share your software algorithms with agreid2023@gmail.com by April 15. Make sure it is compatible with Ubuntu operating system and provide clear instruction to install and test your algorithm. Important: the "test" data offered in Part 1 CANNOT be used in any way for training your algorithms.

Baseline code guidance for AG-ReID can be found HERE.

Rules

General:
1. Your name in the leaderboard should be in the format "Surname Name (place of study/organization)."
2. Participants must work on the problems individually.
3. There is a limit of 10 attempts per day.
4. Participants can only select two submissions at the end of the competition.
5. Participants must exceed the medium baseline by April 15, 2023 (9:00 AM AEST). If the threshold is not exceeded by the specified deadline, the participant will be excluded from the rest of the competition.
6. Method description (One-page) must be submitted to agreid2023@gmail.com within a week after the announcement of the top 25% participants on May 8, 2023 (9:00 AM AEST) if you want to be included in the summary paper submitted to IJCB2023 on May 15, 2023 (9:00 AM AEST).
Specific to this competition:
1. Only models pre-trained on ImageNet are permitted for fine-tuning.
2. The usage of external data is prohibited except for the provided `qut_attribute_v8.mat` file.

Organizers

Queensland University of Technology, Queensland, Australia:
Dr. Kien Nguyen Thanh
Prof. Clinton Fookes
Prof. Sridha Sridharan
Thanh Nhat Huy Nguyen
Defence Science and Technology, Adelaide, Australia:
Dr. Dana Michalski
Michigan State University, Michigan, USA:
Dr. Feng Liu
Prof. Xiaoming Liu
Prof. Arun Ross

Contact us if you have questions!


Acknowledgment

This webpage is based on the LiveDet-Iris 2023 template. We would like to thank Adam Czajka and their team.