ImageCLEF Coral Annotation Challenge 2021 Call for Papers



The 3rd Edition of the ImageCLEF Coral Annotation Challenge 2021

The increasing use of structure-from-motion photogrammetry for
modelling large-scale environments from action cameras attached to
drones has driven the next-generation of visualisation techniques that
can be used in augmented and virtual reality headsets. It has also
created a need to have such models labelled, with objects such as
people, buildings, vehicles, terrain, etc. all essential for machine
learning techniques to automatically identify as areas of interest and
to label them appropriately. However, the complexity of the images
makes impossible for human annotators to assess the contents of images
on a large scale.

Advances in automatically annotating images for complexity and benthic
composition have been promising, and we are interested in
automatically identify areas of interest and to label them
appropriately for monitoring coral reefs. Coral reefs are in danger of
being lost within the next 30 years, and with them the ecosystems they
support. This catastrophe will not only see the extinction of many
marine species, but also create a humanitarian crisis on a global
scale for the billions of humans who rely on reef services. By
monitoring the changes and composition of coral reefs we can help
prioritise conservation efforts.

New for 2021:

in its 3rd edition, the training and test data will form the complete
set of images required to form a 3D reconstruction of the
environment. This allows the participants to explore novel
probabilistic computer vision techniques based around image overlap
and transposition of data points. Participants will be given
instruction on the preparation of 3D reconstruction, the output files
(.obj) and a visualisation of each model without labels (for example,

In addition, participants are encourage to use the publicly available
NOAA NCEI data to train their approaches.

Challenge description

Participants will be require to annotate and localise coral reef
images by labelling the images with types of benthic substrate
together. Each image is provided with possible class types.


The data for this task originates from a growing, large-scale
collection of images taken from coral reefs around the world as part
of a coral reef monitoring project with the Marine Technology Research
Unit at the University of Essex.  Substrates of the same type can have
very different morphologies, color variation and patterns. Some of the
images contain a white line (scientific measurement tape) that may
occlude part of the entity. The quality of the images is variable,
some are blurry, and some have poor color balance. This is
representative of the Marine Technology Research Unit dataset and all
images are useful for data analysis. The images contain annotations of
the following 13 types of substrates: 
Hard Coral 
* Branching, Hard Coral 
* Submassive, Hard Coral 
* Boulder, Hard Coral 
* Encrusting, Hard Coral 
* Table, Hard Coral 
* Foliose, Hard Coral
* Mushroom, Soft Coral, Soft Coral 
* Gorgonian, Sponge, Sponge
* Barrel, Fire Coral 
* Millepora and Algae - Macro or Leaves.  

The test data contains images from four different locations:

    same location as training set
    similar location to training set
    geographically similar to training set
    geographically distinct from training set

Important dates

    16.11.2020: registration opens for all ImageCLEF tasks
    04.02.2021: development data released
    15.03.2021: test data release starts
    07.05.2021: deadline for submitting the participants runs
    28.05.2021: deadline for submission of working notes papers by the participants
    21-24.09.2021: CLEF 2021, Bucharest, Romania


Participant Registration


Organizing Committee

    Jon Chamberlain ,University of Essex, UK
    Thomas A. Oliver , NOAA/ US IOOS, USA
    Hassan Moustahfid , NOAA/ US IOOS, USA
    Antonio Campello ,Wellcome Trust, UK
    Adrian Clark ,University of Essex, UK
    Alba García Seco de Herrera ,University of Essex, UK


For more details and updates, please visit the task website at: 

And join our mailing list: