SHARP workshop in conjunction with CVPR Call for Papers

SHARP workshop in conjunction with CVPR 2021

Call for Participation (Challenges) 

We propose three challenges. The task of the challenges is to
reconstruct a full 3D textured mesh from a partial 3D scan. The first
challenge is for human bodies, while the second and third challenges
are for a variety of generic objects. The third challenge launches a
new unique dataset.

?? An overall 9k will be awarded as cash prizes to the winners. 

CHALLENGE 1 Recovery of Human Body Scans 

The task of this challenge is to reconstruct a full 3D textured mesh
from a partial 3D human scan acquisition. 3DBodyTex.v2 is used, which
consists of about 2500 clothed scans with a large diversity in
clothing and in poses.

CHALLENGE 2 Recovery of Generic Object Scans

This challenge is focused on textured 3D scans of generic objects. It
uses 3DObjectTex.v1 dataset  a subset from the ViewShape
repository  containing 2000 textured 3D scans of very diverse

CHALLENGE 3 Recovery of Feature Edges in 3D Object Scans

This challenge is focused on recovering feature edges of 3D
scans. Here, the very recently introduced CC3D dataset is
considered. The CC3D dataset contains 50k+ pairs of CAD models and
their corresponding 3D scans.

In order to participate to SHARP challenges, two submission options are possible: 

* Option1: Paper (required: a paper describing the method; optional:
working source code)

* Option2: Code (required: a working implementation of the method as
source code; optional: accompanying paper).

All challenges will be available with the same deadline for
registration. By choosing option1, participants will be required to
submit an accompanying paper. For option2, instead of submitting an
accompanying paper, participants will be required to submit a
code. Submitting an accompanying paper, to be included in the
proceedings of CVPR, is still highly encouraged.


Call for Papers (Paper Submission Track) 

The main focus of SHARP is to encourage paper submissions on
high-resolution 3D shape and texture recovery from partial data,
especially as accompanying papers to the challenge submissions. In
addition, all topics that relate to and serve the goal of data-driven
shape and texture processing are of interest. This includes original
contributions at different levels of data processing; for different
industrial applications, as well as proposals for new evaluation
metrics and relevant original datasets. Topics of interest include,
but are not limited to:

    Textured 3D data representation and evaluation 

    Textured 3D scan feature extraction 

    Generative modelling of textured 3D scans 

    Learning-based 3D reconstruction 

    Joint texture and shape matching 

    Joint texture and shape completion 

    Semantic 3D data reconstruction 

    Effective 3D and 2D data fusion 

    Textured 3D data refinement 

    3D feature edge detection and refinement 

    High-level representations of 3D data 

    CAD modeling from unstructured 3D data 

Authors are encouraged to submit their contributions to the SHARP 2021
submission site. All accepted papers will be included in the CVPR 2021
conference proceedings. The papers will be peer-reviewed and they must
comply with the CVPR 2021 proceedings style and format.

Important dates 

Paper submission track: 

Paper submission deadline: 7th of March 2021 

Final decisions to authors: 1st of April 2021 

Camera-Ready submission deadline: 10th of April 2021 



Registration deadline: 22nd of February 2021 

Release of training datasets: 15th of March 2021 

Submission of results: 18th of May 2021 

Announcement of results: 20th of June 2021 


Djamila Aouada, SnT, University of Luxembourg 

Kseniya Cherenkova, SnT, Artec3D

Alexandre Saint, SnT, University of Luxembourg 

David Foffi, University of Burgundy

Gleb Gusev, Artec3D

Bjorn Ottersten, SnT, University of Luxembourg 

Anis Kacem, SnT, University of Luxembourg 

Konstantinos Papadopoulos, SnT, University of Luxembourg 

More details about the workshop can be found in 


Contact: For any enquiries, please feel free to contact us at