Uncertainty Quantification for Computer Vision Call for Papers


********************************

Call for Papers

Uncertainty Quantification for Computer Vision

International Workshop at ECCV 2022

(1st Edition)
https://uncv2022.github.io/

********************************

 

Submission Deadline July 10th AOE

Two types of paper are welcome:

----------------------------

- Regular Papers -

(novel contributions not published previously)

- Extended Abstracts -

(novel contributions or papers that have been already accepted for
publication previously)

----------------------------

In the last decade, substantial progress has been made w.r.t. the
performance of computer vision systems, a significant part of it
thanks to deep learning. These advancements prompted sharp community
growth and a rise in industrial investment. However, most current
models lack the ability to reason about the confidence of their
predictions; integrating uncertainty quantification into vision
systems will help recognize failure scenarios and enable robust
applications.

 

The ECCV 2022 workshop on Uncertainty Quantification for Computer
Vision will consider recent advances in methodology and applications
of uncertainty quantification in computer vision. Prospective authors
are invited to submit papers on relevant algorithms and applications
including, but not limited to:

 
    Applications of uncertainty quantification
    Failure prediction (e.g., OOD detection)
    Robustness in CV
    Safety critical applications in CV
    Domain-shift in CV
    Probabilistic deep models
    Deep probabilistic models
    Deep ensemble uncertainty
    Connections between NNs and GPs
    Incorporating explicit prior knowledge in deep learning
    Computational aspects and real-time probabilistic inference
    Output ambiguity, multi-modality and diversity

All papers will be peer-reviewed, and accepted Regular papers are
presented at the workshop and included in the ECCV Workshop
Proceedings.

 

Prize

The workshop has a prize sponsored by the Future Fund regranting
program. The funding covers an ImageNet OOD Detection Best Paper Award
of $10,000. The awarded paper should study OOD detection. More
specifically, the paper should study OOD detection performance
Species, OpenImage-O, or ImageNet-O. The paper should include at least
one model trained exclusively on ImageNet-1K with an accuracy less
than 82%. The winning paper will not necessarily have the highest
performance on these datasets subject to these constraints, as novelty
and interestingness will also be assessed.

 

Submission Instructions

At the time of submission authors must indicate the desired paper track:

    Regular papers will be peer-reviewed following the same policy of
    the main conference and will be published in the proceedings (call
    for papers with guidelines and template here, max 14 pages,
    additional pages for references only are allowed). These are meant
    to present novel contributions not published previously (submitted
    papers should not have been published, accepted or under review
    elsewhere).

    Extended abstracts are meant for preliminary works and short
    versions of papers that have already been accepted, or are under
    review, preferably in the last year in some major conferences or
    journals. These papers will undergo a separate reviewing process
    to assess the suitability for the workshop. These will *not
    appear* in the workshop proceedings. Template and guidelines (max
    4 content pages, additional pages for references allowed) here.

 

Submission site: 
https://openreview.net/group?id=thecvf.com/ECCV/2022/Workshop/UNCV
 

Important Dates (All times are end of day AOE)

Submission deadline: July 10th, 2022
Notification of acceptance: August 10th, 2022
Camera-ready deadline: August 22nd, 2022

 

Organizing Commitee

    Andrea Pilzer, NVIDIA AI Technology Centre, Italy
    Martin Trapp, Aalto University, Finland
    Arno Solin, Aalto University, Finland
    Yingzhen Li, Imperial College London, UK
    Neill Campbell, University of Bath, UK