Call for CVPR 2021 Workshop Paper

Usually, super-resolution (SR) is trained using pairs of high- and low-resolution images. Infinitely many high-resolution images can be downsampled to the same low-resolution image. That means that the problem is ill-posed and cannot be inverted with a deterministic mapping. Instead, this CVPR 2021 NTIRE challenge frames the SR problem as learning a stochastic mapping, capable of sampling from the space of plausible high-resolution images given a low-resolution image.

Image for post
Image for post

Super-Resolution is ill-posed


Image for post
Image for post
CVPR 2021 is held online from 19th-25th of June 2021 | Joshua Ness, Unsplash

Official CVPR 2021 Website

LaTeX Template [Download]
Submission [CMT]
Guidelines [TLDR]

What is the format for the submission?

Main Paper: PDF | 8 Pages + Reference| 30MB | [Source]
Supplementary: PDF or ZIP |100MB

Put caption below figures/tables and end them with a dot.

When are the submission deadlines?

The deadline for the main paper is on the 16th of November.

However, you have to register your paper with the title, abstract, authors, and subject areas already on the 9th of November.


GAN — vs — Normalizing Flow

The benefits of Normalizing Flow. In this article, we show how we outperformed GAN with Normalizing Flow. We do that based on the application super-resolution. There we describe SRFlow, a super-resolution method that outperforms state-of-the-art GAN approaches. We explain it in detail in our ECCV 2020 paper.

Intuition for Conditional Normalizing Flow. We train a Normalizing Flow model to transform an image into a gaussian latent space. During inference, we sample a random gaussian vector to generate an image. That works because the mapping is bijective and therefore outputs an image for a gaussian vector. …

Computer Vision Zurich

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store