A two-stage density-aware single image deraining method

Min Cao, Zhi Gao, Bharath Ramesh, Tiancan Mei, Jinqiang Cui

Research output: Contribution to journalArticlepeer-review

Abstract

Although advanced single image deraining methods have been proposed, one main challenge remains: the available methods usually perform well on specific rain patterns but can hardly deal with scenarios with dramatically different rain densities, especially when the impacts of rain streaks and the veiling effect caused by rain accumulation are heavily coupled. To tackle this challenge, we propose a two-stage density-aware single image deraining method with gated multi-scale feature fusion. In the first stage, a realistic physics model closer to real rain scenes is leveraged for initial deraining, and a network branch is also trained for rain density estimation to guide the subsequent refinement. The second stage of model-independent refinement is realized using conditional Generative Adversarial Network (cGAN), aiming to eliminate artifacts and improve the restoration quality. In particular, dilated convolutions are applied to extract rain features at multiple scales and gated feature fusion is exploited to better aggregate multi-level contextual information in both stages. Extensive experiments have been conducted on representative synthetic rain datasets and real rain scenes. Quantitative and qualitative results demonstrate the superiority of our method in terms of effectiveness and generalization ability, which outperforms the state-of-the-art.
Original languageEnglish
Article number9499966
Pages (from-to)6843-6854
Number of pages12
JournalIEEE Transactions on Image Processing
Volume30
DOIs
Publication statusPublished - 2021

Fingerprint

Dive into the research topics of 'A two-stage density-aware single image deraining method'. Together they form a unique fingerprint.

Cite this