TY - JOUR
T1 - CDLNet
T2 - Noise-Adaptive Convolutional Dictionary Learning Network for Blind Denoising and Demosaicing
AU - Janjusevic, Nikola
AU - Khalilian-Gourtani, Amirhossein
AU - Wang, Yao
N1 - Publisher Copyright:
© 2020 IEEE.
PY - 2022
Y1 - 2022
N2 - Deep learning based methods hold state-of-the-art results in low-level image processing tasks, but remain difficult to interpret due to their black-box construction. Unrolled optimization networks present an interpretable alternative to constructing deep neural networks by deriving their architecture from classical iterative optimization methods without use of tricks from the standard deep learning tool-box. So far, such methods have demonstrated performance close to that of state-of-the-art models while using their interpretable construction to achieve a comparably low learned parameter count. In this work, we propose an unrolled convolutional dictionary learning network (CDLNet) and demonstrate its competitive denoising and joint denoising and demosaicing (JDD) performance both in low and high parameter count regimes. Specifically, we show that the proposed model outperforms state-of-the-art fully convolutional denoising and JDD models when scaled to a similar parameter count. In addition, we leverage the model's interpretable construction to propose a noise-adaptive parameterization of thresholds in the network that enables state-of-the-art blind denoising performance, and near-perfect generalization on noise-levels unseen during training. Furthermore, we show that such performance extends to the JDD task and unsupervised learning.
AB - Deep learning based methods hold state-of-the-art results in low-level image processing tasks, but remain difficult to interpret due to their black-box construction. Unrolled optimization networks present an interpretable alternative to constructing deep neural networks by deriving their architecture from classical iterative optimization methods without use of tricks from the standard deep learning tool-box. So far, such methods have demonstrated performance close to that of state-of-the-art models while using their interpretable construction to achieve a comparably low learned parameter count. In this work, we propose an unrolled convolutional dictionary learning network (CDLNet) and demonstrate its competitive denoising and joint denoising and demosaicing (JDD) performance both in low and high parameter count regimes. Specifically, we show that the proposed model outperforms state-of-the-art fully convolutional denoising and JDD models when scaled to a similar parameter count. In addition, we leverage the model's interpretable construction to propose a noise-adaptive parameterization of thresholds in the network that enables state-of-the-art blind denoising performance, and near-perfect generalization on noise-levels unseen during training. Furthermore, we show that such performance extends to the JDD task and unsupervised learning.
KW - Interpretable deep learning
KW - blind denoising
KW - dictionary learning
KW - joint demosaicing and denoising
KW - sparse coding
KW - unrolled networks
UR - http://www.scopus.com/inward/record.url?scp=85131344137&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85131344137&partnerID=8YFLogxK
U2 - 10.1109/OJSP.2022.3172842
DO - 10.1109/OJSP.2022.3172842
M3 - Article
AN - SCOPUS:85131344137
SN - 2644-1322
VL - 3
SP - 196
EP - 211
JO - IEEE Open Journal of Signal Processing
JF - IEEE Open Journal of Signal Processing
ER -