论文标题

两全其美的最好的:将退化预测与高性能超分辨率网络相结合的框架

The Best of Both Worlds: a Framework for Combining Degradation Prediction with High Performance Super-Resolution Networks

论文作者

Aquilina, Matthew, Ciantar, Keith George, Galea, Christian, Camilleri, Kenneth P., Farrugia, Reuben A., Abela, John

论文摘要

迄今为止,表现最佳的盲目超级分辨率(SR)技术遵循两个范式之一:a)生成并培训标准的SR网络,以合成低分辨率 - 高分辨率(LR-HR)对或b)尝试预测LR图像的降级,LR映像已遭受LR图像的影响,并使用这些来告知这些定制SR网络。尽管取得了重大进展,但对前者的订阅者却错过了有用的降级信息,这些信息可用于改善SR流程。另一方面,后者的追随者依靠较弱的SR网络,这些网络的表现明显优于最新的建筑进步。在这项工作中,我们提出了一个框架,用于使用元数据插入块将任何盲目的SR预测机制与任何深层SR网络结合在一起,以将预测向量插入SR网络特征图中。通过全面的测试,我们证明,可以在我们的框架内成功地将最新的对比和迭代预测方案与高性能的SR网络(例如RCAN和HAN)结合在一起。我们表明,与非盲和盲目对应物相比,混合模型始终达到更强的SR性能。此外,我们通过从复杂的模糊,噪声和压缩的复杂管道中预测降解和超级分辨图像来证明框架的鲁棒性。

To date, the best-performing blind super-resolution (SR) techniques follow one of two paradigms: A) generate and train a standard SR network on synthetic low-resolution - high-resolution (LR - HR) pairs or B) attempt to predict the degradations an LR image has suffered and use these to inform a customised SR network. Despite significant progress, subscribers to the former miss out on useful degradation information that could be used to improve the SR process. On the other hand, followers of the latter rely on weaker SR networks, which are significantly outperformed by the latest architectural advancements. In this work, we present a framework for combining any blind SR prediction mechanism with any deep SR network, using a metadata insertion block to insert prediction vectors into SR network feature maps. Through comprehensive testing, we prove that state-of-the-art contrastive and iterative prediction schemes can be successfully combined with high-performance SR networks such as RCAN and HAN within our framework. We show that our hybrid models consistently achieve stronger SR performance than both their non-blind and blind counterparts. Furthermore, we demonstrate our framework's robustness by predicting degradations and super-resolving images from a complex pipeline of blurring, noise and compression.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源