论文标题

完全可训练的高斯衍生卷积层

Fully trainable Gaussian derivative convolutional layer

论文作者

Penaud--Polge, Valentin, Velasco-Forero, Santiago, Angulo, Jesus

论文摘要

高斯内核及其衍生物已经在以前的几项作品中用于卷积神经网络。这些论文中的大多数提议通过线性将一个或几个固定或略有训练的高斯内核(带有或没有其衍生物)进行线性结合来计算过滤器。在本文中,我们提出了一个基于各向异性,面向和转移的高斯衍生核的高级配置层,该核心概括了以前的相关工作中遇到的概念,同时保持其主要优势。结果表明,与以前的作品相比,所提出的层具有竞争性能,并且可以成功地包括在常见的深度体系结构中,例如用于图像分类的VGG16和用于图像分割的U-NET。

The Gaussian kernel and its derivatives have already been employed for Convolutional Neural Networks in several previous works. Most of these papers proposed to compute filters by linearly combining one or several bases of fixed or slightly trainable Gaussian kernels with or without their derivatives. In this article, we propose a high-level configurable layer based on anisotropic, oriented and shifted Gaussian derivative kernels which generalize notions encountered in previous related works while keeping their main advantage. The results show that the proposed layer has competitive performance compared to previous works and that it can be successfully included in common deep architectures such as VGG16 for image classification and U-net for image segmentation.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源