论文标题

灵活的发射机网络

Flexible Transmitter Network

论文作者

Zhang, Shao-Qun, Zhou, Zhi-Hua

论文摘要

当前的神经网络主要建立在MP模型上,该模型通常将神经元定为从其他神经元接收到的信号的实现的加权聚合中执行激活函数。在本文中,我们提出了柔性发射器(FT)模型,这是一种具有柔性突触可塑性的新型生物性神经元模型。 FT模型采用一对参数来对神经元之间的发射机进行建模,并提出神经元独占的变量来记录受调节的神经营养蛋白密度,从而导致FT模型作为两变量的两价值功能的配方,以普遍使用的MP神经元模型为特殊情况。这种建模方式使FT模型不仅在生物学上更现实,而且还可以处理复杂的数据,甚至可以处理时间序列。为了展示其力量和潜力,我们提出了灵活的发射机网络(FTNET),该网络建立在最常见的完全连接的进料前架构上,以FT模型为基本的构建块。 FTNET允许梯度计算,并且可以通过复杂值域中的改进的背传算法来实现。董事会范围的实验显示了拟议的FTNET的优越性。这项研究提供了神经网络中的替代基本基础,并表现出具有神经元可塑性的人工神经网络的可行性。

Current neural networks are mostly built upon the MP model, which usually formulates the neuron as executing an activation function on the real-valued weighted aggregation of signals received from other neurons. In this paper, we propose the Flexible Transmitter (FT) model, a novel bio-plausible neuron model with flexible synaptic plasticity. The FT model employs a pair of parameters to model the transmitters between neurons and puts up a neuron-exclusive variable to record the regulated neurotrophin density, which leads to the formulation of the FT model as a two-variable two-valued function, taking the commonly-used MP neuron model as its special case. This modeling manner makes the FT model not only biologically more realistic, but also capable of handling complicated data, even time series. To exhibit its power and potential, we present the Flexible Transmitter Network (FTNet), which is built on the most common fully-connected feed-forward architecture taking the FT model as the basic building block. FTNet allows gradient calculation and can be implemented by an improved back-propagation algorithm in the complex-valued domain. Experiments on a board range of tasks show the superiority of the proposed FTNet. This study provides an alternative basic building block in neural networks and exhibits the feasibility of developing artificial neural networks with neuronal plasticity.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源