论文标题
一种通用的Hopfield模型,用于存储和检索不匹配的内存模式
A generalized Hopfield model to store and retrieve mismatched memory patterns
论文作者
论文摘要
我们研究了一类Hopfield模型,其中记忆由高斯和二进制变量的混合物表示,而神经元正在旋转。我们研究该模型家族的特性是模式中两种变量的相对重量的变化。由于记忆模式包含较大的不匹配变量,因此我们定量确定检索相如何向零挤压。由于记忆是纯粹的高斯检索,因此对于任何积极的存储容量而言都会丢失。结果表明,这是由于高斯情况下自由能的球形对称性而出现的。引入旋转配置之间的两个不同的内存模式与对模式的每个变量的贡献之间的贡献之间的两个不同的模式重叠,人们可以观察到模式的高斯部分充当噪声,从而使检索更加困难。通过蒙特卡洛数值模拟研究了状态吸引的盆地,检索和存储容量的准确性。我们发现,即使在网络容量缩小到零的极限下,(几个)检索状态也具有很大的吸引力盆地,并且与不匹配的模式具有很大的重叠。因此,该网络可用于检索,但容量很小。
We study a class of Hopfield models where the memories are represented by a mixture of Gaussian and binary variables and the neurons are Ising spins. We study the properties of this family of models as the relative weight of the two kinds of variables in the patterns varies. We quantitatively determine how the retrieval phase squeezes towards zero as the memory patterns contain a larger fraction of mismatched variables. As the memory is purely Gaussian retrieval is lost for any positive storage capacity. It is shown that this comes about because of the spherical symmetry of the free energy in the Gaussian case. Introducing two different memory pattern overlaps between spin configurations and each contribution to the pattern from the two kinds of variables one can observe that the Gaussian parts of the patterns act as a noise, making retrieval more difficult. The basins of attraction of the states, the accuracy of the retrieval and the storage capacity are studied by means of Monte Carlo numerical simulations. We uncover that even in the limit where the network capacity shrinks to zero, the (few) retrieval states maintain a large basin of attraction and large overlaps with the mismatched patterns. So the network can be used for retrieval, but with a very small capacity.