论文标题
改进短而嘈杂的时间序列的Nneten熵计算的新技术
Novel techniques for improving NNetEn entropy calculation for short and noisy time series
论文作者
论文摘要
熵是信息理论领域的基本概念。在测量过程中,常规的熵措施易受时间序列的长度和振幅变化的影响。已经开发了一种新的熵指标,神经网络熵(NNETEN),以克服这些局限性。使用修改的LogNnet神经网络分类模型计算Nneten熵。该算法包含n = 19625元素的储层矩阵,必须用给定的数据填充。本文的贡献是三倍。首先,这项工作研究了用时间序列(信号)元素填充储层的不同方法。储层填充方法通过研究时间序列和logNNET测试数据的卷积决定了熵估计的准确性。本研究提出了6种填充时间序列储层的方法。其中两个(方法3和方法6)采用了拉伸时间序列的新方法来创建与之补充的中间元素,但不要改变其动力学。短时间序列最可靠的方法是方法3和方法5。研究的第二部分研究了噪声和恒定偏差对熵值的影响。我们的研究研究了具有不同动态特性,信号与噪声比(SNR)和偏移的三种不同的时间序列数据类型(混乱,周期性和二进制)。当SNR大于30 dB时,Nneten熵计算误差小于10%,并且熵随偏置分量的增加而降低。本文的第三部分分析了从情感识别实验中收集的实时生物信号脑电图数据。 Nneten措施使用各种过滤器在低振幅噪声下显示出鲁棒性。因此,当使用环境噪声,白噪声和1/F噪声应用于现实世界环境时,NNETEN可以有效地测量熵。
Entropy is a fundamental concept in the field of information theory. During measurement, conventional entropy measures are susceptible to length and amplitude changes in time series. A new entropy metric, neural network entropy (NNetEn), has been developed to overcome these limitations. NNetEn entropy is computed using a modified LogNNet neural network classification model. The algorithm contains a reservoir matrix of N=19625 elements that must be filled with the given data. The contribution of this paper is threefold. Firstly, this work investigates different methods of filling the reservoir with time series (signal) elements. The reservoir filling method determines the accuracy of the entropy estimation by convolution of the study time series and LogNNet test data. The present study proposes 6 methods for filling the reservoir for time series. Two of them (Method 3 and Method 6) employ the novel approach of stretching the time series to create intermediate elements that complement it, but do not change its dynamics. The most reliable methods for short time series are Method 3 and Method 5. The second part of the study examines the influence of noise and constant bias on entropy values. Our study examines three different time series data types (chaotic, periodic, and binary) with different dynamic properties, Signal to Noise Ratio (SNR), and offsets. The NNetEn entropy calculation errors are less than 10% when SNR is greater than 30 dB, and entropy decreases with an increase in the bias component. The third part of the article analyzes real-time biosignal EEG data collected from emotion recognition experiments. The NNetEn measures show robustness under low-amplitude noise using various filters. Thus, NNetEn measures entropy effectively when applied to real-world environments with ambient noise, white noise, and 1/f noise.