论文标题
有限的littlestone维度意味着有限信息复杂性
Finite Littlestone Dimension Implies Finite Information Complexity
论文作者
论文摘要
我们证明,Littlestone Dimension $ d $的每一个在线学习的功能都可以接受具有有限信息复杂性的学习算法。为此,我们使用了全球稳定算法的概念。通常,这种全球稳定算法的信息复杂性是大但有限的,大致在$ d $中。我们还显示有改进的空间。对于典型的在线学习类,尺寸$ d $的仿射子空间的指标函数可以在$ d $中以上限对数上限。
We prove that every online learnable class of functions of Littlestone dimension $d$ admits a learning algorithm with finite information complexity. Towards this end, we use the notion of a globally stable algorithm. Generally, the information complexity of such a globally stable algorithm is large yet finite, roughly exponential in $d$. We also show there is room for improvement; for a canonical online learnable class, indicator functions of affine subspaces of dimension $d$, the information complexity can be upper bounded logarithmically in $d$.