论文标题

基于自适应拉索的稀疏dag结构学习

On the Sparse DAG Structure Learning Based on Adaptive Lasso

论文作者

Xu, Danru, Gao, Erdun, Huang, Wei, Wang, Menghan, Song, Andy, Gong, Mingming

论文摘要

学习由纯粹观察数据的相关事件代表的基础贝叶斯网络(BNS)是证据推理的关键部分。由于较大且离散的搜索空间,此任务仍然具有挑战性。最近的一系列发展遵循宣传[1]通过利用代数相等性表征的无性性表征,将此组合问题重新生要呈连续优化问题。但是,连续优化方法在数值优化之后获得非跨度图而受到影响,这会导致不灵活的性能排除具有较小值的潜在循环诱导的边缘或错误的发现边缘。为了解决这个问题,在本文中,我们开发了一种完全数据驱动的DAG结构学习方法,而没有预定义的价值来阈值后较小的价值。我们使用适应性套索(notears-al)命名我们的方法note词,这是通过应用自适应惩罚方法来确保估计DAG的稀疏性来实现的。此外,我们表明Notears-Al还在某些特定条件下继承了Oracle属性。对合成和现实世界数据集的广泛实验表明,我们的方法始终优于ne迹。

Learning the underlying Bayesian Networks (BNs), represented by directed acyclic graphs (DAGs), of the concerned events from purely-observational data is a crucial part of evidential reasoning. This task remains challenging due to the large and discrete search space. A recent flurry of developments followed NOTEARS[1] recast this combinatorial problem into a continuous optimization problem by leveraging an algebraic equality characterization of acyclicity. However, the continuous optimization methods suffer from obtaining non-spare graphs after the numerical optimization, which leads to the inflexibility to rule out the potentially cycle-inducing edges or false discovery edges with small values. To address this issue, in this paper, we develop a completely data-driven DAG structure learning method without a predefined value to post-threshold small values. We name our method NOTEARS with adaptive Lasso (NOTEARS-AL), which is achieved by applying the adaptive penalty method to ensure the sparsity of the estimated DAG. Moreover, we show that NOTEARS-AL also inherits the oracle properties under some specific conditions. Extensive experiments on both synthetic and a real-world dataset demonstrate that our method consistently outperforms NOTEARS.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源