论文标题
关于解释图神经网络的必要性和充分性的概率:一种较低的优化方法
On the Probability of Necessity and Sufficiency of Explaining Graph Neural Networks: A Lower Bound Optimization Approach
论文作者
论文摘要
图神经网络(GNN)的解释性对于各种GNN应用至关重要,但仍然是一个重大挑战。令人信服的解释应同时同时进行。但是,现有的GNN解释方法仅关注两个方面之一,即必要性或充分性,或两者之间的启发式权衡。从理论上讲,必要性和充分性(PNS)的概率具有确定最必要和充分的解释的潜力,因为它可以数学上量化解释的必要性和充分性。然而,由于非单调性和反事实估计的挑战,获得PNS的困难限制了其广泛使用。为了解决PNS的非识别性,我们诉诸PN的下限,可以通过反事实估计进行优化,并通过优化该下限来提出一个必要和充分解释的GNN(NSEG)框架。具体而言,我们将GNN描述为结构性因果模型(SCM),并通过SCM下的干预措施估算了反事实的可能性。此外,我们利用采样策略来利用连续掩模来优化下限以增强可扩展性。经验结果表明,NSEG的表现优于最先进的方法,始终产生最必要和最充分的解释。
The explainability of Graph Neural Networks (GNNs) is critical to various GNN applications, yet it remains a significant challenge. A convincing explanation should be both necessary and sufficient simultaneously. However, existing GNN explaining approaches focus on only one of the two aspects, necessity or sufficiency, or a heuristic trade-off between the two. Theoretically, the Probability of Necessity and Sufficiency (PNS) holds the potential to identify the most necessary and sufficient explanation since it can mathematically quantify the necessity and sufficiency of an explanation. Nevertheless, the difficulty of obtaining PNS due to non-monotonicity and the challenge of counterfactual estimation limit its wide use. To address the non-identifiability of PNS, we resort to a lower bound of PNS that can be optimized via counterfactual estimation, and propose a framework of Necessary and Sufficient Explanation for GNN (NSEG) via optimizing that lower bound. Specifically, we depict the GNN as a structural causal model (SCM), and estimate the probability of counterfactual via the intervention under the SCM. Additionally, we leverage continuous masks with a sampling strategy to optimize the lower bound to enhance the scalability. Empirical results demonstrate that NSEG outperforms state-of-the-art methods, consistently generating the most necessary and sufficient explanations.