论文标题
通过一个单一的学习框架解决多个任务
Tackling Multiple Tasks with One Single Learning Framework
论文作者
论文摘要
深度多任务学习(DMTL)已在机器学习社区中进行了广泛研究,并应用于广泛的现实应用程序。在DMTL中搜索最佳知识共享对于顺序学习问题更具挑战性,因为任务关系将在时间维度上发生变化。在本文中,我们提出了一个称为层次量变激活网络(HTAN)的灵活有效的框架,以同时探讨神经网络层次结构(层次结构轴)的最佳共享和时间变化的任务关系(时间轴)。 HTAN学习了一组时间变化的激活功能来编码任务关系。进一步提出了由调制的SPDNET和对抗性学习实施的功能正则化,以增强DMTL性能。关于几个具有挑战性的应用程序的全面实验表明,我们的HTAN-SPD框架在顺序DMTL中的表现明显优于SOTA方法。
Deep Multi-Task Learning (DMTL) has been widely studied in the machine learning community and applied to a broad range of real-world applications. Searching for the optimal knowledge sharing in DMTL is more challenging for sequential learning problems, as the task relationship will change in the temporal dimension. In this paper, we propose a flexible and efficient framework called HierarchicalTemporal Activation Network (HTAN) to simultaneously explore the optimal sharing of the neural network hierarchy (hierarchical axis) and the time-variant task relationship (temporal axis). HTAN learns a set of time-variant activation functions to encode the task relation. A functional regularization implemented by a modulated SPDNet and adversarial learning is further proposed to enhance the DMTL performance. Comprehensive experiments on several challenging applications demonstrate that our HTAN-SPD framework outperforms SOTA methods significantly in sequential DMTL.