Abstract
Multi-domain recommendation and multi-task recommendation have demonstrated their effectiveness in leveraging common information from different domains and objectives for comprehensive user modeling. Nonetheless, the practical recommendation usually faces multiple domains and tasks simultaneously, which cannot be well-addressed by current methods. To this end, we introduce M3oE, an adaptive multi-domain multi-task mixture-of-experts recommendation framework. M3oE integrates multi-domain information, maps knowledge across domains and tasks, and optimizes multiple objectives. We leverage three mixture-of-experts modules to learn common, domain-aspect, and task-aspect user preferences respectively to address the complex dependencies among multiple domains and tasks in a disentangled manner. Additionally, we design a two-level fusion mechanism for precise control over feature extraction and fusion across diverse domains and tasks. The framework's adaptability is further enhanced by applying AutoML technique, which allows dynamic structure optimization. To the best of the authors' knowledge, our M3oE is the first effort to solve multi-domain multi-task recommendation self-adaptively. Extensive experiments on two benchmark datasets against diverse baselines demonstrate M3oE's superior performance. The implementation code is available to ensure reproducibility.
Abstract (translated)
多领域推荐和多任务推荐已经在利用不同领域的共同信息和目标进行全面的用户建模方面取得了有效性。然而,通常的实践推荐需要同时处理多个领域和任务,而当前的方法很难解决这一问题。为此,我们引入了M3oE,一个自适应的多领域多任务专家混合推荐框架。M3oE整合了多领域信息,将知识跨越领域和任务,并优化多个目标。我们利用三个专家混合模块来分别学习共同的、领域特性和任务特性的用户偏好,以以分离的方式解决多个领域和任务之间的复杂依赖关系。此外,我们还设计了一个双级融合机制,用于精确控制跨多个领域和任务的特征提取和融合。通过应用自动机器学习技术(动态结构优化),该框架的适应性得到了进一步增强。据我们所知,M3oE是第一个自适应解决多领域多任务推荐的尝试。在两个基准数据集上的广泛实验表明,M3oE在各种基线方法中具有卓越的性能。实现代码可用于确保可重复性。
URL
https://arxiv.org/abs/2404.18465