Abstract
Recent years have witnessed rapid advances in graph representation learning, with the continuous embedding approach emerging as the dominant paradigm. However, such methods encounter issues regarding parameter efficiency, interpretability, and robustness. Thus, Quantized Graph Representation (QGR) learning has recently gained increasing interest, which represents the graph structure with discrete codes instead of conventional continuous embeddings. Given its analogous representation form to natural language, QGR also possesses the capability to seamlessly integrate graph structures with large language models (LLMs). As this emerging paradigm is still in its infancy yet holds significant promise, we undertake this thorough survey to promote its rapid future prosperity. We first present the background of the general quantization methods and their merits. Moreover, we provide an in-depth demonstration of current QGR studies from the perspectives of quantized strategies, training objectives, distinctive designs, knowledge graph quantization, and applications. We further explore the strategies for code dependence learning and integration with LLMs. At last, we give discussions and conclude future directions, aiming to provide a comprehensive picture of QGR and inspire future research.
Abstract (translated)
近年来,图表示学习取得了迅速进展,连续嵌入方法已成为主导范式。然而,这类方法在参数效率、可解释性和鲁棒性方面遇到了问题。因此,量化图表示(Quantized Graph Representation, QGR)学习最近引起了越来越多的关注,它使用离散代码而不是传统的连续嵌入来表示图结构。由于其与自然语言的类比表现形式,QGR还具备将图结构无缝集成到大型语言模型(LLMs)中的能力。鉴于这一新兴范式仍处于起步阶段但前景广阔,我们进行了全面调研以促进其未来的快速发展。首先,我们将介绍通用量化方法及其优势的背景信息。此外,从量化解策略、训练目标、独特设计、知识图谱量化和应用等方面深入展示了当前QGR研究。进一步地,我们探讨了代码依赖学习策略以及与LLMs集成的方法。最后,我们将进行讨论并总结未来的发展方向,旨在为QGR提供一个全面的视角,并激发未来的相关研究工作。
URL
https://arxiv.org/abs/2502.00681