Large Language Model with Graph Convolution for Recommendation
On this page
In recent years, efforts have been made to use text information for betteruser profiling and item characterization in recommendations. However, textinformation can sometimes be of low quality, hindering its effectiveness forreal-world applications. With knowledge and reasoning capabilities capsuled inLarge Language Models (LLMs), utilizing LLMs emerges as a promising way fordescription improvement. However, existing ways of prompting LLMs with rawtexts ignore structured knowledge of user-item interactions, which may lead tohallucination problems like inconsistent description generation. To this end,we propose a Graph-aware Convolutional LLM method to elicit LLMs to capturehigh-order relations in the user-item graph. To adapt text-based LLMs withstructured graphs, We use the LLM as an aggregator in graph processing,allowing it to understand graph-based information step by step. Specifically,the LLM is required for description enhancement by exploring multi-hopneighbors layer by layer, thereby propagating information progressively in thegraph. To enable LLMs to capture large-scale graph information, we break downthe description task into smaller parts, which drastically reduces the contextlength of the token input with each step. Extensive experiments on threereal-world datasets show that our method consistently outperformsstate-of-the-art methods.
Further reading
- Access Paper in arXiv.org