About This Document
- sl:arxiv_author :
- sl:arxiv_firstAuthor : Yihong Chen
- sl:arxiv_num : 2207.09980
- sl:arxiv_published : 2022-07-20T15:39:30Z
- sl:arxiv_summary : Factorisation-based Models (FMs), such as DistMult, have enjoyed enduring
success for Knowledge Graph Completion (KGC) tasks, often outperforming Graph
Neural Networks (GNNs). However, unlike GNNs, FMs struggle to incorporate node
features and to generalise to unseen nodes in inductive settings. Our work
bridges the gap between FMs and GNNs by proposing ReFactorGNNs. This new
architecture draws upon both modelling paradigms, which previously were largely
thought of as disjoint. Concretely, using a message-passing formalism, we show
how FMs can be cast as GNNs by reformulating the gradient descent procedure as
message-passing operations, which forms the basis of our ReFactorGNNs. Across a
multitude of well-established KGC benchmarks, our ReFactorGNNs achieve
comparable transductive performance to FMs, and state-of-the-art inductive
performance while using an order of magnitude fewer parameters.@en
- sl:arxiv_title : ReFactorGNNs: Revisiting Factorisation-based Models from a Message-Passing Perspective@en
- sl:arxiv_updated : 2022-07-21T13:33:26Z
- sl:bookmarkOf : https://arxiv.org/abs/2207.09980
- sl:creationDate : 2022-07-23
- sl:creationTime : 2022-07-23T12:57:37Z
Documents with similar tags (experimental)