About This Document
- sl:arxiv_author : Petar Veličković
- sl:arxiv_firstAuthor : Petar Veličković
- sl:arxiv_num : 2301.08210
- sl:arxiv_published : 2023-01-19T18:09:43Z
- sl:arxiv_summary : In many ways, graphs are the main modality of data we receive from nature.
This is due to the fact that most of the patterns we see, both in natural and
artificial systems, are elegantly representable using the language of graph
structures. Prominent examples include molecules (represented as graphs of
atoms and bonds), social networks and transportation networks. This potential
has already been seen by key scientific and industrial groups, with
already-impacted application areas including traffic forecasting, drug
discovery, social network analysis and recommender systems. Further, some of
the most successful domains of application for machine learning in previous
years -- images, text and speech processing -- can be seen as special cases of
graph representation learning, and consequently there has been significant
exchange of information between these areas. The main aim of this short survey
is to enable the reader to assimilate the key concepts in the area, and
position graph representation learning in a proper context with related fields.@en
- sl:arxiv_title : Everything is Connected: Graph Neural Networks@en
- sl:arxiv_updated : 2023-01-19T18:09:43Z
- sl:bookmarkOf : https://arxiv.org/abs/2301.08210
- sl:creationDate : 2023-01-21
- sl:creationTime : 2023-01-21T14:01:42Z
Documents with similar tags (experimental)