Parents:
Network embeddings
Traditionally, networks are usually represented as adjacency matrices. This suffers from data sparsity and high-dimensionality. Recently, fast-growing interest in learning continuous and low-dimensional representations of networks. Edges can be directed, undirected or weighted, and both nodes and edges may carry different semantics. Node embeddings (intuition: similar nodes should have similar vectors). - Laplacian EigenMap (an eigenvector based computation, OK when matrix is not too large) - LINE Large-scale Information Network Embedding, most cited paper at WWW2015; Breadth first search - DeepWalk (Perozi et al. 2014) (the technique to learn word embeddings adapted to nodes. Random walks on network as sentences - Node2Vec (2016) (mixed strategy) etc.
Related Tags:
 
ExpandDescendants
3 Documents (Long List
Properties