• Àüü
  • ÀüÀÚ/Àü±â
  • Åë½Å
  • ÄÄÇ»ÅÍ
´Ý±â

»çÀÌÆ®¸Ê

Loading..

Please wait....

¿µ¹® ³í¹®Áö

Ȩ Ȩ > ¿¬±¸¹®Çå > ¿µ¹® ³í¹®Áö > JIPS (Çѱ¹Á¤º¸Ã³¸®ÇÐȸ)

JIPS (Çѱ¹Á¤º¸Ã³¸®ÇÐȸ)

Current Result Document :

ÇѱÛÁ¦¸ñ(Korean Title) A Graph Embedding Technique for Weighted Graphs Based on LSTM Autoencoders
¿µ¹®Á¦¸ñ(English Title) A Graph Embedding Technique for Weighted Graphs Based on LSTM Autoencoders
ÀúÀÚ(Author) Minji Seo   Ki Yong Lee  
¿ø¹®¼ö·Ïó(Citation) VOL 16 NO. 06 PP. 1407 ~ 1423 (2020. 12)
Çѱ۳»¿ë
(Korean Abstract)
¿µ¹®³»¿ë
(English Abstract)
A graph is a data structure consisting of nodes and edges between these nodes. Graph embedding is to generate a low dimensional vector for a given graph that best represents the characteristics of the graph. Recently, there have been studies on graph embedding, especially using deep learning techniques. However, until now, most deep learning-based graph embedding techniques have focused on unweighted graphs. Therefore, in this paper, we propose a graph embedding technique for weighted graphs based on long short-term memory (LSTM) autoencoders. Given weighted graphs, we traverse each graph to extract node-weight sequences from the graph. Each node-weight sequence represents a path in the graph consisting of nodes and the weights between these nodes. We then train an LSTM autoencoder on the extracted node-weight sequences and encode each nodeweight sequence into a fixed-length vector using the trained LSTM autoencoder. Finally, for each graph, we collect the encoding vectors obtained from the graph and combine them to generate the final embedding vector for the graph. These embedding vectors can be used to classify weighted graphs or to search for similar weighted graphs. The experiments on synthetic and real datasets show that the proposed method is effective in measuring the similarity between weighted graphs.
Å°¿öµå(Keyword) Graph Embedding   Graph Similarity   LSTM Autoencoder   Weighted Graph Embedding   Weighted Graph  
ÆÄÀÏ÷ºÎ PDF ´Ù¿î·Îµå