• Àüü
  • ÀüÀÚ/Àü±â
  • Åë½Å
  • ÄÄÇ»ÅÍ
´Ý±â

»çÀÌÆ®¸Ê

Loading..

Please wait....

±¹³» ³í¹®Áö

Ȩ Ȩ > ¿¬±¸¹®Çå > ±¹³» ³í¹®Áö > Çѱ¹Á¤º¸Ã³¸®ÇÐȸ ³í¹®Áö > Á¤º¸Ã³¸®ÇÐȸ ³í¹®Áö ÄÄÇ»ÅÍ ¹× Åë½Å½Ã½ºÅÛ

Á¤º¸Ã³¸®ÇÐȸ ³í¹®Áö ÄÄÇ»ÅÍ ¹× Åë½Å½Ã½ºÅÛ

Current Result Document : 7 / 175 ÀÌÀü°Ç ÀÌÀü°Ç   ´ÙÀ½°Ç ´ÙÀ½°Ç

ÇѱÛÁ¦¸ñ(Korean Title) LSTM ¿ÀÅäÀÎÄÚ´õ¸¦ ÀÌ¿ëÇÑ °¡Áß ±×·¡ÇÁ ÀÓº£µù ±â¹ý
¿µ¹®Á¦¸ñ(English Title) An Embedding Technique for Weighted Graphs using LSTM Autoencoders
ÀúÀÚ(Author) ¼­¹ÎÁö   À̱â¿ë   Minji Seo    Ki Yong Lee  
¿ø¹®¼ö·Ïó(Citation) VOL 48 NO. 01 PP. 0013 ~ 0026 (2021. 01)
Çѱ۳»¿ë
(Korean Abstract)
±×·¡ÇÁ ÀÓº£µùÀ̶õ ±×·¡ÇÁ¸¦ ÀúÂ÷¿ø °ø°£ÀÇ º¤ÅͷΠǥÇöÇÏ´Â °ÍÀÌ´Ù. ÃÖ±Ù, µö·¯´×À» »ç¿ëÇØ ±×·¡ÇÁ¸¦ ÀÓº£µùÇÏ´Â ¿¬±¸°¡ ÁøÇàµÇ°í ÀÖÁö¸¸ ´ëºÎºÐÀÇ ¿¬±¸´Â ±×·¡ÇÁÀÇ ³ëµå °£ ¿¬°á ±¸Á¶¿¡ ÁýÁßÇÏ°í ³ëµå°£ °£¼±¿¡ ÀÓÀÇÀÇ °¡ÁßÄ¡¸¦ °®´Â °¡Áß ±×·¡ÇÁ¿¡ ´ëÇÑ ÀÓº£µù ±â¹ý¿¡ ´ëÇؼ­ ¸¹Àº ¿¬±¸°¡ ÁøÇàµÇÁö ¾Ê¾Ò´Ù. µû¶ó¼­ º» ³í¹®¿¡¼­´Â °¡Áß ±×·¡ÇÁ¸¦ À§ÇÑ »õ·Î¿î ÀÓº£µù ±â¹ýÀ» Á¦¾ÈÇÑ´Ù. Á¦¾È ±â¹ýÀº °¡Áß ±×·¡ÇÁ°¡ ÁÖ¾îÁö¸é ¸ÕÀú ÇØ´ç ±×·¡ÇÁÀÇ ³»ºÎ¿¡ Á¸ÀçÇÏ´Â ³ëµå-°¡ÁßÄ¡ ½ÃÄö½ºµéÀ» ÃßÃâÇÑ ´ÙÀ½ LSTM ¿ÀÅäÀÎÄÚ´õ¸¦ »ç¿ëÇØ °¢ ½ÃÄö½ºµéÀ» °íÁ¤µÈ ±æÀÌÀÇ º¤ÅÍ·Î ÀÎÄÚµùÇÑ´Ù. ¸¶Áö¸·À¸·Î °¢ ±×·¡ÇÁÀÇ ÀÎÄÚµù º¤Å͵éÀ» ¸ð¾Æ ÇϳªÀÇ ÃÖÁ¾ ÀÓº£µù º¤Å͸¦ »ý¼ºÇÑ´Ù. ÀÌ·¸°Ô ¾ò¾îÁø ÀÓº£µù º¤ÅÍ´Â °¡Áß ±×·¡ÇÁ°£ À¯»çµµ ÃøÁ¤À̳ª ºÐ·ù µî¿¡ È°¿ëµÉ ¼ö ÀÖ´Ù. ¿©·¯ À¯»ç °¡Áß ±×·¡ÇÁ ±×·ìµé·Î ±¸¼ºµÈ ÇÕ¼º µ¥ÀÌÅÍ¿Í ½ÇÁ¦ µ¥ÀÌÅ͸¦ ÀÌ¿ëÇÑ ½ÇÇèÀ» ÅëÇØ Á¦¾È ±â¹ýÀÌ À¯»ç °¡Áß ±×·¡ÇÁ¸¦ Ž»öÇϴµ¥ 94% ÀÌ»óÀÇ Á¤È®µµ¸¦ º¸ÀÓÀ» È®ÀÎÇÏ¿´´Ù.
¿µ¹®³»¿ë
(English Abstract)
Graph embedding is the representation of graphs as vectors in a low-dimensional space. Recently, research on graph embedding using deep learning technology have been conducted. However, most research to date has focused mainly on the topology of nodes, and there are few studies on graph embedding for weighted graphs, which has an arbitrary weight on the edges between the nodes. Therefore, in this paper, we proposed a new graph embedding technique for weighted graphs. Given weighted graphs to be embedded, the proposed technique first extracts node-weight sequences that exist inside the graphs, and then encodes each node-weight sequence into a fixed-length vector using an LSTM (Long Short-Term Memory) autoencoder. Finally, for each graph, the proposed technique combines the encoding vectors of node-weight sequences extracted from the graph to generate one final embedding vector. The embedding vectors of the weighted graphs obtained by the proposed technique can be used for measuring the similarity between weighted graphs or classifying weighted graphs. Experiments on synthetic and real datasets consisting of groups of similar weighted graphs showed that the proposed technique provided more than 94% accuracy in finding similar weighted graphs.
Å°¿öµå(Keyword) ±×·¡ÇÁ ÀÓº£µù   °¡Áß ±×·¡ÇÁ   LSTM ¿ÀÅäÀÎÄÚ´õ   µö·¯´×   ±×·¡ÇÁ À¯»çµµ   graph embedding   weighted graph   LSTM autoencoder   deep learning   graph similarity  
ÆÄÀÏ÷ºÎ PDF ´Ù¿î·Îµå