• Àüü
  • ÀüÀÚ/Àü±â
  • Åë½Å
  • ÄÄÇ»ÅÍ
´Ý±â

»çÀÌÆ®¸Ê

Loading..

Please wait....

±¹³» ³í¹®Áö

Ȩ Ȩ > ¿¬±¸¹®Çå > ±¹³» ³í¹®Áö > Çѱ¹Á¤º¸°úÇÐȸ ³í¹®Áö > Á¤º¸°úÇÐȸ³í¹®Áö (Journal of KIISE)

Á¤º¸°úÇÐȸ³í¹®Áö (Journal of KIISE)

Current Result Document : 10 / 17 ÀÌÀü°Ç ÀÌÀü°Ç   ´ÙÀ½°Ç ´ÙÀ½°Ç

ÇѱÛÁ¦¸ñ(Korean Title) Bidirectional LSTM CRF ±â¹ÝÀÇ °³Ã¼¸í ÀνÄÀ» À§ÇÑ ´Ü¾î Ç¥»óÀÇ È®Àå
¿µ¹®Á¦¸ñ(English Title) Expansion of Word Representation for Named Entity Recognition Based on Bidirectional LSTM CRFs
ÀúÀÚ(Author) À¯È«¿¬   °í¿µÁß   Hongyeon Yu   Youngjoong Ko  
¿ø¹®¼ö·Ïó(Citation) VOL 44 NO. 03 PP. 0306 ~ 0313 (2017. 03)
Çѱ۳»¿ë
(Korean Abstract)
°³Ã¼¸í ÀνÄÀ̶õ ¹®¼­ ³»¿¡¼­ Àθí, ±â°ü¸í, Áö¸í, ½Ã°£, ³¯Â¥ µî °íÀ¯ÇÑ Àǹ̸¦ °¡Áö´Â °³Ã¼¸íÀ» ÃßÃâÇÏ¿© ±× Á¾·ù¸¦ °áÁ¤ÇÏ´Â °ÍÀ» ¸»ÇÑ´Ù. ÃÖ±Ù °³Ã¼¸í ÀÎ½Ä ¿¬±¸¿¡¼­ °¡Àå ¿ì¼öÇÑ ¼º´ÉÀ» º¸¿©ÁÖ°í ÀÖ´Â ¸ðµ¨Àº Bidirectional LSTM CRFs ¸ðµ¨ÀÌ´Ù. ÀÌ·¯ÇÑ LSTM ±â¹ÝÀÇ µö ·¯´× ¸ðµ¨Àº ÀÔ·ÂÀÌ µÇ´Â ´Ü¾î Ç¥»ó¿¡ ÀÇÁ¸ÀûÀÌ´Ù. µû¶ó¼­ ÀÔ·ÂÀÌ µÇ´Â ´Ü¾î¸¦ Àß Ç¥ÇöÇϱâ À§ÇÏ¿© ´Ü¾î Ç¥»óÀ» È®ÀåÇÏ´Â ¹æ¹ý¿¡ ´ëÇÑ ¿¬±¸°¡ ¸¹ÀÌ ÁøÇàµÇ¾îÁö°í ÀÖ´Ù. º» ³í¹®¿¡¼­´Â Çѱ¹¾î °³Ã¼¸í ÀνÄÀ» À§ÇÏ¿© Bidirectional LSTM CRFs¸ðµ¨À» »ç¿ëÇÏ°í, ±× ÀÔ·ÂÀ¸·Î »ç¿ëµÇ´Â ´Ü¾î Ç¥»óÀ» È®ÀåÇϱâ À§ÇØ »çÀü ÇнÀµÈ ´Ü¾î ÀÓº£µù º¤ÅÍ, Ç°»ç ÀÓº£µù º¤ÅÍ, À½Àý ±â¹Ý¿¡¼­ È®ÀåµÈ ´Ü¾î ÀÓº£µù º¤ÅÍ, ±×¸®°í °³Ã¼¸í »çÀü ÀÚÁú º¤Å͸¦ »ç¿ëÇÑ´Ù. ÃÖÁ¾ ´Ü¾î Ç¥»ó È®Àå °á°ú »çÀü ÇнÀµÈ ´Ü¾î ÀÓº£µù º¤Å͸¸ »ç¿ëÇÑ °Í º¸´Ù 8.05%pÀÇ ¼º´É Çâ»óÀ» º¸¿´´Ù.
¿µ¹®³»¿ë
(English Abstract)
Named entity recognition (NER) seeks to locate and classify named entities in text into pre-defined categories such as names of persons, organizations, locations, expressions of times, etc. Recently, many state-of-the-art NER systems have been implemented with bidirectional LSTM CRFs. Deep learning models based on long short-term memory (LSTM) generally depend on word representations as input. In this paper, we propose an approach to expand word representation by using pre-trained word embedding, part of speech (POS) tag embedding, syllable embedding and named entity dictionary feature vectors. Our experiments show that the proposed approach creates useful word representations as an input of bidirectional LSTM CRFs. Our final presentation shows its efficacy to be 8.05%p higher than baseline NERs with only the pre-trained word embedding vector.
Å°¿öµå(Keyword) °³Ã¼¸í ÀνĠ  ´Ü¾î Ç¥»ó   À½Àý   bi-LSTM-CRFs   Named Entity Recognition   Word Representation   Syllable   bi-LSTM-CRFs  
ÆÄÀÏ÷ºÎ PDF ´Ù¿î·Îµå