KCC 2021
Current Result Document :
ÇѱÛÁ¦¸ñ(Korean Title) |
Shared Attention Network¿Í Multi- Query AttentionÀ» ÀÌ¿ëÇÑ Transformer ¸ðµ¨ÀÇ µðÄÚ´õ ¼Óµµ °³¼± |
¿µ¹®Á¦¸ñ(English Title) |
Speed Improvement of Transformer Model using Shared Attention Network and Multi- Query Attention |
ÀúÀÚ(Author) |
ÃÖ¹ÎÁÖ
±èµµ°æ
ȲÇö¼±
ÀÌâ±â
À̼º¹Î
·ù¿ìÁ¾
¿°È«¼±
±èÁؼ®
Minjoo Choi
Dokyoung Kim
Hyunsun Hwang
Changki Lee
Seongmin Lee
Woojong Ryu
Hongseon Yeom
Junseok Kim
|
¿ø¹®¼ö·Ïó(Citation) |
VOL 48 NO. 01 PP. 0373 ~ 0375 (2021. 06) |
Çѱ۳»¿ë (Korean Abstract) |
½Å°æ¸ÁÀ» ÀÌ¿ëÇÏ´Â ±â°è¹ø¿ª (Neural Machine Translation) ¸ðµ¨ Áß¿¡ Transformer¸¦ ÀÌ¿ëÇÏ´Â ±â°è¹ø¿ª ¸ðµ¨ÀÇ °æ¿ì ¿ì¼öÇÑ ¹ø¿ª ¼º´ÉÀ» º¸ÀÌ°í ÀÖÀ¸³ª ¸Þ¸ð¸® »ç¿ë·®ÀÌ ¸¹À¸¸ç ¹ø¿ª ¼Óµµ°¡ ´À¸®´Ù´Â ´ÜÁ¡ÀÌ ÀÖ´Ù. ÀÌ·¯ÇÑ ´ÜÁ¡À» º¸¿ÏÇϱâ À§ÇØ º» ³í¹®¿¡¼´Â TransformerÀÇ µðÄÚ´õ ±¸Á¶¸¦ °³¼±ÇÏ¿© µðÄÚ´õÀÇ ¼Óµµ¸¦ Çâ»ó½ÃÄ×´Ù. ¼±Çà ¿¬±¸¸¦ ¹ÙÅÁÀ¸·Î Àüü ·¹ÀÌ¾î ¼ö¸¦ ÁÙÀÌ°í µðÄÚ´õ ¾ÈÆÆÀ¸·Î Attention Mechanism¿¡ ÇÊ¿äÇÑ ¿ä¼Ò¸¦ °øÀ¯ÇÏ´Â ¹æ¹ýÀ» ÅëÇØ ¿¬»ê Å—¼ö ¹× ¿¬»ê¿¡ ÇÊ¿äÇÑ ÅÙ¼ Å©±â¸¦ ÁÙÀÎ °á°ú ¹ø¿ª ¼Óµµ°¡ Çâ»óµÇ¾ú´Ù. |
¿µ¹®³»¿ë (English Abstract) |
|
Å°¿öµå(Keyword) |
±â°è¹ø¿ª
NMT
Transformer
Shared Attention Networks
Multi- Query Attention
|
ÆÄÀÏ÷ºÎ |
PDF ´Ù¿î·Îµå
|