• Àüü
  • ÀüÀÚ/Àü±â
  • Åë½Å
  • ÄÄÇ»ÅÍ
´Ý±â

»çÀÌÆ®¸Ê

Loading..

Please wait....

±¹³» ³í¹®Áö

Ȩ Ȩ > ¿¬±¸¹®Çå > ±¹³» ³í¹®Áö > Çѱ¹Á¤º¸°úÇÐȸ ³í¹®Áö > Á¤º¸°úÇÐȸ ÄÄÇ»ÆÃÀÇ ½ÇÁ¦ ³í¹®Áö (KIISE Transactions on Computing Practices)

Á¤º¸°úÇÐȸ ÄÄÇ»ÆÃÀÇ ½ÇÁ¦ ³í¹®Áö (KIISE Transactions on Computing Practices)

Current Result Document : 10 / 16 ÀÌÀü°Ç ÀÌÀü°Ç   ´ÙÀ½°Ç ´ÙÀ½°Ç

ÇѱÛÁ¦¸ñ(Korean Title) GPGPU¸¦ È°¿ëÇÑ Àΰø½Å°æ¸Á ¿¹Ãø±â¹Ý ÅؽºÆ® ¾ÐÃà±â¹ý
¿µ¹®Á¦¸ñ(English Title) Neural Predictive Coding for Text Compression Using GPGPU
ÀúÀÚ(Author) ±èÀçÁÖ   ÇÑȯ¼ö   Jaeju Kim   Hwansoo Han  
¿ø¹®¼ö·Ïó(Citation) VOL 22 NO. 03 PP. 0127 ~ 0132 (2016. 03)
Çѱ۳»¿ë
(Korean Abstract)
Àΰø½Å°æ¸ÁÀ» ¾ÐÃà¿¡ Àû¿ëÇÏ¿© ´õ ³ôÀº ¾ÐÃà ¼º´ÉÀ» º¸À̱â À§ÇÑ ¾Ë°í¸®ÁòµéÀÌ ¸î °¡Áö ¿¬±¸µÇ¾î ÀÖ´Ù. ±×·¯³ª ±×µ¿¾È ÀÌ·¯ÇÑ ¾Ë°í¸®ÁòµéÀº ÇÑÁ¤µÈ °è»ê ´É·ÂÀÇ Çϵå¿þ¾î¸¦ °¡Áö°í Àֱ⿡ ÀÛÀº Å©±âÀÇ ½Å°æ¸ÁÀ» »ç¿ëÇÒ ¼ö¹Û¿¡ ¾ø¾úÀ¸¸ç Àû¿ëÇÏ´Â ´ë»ó ¿ª½Ã ½ÇÁ¦·Î »ç¿ëÇϱ⿡´Â ³Ê¹« ÀÛÀº Å©±âÀÇ ÆÄÀϵéÀ̾ú´Ù. º» ³í¹®¿¡¼­´Â GPGPUÀÇ °è»ê´É·ÂÀ» ½Å°æ¸Á ÇнÀ¿¡ ÀÌ¿ëÇÏ¿© ¸¸µç ÅؽºÆ® ¹®¸Æ ±â¹Ý ¹®ÀÚ µîÀå È®·ü ¿¹Ãø±â¿Í ÇÔ²² ÇãÇÁ¸¸ ºÎȣȭÀÇ ¼º´ÉÀ» ³ôÀÏ ¼ö ÀÖ´Â º¯È¯ ¹æ¹ýÀ» Á¦½ÃÇÑ´Ù. ¾Õ¸ÔÀÓ ½Å°æ¸Á°ú GRU ȸ±Í ½Å°æ¸Á¿¡ ´ëÇØ ½ÇÇèÀ» ¼öÇàÇÏ¿´À¸¸ç, ȸ±Í ½Å°æ¸Á ¸ðµ¨Àº ¾Õ¸ÔÀÓ ½Å°æ¸Á¿¡ ºñÇØ ¶Ù¾î³­ ¿¹Ãø ¼º°ø·ü°ú ¾ÐÃà·üÀ» º¸¿´´Ù.
¿µ¹®³»¿ë
(English Abstract)
Several methods have been proposed to apply artificial neural networks to text compression in the past. However, the networks and targets are both limited to the small size due to hardware capability in the past. Modern GPUs have much better calculation capability than CPUs in an order of magnitude now, even though CPUs have become faster. It becomes possible now to train greater and complex neural networks in a shorter time. This paper proposed a method to transform the distribution of original data with a probabilistic neural predictor. Experiments were performed on a feedforward neural network and a recurrent neural network with gated-recurrent units. The recurrent neural network model outperformed feedforward network in compression rate and prediction accuracy.
Å°¿öµå(Keyword) Àΰø½Å°æ¸Á   ¾Õ¸ÔÀÓ ½Å°æ¸Á   ȸ±Í½Å°æ¸Á   ÅؽºÆ® ¾ÐÃà   ÀÚ¿¬¾î ¾ÐÃà   ÇãÇÁ¸¸ ºÎȣȭ   ¿£Æ®   artificial neural network   feedforward neural network   recurrent neural network   text compression   natural language compression   huffman coding   entropy coding   batch normalization  
ÆÄÀÏ÷ºÎ PDF ´Ù¿î·Îµå