• Àüü
  • ÀüÀÚ/Àü±â
  • Åë½Å
  • ÄÄÇ»ÅÍ
´Ý±â

»çÀÌÆ®¸Ê

Loading..

Please wait....

Çмú´ëȸ ÇÁ·Î½Ãµù

Ȩ Ȩ > ¿¬±¸¹®Çå > Çмú´ëȸ ÇÁ·Î½Ãµù > Çѱ¹Á¤º¸°úÇÐȸ Çмú´ëȸ > KSC 2019

KSC 2019

Current Result Document : 4 / 4 ÀÌÀü°Ç ÀÌÀü°Ç

ÇѱÛÁ¦¸ñ(Korean Title) ½Ã°ø°£ ºÐ¼®À»À§ÇÑ LSTM (Long Short Term Memory) ½Å°æ ³ëµå
¿µ¹®Á¦¸ñ(English Title) Long Short Term Memory (LSTM) Neural Nodes for Spatio-Temporal Analysis
ÀúÀÚ(Author) Muhammad Ishfaq Hussain   Jiwon Jun   Jongmin Yu   Ahmad Muqeem Sheri   Moongu Jeon  
¿ø¹®¼ö·Ïó(Citation) VOL 46 NO. 02 PP. 0730 ~ 0732 (2019. 12)
Çѱ۳»¿ë
(Korean Abstract)
¿µ¹®³»¿ë
(English Abstract)
Many learning tasks require dealing with the temporal and spatial analysis of data. Neural Networks are more powerful learning algorithms that are producing amazing results in a wide range of supervised and unsupervised machine learning tasks. LSTM was developed for Neural Network architecture for processing and predicting the output for the long temporal sequence of data. In this paper, the objective is to prove the significant result in the "Distractor Sequence Recall" task by using LSTM nodes internally and network-level control LSTM nodes. The novel approach is inspired by the Hierarchical Temporal Memory (HTM) structure. Through breaking the LSTM neural node, the network is trained for the task of the "Distractor Sequence recall" task, which consists of the presentation of a temporal sequence. That consists of randomly chosen target and distractor symbols, in random order, and at the end of the sequence prompts symbols, which direct the network to produce an output, regardless of the order of sequence.
Å°¿öµå(Keyword)
ÆÄÀÏ÷ºÎ PDF ´Ù¿î·Îµå