• Àüü
  • ÀüÀÚ/Àü±â
  • Åë½Å
  • ÄÄÇ»ÅÍ
´Ý±â

»çÀÌÆ®¸Ê

Loading..

Please wait....

Ȩ Ȩ > ¿¬±¸¹®Çå >

Current Result Document :

ÇѱÛÁ¦¸ñ(Korean Title) BERT-PRF: An Efficient Approach for Intent Detection from Users Search Query Tangina Sultana, Young-Koo Lee
¿µ¹®Á¦¸ñ(English Title) BERT-PRF: An Efficient Approach for Intent Detection from Users Search Query Tangina Sultana, Young-Koo Lee
ÀúÀÚ(Author) Tangina Sultana   Young-Koo Lee  
¿ø¹®¼ö·Ïó(Citation) VOL 48 NO. 02 PP. 0120 ~ 0122 (2021. 12)
Çѱ۳»¿ë
(Korean Abstract)
The exponentially increasing documents on the web demand an efficient approach to search the necessary documents based on the users¡¯ requirement by the intelligent analysis of the user search query. However, the users¡¯ queries are ambiguous because of the characteristics of the natural languages (NL). Therefore, understanding the intent of the users¡¯ query helps the modern search engine to retrieve more relevant documents. Detecting the intent suffers from the lack of sufficient labeled data. Currently, BERT (Bidirectional Encoder Representation from Transformers) has received great attention for processing a wide variety of NL tasks after fine-tuning. There is not much research work exist that explores BERT for intent detection. In this study, we have proposed a novel approach named BERT-PRF for detecting the intent from the users¡¯ search query by using BERT and Pseudo-Relevance Feedback (PRF). We semantically analyze the query by using BERT and improve the performance of the system by using PRF. Experimental results affirm that our proposed approach outperforms the existing scheme in terms of accuracy for detecting the intent.
¿µ¹®³»¿ë
(English Abstract)
Å°¿öµå(Keyword) BERT   fine-tuning   intent detection   natural language   PRF  
ÆÄÀÏ÷ºÎ PDF ´Ù¿î·Îµå