Please use this identifier to cite or link to this item:
https://hdl.handle.net/11499/47400
Title: | Intent Detection Using Contextualized Deep SemSpace | Authors: | Orhan U. Tosun E.G. Ozkaya O. |
Keywords: | Bidirectional long short-term memory Generalized SemSpace Intent detection Natural language understanding Synset vectors WordNet |
Publisher: | Springer Science and Business Media Deutschland GmbH | Abstract: | In this study, a new approach called Contextualized Deep SemSpace is proposed for intent detection. First, the synset vectors are determined by training the generalized SemSpace method with the WordNet 3.1 data. Then, each word in an intent dataset is transformed into a synset vector by a contextualized approach, and finally, the synset vectors are trained with a deep learning model using BLSTM. Since the proposed approach adapts the contextualized semantic vectors to the dataset with a deep learning model, it treats like one of contextualized deep embeddings like BERT, ELMo, and GPT-3 methods. In order to measure the success of the proposed approach, some experiments have been carried out on six well-known intent detection benchmark datasets (ATIS, Snips, Facebook, Ask Ubuntu, WebApp, and Chatbot). Although the dependence of its vocabulary on WordNet causes a serious number of out of vocabulary problems, results showed that the proposed approach is the most successful intent classifier in the literature. According to these results, it can be said that deep learning-based contextualized synset vectors can be used successfully in many problems. © 2022, King Fahd University of Petroleum & Minerals. | URI: | https://doi.org/10.1007/s13369-022-07016-9 https://hdl.handle.net/11499/47400 |
ISSN: | 2193-567X |
Appears in Collections: | Bilgi İşlem Daire Başkanlığı Koleksiyonu Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collection WoS İndeksli Yayınlar Koleksiyonu / WoS Indexed Publications Collection |
Show full item record
CORE Recommender
Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.