Paper Title : Survey on Different NLP models for Semantic Similarity
ISSN : 2394-2231
Year of Publication : 2021
10.29126/23942231/IJCT-v8i3p7
MLA Style: Raj Awate,Keshav Bajaj,Anilkumar Gupta " Survey on Different NLP models for Semantic Similarity " Volume 8 - Issue 3 May-June , 2021 International Journal of Computer Techniques (IJCT) ,ISSN:2394-2231 , www.ijctjournal.org
APA Style: Raj Awate,Keshav Bajaj,Anilkumar Gupta " Survey on Different NLP models for Semantic Similarity " Volume 8 - Issue 3 May-June , 2021 International Journal of Computer Techniques (IJCT) ,ISSN:2394-2231 , www.ijctjournal.org
Abstract
Calculating Semantic Similarity of Sentences helps in many Real-life application Which Includes developing an automatic grading system, Determining Repeatedly asked Questions on Quora, Stack overflow, and some of the other similar platforms. There are various Machine Learning models and Deep Learning techniques used to find out semantic similarity of sentences and this article presents accuracies, advantages, and disadvantages of various models such as Siamese network model, BERT, T5, used with different datasets to calculate similarity in sentences. This study will help various machine learning engineers while fine-tuning and using pretrained machine learning models.
Reference
[1] Ashish Vaswani, Noam Shazeer, Niki Parmar, JakobUszkoreit, Llion Jones, Aidan N Gomez, ŁukaszKaiser, and Illia Polosukhin. 2017. Attention is Allyou Need. In I. Guyon, U. V. Luxburg, S. Bengio,H. Wallach, R. Fergus, S. Vishwanathan, and R. Gar-nett, editors,Advances in Neural Information Pro-cessing Systems 30, pages 5998–6008. [2] Alexis Conneau, Douwe Kiela, Holger Schwenk, Lo ̈ıcBarrault, and Antoine Bordes. 2017. SupervisedLearning of Universal Sentence Representationsfrom Natural Language Inference Data. InProceed-ings of the 2017 Conference on Empirical Methodsin Natural Language Processing, pages 670–680, Copenhagen, Denmark. Association for Computa-tional Linguistics. [3] Devlin, Jacob, et al. "Bert: Pre-training of deep bidirectional transformers for language understanding." arXiv preprint arXiv:1810.04805 (2018). [4] Zaheer, Manzil, et al. "Big bird: Transformers for longer sequences." arXiv preprint arXiv:2007.14062 (2020). [5] Raffel, Colin, et al. "Exploring the limits of transfer learning with a unified text-to-text transformer." arXiv preprint arXiv:1910.10683 (2019). [6] Reimers, Nils, and Iryna Gurevych. "Sentence-bert: Sentence embeddings using siamese bert-networks." arXiv preprint arXiv:1908.10084 (2019).
Keywords
—transformers, semantic similarity