Work place: Department of Computer Science, Faculty of Science and Technology, American International University-Bangladesh, Dhaka, Bangladesh
E-mail: 18-38671-3@student.aiub.edu
Website:
Research Interests: World Wide Web, Data Structures and Algorithms
Biography
Md. Ibrahim was born in Dhaka, Bangladesh in 2000. He received a BSc degree in Computer Science & Engineering from the American International University-Bangladesh (AIUB), with a major in information systems, in 2022.
He is currently serving as intern at AIUB. His fields of interest lie in Data Science and Web Development.
By Sabbir Hossain Rahman Sharar Md. Ibrahim Bahadur Abu Sufian Rashidul Hasan Nabil
DOI: https://doi.org/10.5815/ijisa.2023.04.05, Pub. Date: 8 Aug. 2023
The emergence of chatbots over the last 50 years has been the primary consequence of the need of a virtual aid. Unlike their biological anthropomorphic counterpart in the form of fellow homo sapiens, chatbots have the ability to instantaneously present themselves at the user's need and convenience. Be it for something as benign as feeling the need of a friend to talk to, to a more dire case such as medical assistance, chatbots are unequivocally ubiquitous in their utility. This paper aims to develop one such chatbot that is capable of not only analyzing human text (and speech in the near future), but also refining the ability to assist them medically through the process of accumulating data from relevant datasets. Although Recurrent Neural Networks (RNNs) are often used to develop chatbots, the constant presence of the vanishing gradient issue brought about by backpropagation, coupled with the cumbersome process of sequentially parsing each word individually has led to the increased usage of Transformer Neural Networks (TNNs) instead, which parses entire sentences at once while simultaneously giving context to it via embeddings, leading to increased parallelization. Two variants of the TNN Bidirectional Encoder Representations from Transformers (BERT), namely KeyBERT and BioBERT, are used for tagging the keywords in each sentence and for contextual vectorization into Q/A pairs for matrix multiplication, respectively. A final layer of GPT-2 (Generative Pre-trained Transformer) is applied to fine-tune the results from the BioBERT into a form that is human readable. The outcome of such an attempt could potentially lessen the need for trips to the nearest physician, and the temporal delay and financial resources required to do so.
[...] Read more.Subscribe to receive issue release notifications and newsletters from MECS Press journals