Analysis of Automated Text Generation Using Deep Learning
DOI:
https://doi.org/10.53555/cse.v7i4.1592Keywords:
Natural Language Processing, Deep learning, Chatbots, Gated Recurrent Units, Long short-term memoryAbstract
A chatbot is a computer program that can converse with humans using artificial intelligence in messaging platforms. The goal of the project is to use and optimize deep learning techniques for making an efficient chat bot. Among current chat bots many are developed using rule-based techniques, simple machine learning algorithms or retrieval-based techniques which doesn’t generate good results.in this paper, we will be comparing performance of three chatbots build by using RNN, GRU and LSTM. These conversation chatbots are mostly used by different businesses, government organizations and non-profit organizations.
Downloads
References
Peng H., Parikh A. P., Faruqui M., Dhingra B., and Das D. (2019). “Text generation with exemplar-based adaptive decoding,” arXiv preprint arXiv:1904.04428.
Luong M.-T., Pham H., and Manning C. D. (2015). “Effective approaches to attention-based neural machine translation,” arXiv preprint arXiv:1508.04025.
Sherstinsky A. (2020). “Fundamentals of recurrent neural network (rnn) and long short-term memory (lstm) network,” Physica D: Nonlinear Phenomena, vol. 404, p. 132306.
Hochreiter S. and Schmidhuber J. (1997). “Long short-term memory,” Neural computation, vol. 9, no. 8, pp. 1735–1780.
Chung J., Gulcehre C., Cho K., and Bengio Y. (2014). “Empirical evaluation of gated recurrent neural networks on sequence modeling,” arXiv preprint arXiv:1412.3555.
Sutskever I., Vinyals O., and Le Q. V. (2014). “Sequence to sequence learning with neural networks,” arXiv preprint arXiv:1409.3215.
Cho K., Van Merri¨enboer B., Gulcehre C., Bahdanau D., Bougares F., Schwenk H., and Bengio Y. (2014). “Learning phrase representations using rnn encoder-decoder for statistical machine translation,” arXiv preprint arXiv:1406.1078.
Kingma D. P. and Ba J. (2014). “Adam: A method for stochastic optimization,” arXiv preprint arXiv:1412.6980.
Sojasingarayar A. (2020). “Seq2seq ai chatbot with attention mechanism,” arXiv preprint arXiv:2006.02767.
Graves A., Fern´andez S., and Schmidhuber J. (2005). “Bidirectional lstm networks for improved phoneme classification and recognition,” in International conference on artificial neural networks. Springer, pp. 799–804.
Sundermeyer Martin, Schl¨uter Ralf, and Ney Hermann (2012). “LSTM Neural Networks for Language Modeling,” In Interspeech, pages 194–197.
Bengio Y., Ducharme Rejean, Vincent Pascal, and Janvin Christian (2000) . “A neural probabilistic language model,” 13th International Conference on Neural Information Processing Systems, Pages 893–899.
Bengio Y., Boulanger-Lewandowski N., and Pascanu R. (2013). “Advances in optimizing recurrent networks,”, DOI: 10.1109/ICASSP.2013.6639349
Csaky R. (2019). “Deep learning based chatbot models,” arXiv reprint arXiv:1908.08835.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2021 Green Publication
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
In consideration of the journal, Green Publication taking action in reviewing and editing our manuscript, the authors undersigned hereby transfer, assign, or otherwise convey all copyright ownership to the Editorial Office of the Green Publication in the event that such work is published in the journal. Such conveyance covers any product that may derive from the published journal, whether print or electronic. Green Publication shall have the right to register copyright to the Article in its name as claimant, whether separately
or as part of the journal issue or other medium in which the Article is included.
By signing this Agreement, the author(s), and in the case of a Work Made For Hire, the employer, jointly and severally represent and warrant that the Article is original with the author(s) and does not infringe any copyright or violate any other right of any third parties, and that the Article has not been published elsewhere, and is not being considered for publication elsewhere in any form, except as provided herein. Each author’s signature should appear below. The signing author(s) (and, in