Analysis of Automated Text Generation Using Deep Learning

Authors

  • Manoj Kumar Department of Computer Science and Engineering, Delhi Technological University, New Delhi, India
  • Arnav Kumar Department of Computer Science and Engineering, Delhi Technological University, New Delhi, India
  • Abhishek Singh Department of Computer Science and Engineering, Delhi Technological University, New Delhi, India
  • Ankit Kumar Department of Computer Science and Engineering, Delhi Technological University, New Delhi, India

DOI:

https://doi.org/10.53555/cse.v7i4.1592

Keywords:

Natural Language Processing, Deep learning, Chatbots, Gated Recurrent Units, Long short-term memory

Abstract

A chatbot is a computer program that can converse with humans using artificial intelligence in messaging platforms. The goal of the project is to use and optimize deep learning techniques for making an efficient chat bot. Among current chat bots many are developed using rule-based techniques, simple machine learning algorithms or retrieval-based techniques which doesn’t generate good results.in this paper, we will be comparing performance of three chatbots build by using RNN, GRU and LSTM. These conversation chatbots are mostly used by different businesses, government organizations and non-profit organizations.

Downloads

Download data is not yet available.

References

Peng H., Parikh A. P., Faruqui M., Dhingra B., and Das D. (2019). “Text generation with exemplar-based adaptive decoding,” arXiv preprint arXiv:1904.04428.

Luong M.-T., Pham H., and Manning C. D. (2015). “Effective approaches to attention-based neural machine translation,” arXiv preprint arXiv:1508.04025.

Sherstinsky A. (2020). “Fundamentals of recurrent neural network (rnn) and long short-term memory (lstm) network,” Physica D: Nonlinear Phenomena, vol. 404, p. 132306.

Hochreiter S. and Schmidhuber J. (1997). “Long short-term memory,” Neural computation, vol. 9, no. 8, pp. 1735–1780.

Chung J., Gulcehre C., Cho K., and Bengio Y. (2014). “Empirical evaluation of gated recurrent neural networks on sequence modeling,” arXiv preprint arXiv:1412.3555.

Sutskever I., Vinyals O., and Le Q. V. (2014). “Sequence to sequence learning with neural networks,” arXiv preprint arXiv:1409.3215.

Cho K., Van Merri¨enboer B., Gulcehre C., Bahdanau D., Bougares F., Schwenk H., and Bengio Y. (2014). “Learning phrase representations using rnn encoder-decoder for statistical machine translation,” arXiv preprint arXiv:1406.1078.

Kingma D. P. and Ba J. (2014). “Adam: A method for stochastic optimization,” arXiv preprint arXiv:1412.6980.

Sojasingarayar A. (2020). “Seq2seq ai chatbot with attention mechanism,” arXiv preprint arXiv:2006.02767.

Graves A., Fern´andez S., and Schmidhuber J. (2005). “Bidirectional lstm networks for improved phoneme classification and recognition,” in International conference on artificial neural networks. Springer, pp. 799–804.

Sundermeyer Martin, Schl¨uter Ralf, and Ney Hermann (2012). “LSTM Neural Networks for Language Modeling,” In Interspeech, pages 194–197.

Bengio Y., Ducharme Rejean, Vincent Pascal, and Janvin Christian (2000) . “A neural probabilistic language model,” 13th International Conference on Neural Information Processing Systems, Pages 893–899.

Bengio Y., Boulanger-Lewandowski N., and Pascanu R. (2013). “Advances in optimizing recurrent networks,”, DOI: 10.1109/ICASSP.2013.6639349

Csaky R. (2019). “Deep learning based chatbot models,” arXiv reprint arXiv:1908.08835.

Downloads

Published

2021-04-30

How to Cite

Kumar, M., Kumar, A., Singh, A., & Kumar, A. (2021). Analysis of Automated Text Generation Using Deep Learning. International Journal For Research In Advanced Computer Science And Engineering, 7(4), 01–08. https://doi.org/10.53555/cse.v7i4.1592