lnu.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Systematic Literature Review and Bibliometric Analysis on Addressing the Vanishing Gradient Issue in Deep Neural Networks for Text Data
University of Ilorin, Nigeria.
Linnaeus University, Faculty of Arts and Humanities, Department of Cultural Sciences.ORCID iD: 0000-0002-0025-118X
Universiti Utara Malaysia, Malaysia.
Universiti Utara Malaysia, Malaysia.
2024 (English)In: Computing and Informatics: 9th International Conference, ICOCI 2023, Kuala Lumpur, Malaysia, September 13–14, 2023, Revised Selected Papers, Part I / [ed] Zakaria N.H.;Mansor N.S.;Husni H.;Mohammed F., Springer Nature, 2024, p. 168-181Conference paper, Published paper (Refereed)
Abstract [en]

The feature to learn complex text representations enabled by Deep Neural Networks (DNNs) has revolutionized Natural Language Processing and several other fields. However, DNNs have not developed beyond all challenges. For instance, the vanishing gradient problem remains a major challenge. This challenge hinders the ability of the system to capture long-term dependencies in text data. This challenge limits the ability to understand context, implied meanings, semantics, and to represent intricate patterns in text. This study aims to address the prevalent vanishing gradient problem encountered in DNNs when dealing with text data. Text data’s inherent sparsity and heterogeneity exacerbate this issue, increasing computational complexities and processing time. To tackle this problem comprehensively, we will explore existing literature and conduct a bibliometric analysis to identify potential solutions. The findings will contribute to a comprehensive review of the existing literature and suggest effective strategies for mitigating the vanishing gradient problem in the context of NLP tasks. Ultimately, our study will pave the way for further advancements in this area of research.

Place, publisher, year, edition, pages
Springer Nature, 2024. p. 168-181
Series
Communications in Computer and Information Science, ISSN 1865-0929, E-ISSN 1865-0937 ; 2001 CCIS
Keywords [en]
Deep Neural Network, Natural Language Processing, Text data, Vanishing gradient
National Category
Natural Language Processing
Research subject
Computer and Information Sciences Computer Science, Computer Science
Identifiers
URN: urn:nbn:se:lnu:diva-129898DOI: 10.1007/978-981-99-9589-9_13Scopus ID: 2-s2.0-85185709676ISBN: 9789819995882 (print)ISBN: 9789819995899 (print)OAI: oai:DiVA.org:lnu-129898DiVA, id: diva2:1864841
Conference
9th International Conference on Computing and Informatics, ICOCI 2023, Kuala Lumpur, 13-14 September 2023
Available from: 2024-06-04 Created: 2024-06-04 Last updated: 2025-02-07Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Mohammed, Ahmed Taiye

Search in DiVA

By author/editor
Mohammed, Ahmed Taiye
By organisation
Department of Cultural Sciences
Natural Language Processing

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 70 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf