Explore open access research and scholarly works from STORE - University of Staffordshire Online Repository

Advanced Search

Transforming Language Translation: A Deep Learning Approach to Urdu–English Translation

Safder, Iqra, Abu Bakar, Muhammad, Zaman, Farooq, Waheed, Hajra, Aljohani, Naif Radi, NAWAZ, Raheel and Hassan, Saeed Ul (2024) Transforming Language Translation: A Deep Learning Approach to Urdu–English Translation. Journal of Ambient Intelligence and Humanized Computing. ISSN 1868-5137

Full text not available from this repository. (Request a copy)
Official URL: http://dx.doi.org/10.1007/s12652-024-04839-2

Abstract or description

Machine translation has revolutionized the field of language translation in the last decade. Initially dominated by statistical models, the rise of deep learning techniques has led to neural networks, particularly Transformer models, taking the lead. These models have demonstrated exceptional performance in natural language processing tasks, surpassing traditional sequence-to-sequence models like RNN, GRU, and LSTM. With advantages like better handling of long-range dependencies and requiring less training time, the NLP community has shifted towards using Transformers for sequence-to-sequence tasks. In this work, we leverage the sequence-to-sequence transformer model to translate Urdu (a low resourced language) to English. Our model is based on a variant of transformer with some changes as activation dropout, attention dropout and final layer normalization. We have used four different datasets (UMC005, Tanzil, The Wire, and PIB) from two categories (religious and news) to train our model. The achieved results demonstrated that the model’s performance and quality of translation varied depending on the dataset used for fine-tuning. Our designed model has out performed the baseline models with 23.9 BLEU, 0.46 chrf, 0.44 METEOR and 60.75 TER scores. The enhanced performance attributes to meticulous parameter tuning, encompassing modifications in architecture and optimization techniques. Comprehensive parametric details regarding model configurations and optimizations are provided to elucidate the distinctiveness of our approach and how it surpasses prior works. We provide source code via GitHub for future studies. © The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2024.

Item Type: Article
Faculty: Executive
Depositing User: Raheel NAWAZ
Date Deposited: 13 Sep 2024 11:35
Last Modified: 13 Sep 2024 11:35
URI: https://eprints.staffs.ac.uk/id/eprint/8439

Actions (login required)

View Item
View Item