Unconstrained Urdu Handwriting Recognition Using Transformers

Welcome to DSpace BU Repository

Welcome to the Bahria University DSpace digital repository. DSpace is a digital service that collects, preserves, and distributes digital material. Repositories are important tools for preserving an organization's legacy; they facilitate digital preservation and scholarly communication.

Show simple item record

dc.contributor.author Anita Ishaq, 01-243212-002.
dc.date.accessioned 2023-12-19T04:31:35Z
dc.date.available 2023-12-19T04:31:35Z
dc.date.issued 2023
dc.identifier.uri http://hdl.handle.net/123456789/16843
dc.description Supervised by Dr. Arif Ur Rahman en_US
dc.description.abstract The technique is intended to overcome the difficulty of recognising Arabic-like scripts, particularly Urdu, which is a challenging and understudied script for offline handwriting recognition. The encoder-decoder approach the researchers suggest is attention-based and intended to interpret Urdu contextually. The transformer design, which has been successful in many natural language processing problems, is used in the model. In order to represent the contextdependent character forms, two-dimensional structure, gaps, overlaps, and diacritics found in Urdu script, it makes use of attention processes. The model may concentrate on pertinent sections of the input sequence using attention processes in order to produce precise predictions. A unique localization penalty is implemented to encourage the model to focus on one place at a time while identifying the next actor. This penalty acts as a regularisation term, directing the model to focus attention in one place rather than dispersing it across several. This improvement enables the model to more accurately comprehend and represent the unique traits and structure of the Urdu script. The researchers improve the only full and publicly accessible handwritten Urdu dataset in terms of ground-truth annotations in addition to creating the attentionbased model. This process makes sure the dataset is appropriately labelled and gives the model trustworthy training and assessment data. The model is tested on datasets in Urdu in order to gauge the effectiveness of the suggested technique. The results show that the pre-trained attention-based visual encoder and textual decoder transformer technique performs better at recognising Urdu scripts than straightforward attention-based transformer models. This shows that the localisation penalty and the attention-based encoder-decoder model, when used together, successfully solve the issues unique to Arabic-like scripts, notably the Urdu language. en_US
dc.language.iso en en_US
dc.publisher Computer Sciences en_US
dc.relation.ispartofseries MS(CS);T-02078
dc.subject Unconstrained en_US
dc.subject Urdu Handwriting en_US
dc.subject Recognition Using Transformers en_US
dc.title Unconstrained Urdu Handwriting Recognition Using Transformers en_US
dc.type Thesis en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search DSpace


Advanced Search

Browse

My Account