Improving news headline text generation quality through frequent POS-Tag patterns analysisShow others and affiliations
2023 (English)In: Engineering applications of artificial intelligence, ISSN 0952-1976, E-ISSN 1873-6769, Vol. 125, article id 106718Article in journal (Refereed) Published
Abstract [en]
Original synthetic content writing is one of the human abilities that algorithms aspire to emulate. The advent of sophisticated algorithms, especially based on neural networks has shown promising results in recent times. A watershed moment was witnessed when the attention mechanism was introduced which paved the way for transformers, a new exciting architecture in natural language processing. Recent sensations like GPT and BERT for synthetic text generation rely on NLP transformers. Although, GPT and BERT-based models are capable of generating creative text given they are properly trained on abundant data, however, the generated text suffers the quality aspect when limited data is available. This is especially an issue for low-resource languages where labeled data is still scarce. In such cases, the generated text, more often than not, lacks the proper sentence structure, thus unreadable. This study proposes a post-processing step in text generation that improves the quality of generated text through the GPT model. The proposed post-processing step is based on the analysis of POS tagging patterns in the original text and accepts only those generated sentences from GPT which satisfy POS patterns that are originally learned from the data. We exploit the GPT model to generate English headlines by utilizing Australian Broadcasting Corporation (ABC) news dataset. Furthermore, for assessing the applicability of the model in low-resource languages, we also train the model on the Urdu news dataset for Urdu news headlines generation. The experiments presented in this paper on these datasets from high- and low-resource languages show that the performance of generated headlines has a significant improvement by using the proposed headline POS pattern extraction. We evaluate the performance through subjective evaluation as well as using text generation quality metrics like BLEU and ROUGE.
Place, publisher, year, edition, pages
Elsevier, 2023. Vol. 125, article id 106718
Keywords [en]
POS tagging, Text generation, Low resource language, Generative pre-trained transformer, Attention mechanism
National Category
Information Systems
Research subject
Computer and Information Sciences Computer Science, Information Systems
Identifiers
URN: urn:nbn:se:lnu:diva-123315DOI: 10.1016/j.engappai.2023.106718ISI: 001043855300001Scopus ID: 2-s2.0-85165234163OAI: oai:DiVA.org:lnu-123315DiVA, id: diva2:1783192
2023-07-192023-07-192023-08-25Bibliographically approved