A multi-task learning framework for sentiment analysis and news classification for low-resource language
DOI:
https://doi.org/10.4314/jobasr.v4i2.24Keywords:
Multi-Task Learning, AfriBERTa, Hausa, Sentiment Analysis, Low-Resource NLPAbstract
Despite the growing progress in Natural Language Processing (NLP), low-resource languages such as Hausa remain underrepresented in model development and evaluation. This paper presents a shared-encoder multi-task learning (MTL) framework based on AfriBERTa for joint Hausa sentiment analysis and news topic classification. The model learns both tasks simultaneously using Hausa subsets of the NaijaSenti and MasakhaNEWS datasets. Seven structured experiments were conducted, systematically evaluating optimization strategies including class weighting, encoder freezing, gradual unfreezing, and dynamic loss weighting. The best sentiment result was achieved in Experiment 5 (F1 = 80.7%) through encoder freezing and class rebalancing, while the best news classification result was achieved in Experiment 7 (F1 = 89.0%) through combined optimization with seed variation. The final shared-encoder model achieves a Macro-F1 of 74.0% and a Harmonic-F1 of 72.3%, substantially outperforming the FonMTL dual-encoder baseline in task balance (+27.2 Macro-F1 points; +64.1 Harmonic-F1 points). Although task interference was observed, the results confirm the viability of shared-encoder MTL for heterogeneous tasks in low-resource African language NLP and provide practical guidance for adapting multilingual models in such settings.
References
Downloads
Published
Issue
Section
License

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.