Enhancing aspect-based financial sentiment analysis through contrastive learning

Authors

DOI:

https://doi.org/10.30837/ITSSI.2023.25.138

Keywords:

Aspect-based Financial Sentiment Analysis; Contrastive Learning; Text Classification.

Abstract

The subject of research in the article explores the specialized application of Aspect-Based Financial Sentiment Analysis (ABFSA), focusing on the intricate and multifaceted emotional landscape of financial textual data. The study extends the current understanding of sentiment analysis by addressing its limitations and opportunities within a financial context. The purpose of the work is to advance the field of Aspect-Based Financial Sentiment Analysis by developing a more nuanced and effective methodology for analyzing sentiments in financial news. Additionally, the study aims to assess the efficacy of recent advancements in Natural Language Processing (NLP) and machine learning for enhancing ABFSA models. The article deals with the following tasks: Firstly, the study focuses on the rigorous pre-processing of the SEntFiN dataset to make it more amenable to advanced machine learning techniques, specifically contrastive learning methodologies. Secondly, it aims to architect a unified model that integrates state-of-the-art machine learning techniques, including DeBERTa v3,  contrast learning, and LoRa fine-tuning. Lastly, the research critically evaluates the proposed model's performance metrics across the test dataset and compares them with existing methodologies. The following methods are used: Firstly, the study employs pre-processing techniques tailored for the SEntFiN dataset, which is explicitly designed for entity-sensitive sentiment analysis in financial news. Secondly, it utilizes advanced machine learning techniques such as DeBERTa v3 for language model pre-training,  contrast learning for focusing on causal relationships, and LoRa for fine-tuning large language models. Lastly, performance evaluation methods are used to assess the efficacy of the proposed model, including comparisons with existing methodologies in the field.The following results were obtained: The study reveals that the proposed pre-processing framework successfully accommodates the variable number of entities present in financial news, thereby improving the granularity of sentiment classification. Furthermore, the integration of advanced NLP and machine learning techniques significantly enhances the accuracy and efficiency of ABFSA models. Conclusions: The paper concludes that specialized ABFSA methodologies, when augmented with advanced NLP techniques and a robust pre-processing framework, can offer a more nuanced and accurate representation of sentiment in financial narratives. The study lays the groundwork for future research in this nascent yet crucial interdisciplinary field, providing actionable insights for stakeholders ranging from investors to financial analysts.

Author Biography

Viacheslav Ivanenko, Taras Shevchenko National University of Kyiv

postgraduate Department of Mathematical Informatics

References

References

Hu, E.J., Shen, Y., Wallis, P., Allen-Zhu, Z., Li, Y., Wang, S., Wang, L. and Chen, W. (2021), "Lora. Low-rank adaptation of large language models". FinLLM Symposium at IJCAI. 2021. Р. 1–26. DOI: https://doi.org/10.48550/arXiv.2106.09685

Choi, S., Jeong, M., Han, H., & Hwang, S. W. (2022), "C2l: Causally contrastive learning for robust text classification". In Proceedings of the AAAI Conference on Artificial Intelligence. Vol. 36(10). 2022. Р. 10526–10534. DOI: https://doi.org/10.1609/aaai.v36i10.21296

He, Pengcheng, Jianfeng, Gao, and Weizhu, Chen (2021), "Improving deberta using electra-style pre-training with gradient-disjoint embedding sharing". FinLLM Symposium at IJCAI. 16 р. DOI: https://doi.org/10.48550/arXiv.2111.09543

Sinha, A., Kedas, S., Kumar, R., & Malo, P. (2022), "SEntFiN 1.0: Entity-aware sentiment analysis for financial news". Journal of the Association for Information Science and Technology. Vol. 73(9). Р. 1314–1335. DOI: 10.1002/asi.24634

Malo, Pekka et al. (20140, "Good debt or bad debt: Detecting semantic orientations in economic texts". Journal of the Association for Information Science and Technology. Vol.65.4. Р. 782-796. DOI: https://doi.org/10.48550/arXiv.1307.5336

Maia, Macedo, et al. (2018), "Www'18 open challenge: financial opinion mining and question answering". Companion proceedings of the web conference. Р. 1941–1942. DOI: 10.1145/3184558.3192301

Loukas, Lefteris et al. (2022), "Financial numeric entity recognition for XBRL tagging". Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics. (Volume 1: Long Papers). Р. 4419–4431. DOI: 10.18653/v1/2022.acl-long.303

Huang, Allen H., Hui, Wang, and Yi, Yang. (2022), "A large language model for extracting information from financial text". Contemporary Accounting Research. Р. 806–841. DOI: https://doi.org/10.1111/1911-3846.12832

Zhang, Yuzhe, and Hong, Zhang (2022), "FinBERT-MRC: financial named entity recognition using BERT under the machine reading comprehension paradigm". FinLLM Symposium at IJCAI. Р. 1–19. DOI: https://doi.org/10.48550/arXiv.2205.15485

Asahi, Ushio and Jose, Camacho-Collados, An. (2021), "All-Round Python Library for Transformer-based Named Entity Recognition". In Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: System Demonstrations. Association for Computational Linguistics. Р. 53–62. DOI: 10.18653/v1/2021.eacl-demos.7

Nguyen, D. N., Cao, S., Nguyen, S. and Dinh, C. (20220, "Multilingual Pretrained Language Model for Financial Domain" 14th International Conference on Knowledge and Systems Engineering (KSE), Nha Trang, Vietnam. Р. 1–6. DOI: 10.1109/KSE56063.2022.9953749.

Iz Beltagy, Kyle Lo, and Arman Cohan (2019), "A Pretrained Language Model for Scientific Text". In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP). Hong Kong, China. Association for Computational Linguistics. Р. 3615–3620. DOI: 10.18653/v1/D19-1371

Lee, Jinhyuk, et al. (2020), "A pre-trained biomedical language representation model for biomedical text mining". Bioinformatics.Vol.36.4. 2020. P. 1234–1240. DOI: https://doi.org/10.48550/arXiv.1901.08746

Taylor, Ross, et al. (2022), "A large language model for science". FinLLM Symposium at IJCAI. 58 р. DOI: https://doi.org/10.48550/arXiv.2211.09085

Vaswani, Ashish, et al. "Attention is all you need". Advances in neural information processing systems. 2017. 30 р. available at: https://papers.nips.cc/paper_files/paper/2017/hash/3f5ee243547dee91fbd053c1c4a845aa-Abstract.html

Xie, Q., Han, W., Zhang, X., Lai, Y., Peng, M., Lopez-Lira, A. and Huang, J. (2023), "A Large Language Model". Instruction Data and Evaluation Benchmark for Finance. 12 р. DOI: https://doi.org/10.48550/arXiv.2306.05443

Liu, X.Y., Wang, G. and Zha, D. (2023), "FinGPT: Democratizing Internet-scale Data for Financial Large Language Models". FinLLM Symposium at IJCAI. 43 р. DOI: https://doi.org/10.48550/arXiv.2307.10485

Zhang, B., Yang, H. and Liu, X.Y. (2023), "Instruct-FinGPT: Financial Sentiment Analysis by Instruction Tuning of General-Purpose Large Language Models". FinLLM Symposium at IJCAI. Р. 1–7. DOI: https://doi.org/10.48550/arXiv.2306.12659

Wu, S., Irsoy, O., Lu, S., Dabravolski, V., Dredze, M., Gehrmann, S., & Mann, G. (2023), "A large language model for finance". Updated to include Training Chronicles. 76 р. DOI: https://doi.org/10.48550/arXiv.2303.17564

Chi, T. C., & Chen, Y. N. (2018), "Cross-lingual unsupervised sense embeddings". Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Association for Computational Linguistics. Р. 271–281. DOI: 10.18653/v1/d18-1025

Chen, T., Kornblith, S., Norouzi, M. and Hinton, G. (2020), "A simple framework for contrastive learning of visual representations". In International Conference on Learning Representations. Vol. 2. Р. 1–20. DOI: https://doi.org/10.48550/arXiv.2002.05709

Gao, T., Yao, X., & Chen, D. (2021), "Simple contrastive learning of sentence embeddings". Accepted to EMNLP 2021. Р. 1–17. DOI: https://doi.org/10.48550/arXiv.2104.08821

Downloads

Published

2023-09-30

How to Cite

Ivanenko, V. (2023). Enhancing aspect-based financial sentiment analysis through contrastive learning. INNOVATIVE TECHNOLOGIES AND SCIENTIFIC SOLUTIONS FOR INDUSTRIES, (3(25), 138–147. https://doi.org/10.30837/ITSSI.2023.25.138

Issue

Section

MODERN ENTERPRISE MANAGEMENT TECHNOLOGIES