WANG He, LU Yiwang, LIU Xiapu, HUANG Hailiang. Large language model-driven multi-topic and multi-level sentiment analysis framework for futures market newsJ. Journal of Beijing Normal University(Natural Science), 2025, 61(6): 751-757. DOI: 10.12202/j.0476-0301.2025142
Citation: WANG He, LU Yiwang, LIU Xiapu, HUANG Hailiang. Large language model-driven multi-topic and multi-level sentiment analysis framework for futures market newsJ. Journal of Beijing Normal University(Natural Science), 2025, 61(6): 751-757. DOI: 10.12202/j.0476-0301.2025142

Large language model-driven multi-topic and multi-level sentiment analysis framework for futures market news

  • To address the need for systematic and in-depth mining of complex sentiment signals from futures market news, in this paper a large language model-driven multi-topic and multi-level sentiment analysis framework is proposed. A “macro-topic-specific aspect/event” hierarchical strategy is applied to this framework to construct a topic system that covers multi-dimensional market elements, to achieve accurate discrimination of topic-level sentiments. Aspect-sentiment-opinion triplet extraction and event sentiment analysis techniques are incorporated to identify sentiment impact of key market elements and unexpected events. Low-rank adaptation is employed to achieve efficient domain adaptation of a localized large language model, to validate applicability of large language models in financial text analysis. The proposed framework is found to perform rather well across multiple sentiment analysis tasks, effectively distinguishing the sentiment tendencies of different topics and deeply mining complex sentiment signals within news texts. This work provides a systematic solution for in-depth sentiment analysis of futures market news and offers reliable technical support for the quantitative analysis of the impact of news sentiment on the futures market.
  • loading

Catalog

    Turn off MathJax
    Article Contents

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return