Skoči na glavni sadržaj

Izvorni znanstveni članak

https://doi.org/10.17559/TV-20230628001154

Emotion Intensity Detection in Online Media: An Attention Mechanism Based Multimodal Deep Learning Approach

Yuanchen Chai ; Sunshine Ruizhi Securities Consulting (Beijing) Co., LTD 100081


Puni tekst: engleski pdf 495 Kb

str. 587-595

preuzimanja: 78

citiraj


Sažetak

With the increasing influence of online public opinion, mining opinions and trend analysis from massive data of online media is important for understanding user sentiment, managing brand reputation, analyzing public opinion and optimizing marketing strategies. By combining data from multiple perceptual modalities, more comprehensive and accurate sentiment analysis results can be obtained. However, using multimodal data for sentiment analysis may face challenges such as data fusion, modal imbalance and inter-modal correlation. To overcome these challenges, the paper introduces an attention mechanism to multimodal sentiment analysis by constructing text, image, and audio feature extractors and using a custom cross-modal attention layer to compute the attention weights between different modalities, and finally fusing the attention-weighted features for sentiment classification. Through the cross-modal attention mechanism, the model can automatically learn the correlation between different modalities, dynamically adjust the modal weights, and selectively fuse features from different modalities, thus improving the accuracy and expressiveness of sentiment analysis.

Ključne riječi

attention mechanism; emotion detection; multimodal; online media

Hrčak ID:

314851

URI

https://hrcak.srce.hr/314851

Datum izdavanja:

29.2.2024.

Posjeta: 157 *