Preprint Article Version 1 This version is not peer-reviewed

Exploring Multimodal Sentiment Analysis Models: A Comprehensive Survey

Version 1 : Received: 1 August 2024 / Approved: 2 August 2024 / Online: 2 August 2024 (03:52:26 CEST)

How to cite: Q. Dao, P.; Roantree, M.; B. Nguyen-Tat, T.; M. Ngo, V. Exploring Multimodal Sentiment Analysis Models: A Comprehensive Survey. Preprints 2024, 2024080127. https://doi.org/10.20944/preprints202408.0127.v1 Q. Dao, P.; Roantree, M.; B. Nguyen-Tat, T.; M. Ngo, V. Exploring Multimodal Sentiment Analysis Models: A Comprehensive Survey. Preprints 2024, 2024080127. https://doi.org/10.20944/preprints202408.0127.v1

Abstract

The exponential growth of multimodal content across social media platforms, comprising text, images, audio, and video, has catalyzed substantial interest in artificial intelligence, particularly in multi-modal sentiment analysis (MSA). This study presents a comprehensive survey of 30 research papers published between 2020 and 2024 by eminent publishers such as Elsevier, ACM, IEEE, Springer, and others indexed in Google Scholar. Our analysis primarily focuses on exploring multimodal fusion techniques and features, with specific emphasis on the integration of text and image data. Additionally, the article offers an overview of the evolution, definition, and historical context of MSA. It delves into the current challenges and potential advantages of MSA, investigating recent datasets and sophisticated models. Furthermore, the study provides insights into prospective research directions. Notably, this review offers valuable recommendations for advancing research and developing more robust MSA models, thus serving as a valuable resource for both academic and industry researchers engaged in this burgeoning field.

Keywords

Multi-modal sentiment analysis; fusion techniques; Machine Learning; Deep Learning

Subject

Computer Science and Mathematics, Computer Science

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.