Preprint Article Version 1 This version is not peer-reviewed

DataLDA: Analyzing Data Quality in Big Data Using Traditional Approaches vs. Latent Dirichlet Allocation - Systematic Review

Version 1 : Received: 30 October 2024 / Approved: 31 October 2024 / Online: 31 October 2024 (15:21:51 CET)

How to cite: Sayeed, S. A.; Kshetri, N.; Rahman, M. M.; Alam, S.; Rahman, A. DataLDA: Analyzing Data Quality in Big Data Using Traditional Approaches vs. Latent Dirichlet Allocation - Systematic Review. Preprints 2024, 2024102596. https://doi.org/10.20944/preprints202410.2596.v1 Sayeed, S. A.; Kshetri, N.; Rahman, M. M.; Alam, S.; Rahman, A. DataLDA: Analyzing Data Quality in Big Data Using Traditional Approaches vs. Latent Dirichlet Allocation - Systematic Review. Preprints 2024, 2024102596. https://doi.org/10.20944/preprints202410.2596.v1

Abstract

Big data quality is all about ensuring that the huge amounts of data are accurate, reliable, and useful. It is important because decisions made based on this data can affect many things, like business strategies, customer experiences, and even public policies. Many researchers are studying data quality in the context of big data. This study aims to analyze the research trends and patterns within the domain of big data quality. This method reveals hidden themes in large datasets, helping to understand data quality better. The findings show that Latent Dirichlet Allocation (LDA) is effective for topic modeling in big data, highlighting data quality challenges and opportunities. The research identified key trends and effective assessment methods. For researchers, using LDA in a systematic review helps summarize existing studies and find gaps, guiding future research. Practitioners and future researchers get useful direction for managing data quality and improving efficiency and decisions in various industries.

Keywords

big data; data quality; topic modeling; latent dirichlet allocation (lda); systematic literature review

Subject

Computer Science and Mathematics, Computer Science

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.