Preprint Article Version 1 This version is not peer-reviewed

QA-RAG: Leveraging Question and Answer-based Retrieved Chunk Re-Formatting for Improving Response Quality During Retrieval-augmented Generation

Version 1 : Received: 3 July 2024 / Approved: 3 July 2024 / Online: 4 July 2024 (14:19:42 CEST)

How to cite: Roy, K.; Zi, Y.; Shyalika, C.; Prasad, R.; Murali, S.; Palit, V.; Sheth, A. QA-RAG: Leveraging Question and Answer-based Retrieved Chunk Re-Formatting for Improving Response Quality During Retrieval-augmented Generation. Preprints 2024, 2024070376. https://doi.org/10.20944/preprints202407.0376.v1 Roy, K.; Zi, Y.; Shyalika, C.; Prasad, R.; Murali, S.; Palit, V.; Sheth, A. QA-RAG: Leveraging Question and Answer-based Retrieved Chunk Re-Formatting for Improving Response Quality During Retrieval-augmented Generation. Preprints 2024, 2024070376. https://doi.org/10.20944/preprints202407.0376.v1

Abstract

The use of Retrieval-augmented generation (RAG) using large language models (LLMs) has shown potential for addressing issues such as hallucinations and inadequately contextualized responses. A pivotal stage in the RAG process involves a retriever for retrieving chunks based on semantic similarity with the query. In this study, we advocate for and provide experimental evidence supporting integrating and maintaining questions and answers (QA) formatted databases to improve retrieved-context representations and response quality. Our experiments evaluate our approach on benchmark RAG datasets using standard evaluation metrics and provide comparative analyses against state-of-the-art retrieval methods, showing the potential of our approach.

Keywords

Retrieval-augmented Generation; Information Retrieval; Large Language Models

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.