Preprint Article Version 1 This version is not peer-reviewed

Context-Aware and Task-Specific Prompting with Iterative Refinement for Historical Texts

Version 1 : Received: 6 October 2024 / Approved: 7 October 2024 / Online: 8 October 2024 (08:38:17 CEST)

How to cite: Zhang, J. Context-Aware and Task-Specific Prompting with Iterative Refinement for Historical Texts. Preprints 2024, 2024100470. https://doi.org/10.20944/preprints202410.0470.v1 Zhang, J. Context-Aware and Task-Specific Prompting with Iterative Refinement for Historical Texts. Preprints 2024, 2024100470. https://doi.org/10.20944/preprints202410.0470.v1

Abstract

The advent of Large Language Models (LLMs) has significantly advanced natural language processing (NLP), yet their application to historical texts remains challenging due to archaic language, distinct terminologies, and varied contextual backgrounds. This study introduces Historical Domain Large Language Models, designed to bridge this gap by adapting LLMs for better comprehension and processing of historical data. Our approach leverages context-aware and task-specific prompts to enhance model performance in tasks such as named entity recognition (NER), sentiment analysis, and information extraction within historical contexts. We propose an iterative refinement process to improve prompt quality and model outputs continuously. Instruction tuning on newly collected evaluation data ensures our methods' efficacy, avoiding biases from previously used datasets. Evaluations using GPT-4 demonstrate significant improvements in handling historical texts, underscoring the potential of our approach to unlock profound insights from historical data. This work highlights the importance of tailored LLM adaptations for specialized domains, offering a robust framework for future research in historical NLP.

Keywords

Iterative Refinement; Natural Language Processing

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.