DemoCraft: Using In-Context Learning to Improve Code Generation in Large Language Models
Version 2 : Received: 30 October 2024 / Approved: 31 October 2024 / Online: 31 October 2024 (08:38:19 CET)
How to cite: Nirmal Joshua, K.; Sreejith, M. DemoCraft: Using In-Context Learning to Improve Code Generation in Large Language Models. Preprints 2024, 2024061105. https://doi.org/10.20944/preprints202406.1105.v2 Nirmal Joshua, K.; Sreejith, M. DemoCraft: Using In-Context Learning to Improve Code Generation in Large Language Models. Preprints 2024, 2024061105. https://doi.org/10.20944/preprints202406.1105.v2
Abstract
Generating executable code from natural language instructions using Large Language Models (LLMs) poses challenges such as semantic ambiguity and understanding task- specific contexts. To address these issues, we propose a system called DemoCraft, which enhances code generation by leveraging in-context learning and demonstration selection, combined with latent concept learning. Latent concept learning introduces additional concept tokens, which are trainable embeddings that capture task-specific knowledge. We then test our system on two major datasets: MBPP and HumanEval. Our experimental results demonstrate that the proposed system achieves an approximate 2x increase in the pass@k metric compared to baseline models. Furthermore, we introduce two novel evaluation metrics: correctness@k and similarity@k. Our empirical studies indicate that our system attains nearly a 3x improvement in these metrics as well.
Keywords
in-context learning; code generation; latent concept learning; demonstration selection; large language models
Subject
Copyright: This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Comments (0)
We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.
Leave a public commentSend a private comment to the author(s)