Preprint Brief Report Version 1 Preserved in Portico This version is not peer-reviewed

Enhancing Natural Language to Code Generation in the SantaCoder Model through In-Context Learning

Version 1 : Received: 15 June 2024 / Approved: 17 June 2024 / Online: 17 June 2024 (07:57:08 CEST)

How to cite: Nirmal Joshua, K.; Sreejith, M. Enhancing Natural Language to Code Generation in the SantaCoder Model through In-Context Learning. Preprints 2024, 2024061105. https://doi.org/10.20944/preprints202406.1105.v1 Nirmal Joshua, K.; Sreejith, M. Enhancing Natural Language to Code Generation in the SantaCoder Model through In-Context Learning. Preprints 2024, 2024061105. https://doi.org/10.20944/preprints202406.1105.v1

Abstract

Generating executable code from natural language instructions using Large Language Models (LLMs) presents challenges such as semantic understanding and handling ambiguous input. This study focuses on the SantaCoder model and explores the impact of in-context learning on code generation using the MBPP and HumanEval datasets for evaluation. Our results demonstrate significant improvements in three key metrics (defined in the paper): correctness@k, similarity@k and pass@k. To address the problem of selecting optimal demonstrations to maximize correctness and pass rates, we investigate two methods: latent concept selection and random selection in this paper. These findings highlight the effectiveness of in-context learning and the critical role of demonstration selection in enhancing the accuracy, efficiency, and versatility of the SantaCoder model in code generation.

Keywords

In Context Learning; Large Language Models; NL2Code; Machine Learning; Latent Concept Learning

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.