Preprint Article Version 1 This version is not peer-reviewed

Generative AI for Culturally Responsive Assessment in Science: A Conceptual Framework

Version 1 : Received: 14 September 2024 / Approved: 17 September 2024 / Online: 17 September 2024 (08:43:20 CEST)

How to cite: Nyaaba, M.; Zhai, X.; Faison, M. Z. Generative AI for Culturally Responsive Assessment in Science: A Conceptual Framework. Preprints 2024, 2024091276. https://doi.org/10.20944/preprints202409.1276.v1 Nyaaba, M.; Zhai, X.; Faison, M. Z. Generative AI for Culturally Responsive Assessment in Science: A Conceptual Framework. Preprints 2024, 2024091276. https://doi.org/10.20944/preprints202409.1276.v1

Abstract

This study highlighted the core and critical tenets that constitute culturally responsive science assessments (CRSciA) for K-12 education and aligned them with the capabilities of generative AI (GenAI). The tenets were identified based on their dominance in the literature, including Indigenous Language, Race and Ethnicity, Religious Beliefs, Family and Community, and Indigenous Knowledge. These tenets were instrumental in developing the GenAI-CRSciA framework and the subsequent prompts to generate CRSciAs aligned with the Next Generation Science Standards (NGSS) while comparing it with standard prompt strategies. The results showed that the GenAI-CRSciA prompting approach outperformed normal prompting strategies in terms of generating CRSciAs, proving its effectiveness in addressing the ongoing challenges in science assessments. Even though the GenAI-CRSciA framework presented assessments that cover a range of the cultural backgrounds, it presents potential challenges such as compromising curricula core competencies and overgeneralization of students’ backgrounds; overlooking subcultural variations within a country or region, for example differences between urban and rural students, or between different ethnic or linguistic groups within the same nation, which can lead to bias and less effective CRSciA outputs. To address this, we recommend that teachers provide comprehensive and specific background information about their students when using the GenAI-CRSciA prompting, ensuring that the generated content is truly reflective of the students’ unique cultural contexts. In larger class settings, teachers could implement comprehensive local data indexing, where detailed student background information is uploaded and retrievable for use in crafting more personalized CRSciAs. We strongly recommend educators to make conscious efforts to validate the generated assessments with input from human experts, including community or family, educators, and even students who are knowledgeable about the cultural nuances. We also suggest continuous professional development would be necessary to enhance teachers’ understanding of the cultural tenets.

Keywords

Culturally Responsive Assessment; Assessment; Prompt Engineering; Generative AI (GenAI); Artificial Intelligence (AI); Science Education

Subject

Social Sciences, Education

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.