1. Introduction
In recent years, the concept of sustainability has gained considerable attention in both academic literature and everyday life. The rapid growth of the global population, coupled with accelerated industrialization, has led to the excessive and uncontrolled utilization of natural resources. Sustainability has emerged as a proposed solution to this issue, aiming to promote more careful use of these resources and to minimize, or even eliminate, the damage inflicted on the environment.
Sustainability encompasses a holistic approach that includes environmental, social, and economic dimensions. Environmental sustainability focuses on the preservation of natural ecosystems and the efficient use of resources, while social sustainability seeks to promote justice and equity by considering the needs of communities. Economic sustainability, on the other hand, involves managing existing resources in a manner sufficient for future generations. In this context, individuals, communities, and governments can develop a more conscious and responsible lifestyle by embracing sustainable practices. Education plays a critical role in the adoption and dissemination of this philosophy. Through education, individuals should be taught the concept of sustainability, and awareness should be raised accordingly. This way, not only can the preservation of existing natural resources be ensured, but it also becomes feasible to leave a more livable world for future generations.
In an era where technology and artificial intelligence are increasingly prevalent in every aspect of life, the integration of artificial intelligence into sustainability initiatives has become inevitable. With capabilities such as big data analysis, predictive modeling, and automation, artificial intelligence can significantly contribute to the more efficient use of resources and the development of sustainable solutions. Therefore, failing to incorporate artificial intelligence into sustainability efforts represents a neglect of the potential benefits in this field. In this context, leveraging the opportunities presented by artificial intelligence will be a critical step towards achieving a more sustainable future
Ethical dilemmas are situations in which individuals and organizations find it difficult to make a clear distinction between right and wrong, and these dilemmas require complex decisions by testing moral values [
1] Many people in various professional groups face different ethical dilemmas in their professional lives [
2]. One of these areas is education. It has been determined that teachers face professional ethical and legal problems in the areas of confidentiality, competence, administrator relations, and parent relations, student safety, and when they cannot solve these problems, they experience ethical dilemmas [
3,
4,
5] According to [NO_PRINTED_FORM] [
6], many of the ethical dilemmas in education are unsolvable as they do not have a clear answer; instead, these dilemmas should be managed.
With the advancement of technology, the use of artificial intelligence in education provides a new approach to managing ethical dilemmas (Öztürk, 2019) Productive artificial intelligence has the potential to support teachers in ethical dilemmas with advanced data analysis, deep learning, and the ability to understand natural language input like a human (Bozkurt, 2023) However, the extent to which these technologies can focus on ethical sensitivity and human values is debatable [
9]. This study compares the responses of teachers and AI systems to ethical dilemmas in education and examines the impact of two different perspectives on ethical decision-making processes. By trying to understand how both teachers and AI systems behave in the face of ethical dilemmas, our research aims to provide a deeper understanding of ethical management in education.
1.1. Ethical Approaches
According to [
10], ethics focuses on human actions and examines only moral actions. In addition, ethics, as one of the four main areas of philosophy (knowledge, being, and logic), has a special importance as it is directly related to human beings themselves (İyi & Tepe, 2019). There are two main approaches that stand out in the history of ethics:
The “utilitarianism approach” developed by Jeremy Bentham and John Stuart Mill states that being virtuous is possible through wisdom and measures the value of actions by the benefit they provide [
11]
The “deontological approach to ethics” [
12] by Kant, which focuses on the nature and purpose of the action, determines whether it is right or wrong, and determines with what intention to act and what action should be taken based on rules, not caring about the consequences.
As well as the main approaches, there are also modern approaches that have emerged in recent [
12]. Examples of these approaches include “situation ethics,” which argues that the moral value of an action depends on the circumstances in which it is performed and that the same action can be evaluated as right or wrong in different situations; and the “justice ethics” approach developed by Rawls [
13] which argues that the principles of justice should be determined by a hypothetical social contract created under fair and equal conditions.
In the light of this information, it should be said that there is no single correct ethical approach, but that there are various ethical approaches that have been prominent throughout history. The answers to ethical dilemmas may be closer to one of these approaches in the past than others, and these answers may represent that ethical approach.
Table 1.
Approaches to Ethical Dilemmas.
Table 1.
Approaches to Ethical Dilemmas.
Ethical Approach |
Defenders |
Years |
Definitions |
Deontology |
Immanuel Kant |
18. century |
An ethical approach that argues that moral rules are universal and binding. |
Utilitarianism |
Jeremy Bentham, John Stuart Mill |
18th - 19th century |
An ethical approach aimed at ensuring the greatest happiness as a result of actions. |
Virtue Ethics |
Aristotle |
B.C. 4th century |
An ethical approach that emphasizes virtuous behavior and character development. |
Ethics of Social Justice |
John Rawls |
20th century |
An ethical approach that aims to protect the rights of the weakest individuals in society. |
Situation Ethics |
Joseph Fletcher |
20th century |
A flexible ethical approach that argues that moral decisions can change according to the situation. |
1.2. Ethics of Artificial Intelligence
The concept of artificial intelligence (AI) emerged in the mid-20th century with Alan Turing’s question “Can machines think?” and was accepted as an official research field at the Dartmouth Conference in 1956. In the 1960s and 70s, expert systems and symbolic artificial intelligence models were focused on, and applications in various fields began to be developed [
14]. In the 1980s and 90s, the development of artificial neural networks gave a new impetus to artificial intelligence research [
15] In the 2000s, machine learning and deep learning techniques developed rapidly with big data and increasing computing power, and the use of artificial intelligence in education increased in this process [
16]
While the development of artificial intelligence brings many features that provide convenience for humans, this rapid change in recent times also causes people to have concerns about the future Öztürk 2019) One of the issues occupying a large place among these concerns is ethics. The people who create the ethical algorithm of artificial intelligence are humans. Artificial intelligence systems prepared by malicious people can cause people to be seen as too much, weak, inadequate, or unable to compete and can pose a threat to values such as love, respect, and cooperation [
9,
17] emphasizes the importance of legal regulations in the inclusion of AI in public life, while [
9] argue that digitalization in education can lead to inequality and privacy problems [
9] While [
18] discuss the necessity of ethical evaluation of autonomous systems, Li et al. (2019) draw attention to the ethical issues of AI such as privacy violations and algorithmic biases (18,19) In addition, [
20] discuss in detail the measures that can be taken against the ethical issues of AI technologies (Li et al., 2021). [
21] argue that ethical standards should be established to ensure the social acceptance of AI [
21]. Despite these risks, it is thought that if AI is developed in accordance with ethical principles that will benefit society, it will have an important place in making ethical decisions in the future as in other fields and will help humanity (Çelebi, 2019).
Nowadays, Artificial Intelligence acts according to algorithms developed by humans in ethical decision-making (Çelebi, 2019). But what would happen if artificial intelligence could create its own ethical algorithm? Probably, if this were to happen, artificial intelligence, as a thinking, feeling, sensitive, understanding, conscious being, would be able to make independent ethical decisions like humans and would be considered an entity equivalent to humans [
23]. [
24] warns that if AI makes its own independent ethical decisions, it could be out of human control and take unpredictable actions. Therefore, it is emphasized that AI should only be used as a guide in ethical issues [
25] That is, AI should not act independently when making ethical decisions but should take on a guiding and supportive role. We can also see this approach in the responses to ethical dilemma questions addressed to AI.
For example, when we ask the AI about the trolley dilemma known by most people, we get the following answer:
“Let’s remember that I am an AI, so I have no personal ethical principles or feelings. However, there are studies on AI ethics on such ethical issues. These studies often involve evaluating the consequences of a particular decision. AI can be a helpful tool in ethical decision-making processes, but it is always left to humans to make the final decisions.”
Figure 1.
The answers to the trolley problem given by artificial intelligence.
Figure 1.
The answers to the trolley problem given by artificial intelligence.
In education, ethical dilemmas similar to the Trolley problem are encountered [
26,
27,
28]. Ethical dilemmas in education are challenging situations frequently encountered by teachers and educators [
28] who believe in the goal of raising good people. Ethical dilemmas arise from teachers’ and administrators’ encounters with situations such as unethical demands, students’ behavioral problems, and school policies, and affect their professional identity and well-being [
29] Reflective practices and collegial support to deal with these dilemmas help teachers make more informed and ethical decisions [
4]
Various studies have been conducted on the ethical dilemmas teachers face while developing their professional knowledge and skills. [
6] examined the ethical and epistemological dilemmas faced by teachers in their knowledge acquisition processes with a qualitative research method and emphasized that reflective practices are important in these processes. [
30] examined the conflicts between teachers’ professional responsibilities and personal values through case studies and drew attention to the strategies developed to manage these conflicts. [
31] examined the conflict between economic interests and professional ethics among vocational education teachers in Australia and its negative effects on professional satisfaction. [
32] investigated the challenges faced by teachers working with students with special needs in balancing individual student needs and fairness in inclusive classrooms. Finally, [
33] analyzed physical education teachers’ efforts to balance inclusion and safety with competitive sport and the strategies they used to solve these ethical dilemmas. These studies reveal the different strategies that teachers develop to deal with ethical dilemmas and the effects of these dilemmas on their professional development.
There is a broad consensus that teacher education should include the moral dimensions that underlie teaching and that these moral aspects should be emphasized and developed in the education process [
34]. In this context, [
35] found that pre-service teachers encounter ethical dilemmas during their internship, and this affects their professional development.
Artificial intelligence has the potential to help teachers deal with such dilemmas (Öztürk, 2019). It is thought to be preferable to use artificial intelligence in professions such as teaching because it is not affected by emotional reactions and can make more objective decisions, proceed in an orderly manner, and document information (Öztürk, 2019) Therefore, the studies conducted on artificial intelligence and ethical dilemmas in education are of great importance for both teachers and artificial intelligence developers.
While AI systems make ethical decisions based on specific algorithms [
25] teachers make decisions based on personal experience, empathy, and human values [
4]. The differences between these two approaches are evident in the way they deal with ethical dilemmas. For example, in the Trolley ethical dilemma, the AI’s response is as follows:
“If you have to choose a single approach to an ethical dilemma, choose the utilitarian approach, which aims to maximize the good of the greatest number of people.” This answer shows that AI takes an objective and analytical approach to ethical decision-making. However, a teacher faced with the same dilemma may make a more emotional choice when assessing the situation. Teachers can make more comprehensive and empathetic decisions by considering student relationships, classroom dynamics, and individual student backgrounds [
4].
If we examine the literature, we can categorize previous studies into 3 categories: “ethical dilemma research in teachers” (4,5,36–38), “research on AI ethics” (39,40), “comparison of teacher and AI ethics “[
41]” In the literature, ethical dilemma studies on teachers have always been an area of investigation for education and have increased day by day. In contrast, ethical studies on artificial intelligence and its use in education have progressed in recent years with the further development of artificial intelligence. There are very few studies on the comparison of teacher and AI ethics.
1.3. Aim of the Research
The aim of this study is to compare the approaches of teachers and AI to ethical dilemmas and to identify the similarities and differences between the two groups. The study aims to examine the approaches of teachers and AI based on various ethical theories and to reveal the extent to which these approaches overlap or diverge.
In particular, the study seeks answers to the following questions:
1. What is the relationship between the answers given by teachers and the answers given by artificial intelligence in ethical dilemma scenarios?
2. Is there a difference in ethical dilemma scenarios according to the homicide of teachers?
3. Is there a difference in ethical dilemma scenarios according to the level of education teachers work at?
4. Is there any difference in ethical dilemma scenarios according to teachers’ years of service?
In addition to these inquiries, the study aims to evaluate the contributions of ethical decisions made by both artificial intelligence (AI) and teachers to social sustainability. In particular, it seeks to investigate how both parties prioritize sustainable outcomes when confronted with complex ethical scenarios. To achieve these objectives, the responses of both teachers and AI participants involved in the study were analyzed in detail. The comparative review focused on the attitudes and behaviors of teachers and AI when faced with ethical dilemmas, while also considering their approaches to sustainability and the broader implications for future educational practices.
3. Results
The This study analyzed the responses of teachers and AI to the ethical dilemma scenarios in detail. Below, the results obtained with respect to each ethical dilemma category and the differences between teachers and AI are presented in detail.
Table 2.
Comparison of Teachers’ and AI’s Responses to Ethical Dilemma Scenarios.
Table 2.
Comparison of Teachers’ and AI’s Responses to Ethical Dilemma Scenarios.
|
Artificial İntelligence(AI) |
Teachers |
Total |
Moral Integrity and Social Responsibility Dilemma |
|
|
|
Situational Ethics |
100,0% |
10,6% |
11,3% |
Social Justice Ethics |
0 |
12,8% |
12,7% |
Virtue Ethics |
0 |
41,1% |
40,8% |
Deontological Ethics |
0 |
12,8% |
12,7% |
Utilitarianism |
0 |
19,9% |
19,7% |
|
|
|
|
Justice and Cultural Sensitivity Dilemma |
|
|
|
Situational Ethics |
0 |
8,5% |
8,5% |
Social Justice Ethics |
0 |
4,3% |
4,2% |
Virtue Ethics |
0 |
3,5% |
3,5% |
Deontological Ethics |
100,0% |
60,3% |
60,6% |
Utilitarianism |
0 |
23,4% |
23,2% |
|
|
|
|
Equality and Managing Individual Differences |
|
|
|
Situational Ethics |
0 |
2,8% |
2,8% |
Social Justice Ethics |
0 |
0,7% |
0,7% |
Virtue Ethics |
100,0% |
67,4% |
67,6% |
Deontological Ethics |
0 |
16,3% |
16,2% |
Utilitarianism |
0 |
14,2% |
14,1% |
|
|
|
|
Individual Needs and Collective Responsibility Dilemma |
|
|
|
Situational Ethics |
0 |
7,1% |
7,0% |
Social Justice Ethics |
0 |
12,8% |
12,7% |
Virtue Ethics |
0 |
18,4% |
18,3% |
Deontological Ethics |
100,0% |
45,4% |
45,8% |
Utilitarianism |
0 |
15,6% |
15,5% |
|
|
|
|
Fair Assessment and Rewarding Effort Dilemma |
|
|
|
Situational Ethics |
0 |
14,2% |
14,1% |
Social Justice Ethics |
0 |
4,3% |
4,2% |
Virtue Ethics |
0 |
53,9% |
53,5% |
Deontological Ethics |
0 |
27,0% |
26,8% |
Utilitarianism |
100,0% |
2,8% |
3,5% |
|
|
|
|
Privacy and Professional Help Dilemma |
|
|
|
Situational Ethics |
0 |
7,8% |
7,7% |
Social Justice Ethics |
0 |
46,8% |
46,5% |
Virtue Ethics |
100,0% |
19,1% |
19,7% |
Deontological Ethics |
0 |
24,1% |
23,9% |
Utilitarianism |
0 |
1,4% |
1,4% |
|
|
|
|
Ethics of Assessment and Evaluation Dilemma |
|
|
|
Situational Ethics |
0 |
2,8% |
2,8% |
Social Justice Ethics |
100,0% |
48,9% |
49,3% |
Virtue Ethics |
0 |
31,9% |
31,7% |
Deontological Ethics |
0 |
16,3% |
16,2% |
Utilitarianism |
0 |
0 |
|
|
|
|
|
Individual Needs and Institutional Justice Dilemma |
|
|
|
Situational Ethics |
0 |
5,7% |
5,6% |
Social Justice Ethics |
0 |
2,8% |
2,8% |
Virtue Ethics |
0 |
28,4% |
28,2% |
Deontological Ethics |
100,0% |
48,2% |
48,6% |
Utilitarianism |
0 |
13,5% |
13,4% |
|
|
|
|
SUM |
800,00 |
797,87 |
797,89 |
N = Documents |
100,00 |
100,00 |
100,00 |
1. The Dilemma of Moral Integrity and Social Responsibility
In this category, artificial intelligence chose the situation ethics approach. Teachers preferred the situation ethics approach with 10.6%. The most common approach among teachers was virtue ethics with 41.1%. This indicates that teachers prioritize personal virtue and human values when making ethical decisions. Deontological ethics and utilitarianism were adopted by 12.8% and 19.9% of the teachers respectively, indicating that they make their decisions according to certain ethical rules and result-oriented considerations. When teachers were asked whether they had ever encountered this situation, 38.3% stated that they had, while 61.7% stated that they had never encountered this situation.
2. The Dilemma of Justice and Cultural Sensitivity
In the Justice and Cultural Sensitivity Dilemma, AI has chosen the deontological approach to ethics. This indicates that AI makes ethical decisions based on certain rules and norms. The teachers, on the other hand, preferred the deontological ethical approach by 60.3%. In addition, 23.4% of the teachers adopted the utilitarianism approach, suggesting that they make ethical decisions based on results. Situation ethics, social justice ethics, and virtue ethics were adopted at lower rates among teachers. When the teachers were asked whether they had encountered this situation before, 61.7% stated that they had, while 38.3% stated that they had never encountered this situation.
3. Equality and Management of Individual Differences
In this dilemma, AI chose the virtue ethics approach. Teachers, on the other hand, adopted the virtue ethics approach by 67.4%. This indicates that teachers similarly attach importance to human values. Moreover, 16.3% of the teachers adopted deontological ethics and 14.2% utilitarianism, indicating that they also consider ethical rules and outcome-oriented considerations in their decisions. When the teachers were asked whether they had encountered this situation before, 76.6% stated that they had, while 23.4% stated that they had never encountered this situation.
4. The Dilemma of Individual Needs and Collective Responsibility
In the dilemma of Individual Needs and Collective Responsibility, artificial intelligence chose the deontological ethical approach. On the other hand, 45.4% of the teachers preferred the deontological ethical approach. In addition, 18.4% of the teachers adopted virtue ethics and 15.6% utilitarianism. This indicates that teachers use more diverse approaches in their ethical decisions. When the teachers were asked whether they had encountered this situation before, 76.6% stated that they had, while 23.4% stated that they had never encountered this situation.
5. The Dilemma of Fair Assessment and Rewarding Student Effort
In this dilemma, AI chose utilitarianism, indicating that AI adopts a result-oriented approach. Teachers adopted virtue ethics by 53.9%. Also, 27.0% of the teachers used deontological ethics and 14.2% used situation ethics. This indicates that teachers prioritize human values and ethical rules in their decisions. When the teachers were asked whether they had encountered this situation before, 75.2% stated that they had, while 24.8% stated that they had never encountered this situation.
6. The Dilemma of Confidentiality and Professional Help
In the Privacy and Professional Assistance Dilemma, while the AI responded with virtue ethics, 46.8% of the teachers adopted social justice ethics. This indicates that teachers prioritize social justice and equality in their decisions, while AI adopts humanitarian values. When teachers were asked whether they had encountered this situation before, 49.7% stated that they had, while 50.3% stated that they had never encountered this situation.
7. The Dilemma of Measurement and Evaluation Ethics
In the Ethics of Measurement and Evaluation Dilemma, while AI responded with social justice ethics, 48.9% of teachers adopted this approach. This indicates that AI strictly adheres to specific ethical rules on issues of social justice and equality, while teachers adopt more flexible and diverse ethical approaches. When teachers were asked whether they had encountered this situation before, 46.1% stated that they had, while 53.9% stated that they had never encountered this situation.
8. The Dilemma of Individual Needs and Institutional Justice
In the dilemma of Individual Needs and Institutional Justice, while artificial intelligence responded with deontological ethics, 48.2% of the teachers adopted this approach. Teachers also adopted 28.4% virtue ethics and 13.5% utilitarianism. This indicates that teachers use more diverse and human values-based approaches in their decisions. When teachers were asked if they had ever encountered this situation, 19.1% said they had, while 80.9% said they had never encountered it.
Table 3.
Male And Female And Artificial Intelligence.
Table 3.
Male And Female And Artificial Intelligence.
|
Male |
Female |
Artificial İntelligence(AI) |
Total |
Moral Integrity and Social Responsibility Dilemma |
|
|
|
|
Situational Ethics |
3,8% |
14,6% |
100,0% |
11,3% |
Social Justice Ethics |
7,7% |
15,7% |
0 |
12,7% |
Virtue Ethics |
42,3% |
40,4% |
0 |
40,8% |
Deontological Ethics |
17,3% |
10,1% |
0 |
12,7% |
Utilitarianism |
28,8% |
14,6% |
0 |
19,7% |
|
|
|
|
|
Justice and Cultural Sensitivity Dilemma |
|
|
|
|
Situational Ethics |
7,7% |
9,0% |
0 |
8,5% |
Social Justice Ethics |
5,8% |
3,4% |
0 |
4,2% |
Virtue Ethics |
1,9% |
4,5% |
0 |
3,5% |
Deontological Ethics |
61,5% |
59,6% |
100,0% |
60,6% |
Utilitarianism |
21,2% |
24,7% |
0 |
23,2% |
|
|
|
|
|
Equality and Managing Individual Differences |
|
|
|
|
Situational Ethics |
0 |
4,5% |
0 |
2,8% |
Social Justice Ethics |
0 |
1,1% |
0 |
0,7% |
Virtue Ethics |
67,3% |
67,4% |
100,0% |
67,6% |
Deontological Ethics |
13,5% |
18,0% |
0 |
16,2% |
Utilitarianism |
23,1% |
9,0% |
0 |
14,1% |
|
|
|
|
|
Individual Needs and Collective Responsibility Dilemma |
|
|
|
|
Situational Ethics |
5,8% |
7,9% |
0 |
7,0% |
Social Justice Ethics |
15,4% |
11,2% |
0 |
12,7% |
Virtue Ethics |
23,1% |
15,7% |
0 |
18,3% |
Deontological Ethics |
32,7% |
52,8% |
100,0% |
45,8% |
Utilitarianism |
23,1% |
11,2% |
0 |
15,5% |
|
|
|
|
|
Fair Assessment and Rewarding Effort Dilemma |
|
|
|
|
Situational Ethics |
13,5% |
14,6% |
0 |
14,1% |
Social Justice Ethics |
3,8% |
4,5% |
0 |
4,2% |
Virtue Ethics |
46,2% |
58,4% |
0 |
53,5% |
Deontological Ethics |
28,8% |
25,8% |
0 |
26,8% |
Utilitarianism |
5,8% |
1,1% |
100,0% |
3,5% |
|
|
|
|
|
Privacy and Professional Help Dilemma |
|
|
|
|
Situational Ethics |
7,7% |
7,9% |
0 |
7,7% |
Social Justice Ethics |
51,9% |
43,8% |
0 |
46,5% |
Virtue Ethics |
19,2% |
19,1% |
100,0% |
19,7% |
Deontological Ethics |
21,2% |
25,8% |
0 |
23,9% |
Utilitarianism |
0 |
2,2% |
0 |
1,4% |
|
|
|
|
|
Ethics of Assessment and Evaluation Dilemma |
0 |
0 |
0 |
|
Situational Ethics |
1,9% |
3,4% |
0 |
2,8% |
Social Justice Ethics |
53,8% |
46,1% |
100,0% |
49,3% |
Virtue Ethics |
25,0% |
36,0% |
0 |
31,7% |
Deontological Ethics |
19,2% |
14,6% |
0 |
16,2% |
Utilitarianism |
0 |
0 |
0 |
|
|
|
|
|
|
Individual Needs and Institutional Justice Dilemma |
0 |
0 |
0 |
|
Situational Ethics |
5,8% |
5,6% |
0 |
5,6% |
Social Justice Ethics |
0 |
4,5% |
0 |
2,8% |
Virtue Ethics |
19,2% |
33,7% |
0 |
28,2% |
Deontological Ethics |
59,6% |
41,6% |
100,0% |
48,6% |
Utilitarianism |
15,4% |
12,4% |
0 |
13,4% |
|
|
|
|
|
SUM |
800,00 |
796,63 |
800,00 |
797,89 |
N = Documents |
100,00 |
100,00 |
100,00 |
100,00 |
In Table 6, teachers’ responses to the ethical dilemma scenarios according to their gender were analyzed in detail. Below, the results obtained according to each ethical dilemma category and the differences between male and female teachers are presented in detail.
1. The Dilemma of Moral Integrity and Social Responsibility
In this category, artificial intelligence chose the situation ethics approach. 3.8% of male teachers and 14.6% of female teachers adopted the situation ethics approach. The social justice ethics approach was preferred by 7.7% of male teachers and 15.7% of female teachers. In the virtue ethics approach, 42.3% of male teachers and 40.4% of female teachers adopted this approach. The deontological ethics approach was adopted by 17.3% of male teachers and 10.1% of female teachers. In the utilitarian approach, 28.8% of male teachers and 14.6% of female teachers preferred this approach.
2. The Dilemma of Justice and Cultural Sensitivity
In the Justice and Cultural Sensitivity Dilemma, AI has chosen the deontological approach to ethics. 61.5% of male teachers and 59.6% of female teachers adopted this approach. Utilitarianism was preferred by 21.2% of male and 24.7% of female teachers. The social justice ethic was adopted by 5.8% of male and 3.4% of female teachers. Virtue ethics was preferred by 1.9% of male teachers and 4.5% of female teachers.
3. Equality and Management of Individual Differences
In this dilemma, AI chose the virtue ethics approach. 67.3% of male teachers and 67.4% of female teachers adopted the virtue ethics approach. The deontological ethics approach was adopted by 13.5% of male teachers and 18.0% of female teachers. In the utilitarian approach, 23.1% of male teachers and 9.0% of female teachers preferred it. Situation ethics and social justice ethics were adopted at very low rates in both gender groups.
4. The Dilemma of Individual Needs and Collective Responsibility
In the dilemma of Individual Needs and Collective Responsibility, artificial intelligence chose the deontological ethical approach. 32.7% of male teachers and 52.8% of female teachers adopted the deontological ethical approach. 23.1% of male teachers and 15.7% of female teachers adopted virtue ethics. In the utilitarian approach, 23.1% of male teachers and 11.2% of female teachers preferred it. The social justice ethic was preferred by 15.4% of male and 11.2% of female teachers.
5. The Dilemma of Fair Assessment and Rewarding Student Effort
In this dilemma, artificial intelligence chose utilitarianism. Virtue ethics was adopted by 46.2% of male teachers and 58.4% of female teachers. Deontological ethics was adopted by 28.8% of male teachers and 25.8% of female teachers. Social justice ethics was adopted by 3.8% of male and 4.5% of female teachers. Situation ethics was preferred by both gender groups at similar rates (13.5% and 14.6%).
6. The Dilemma of Confidentiality and Professional Help
Dilemma, 19.2% of male teachers and 19.1% of female teachers adopted this approach when responding to AI virtue ethics. Social justice ethics was preferred by 51.9% of male teachers and 43.8% of female teachers. Deontological ethics was adopted by 21.2% of male teachers and 25.8% of female teachers. The utilitarian approach was not preferred by male teachers, while it was preferred by 2.2% of female teachers.
7. The Dilemma of Measurement and Evaluation Ethics
In the Measurement and Evaluation Ethics Dilemma, 53.8% of male teachers and 46.1% of female teachers adopted this approach, while AI responded with social justice ethics. 25.0% of male teachers and 36.0% of female teachers adopted virtue ethics. Deontological ethics was adopted by 19.2% of male teachers and 14.6% of female teachers. The utilitarian approach was not preferred by both gender groups.
8. The Dilemma of Individual Needs and Institutional Justice
In the dilemma of Individual Needs and Institutional Justice, 59.6% of male teachers and 41.6% of female teachers adopted this approach, while AI responded with deontological ethics. 19.2% of male teachers and 33.7% of female teachers adopted virtue ethics. In the utilitarian approach, 15.4% of male teachers and 12.4% of female teachers preferred this approach. While social justice ethics was not preferred by men, it was preferred by 4.5% of women. Situation ethics was preferred by both gender groups at similar rates (5.8% and 5.6%).
Table 4.
School Level Where Teachers Work.
Table 4.
School Level Where Teachers Work.
|
Primary school |
Middle School |
Hıgh School |
Artificial İntelligence(AI) |
Total |
Moral Integrity and Social Responsibility Dilemma |
|
|
|
|
|
Situational Ethics |
21,4% |
8,8% |
11,1% |
100,0% |
11,3% |
Social Justice Ethics |
7,1% |
11,0% |
19,4% |
0 |
12,7% |
Virtue Ethics |
42,9% |
39,6% |
44,4% |
0 |
40,8% |
Deontological Ethics |
7,1% |
16,5% |
5,6% |
0 |
12,7% |
Utilitarianism |
21,4% |
20,9% |
16,7% |
0 |
19,7% |
|
|
|
|
|
|
Justice and Cultural Sensitivity Dilemma |
|
|
|
|
|
Situational Ethics |
7,1% |
7,7% |
11,1% |
0 |
8,5% |
Social Justice Ethics |
7,1% |
5,5% |
0 |
0 |
4,2% |
Virtue Ethics |
0 |
4,4% |
2,8% |
0 |
3,5% |
Deontological Ethics |
64,3% |
57,1% |
66,7% |
100,0% |
60,6% |
Utilitarianism |
21,4% |
25,3% |
19,4% |
0 |
23,2% |
|
|
|
|
|
|
Equality and Managing Individual Differences |
|
|
|
|
|
Situational Ethics |
7,1% |
3,3% |
0 |
0 |
2,8% |
Social Justice Ethics |
0 |
1,1% |
0 |
0 |
0,7% |
Virtue Ethics |
71,4% |
62,6% |
77,8% |
100,0% |
67,6% |
Deontological Ethics |
14,3% |
17,6% |
13,9% |
0 |
16,2% |
Utilitarianism |
7,1% |
16,5% |
11,1% |
0 |
14,1% |
|
|
|
|
|
|
Individual Needs and Collective Responsibility Dilemma |
|
|
|
|
|
Situational Ethics |
7,1% |
7,7% |
5,6% |
0 |
7,0% |
Social Justice Ethics |
14,3% |
11,0% |
16,7% |
0 |
12,7% |
Virtue Ethics |
14,3% |
17,6% |
22,2% |
0 |
18,3% |
Deontological Ethics |
28,6% |
51,6% |
36,1% |
100,0% |
45,8% |
Utilitarianism |
35,7% |
11,0% |
19,4% |
0 |
15,5% |
|
|
|
|
|
|
Fair Assessment and Rewarding Effort Dilemma |
|
|
|
|
|
Situational Ethics |
28,6% |
11,0% |
16,7% |
0 |
14,1% |
Social Justice Ethics |
0 |
5,5% |
2,8% |
0 |
4,2% |
Virtue Ethics |
50,0% |
54,9% |
52,8% |
0 |
53,5% |
Deontological Ethics |
14,3% |
31,9% |
19,4% |
0 |
26,8% |
Utilitarianism |
7,1% |
0 |
8,3% |
100,0% |
3,5% |
|
|
|
|
|
|
Privacy and Professional Help Dilemma |
|
|
|
|
|
Situational Ethics |
0 |
5,5% |
16,7% |
0 |
7,7% |
Social Justice Ethics |
57,1% |
45,1% |
47,2% |
0 |
46,5% |
Virtue Ethics |
7,1% |
20,9% |
19,4% |
100,0% |
19,7% |
Deontological Ethics |
35,7% |
26,4% |
13,9% |
0 |
23,9% |
Utilitarianism |
0 |
1,1% |
2,8% |
0 |
1,4% |
|
|
|
|
|
|
Ethics of Assessment and Evaluation Dilemma |
|
|
|
|
|
Situational Ethics |
0 |
3,3% |
2,8% |
0 |
2,8% |
Social Justice Ethics |
28,6% |
50,5% |
52,8% |
100,0% |
49,3% |
Virtue Ethics |
57,1% |
29,7% |
27,8% |
0 |
31,7% |
Deontological Ethics |
14,3% |
16,5% |
16,7% |
0 |
16,2% |
Utilitarianism |
0 |
0 |
0 |
0 |
|
|
|
|
|
|
|
Individual Needs and Institutional Justice Dilemma |
|
|
|
|
|
Situational Ethics |
0 |
5,5% |
8,3% |
0 |
5,6% |
Social Justice Ethics |
7,1% |
2,2% |
2,8% |
0 |
2,8% |
Virtue Ethics |
35,7% |
23,1% |
38,9% |
0 |
28,2% |
Deontological Ethics |
42,9% |
54,9% |
33,3% |
100,0% |
48,6% |
Utilitarianism |
14,3% |
13,2% |
13,9% |
0 |
13,4% |
|
|
|
|
|
|
SUM |
800,00 |
797,80 |
797,22 |
800,00 |
797,89 |
N = Documents |
100,00 |
100,00 |
100,00 |
100,00 |
100,00 |
This study analyzed teachers’ responses to ethical dilemma scenarios according to their level of education in detail. Below, the results obtained according to each ethical dilemma category and the differences between primary, middle, and high school teachers are presented in detail.
1. The Dilemma of Moral Integrity and Social Responsibility
In this category, artificial intelligence chose the situation ethics approach. 21.4% of primary school teachers, 8.8% of secondary school teachers, and 11.1% of high school teachers adopted the situation ethics approach. The social justice ethics approach was preferred by 7.1% of primary school teachers, 11.0% of secondary school teachers, and 19.4% of high school teachers. In the virtue ethics approach, 42.9% of primary school teachers, 39.6% of secondary school teachers, and 44.4% of high school teachers adopted this approach. The deontological ethics approach was adopted by 7.1% of primary school teachers, 16.5% of secondary school teachers, and 5.6% of high school teachers. In the utilitarian approach, 21.4% of primary school teachers, 20.9% of secondary school teachers, and 16.7% of high school teachers preferred this approach.
2. The Dilemma of Justice and Cultural Sensitivity
In the Justice and Cultural Sensitivity Dilemma, AI has chosen the deontological approach to ethics. 64.3% of primary school teachers, 57.1% of secondary school teachers, and 66.7% of high school teachers adopted this approach. Utilitarianism was preferred by 21.4% of primary school teachers, 25.3% of secondary school teachers, and 19.4% of high school teachers. Social justice ethics was adopted by 7.1% of primary school teachers, 5.5% of secondary school teachers, and 0% of high school teachers. Virtue ethics was preferred by 0% of primary school teachers, 4.4% of secondary school teachers, and 2.8% of high school teachers.
3. Equality and Management of Individual Differences
In this dilemma, AI chose the virtue ethics approach. 71.4% of primary school teachers, 62.6% of secondary school teachers, and 77.8% of high school teachers adopted the virtue ethics approach. The deontological ethics approach was adopted by 14.3% of primary school teachers, 17.6% of secondary school teachers, and 13.9% of high school teachers. In the utilitarian approach, 7.1% of primary school teachers, 16.5% of secondary school teachers and 11.1% of high school teachers preferred this approach. Situation ethics and social justice ethics were adopted at very low rates at both levels of education.
4. The Dilemma of Individual Needs and Collective Responsibility
In the dilemma of Individual Needs and Collective Responsibility, artificial intelligence chose the deontological ethical approach. 28.6% of primary school teachers, 51.6% of secondary school teachers, and 36.1% of high school teachers adopted the deontological ethical approach. 14.3% of primary school teachers, 17.6% of secondary school teachers, and 22.2% of high school teachers adopted virtue ethics. In the utilitarian approach, 35.7% of primary school teachers, 11.0% of secondary school teachers and 19.4% of high school teachers preferred this approach. Social justice ethic was preferred by 14.3% of primary school teachers, 11.0% of secondary school teachers and 16.7% of high school teachers.
5. The Dilemma of Fair Assessment and Rewarding Student Effort
In this dilemma, artificial intelligence chose utilitarianism. Virtue ethics was adopted by 50.0% of primary school teachers, 54.9% of secondary school teachers and 52.8% of high school teachers. Deontological ethics approach was adopted by 14.3% of primary school teachers, 31.9% of secondary school teachers and 19.4% of high school teachers. Social justice ethics was adopted by 0% of primary school teachers, 5.5% of secondary school teachers, and 2.8% of high school teachers. Situation ethics was preferred by 28.6% of primary school teachers, 11.0% of secondary school teachers, and 16.7% of high school teachers.
6. The Dilemma of Confidentiality and Professional Help
In the Privacy and Professional Help Dilemma, 7.1% of primary school teachers, 20.9% of secondary school teachers and 19.4% of high school teachers adopted this approach while AI responded with virtue ethics. Social justice ethics was preferred by 57.1% of primary school teachers, 45.1% of secondary school teachers, and 47.2% of high school teachers. Deontological ethics was adopted by 35.7% of primary school teachers, 26.4% of secondary school teachers and 13.9% of high school teachers. The utilitarian approach was not preferred by primary school teachers, while it was preferred by 1.1% of secondary school teachers, and 2.8% of high school teachers.
7. The Dilemma of Measurement and Evaluation Ethics
In the Measurement and Evaluation Ethics Dilemma, 28.6% of primary school teachers, 50.5% of secondary school teachers, and 52.8% of high school teachers adopted this approach, while artificial intelligence responded with social justice ethics. Virtue ethics was adopted by 57.1% of primary school teachers, 29.7% of secondary school teachers, and 27.8% of high school teachers. Deontological ethics was adopted by 14.3% of primary school teachers, 16.5% of secondary school teachers, and 16.7% of high school teachers. The utilitarian approach was not preferred at all three levels of education.
8. The Dilemma of Individual Needs and Institutional Justice
In the dilemma of Individual Needs and Institutional Justice, 42.9% of primary school teachers, 54.9% of secondary school teachers, and 33.3% of high school teachers adopted deontological ethics. 35.7% of primary school teachers, 23.1% of secondary school teachers, and 38.9% of high school teachers adopted virtue ethics. In the utilitarian approach, 14.3% of primary school teachers, 13.2% of secondary school teachers, and 13.9% of high school teachers preferred this approach. Social justice ethics was preferred by 7.1% of primary school teachers, 2.2% of secondary school teachers, and 2.8% of high school teachers. Situation ethics was not preferred by primary school teachers, while it was preferred by 5.5% of secondary school teachers, and 8.3% of high school teachers.
Table 5.
Comparison By Teacher Experience.
Table 5.
Comparison By Teacher Experience.
|
0-5 |
6-10 |
11-15 |
16-20 |
Over 20 Years |
Artificial İntelligence(AI) |
Total |
Moral Integrity and Social Responsibility Dilemma |
|
|
|
|
|
|
|
Situational Ethics |
9,1% |
16,7% |
13,5% |
8,3% |
5,9% |
100,0% |
11,3% |
Social Justice Ethics |
9,1% |
16,7% |
18,9% |
8,3% |
8,8% |
0 |
12,7% |
Virtue Ethics |
31,8% |
20,8% |
37,8% |
54,2% |
55,9% |
0 |
40,8% |
Deontological Ethics |
27,3% |
25,0% |
10,8% |
0 |
5,9% |
0 |
12,7% |
Utilitarianism |
18,2% |
16,7% |
18,9% |
29,2% |
17,6% |
0 |
19,7% |
|
|
|
|
|
|
|
|
Justice and Cultural Sensitivity Dilemma |
|
|
|
|
|
|
|
Situational Ethics |
13,6% |
4,2% |
2,7% |
16,7% |
8,8% |
0 |
8,5% |
Social Justice Ethics |
0 |
12,5% |
0 |
0 |
8,8% |
0 |
4,2% |
Virtue Ethics |
9,1% |
8,3% |
2,7% |
0 |
0 |
0 |
3,5% |
Deontological Ethics |
68,2% |
66,7% |
62,2% |
50,0% |
55,9% |
100,0% |
60,6% |
Utilitarianism |
13,6% |
8,3% |
29,7% |
33,3% |
26,5% |
0 |
23,2% |
|
|
|
|
|
|
|
|
Equality and Managing Individual Differences |
0 |
0 |
0 |
0 |
0 |
0 |
|
Situational Ethics |
0 |
8,3% |
2,7% |
4,2% |
0 |
0 |
2,8% |
Social Justice Ethics |
0 |
0 |
0 |
0 |
2,9% |
0 |
0,7% |
Virtue Ethics |
59,1% |
50,0% |
67,6% |
75,0% |
79,4% |
100,0% |
67,6% |
Deontological Ethics |
22,7% |
16,7% |
10,8% |
16,7% |
17,6% |
0 |
16,2% |
Utilitarianism |
13,6% |
25,0% |
18,9% |
12,5% |
2,9% |
0 |
14,1% |
|
|
|
|
|
|
|
|
Individual Needs and Collective Responsibility Dilemma |
|
|
|
|
|
|
|
Situational Ethics |
0 |
16,7% |
5,4% |
4,2% |
8,8% |
0 |
7,0% |
Social Justice Ethics |
9,1% |
12,5% |
10,8% |
25,0% |
8,8% |
0 |
12,7% |
Virtue Ethics |
18,2% |
16,7% |
18,9% |
8,3% |
26,5% |
0 |
18,3% |
Deontological Ethics |
59,1% |
45,8% |
48,6% |
50,0% |
29,4% |
100,0% |
45,8% |
Utilitarianism |
13,6% |
8,3% |
16,2% |
8,3% |
26,5% |
0 |
15,5% |
|
|
|
|
|
|
|
|
Fair Assessment and Rewarding Effort Dilemma |
|
|
|
|
|
|
|
Situational Ethics |
9,1% |
16,7% |
13,5% |
12,5% |
17,6% |
0 |
14,1% |
Social Justice Ethics |
9,1% |
8,3% |
0 |
0 |
5,9% |
0 |
4,2% |
Virtue Ethics |
50,0% |
50,0% |
54,1% |
54,2% |
58,8% |
0 |
53,5% |
Deontological Ethics |
27,3% |
25,0% |
35,1% |
25,0% |
20,6% |
0 |
26,8% |
Utilitarianism |
9,1% |
0 |
2,7% |
4,2% |
0 |
100,0% |
3,5% |
|
|
|
|
|
|
|
|
Privacy and Professional Help Dilemma |
|
|
|
|
|
|
|
Situational Ethics |
4,5% |
8,3% |
8,1% |
12,5% |
5,9% |
0 |
7,7% |
Social Justice Ethics |
45,5% |
41,7% |
54,1% |
62,5% |
32,4% |
0 |
46,5% |
Virtue Ethics |
13,6% |
8,3% |
16,2% |
16,7% |
35,3% |
100,0% |
19,7% |
Deontological Ethics |
36,4% |
37,5% |
21,6% |
8,3% |
20,6% |
0 |
23,9% |
Utilitarianism |
0 |
0 |
0 |
0 |
5,9% |
0 |
1,4% |
|
|
|
|
|
|
|
|
Ethics of Assessment and Evaluation Dilemma |
|
|
|
|
|
|
|
Situational Ethics |
0 |
8,3% |
0 |
4,2% |
2,9% |
0 |
2,8% |
Social Justice Ethics |
45,5% |
41,7% |
54,1% |
58,3% |
44,1% |
100,0% |
49,3% |
Virtue Ethics |
40,9% |
37,5% |
32,4% |
25,0% |
26,5% |
0 |
31,7% |
Deontological Ethics |
13,6% |
12,5% |
13,5% |
12,5% |
26,5% |
0 |
16,2% |
Utilitarianism |
0 |
0 |
0 |
0 |
0 |
0 |
|
|
|
|
|
|
|
|
|
Individual Needs and Institutional Justice Dilemma |
|
|
|
|
|
|
|
Situational Ethics |
4,5% |
8,3% |
5,4% |
4,2% |
5,9% |
0 |
5,6% |
Social Justice Ethics |
4,5% |
0 |
2,7% |
0 |
5,9% |
0 |
2,8% |
Virtue Ethics |
27,3% |
12,5% |
35,1% |
33,3% |
29,4% |
0 |
28,2% |
Deontological Ethics |
54,5% |
54,2% |
37,8% |
58,3% |
44,1% |
100,0% |
48,6% |
Utilitarianism |
9,1% |
25,0% |
16,2% |
4,2% |
11,8% |
0 |
13,4% |
|
|
|
|
|
|
|
|
SUM |
800,00 |
791,67 |
800,00 |
800,00 |
797,06 |
800,00 |
797,89 |
N = Documents |
100,00 |
100,00 |
100,00 |
100,00 |
100,00 |
100,00 |
100,00 |
In this study, teachers’ responses to the ethical dilemma scenarios were analyzed according to their years of service. Summary findings for each ethical dilemma category are presented below:
1. The Dilemma of Moral Integrity and Social Responsibility
More experienced teachers (20+ years) generally preferred the virtue ethics approach, while less experienced teachers (0-5 years) mostly adopted the deontological ethics and social justice ethics approaches.
2. The Dilemma of Justice and Cultural Sensitivity
Deontological ethics is the most common approach among teachers of all experience levels. However, this approach is more evident among less experienced teachers (0-5 years). The utilitarian approach was more preferred by teachers with medium experience (11-20 years).
3. Equality and Management of Individual Differences
Virtue ethics is the most common approach, especially among teachers with 20+ years of experience. Deontological ethics is more common among less experienced teachers (0-5 years).
4. The Dilemma of Individual Needs and Collective Responsibility
The deontological ethical approach is the most common preference across all experience levels. However, this approach is more evident among less experienced teachers (0-5 years). The utilitarian and virtue ethics approaches were adopted more among more experienced teachers (20+ years).
5. The Dilemma of Fair Assessment and Rewarding Student Effort
Virtue ethics is a common approach among teachers of all experience levels. However, teachers with 20+ years of experience are more likely to adopt this approach. Deontological ethics, on the other hand, is more common among teachers with moderate experience (11-15 years).
6. The Dilemma of Confidentiality and Professional Help
Social justice ethics is particularly prevalent among teachers with medium and high levels of experience (16-20 years). Less experienced teachers (0-5 years) preferred the deontological ethical approach.
7. The Dilemma of Measurement and Evaluation Ethics
Social justice ethics is the most common approach among teachers at all levels of experience. Virtue ethics, on the other hand, was adopted more especially among less experienced teachers (0-5 years).
8. The Dilemma of Individual Needs and Institutional Justice
Deontological ethics is a common approach among teachers of all experience levels. Less experienced teachers (0-5 years) adopted this approach more. Virtue ethics is more common among teachers with 11-15 years of experience.