Preprint
Article

Analysis of Orchestration Load and Teacher Agency in Smart Synchronous Hybrid Learning Environments

Altmetrics

Downloads

169

Views

59

Comments

0

A peer-reviewed article of this preprint also exists.

Submitted:

04 July 2023

Posted:

06 July 2023

You are already at the latest version

Alerts
Abstract
The COVID-19 pandemic has led to the growth of hybrid and online learning environments and the trend to introduce more technology into the classroom. One such change would be the use of smart synchronous hybrid learning environments (SSHLE), which are settings with both in-person and online students concurrently, and in which technology plays a key role in sensing, analyzing, and reacting throughout the teaching and learning process. These changing environments and the incorporation of new technologies can place a greater orchestration load on participants and a reduction in teacher agency. In this context, the aim of this paper is to analyse the orchestration load and teacher agency in different SSHLEs. The NASA-TLX model was used to measure the orchestration load in several scenarios. Questionnaires and interviews were used to measure teacher agency. The results obtained indicate that the orchestration load of the teacher tends to be high (between 60 and 70 points out of 100 of the NASA-TLX workload), especially when they lack experience in synchronous hybrid learning environments, and the orchestration load of the students tends to have average values (between 50 and 60) in the SSHLEs analysed. Meanwhile, the teacher agency does not appear to be altered but shows potential for improvement.
Keywords: 
Subject: Computer Science and Mathematics  -   Other

1. Introduction

The COVID-19 pandemic had a significant impact on global educational environments, especially in higher education [1]. In response to health regulations and social distancing measures, the deployment of hybrid learning environments (HLEs) and online learning environments increased [2]. HLEs have different implementations, but, in particular, synchronous hybrid learning environments (SHLEs) emerged as a popular solution, especially during the COVID-19 pandemic as a trade-off to meet health regulations [3]. These environments allow students to attend classes both online and in person in real-time, providing greater flexibility in learning and better access to educational resources regardless of their physical location [4]. Therefore, the use of SHLEs is not restricted to situations where social distancing restrictions apply but can be beneficial to make access to formal learning more flexible than in traditional educational settings. However, while SHLEs have the potential to support education, they also pose important challenges. For example, the implementation of these environments requires significant investment in technology, infrastructure, and teacher training to ensure an effective learning experience [3]. Despite the advantages of SHLEs, more research is needed to better understand their impact on student learning and performance as well as to identify best practices in their implementation [5]. This will enable educational institutions to make more informed and effective decisions about to adapt to the educational challenges posed by the pandemic and to implement more effective long-term learning environments.
SHLE can be combined with additional technology to collect, process, and provide supplementary information to the teacher, with the aim of enhancing and making learning more flexible. The environments that employ this technology are referred to as Smart Learning Environments (SLEs) [6]. In these environments, technology plays three key roles: sensing, obtaining data such as audio or positioning; analysing, processing that data; and reacting, using that data to support teachers and students with their pedagogical activities [7]. Key features of SLEs include adaptability, which enables the personalisation of learning to meet the individual needs of students; traceability, which allows educators to make informed decisions by monitoring and analysing data on student performance; and real-time interaction, which enables real-time completion of tasks and access to educational resources from anywhere at any time [7]. However, the application of SLEs also has disadvantages. For example, their costs can be high due to the need for additional technology and resources. In addition, technical glitches can disrupt learning and create frustration for students and teachers. There are also issues related to the privacy and security of personal data collected and used by SLEs [5]. Overall, the implementation of SLEs can provide significant benefits in the personalisation of learning, informed decision-making, and access to educational resources. However, these benefits must be balanced with the constraints and considerations of security, privacy, and teacher agency to ensure effective and sustainable implementation of SLEs.
This study proposes the concept of Smart Synchronous Hybrid Learning Environments (SSHLEs) by bringing together the advantages of Synchronous Hybrid Learning Environments (SHLEs) and Smart Learning Environments (SLEs). SSHLEs enable students to interact synchronously from different locations. Therefore, SSHLE can offer greater adaptability and support more complex learning experiences [8]. However, the implementation of SSHLEs also presents challenges inherited from SLEs and SHLEs, including the high cost of additional technology and resources required, possible technical issues, and privacy and security concerns [9]. In conclusion, SSHLEs offer a promising approach to enhance the effectiveness of SHLEs, although they are not free from problems depending on the methodology used by the teacher, especially when implementing complex strategies such as active learning, collaborative learning, etc. [3].
In the context of SSHLEs, enacting collaborative learning situations is particularly challenging because of the complexity involved in coordinating students and ensuring that activities are carried out effectively. Collaborative learning involves a joint intellectual effort by teachers and/or students to carry out activities in a group of two or more [10]. Collaborative learning can be a valuable approach for fostering teamwork and enhancing students learning. However, this type of learning requires careful planning and organisation on the part of the teacher to ensure its effectiveness [11], and adding technology into the mix may lead to an increased orchestration load. The orchestration load is the effort required by the teacher and the students to carry out the desired activities [12]. A high orchestration load can impact the success of collaborative learning and can be affected by various factors, such as the teachers level of experience, the type of activity, and the group size [13]. Therefore, teachers and students must receive the appropriate training and support to plan and effectively manage collaborative learning in SSHLEs [14]. Additionally, technology can play a significant role in facilitating this type of learning, providing tools and resources for collaboration and communication between students and teachers [15].
Moreover, it is important to consider that the addition of new elements into the educational environment, particularly different types of technology to implement SSHLEs, can have an impact on teacher agency [16]. Teacher agency refers to the experiences, professional training, resources, culture, social structure and environment, that influence the teacher decision-making process [17]. Therefore, any limitations in the teacher agency may not only reduce the teacher ability to make effective decisions but may also negatively affect students performance [18]. To mitigate these problems, it is important to implement SSHLEs carefully and strategically, considering not only the technological benefits but also the impacts on the educational process and teacher agency.
This study aims to analyse the factors that influence the orchestration load of the teacher and students, and teacher agency in the particular context of the implementation of collaborative learning situations in SSHLEs. To this end, two research questions are posed:
  • RQ1: What factors influence the orchestration load of the teacher and students in SSHLEs that support collaborative learning situations?.
  • RQ2: What factors influence teacher agency in SSHLEs that support collaborative learning situations?.

2. Materials and Methods

2.1. Design

Three experiments developed in SSHLEs that include collaborative learning situations are designed. Specifically, a collaborative learning flow pattern (CLFP) called jigsaw [19] is used in two of these experiments, which are adapted from [20], with the objective to measure the orchestration load and teacher agency in SSHLEs. The jigsaw pattern involves dividing a topic into subtopics, assigning each student a subtopic to become an expert on it, and then grouping experts of each subtopic together to teach each other the various subtopics. To this end, the jigsaw pattern is divided into three Jigsaw Phases (JP), as shown in Figure 1.
The first phase of the jigsaw (JP1) is the individual phase. In this phase, the teacher chooses a topic to be addressed and divides it into various subtopics (into 3 or 4, for example). Subsequently, each student is assigned one of these subtopics, ensuring that approximately the same number of students cover the same subtopic. Once students receive their subtopic, they are given documentation to learn about the subtopic. This task can be assigned as homework, as it is an individual task.
The second phase of the jigsaw (JP2) is the expert phase. In this phase, students are grouped according to their subtopic, with the potential for more than one group covering the same subtopic. Additionally, all group members are in the same environment, either all online or all onsite, which is one of the major differences with a standard Jigsaw CLFP. Each group needs to tackle problems presented that are related to their subtopic.
The third and final phase of the jigsaw (JP3) is the jigsaw phase. In this phase, groups are formed, each of which must include at least one expert on each subtopic. On this occasion, there is a mix of online and onsite students within the same group. In this phase, the groups need to address problems requiring knowledge of all subtopics to be solved.

2.2. Data collection

Several sources are used for data collection. Logs of the various applications used in each experiment along with the recording and transcription of classes are the first sources of data. These resources show the number and timing of the teacher interactions with both online and onsite students. Observations of the teachers actions also help triangulate the information on the orchestration load collected through the questionnaires. The flow between teachers, students, and technology is modelled using Epistemic Network Analysis (ENA) [21]. ENA aids in visualising the structure of connections between codes in the flow data via dynamic network models. The work of Amarasinghe et al. [22] has been used as a reference to define these codes due to the great similarity between the design of our experiments and theirs. This in turn allows a better comparison with other similar works. The activities linked to each code can be seen in the Table 1. Another source used was a questionnaire to measure orchestration load. This questionnaire consists of the model proposed by NASA-TLX [23], with 6 questions on a scale of 1 to 100, 15 questions of pairwise comparisons among factors to extract variation, a set of demographic questions, and other questions about the activity to facilitate correlation. Another data collection source used is the teacher agency questionnaire. The teacher agency questionnaire is based on the work of Hull et al. [24], which is one of the few main articles that studied teacher agency. The main objective of the teacher agency questionnaire is to compare the perception that teachers have of their agency before and after implementing the SSHLE. The teacher agency questionnaire comprises 17 questions concerning certain factors of teacher agency. The teacher is required to respond on a scale of 1 to 5, indicating how much they agree with each statement. Questions may be framed positively or negatively, thus a score of 5 on a positive question implies a higher level of agency, whereas a score of 5 on a negative question indicates a lower level of agency. Interviews with teachers are the final source of data collection. Interviews are designed to obtain the data that could not be obtained through the questionnaires and to provide a deeper insight into the teacher perception of orchestration load and agency. This interview is based on the evaluation concepts proposed by Stake & Jorrín-Abellán [25].
These data sources have been used during each experiment. The teacher agency questionnaire was completed before beginning to design the jigsaw activity and after the activity was carried out. The interview was conducted either before the jigsaw activity or after it concluded. The class recording, transcription, and log collection were conducted during the jigsaw activity. Finally, the orchestration load questionnaire was completed at the end of the jigsaw activity, by both students and the teacher (except in one experiment). The organisation and usage of these sources during the experiments can be observed in Figure 2.

2.3. Experiments

The three experiments conducted are summarised in Table 2. The first experiment was conducted at the Catholic University of Louvain (KU Leuven, Belgium) due to their experience in SSHLEs and the available classrooms with the appropriate technology for these environments [3]. This experiment focused on studying a setting where both students and teachers have experience in SSHLEs. The experiment was carried out in a session of the university course where a collaborative learning situation was to be implemented. This session lasted 2 hours and was attended by 22 online students and 24 onsite students. The software WeConnect was used to support this SSHLE [26]. WeConnect includes participation measures, user profile identification, and tools for group management. In this experiment, the teacher completed the teacher agency questionnaire before beginning the design of the experiment. The second part consisted of three collaborative activities, which could not follow the jigsaw pattern because it did not fit into the design of the session by the teacher. Instead, students solved three problems in groups of four (homogeneous groups, all students were either online or onsite). The information on the activity was recorded. In the end, only the teacher filled in the orchestration load questionnaire, as the university regulations did not allow the collection of student information when it came to an external experiment. After this, the teacher completed the teacher agency questionnaire and the interview.
The second experiment was conducted at Universidad Carlos III de Madrid (UC3M, Spain) and involved participants from Universidad de Valladolid (UVa, Spain), Universitat Pompeu Fabra (UPF, Spain) and UC3M. This experiment was aimed at converting a classroom with technologies usually available (blackboard, projector, speakers, and computers) into an SSHLE. To do this, a one-hour workshop was carried out with online (6 students) and onsite (11 students) participants in this classroom. The software Engageli was used to support this SSHLE [27]. Engageli supports the communication between teachers and students and provides the teacher with different measures, such as provide the teacher with measures such as student participation (based on spoken time, resource usage, etc.). In addition, Engageli supports collaboration with virtual tables, collaborative work environments, and group resource management. The teacher completed the teacher agency questionnaire before starting with the design of the experiment. An interview was conducted with her in the first part of the experiment. The second part consisted of the implementation of a jigsaw on the theme of user-centred design. The information about the activity was collected from the recording of the Engageli session and the transcription of an observer in the classroom. In the end, both students and the teacher completed the orchestration load questionnaire. After this, the teacher completed the teacher agency questionnaire.
The last experiment was conducted at UC3M and involved participants from UVa and UC3M. This experiment focused on repeating the approach of the second experiment, a simple classroom transformed into an SSHLE with the minimum technology, but with participants and a teacher more familiar with the SSHLEs. This experiment was intended to collect information from participants with greater experience in these environments for a more effective comparison. A one-hour workshop was planned with 3 online students and 9 onsite students. The software Engageli was also used to support this SSHLE, and in addition, the teacher had gained more experience as it was the same as in Experiment 2. The teacher completed the teacher agency questionnaire before starting with its design and did the interview at the end of the experiment. The second part consisted of a jigsaw focused on the study of research paradigms. The activity information was collected from the recording of the Engageli session and the recording in the classroom. In the end, both the students and the teacher completed the orchestration load questionnaire. After this, the teacher completed the teacher agency questionnaire and carried out the interview.

3. Results

3.1. Experiment 1

The results of the NASA-TLX questionnaire completed by teachers indicate that the factors that most affected the workload were mental demand and temporal demand (presenting a subscale of 70 and 60 respectively). In addition, temporal demand was the factor that varied the most in the pairwise comparisons among factors, being selected in all 5 comparisons. The rest of the variations and subscales can be seen in Table 3. The teacher final workload was 50 in a range between 0 and 100. This value falls within the mid-range of orchestration load (40-60) [23].
The ENA model can be seen in Figure 3. It can be observed that announcements to the class and the use of the tool were among the actions the teacher had to perform most frequently. Moreover, most of the time the teacher had to use the tool she had just made an announcement; this was because the teacher was checking the impact this announcement had on the students. In contrast, the lines connecting individual or group interaction actions, regardless of the environment where the students were located (online or in-person), are rather thin, which indicates that there were barely any interactions.
The results of the teacher agency questionnaire indicated that 4 (23.53%) out of 17 factors increased, and only 1 (5.88%) decreased, with the rest remaining the same (70.59%) after conducting the experiment. The variation, both in the increase and decrease, is by one point on a scale from 1 to 5. The factors that increased are those that dealt with the possibility of using applications for the design and development of classes, as well as the possibility of choosing the content taught. In addition to these results, the teacher indicated in the interview that hybrid classes required the same effort from her as in-person classes. All of this suggests that there has been minimal impact on teacher agency, and if any, only a slight increase would be noted.

3.2. Experiment 2

The results of the teacher NASA-TLX questionnaire indicate that the factor causing the greatest workload was temporal demand (showing a subscale of 70). Additionally, temporal demand was also the most frequently selected factor in the pairwise comparisons, being chosen in all bilateral comparisons. Another detail to highlight is that mental demand is the second factor that most affects orchestration load, just below temporal demand. This is because, in addition to having a subscale of 50, it has been selected in 4 out of 5 of the pairwise comparisons. All values from the NASA-TLX can be seen in Table 5. The final workload for the teacher was 60.67 in a range between 0 and 100. This value is within the high orchestration load band (60-80) [23].
Table 4. Experiment 2 - NASA-TLX Teacher results (Subscales in a range between 0 and 100, and bilateral comparisons in a range between 0 and 5).
Table 4. Experiment 2 - NASA-TLX Teacher results (Subscales in a range between 0 and 100, and bilateral comparisons in a range between 0 and 5).
Mental Demand Physical Demand Temporal Demand Performance Effort Frustration Level
Subscales 50 20 70 60 60 60
Greatest variation 4 0 5 1 2 3
The workload values for students were also obtained (see Table 5). The students reported a higher mental demand due to the difficulty of coordinating with their classmates who were in a different environment. Physical demand was high due to the noise generated during the activity as a result of communication between students in JP3. This is due to the conversations from other groups filtering through the microphones, thus making communication within each group more difficult. There was a high time demand due to technical issues causing delays. Despite these challenges, overall performance was good, although some students reported lower performance due to lack of time to complete the tasks. The reported effort corresponded to levels of mental demand and the level of frustration was generally low, with only a few students reporting higher levels due to stress from the lack of time.
The actions that were most frequently undertaken by the teacher according to the ENA model (see Figure 4) were observing the state of the class, utilising the tool, and interacting with the class. Moreover, a strong correlation could be noted between class interaction, class announcements, and interaction with the hybrid groups. In contrast, the use of the tool is significantly related to the rest of the actions, being equally connected to almost all of the others. Furthermore, it could be observed that there were very few individual interactions, whether online or onsite.
The results of the teacher agency questionnaire showed no change before and after the experiment. This could have been due to the fact that the design and implementation of the activity were coordinated jointly with the teacher. The teacher supported this idea during the interview.

3.3. Experiment 3

The results of the teacher NASA-TLX questionnaire indicate that the greatest workload was caused by mental demand and effort. Additionally, mental demand had the highest variation in the pairwise comparisons, being selected in all 5 comparisons. The high values in mental demand, effort, and temporal demand come from the teacher difficulty in coordinating the students in different environments in the times planned for each phase of the jigsaw. The teacher final workload was 76 in a range between 0 and 100. This value falls within the high orchestration load range (60-80) [23].
Table 6. Experiment 3 - NASA-TLX Teacher results (Subscales in a range between 0 and 100, and bilateral comparisons in a range between 0 and 5).
Table 6. Experiment 3 - NASA-TLX Teacher results (Subscales in a range between 0 and 100, and bilateral comparisons in a range between 0 and 5).
Mental Demand Physical Demand Temporal Demand Performance Effort Frustration Level
Subscales 90 40 80 40 90 70
Greatest variation 5 0 2 3 3 2
The values for the students orchestration load can be seen in Table 7. The students reported a higher mental demand than in on-site classes due to the difficulty of coordinating with their peers who were in a different environment. Some students indicated a high physical demand due to the additional noise generated in the classroom from multiple conversations between groups. There was a high time demand as the activities that took place in phases JP2 and JP3 were debating, and the students would have preferred more time to further develop them. Despite these challenges, the overall performance was good, although some students reported low performance due to a technical problem. The reported effort corresponded to the levels of mental demand, and the level of frustration was generally low, with only a few students reporting higher levels due to stress from the lack of time.
As can be observed in Figure 5, the actions most frequently undertaken by the teacher were observing the state of the class and utilising the tool. In addition, it can be noted that alongside class observation, there was a strong correlation with interaction with different groups. In contrast, the use of the tool was quite related to interaction with the hybrid groups, significantly above that in the other types of groups. Moreover, there was a relationship between the use of the tool and class announcements. Furthermore, it can be observed that often when class announcements were made, there was also an action of interaction with the class.
The results from the teacher agency questionnaire indicated that 3 (17.65%) out of 17 factors increased, 5 (29.41%) decreased, and 9 (52.94%) factors maintained the same value after carrying out the experiment. The factors that increased were those dealing with the possibility of using applications, as well as the efficiency of their teaching. The factors that decreased were teacher actions and the effect of time on effective teaching. These data suggest that there has been a slight change in teacher agency, as almost half of the factors changed. However, it is not possible to conclude that there has been an increase or decrease in teacher agency

4. Discussion

This paper proposed two research questions and to address them three experiments using SSHLE to support collaborative learning situations were carried out. The first research question: "What factors influence the orchestration load of the teacher and students in SSHLEs that support collaborative learning situations?" is answered with the NASA-TLX questionnaire, and the ENA model and complemented with teachers interviews.
The NASA-TXL questionnaire served to obtain values for different factors that affect orchestration load and a general value called workload. The workload of the teachers in each experiment was respectively 50, 60.67 and 76. The teacher from experiment 1 was asked in an interview about possible factors that could affect her orchestration load. She indicated that she had extensive experience in this type of class and did not find it difficult to conduct hybrid classes as long as she had the appropriate technologies. The other two experiments presented a higher degree of orchestration load than the former. From interviews conducted with the teacher and comments made in the NASA-TLX questionnaire, it was deduced that the main problems encountered were the noise generated during JP3 with the hybrid groups, technical problems, the need for more time to carry out the activities, and the lack of experience of the teacher with SSHLE. No studies were found that use the NASA-TLX questionnaire to measure teacher orchestration load in SHLEs or collaborative learning. The most similar study is that of Prieto et al. [12], which measures the orchestration load of teachers in Technology-Enhanced Classrooms. In this study, the teachers obtained a 53.3 (out of 100) in one session and a 56.3 (out of 100) in another session, which could serve as a reference to measure orchestration load in an environment with a strong presence of technology like in SSHLEs. It was also observed that in these experiments, incorporating collaborative learning and conducting it within a SSHLE increases the orchestration load by between 5 and 20 points more, but further research is needed for a broader perspective.
The orchestration load of the students had an average workload score of 50.66 and 50.94. These values are within the medium range of orchestration load (40-60) [23]. No studies have been found that use the NASA-TLX questionnaire to measure the orchestration load of students in hybrid environments carrying out collaborative activities. The closest study is that of Zhang et al. [28] who measured orchestration load in onsite class and different collaboration strategies were used. The results of the study by Zhang et al. (38.94) shows lower values than those obtained in our experiments. It was observed in this case that conducting collaborative activities within a SSHLE increased the orchestration load by approximately 12 points, but further research is needed for a more comprehensive view.
Regarding the ENAs, a pairwise comparisons were carried out for easier comprehension of the differences. A stronger relationship between announcements and tool usage can be observed in Experiment 1 after comparing Experiments 1 and 2 (Figure 6). This is due to the fact that in Experiment 1 the students hardly initiated any interaction with the teacher, and she had to monitor the class progress through the tool. In contrast, Experiment 2 shows a strong relationship between class interaction and hybrid groups. This is due to the teacher requesting general information, and if there was a problem, assisting the indicated group. In both experiments, the teacher made extensive use of the tool. This action often becomes the pivot among other options, that is, after performing one action, the use of the tool was typically involved. This made the use of the tool a key point from the orchestration perspective.
In the comparison between Experiment 1 and Experiment 3 (Figure 7), the same difference can be observed as in the previous comparison. Experiment 1 had a stronger relationship between class announcements and tool usage. Experiment 3 had a greater relationship between the perception of the class and interaction with the different groups. This may be due to the fact that the teacher already had more experience, and with a general perception, she was able to see where her presence was required. In this case, the most frequently used action was perceiving the state of the class; therefore, if this action is performed easily and quickly, it would decrease the orchestration load.
The two experiments conducted by the same teacher, Experiment 2 and Experiment 3, were compared (Figure 8). In this comparison, as was the case in the first of these (Figure 6), the relationship between class interaction and hybrid group interaction in Experiment 2 stood out. In contrast, Experiment 3 was distinguished by its individual onsite interactions with the class and perception of online groups. As for individual onsite interactions, they occurred because when the teacher asked the class for information and a student responded with a problem, the teacher assisted them individually. The interaction with online groups related to perception is due to the teacher assisting the group when she noticed a problem with any group in the tool. In this comparison, the most performed action was the interaction with the class. This action is crucial, especially for it to be carried out effectively for all students, whether they are online or onsite. Providing the appropriate tools to carry out this action is crucial for conducting activities in SSHLEs. Moreover, ensuring these tools do not pose a greater orchestration load is a challenge.
An analysis of the NASA-TLX questionnaire, the ENA model, and interviews revealed several key factors impacting the orchestration load of these experiments. One of these factors was task complexity. This factor was identified in the literature on technology implementation [12] and gains greater importance in SSHLEs. This is due to the requirement of using new technologies together with the need to work with people in different environments (online and onsite). From the point of view of collaborative learning, this factor becomes more important as collaborative activities usually require extensive communication and the use of resources for collaboration. Both the NASA-TLX questionnaire from the teachers and the students, as well as the interviews with the teachers, pointed out this factor. The characteristics that helped to reduce task complexity, indicated by the teachers in the interviews, were the centralisation of resources, the adaptability to various changes that arose during the activity, and the support for group management. Another factor was time limitations, which, like the previous factor, are also found in learning environments where technology is added [12]. Time limitations become more important in SSHLEs because, unlike other environments, if there is any problem with the technology, especially with communication technology, it is very challenging (at least in a short period of time) to find a solution or alternative. From the perspective of collaboration, calculating the time of activities is already a challenge in itself [29]. But if this factor is compounded by the need to take more steps to complete an activity due to technology, not having alternatives when an error or complications arise (for example, problems with a student internet or microphone failures), this factor becomes more significant. Both the NASA-TLX questionnaire from the teachers and the students, as well as the interviews with the teachers, pointed out this factor. The characteristics that facilitated reducing the activity time, indicated by the teachers in the interviews, were adaptability to different changes that arose, and support for group management. Another factor that affected the orchestration load is the tools used in SSHLEs. This factor is inherited from both SLEs and SHLEs [3,7]. From the perspective of collaborative learning, more specifically Computer Support Collaborative Learning (CSCL), tools are also a key factor in enhancing development [30]. In addition to being an individually identified factor in SLEs, SHLEs and CSCL, the ENA models indicated a significant weight in tool use, pointing it out as a key factor for the orchestration load. The prominent features of the tools in the teacher interviews were video/chat, real-time interaction, group manager, file manager and the ability to incorporate external resources. The last identified factor was knowledge about the state of the class and the students, which is present as a feature in some SLEs [7] and is also a factor identified in other studies of the literature on collaborative learning [22]. In the case of these experiments, this factor had been detected in the ENA models and teacher interviews. The features that contributed to this factor, as indicated by the teachers in the interviews, were student participation data, a notice that a student had a question, and viewing student progress. All these factors can be seen in Table 8.
The second research question: "What factors influence teacher agency in SSHLEs that support collaborative learning situations?" is answered with the teacher agency questionnaire and complemented with teachers interviews.
The different results obtained from the teacher agency questionnaire seem to indicate that SSHLEs had minimal impact on teacher agency. However, in two out of three cases, they increased teacher agency in factors related to the use of tools. Factors related to the material teachers had at their disposal carry significant weight in teacher agency [31]. For this reason, and based on the results obtained, it is possible that a specific approach to SSHLEs to support these factors could have a positive impact on teacher agency. In contrast, it should be noted that no studies have been found that assess teacher agency with a questionnaire, an issue also encountered by the creators of the model upon which the questionnaire of this paper on the teacher agency is based [32].
Although the questionnaire results did not indicate a significant impact on teacher agency, some factors had been affected and had also been identified in the interview. One of these factors was the control to create and manage the activity. This factor was identified because the teachers experienced a slight increase at the beginning and end of Experiments 1 and 3 in the factors regarding the creation and implementation of activities with tools. Meanwhile, this factor did not change in Experiment 2, where the activity was designed in collaboration with the teacher. The teachers were asked in the interviews and indicated, in Experiment 1, that having designed the activity (they were not forced to follow the Jigsaw pattern) entirely, gave them more security, control, and freedom when acting. In Experiment 2, the teacher indicated that there had been no changes in teacher agency due to the collaboration in creating the activity. In Experiment 3, the teacher indicated that she felt more comfortable having more control over the activity. The features pointed out in the interview as potentially improving teacher agency were support in the design and management of collaborative activities and adaptability to possible changes that might arise during implementation. Another factor that was affected was the use of available tools (e.g. software WeConnect in Experiment 1 and Engageli in Experiment 2 and Experiment 3). Although this factor is identified in the literature as the available resources [31], in the teacher agency questionnaire it is identified not as a general resource but as the available tools. In Experiments 1 and 3, questions related to the use of tools slightly increased (1 point more, on a scale of 1 to 5). In addition, in all interviews, teachers indicated how necessary the tools were to conduct the class and make any modifications. Features that could increase this factor, pointed out by the teachers in the interviews, were ease of use, adaptability to possible changes that might arise during the activity, and ease of access to all resources (teaching material, exercises, shared documents, etc...). The last identified factor was the perception of teaching efficacy. The only questions that decreased were related to the perception of their efficacy in teaching in Experiment 3. In the interview, the teacher indicated that the lack of time due to technical failures and the difficulty of checking in real-time the students progress complicated their evaluation. The features indicated by the teachers that could improve this factor were more information about the state of the class and easily accessible values to check student progress. All these factors are summarised in Table 9.

5. Conclusions

This paper identified and analysed the factors affecting orchestration load on teachers and students, and teacher agency in SSHLEs adapting collaborative learning situations. To this end, the three experiments conducted in this paper present different collaborative activities in SSHLEs. Several factors that affected both the orchestration load of the students and the teacher, as well as factors that influenced teacher agency were extracted from these experiences. The factors found to influence orchestration load in these experiments were: task complexity, time constraints, tools used, and knowledge about the class and student status. These factors were extracted from the NASA-TLX questionnaire, and the ENA model, and also from interviews with the teachers. The factors found to influence teacher agency were: control over the creation and management of the activity, the use of available tools, and the perception of teaching effectiveness. These factors were extracted from the teacher agency questionnaire and also from the teachers interviews. Furthermore, from the teachers interviews, some characteristics that occurred in the experiments were extracted, which helped or could have been improved for a lower orchestration load. Some of these characteristics were: centralising resources, adaptability to errors, group management, the ability to incorporate external resources, information on student participation and student progression. Also, some characteristics were discussed that occurred in the experiments, which helped or could have been improved for greater teacher agency. Some of these characteristics were: support in the design and management of collaborative activities, adaptability to errors, ease of access to resources, information about the state of the class, and information about student progress.
The main limitation found in this study was finding a real scenario to conduct the experiments. SHLEs are present in some institutions, but the difficulty of transforming them into SSHLEs and incorporating the Jigsaw pattern to implement a complex collaborative learning activity were a significant barrier to conducting more experiments. Experiments 2 and 3 had to be implemented as workshops with a limited duration. Another limitation was the regulations of the different institutions. In the case of Experiment 1, the collection of the students orchestration load data was not allowed, and in Experiment 2, the need to go through the necessary steps for consent caused delays in carrying out the activity. Another limitation found in this study was the emergence of technical issues. In Experiment 2, due to a lack of experience, technical problems arose, causing delays in the activity. In Experiments 1 and 3, there were some issues related to student disconnections, which could not be resolved, but due to the teachers experience, they hardly posed a problem. The last limitation found was the noise when carrying out the last phase of the jigsaw (JP3). In Experiment 2, it was a significant problem indicated by both students and the teacher. In Experiment 3, although the use of headphones was recommended, due to the small classroom space, there were occasional issues, far less than in Experiment 2.
For future work, the plan is to conduct more experiments in other SSHLEs with a different distribution of the hybrid environment, for example, in telepresence classrooms where the teacher and a group of students are in one classroom, and on one of the walls, there is a projection of another classroom where the rest of the students are. These environments pose new challenges, but at the same time, we aim to find similarities with these SSHLEs studied in this research. Another future work is the incorporation of the features recommended by the teachers into the SSHLEs and evaluates their improvement impact.

Author Contributions

Conceptualization, A.C.M. and C.A.-H.; methodology, A.C.M.; software, A.C.M. and A.M.-M.; validation, C.A.-H. and A.M.-M.; formal analysis, A.C.M.; investigation, A.C.M.; resources, C.D.K., C.A.-H. and A.M.-M.; data curation, A.C.M.; writing—original draft preparation, A.C.M.; writing—review and editing, C.A.-H. and A.M.-M.; visualization, A.C.M.; supervision, C.A.-H. and A.M.-M.; project administration, C.D.K; funding acquisition, C.A.-H and C.D.K. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported in part by grant PID2020-112584RB-C31 (H2O Learn project) funded by MCIN/ AEI /10.13039/501100011033, and in part by the Madrid Regional Government through the Multiannual Agreement with UC3M in the line of Excellence of University Professors EPUC3M21), and in the context of the V PRICIT (Regional Programme of Research and Technological Innovation), a project which is co-funded by the European Structural Funds (FSE and FEDER). Partial support has also been received from the European Commission through Erasmus+ Capacity Building in the Field of Higher Education project PROF-XXI (609767-EPP-1-2019-1-ESEPPKA2-CBHE-JP), MICROCASA (101081924 ERASMUS-EDU-2022-CBHE-STRAND-2), EUCare4.0 (2021-1-FR01-KA220-VET-000024860), and POEM-SET (2021-FR01-KA220-HED-000032171). This publication reflects the views only of the authors and funders cannot be held responsible for any use which may be made of the information contained therein.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Due to the privacy concern of the participants, we are submitting the data only in the summary form.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Cahapay, M.B. Rethinking education in the new normal post-COVID-19 era: A curriculum studies perspective. Aquademia 2020, 4. [Google Scholar] [CrossRef]
  2. Bonfield, C.A.; Salter, M.; Longmuir, A.; Benson, M.; Adachi, C. Transformation or evolution?: Education 4.0, teaching and learning in the digital age. Higher Education Pedagogies 2020, 5, 223–246. [Google Scholar] [CrossRef]
  3. Raes, A.; Detienne, L.; Windey, I.; Depaepe, F. A systematic literature review on synchronous hybrid learning: gaps identified. Learning Environments Research 2020, 23, 269–290. [Google Scholar] [CrossRef]
  4. Bülow, M.W. Designing Synchronous Hybrid Learning Spaces: Challenges and Opportunities. In Hybrid Learning Spaces; Gil, E., Mor, Y., Dimitriadis, Y., Köppe, C., Eds.; Springer International Publishing: Cham, 2022; pp. 135–163. [Google Scholar] [CrossRef]
  5. Carrruana Martín, A.; Alario-Hoyos, C.; Delgado Kloos, C. Smart Groups: A system to orchestrate collaboration in hybrid learning environments. A simulation study. Australasian Journal of Educational Technology 2022, 38, 150–168. [Google Scholar] [CrossRef]
  6. Carruana Martín, A.; Alario-Hoyos, C.; Delgado Kloos, C. Smart Education: A Review and Future Research Directions. Proceedings 2019, 31. [Google Scholar] [CrossRef]
  7. Tabuenca, B.; Serrano-Iglesias, S.; Carruana Martín, A.; Villa-Torrano, C.; Dimitriadis, Y.; I. Asensio-Pérez, J.; Alario-Hoyos, C.; Gómez-Sánchez, E.; L. Bote-Lorenzo, M.; Martínez-Monés, A.; Delgado Kloos, C. Affordances and Core Functions of Smart Learning Environments: A Systematic Literature Review. IEEE Transactions on Learning Technologies 2021, 14, 129–145. [CrossRef]
  8. Hwang, G.J. Definition, framework and research issues of smart learning environments - a context-aware ubiquitous learning perspective. Smart Learning Environments 2014, 1, 4. [Google Scholar] [CrossRef]
  9. Gambo, Y.; Shakir, M.Z. Evaluating students’ experiences in self-regulated smart learning environment. Education and Information Technologies 2023, 28, 547–580. [Google Scholar] [CrossRef]
  10. Smith, B.L.; MacGregor, J.T., What is collaborative learning. In n Collaborative Learning: A Sourcebook for Higher Education; Goodsell, A.; Maher, M.; Tinto, V.; Leigh Smith, B.; MacGregor, J., Eds.; The National Center on Postsecondary Teaching, Learning, and Assessment at Pennsylvania State University, 1992; pp. 69–81.
  11. Ángel Herrera-Pavo, M. Collaborative learning for virtual higher education. Learning, Culture and Social Interaction 2021, 28, 100437. [Google Scholar] [CrossRef]
  12. Prieto, L.P.; Sharma, K.; Dillenbourg, P. Studying Teacher Orchestration Load in Technology-Enhanced Classrooms. Design for Teaching and Learning in a Networked World; Conole, G., Klobučar, T., Rensing, C., Konert, J., Lavoué, E., Eds.; Springer International Publishing: Cham, 2015; pp. 268–281. [Google Scholar] [CrossRef]
  13. Al-Samarraie, H.; Saeed, N. A systematic review of cloud computing tools for collaborative learning: Opportunities and challenges to the blended-learning environment. Computers & Education 2018, 124, 77–91. [Google Scholar] [CrossRef]
  14. Hämäläinen, R.; Oksanen, K. Challenge of supporting vocational learning: Empowering collaboration in a scripted 3D game – How does teachers’ real-time orchestration make a difference? Computers & Education 2012, 59, 281–293. [Google Scholar] [CrossRef]
  15. wen Shen, C.; tsung Ho, J. Technology-enhanced learning in higher education: A bibliometric analysis with latent semantic approach. Computers in Human Behavior 2020, 104, 106177. [Google Scholar] [CrossRef]
  16. Kayi-Aydar, H. Teacher agency, positioning, and English language learners: Voices of pre-service classroom teachers. Teaching and Teacher Education 2015, 45, 94–103. [Google Scholar] [CrossRef]
  17. Biesta, G.; Priestley, M.; Robinson, S. The role of beliefs in teacher agency. Teachers and Teaching 2015, 21, 624–640. [Google Scholar] [CrossRef]
  18. Sammons, P.; Day, C.; Kington, A.; Gu, Q.; Stobart, G.; Smees, R. Exploring variations in teachers’ work, lives and their effects on pupils: key findings and implications from a longitudinal mixed-method study. British Educational Research Journal 2007, 33, 681–701. [Google Scholar] [CrossRef]
  19. Hernández-Leo, D.; Villasclaras-Fernández, E.D.; Asensio-Pérez, J.I.; Dimitriadis, Y.; Jorrín-Abellán, I.M.; Ruiz-Requies, I.; Rubia-Avi, B. COLLAGE: A collaborative Learning Design editor based on patterns. Journal of Educational Technology & Society 2006, 9, 58–71. [Google Scholar]
  20. Carruana Martín, A.; Ortega-Arranz, A.; Alario-Hoyos, C.; Amarasinghe, I.; Hernández-Leo, D.; Delgado Kloos, C. Scenario for Analysing Student Interactions and Orchestration Load in Collaborative and Hybrid Learning Environments. Collaboration Technologies and Social Computing; Wong, L.H., Hayashi, Y., Collazos, C.A., Alvarez, C., Zurita, G., Baloian, N., Eds.; Springer International Publishing: Cham, 2022; pp. 295–303. [Google Scholar] [CrossRef]
  21. Csanadi, A.; Eagan, B.; Kollar, I.; Shaffer, D.W.; Fischer, F. When coding-and-counting is not enough: using epistemic network analysis (ENA) to analyze verbal data in CSCL research. International Journal of Computer-Supported Collaborative Learning 2018, 13, 419–438. [Google Scholar] [CrossRef]
  22. Amarasinghe, I.; Hernández-Leo, D.; Ulrich Hoppe, H. Deconstructing orchestration load: comparing teacher support through mirroring and guiding. International Journal of Computer-Supported Collaborative Learning 2021, 16, 307–338. [Google Scholar] [CrossRef]
  23. Hart, S.G.; Staveland, L.E. Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research. In Human Mental Workload; Hancock, P.A.; Meshkati, N., Eds.; North-Holland, 1988; Vol. 52, Advances in Psychology, pp. 139–183. [CrossRef]
  24. Hull, M.M.; Vormayr, K.; Uematsu, H. Validation of a survey to measure pre-service teachers’ sense of agency. Journal of Physics: Conference Series 2021, 1929, 012085. [Google Scholar] [CrossRef]
  25. Stake, R.E.; Jorrín-Abellán, Iván, M. Does Ubiquitous Learning Call for Ubiquitous Forms of Formal Evaluation?: An Evaluand Oriented Responsive Evaluation Model. Ubiquitous Learning 2009, 1, 71–82. [CrossRef]
  26. WeConnect Software. https://kulak.kuleuven.be/nl/over_kulak/diensten/dienst-informatica/platformen/tecol/weconnect. Accessed: 2023-06-14.
  27. Engageli Software. https://www.engageli.com. Accessed: 2023-06-14.
  28. Zhang, L.; Ayres, P.; Chan, K. Examining different types of collaborative learning in a complex computer-based environment: A cognitive load approach. Computers in Human Behavior 2011, 27, 94–98. Current Research Topics in Cognitive Load Theory. [Google Scholar] [CrossRef]
  29. Saputra, M.D.; Joyoatmojo, S.; Wardani, D.K.; Sangka, K.B. Developing critical-thinking skills through the collaboration of jigsaw model with problem-based learning model. International Journal of Instruction 2019, 12, 1077–1094. [Google Scholar] [CrossRef]
  30. Ludvigsen, S.; Steier, R. Reflections and looking ahead for CSCL: digital infrastructures, digital tools, and collaborative learning. International Journal of Computer-Supported Collaborative Learning 2019, 14, 415–423. [Google Scholar] [CrossRef]
  31. Li, L.; Ruppar, A. Conceptualizing Teacher Agency for Inclusive Education: A Systematic and International Review. Teacher Education and Special Education 2021, 44, 42–59. [Google Scholar] [CrossRef]
  32. Hull, M.M.; Uematsu, H. Perceived Agency of In-Service Physics Teachers in Japan and Austria. PhyDid B - Didaktik der Physik - Beiträge zur DPG-Frühjahrstagung 2022, 1. [Google Scholar]
Figure 1. Phases of Jigsaw CLFP adapted for a hybrid scenario as part of a SSHLE.
Figure 1. Phases of Jigsaw CLFP adapted for a hybrid scenario as part of a SSHLE.
Preprints 78554 g001
Figure 2. The organisation of the data sources from teachers and students.
Figure 2. The organisation of the data sources from teachers and students.
Preprints 78554 g002
Figure 3. Experiment 1 - ENA Model (The size of the points corresponds to the number of times an action was performed, and the thickness of the lines corresponds to the number of times there was a transition from one action to another).
Figure 3. Experiment 1 - ENA Model (The size of the points corresponds to the number of times an action was performed, and the thickness of the lines corresponds to the number of times there was a transition from one action to another).
Preprints 78554 g003
Figure 4. Experiment 2 - ENA Model (The size of the points corresponds to the number of times an action has been performed, and the thickness of the lines corresponds to the number of times there was a transition from one action to another).
Figure 4. Experiment 2 - ENA Model (The size of the points corresponds to the number of times an action has been performed, and the thickness of the lines corresponds to the number of times there was a transition from one action to another).
Preprints 78554 g004
Figure 5. Experiment 3 - ENA Model (The size of the points corresponds to the number of times an action has been performed, and the thickness of the lines corresponds to the number of times there was a transition from one action to another).
Figure 5. Experiment 3 - ENA Model (The size of the points corresponds to the number of times an action has been performed, and the thickness of the lines corresponds to the number of times there was a transition from one action to another).
Preprints 78554 g005
Figure 6. ENA - Comparison Experiment 1 (Blue) and Experiment 2 (Red).
Figure 6. ENA - Comparison Experiment 1 (Blue) and Experiment 2 (Red).
Preprints 78554 g006
Figure 7. ENA - Comparison Experiment 1 (Blue) and Experiment 3 (Green).
Figure 7. ENA - Comparison Experiment 1 (Blue) and Experiment 3 (Green).
Preprints 78554 g007
Figure 8. ENA - Comparison Experiment 2 (Red) and Experiment 3 (Green).
Figure 8. ENA - Comparison Experiment 2 (Red) and Experiment 3 (Green).
Preprints 78554 g008
Table 1. Codes of teachers actions for the ENA model.
Table 1. Codes of teachers actions for the ENA model.
Code Definition
Teacher.individual.interaction.online The teacher answers a questions posed by an online student.
Teacher.individual.interaction.onsite The teacher answers a questions posed by a onsite student.
Teacher.group.interaction.online The teacher answers a question posed by an online group.
Teacher.group.interaction.onsite The teacher answers a question posed by an onsite group.
Teacher.group.interaction.hybrid The teacher answers a question posed by an hybrid group (some members online and others onsite).
Teacher.class.interaction The teacher addresses all students expecting a response/reaction from them.
Examples:
- Teacher requests information from the class
- Teacher gives instructions to the class about the jigsaw phase or about a task that the students have to carry out (switching groups or submitting tasks)
Announcements.class Teacher announces information to the students.
Examples:
- Remaining time of the activity
- Information about an assignment - Information needed to complete the task
Teacher.perception The teacher checks or monitors the status of the class (both online and onsite).
Use.tool The teacher uses some of the features of the tool, such as checking the level of participation, group management, etc...
Table 2. Detail of the three experiments carried out in the three SSHLEs.
Table 2. Detail of the three experiments carried out in the three SSHLEs.
No. Place No. of participants Time Motivation Data sources Technologies
1 Belgium 46 (24 on-site and 22 online) 2 h Study a setting prepared for SSHLEs, a classroom with greater incorporation of specific technology to cover hybrid learning, and where the teacher and students had more experience in these environments. - Teacher Agency questionnaires
- Teacher orchestration load questionnaire
- Teacher Interview
- Recording activity
- Televisions
- Cameras
- Speakers and microphone systems - WeConnect software
- Participants’ laptops
- Teacher’s laptop
2 Spain 17 (9 on-site and 8 online) 1 h Study the topics in a classroom with the usual technologies (whiteboard, projector, speakers and computer) converted into SSHLE. - Teacher Agency questionnaires
- Teacher orchestration load questionnaire
- Students orchestration load questionnaire
- Teacher Interview
- Recording activity
- Whiteboard
- Projector
- Speakers
- Engageli software
- Participants’ laptops
- Teacher’s laptop
3 Spain 12 (9 on-site and 3 online) 1 h Study a scenario with participants with experience in these environments for a better comparison and the lack of data on SSHLEs. - Teacher Agency questionnaires
- Teacher orchestration load questionnaire
- Students orchestration load questionnaire
- Teacher Interview - Recording activity
Table 3. Experiment 1 - NASA-TLX Teacher results (Subscales in a range between 0 and 100, and pairwise comparisons in a range between 0 and 5).
Table 3. Experiment 1 - NASA-TLX Teacher results (Subscales in a range between 0 and 100, and pairwise comparisons in a range between 0 and 5).
Mental Demand Physical Demand Temporal Demand Performance Effort Frustration Level
Subscales 70 1 60 10 50 25
Greatest variation 4 0 5 2 2 2
Table 5. Experiment 2 - NASA-TLX Students results. The first 6 students were online and are marked in italics.
Table 5. Experiment 2 - NASA-TLX Students results. The first 6 students were online and are marked in italics.
A B C D E F G H I J K L M N O P Q Mean SD
Mental Demand 50 60 10 75 60 70 60 80 67 60 50 35 70 80 30 70 60 58.06 18.6
Physical Demand 40 40 1 60 5 1 90 70 8 20 20 10 30 10 60 33 20 30.47 26.33
Temporal Demand 85 80 10 40 65 75 10 80 79 80 40 45 60 80 50 70 30 57.59 24.83
Performance 1 10 70 35 30 15 20 30 27 10 10 45 20 30 5 40 20 24.53 17.14
Effort 60 70 10 65 70 70 80 90 58 60 70 55 50 70 55 70 50 61.94 16.98
Frustration Level 35 20 1 40 40 75 1 60 7 20 1 60 10 30 25 65 30 30.53 23.73
Workload 50 51 21 52 49 66 61 65 58 53 39 49 48 68 32 59 43 50.66 12.21
Table 7. Experiment 3 - NASA-TLX Students results. The first 3 students were online and are marked in italics.
Table 7. Experiment 3 - NASA-TLX Students results. The first 3 students were online and are marked in italics.
A B C D E F G H I J K L Mean SD
Mental Demand 70 40 35 70 40 50 60 80 70 70 60 70 59.58 14.84
Physical Demand 10 40 1 0 2 5 20 20 10 10 0 60 14.83 18.31
Temporal Demand 60 40 70 70 50 40 30 90 20 90 80 50 57.50 23.01
Performance 20 30 20 20 10 20 30 20 20 60 20 20 24.17 12.40
Effort 70 50 30 70 50 40 40 80 60 70 60 60 56.67 14.97
Frustration Level 30 70 20 40 10 50 20 60 10 70 30 80 40.83 24.66
Workload 58.67 46.67 40 56 35.33 36 37.33 74 41.33 70 60 56 50.94 13.38
Table 8. RQ1: Factors influencing orchestration load of the teacher and students in SSHLEs that support collaborative learning situations.
Table 8. RQ1: Factors influencing orchestration load of the teacher and students in SSHLEs that support collaborative learning situations.
Factor Data sources Reason Potential improvements
Complexity to perform the task - NASA-TLX questionnaire of teachers and students
- Teacher interviews
- Problems inherited from the incorporation of technology
- Need to work with people in different environments
- Great importance for collaborative learning
- Centralising resources
- Adaptability
- Group management
Time limitations - NASA-TLX questionnaire of teachers and students
- Teacher interviews
- Problems inherited from the incorporation of technology
- Difficulty in finding an alternative when an error occurs
- Difficulty of timing in collaborative learning
- Adaptability
- Group management
Used tools - ENA models
- Teacher interviews
- Problems inherited from the incorporation of technology
- Important factor in SLEs and SHLEs - Important factor in CSCL
- Video/chat
- Real-time interaction
- Group manager
- File manager
- Incorporate external resources
Knowledge about the status of the class and the students - ENA models
- Teacher interviews
- Some SLEs are present as a feature
- Identified in collaborative learning
- Data on student participation
- Notice student has a question
- Student progress
Table 9. RQ2: Factors influencing teacher agency in SSHLEs that support collaborative learning situations.
Table 9. RQ2: Factors influencing teacher agency in SSHLEs that support collaborative learning situations.
Factor Data sources Reason Potential improvements
Control to create and manage the activity - Teacher Agency questionnaire
- Teacher interviews
- Feeling of greater freedom in Experiment 1
- Co-design in Experiment 2
- Greater comfort by having more control over the activity in Experiment 3
- Support in the design and management of collaborative activities
- Adaptability
Use of tools - Teacher Agency questionnaire
- Teacher interviews
- Identified in the teacher agency literature
- Indicated as necessary by teachers
- User-friendliness
- Adaptability
- Ease of access to resources
Teaching effectiveness - Teacher Agency questionnaire
- Teacher interviews
- Lack of time
- Difficulty in checking the progress of the students in real time
- Class status information
- Student progress values
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

© 2024 MDPI (Basel, Switzerland) unless otherwise stated