1. Introduction
The COVID-19 pandemic had a significant impact on global educational environments, especially in higher education [
1]. In response to health regulations and social distancing measures, the deployment of hybrid learning environments (HLEs) and online learning environments increased [
2]. HLEs have different implementations, but, in particular, synchronous hybrid learning environments (SHLEs) emerged as a popular solution, especially during the COVID-19 pandemic as a trade-off to meet health regulations [
3]. These environments allow students to attend classes both online and in person in real-time, providing greater flexibility in learning and better access to educational resources regardless of their physical location [
4]. Therefore, the use of SHLEs is not restricted to situations where social distancing restrictions apply but can be beneficial to make access to formal learning more flexible than in traditional educational settings. However, while SHLEs have the potential to support education, they also pose important challenges. For example, the implementation of these environments requires significant investment in technology, infrastructure, and teacher training to ensure an effective learning experience [
3]. Despite the advantages of SHLEs, more research is needed to better understand their impact on student learning and performance as well as to identify best practices in their implementation [
5]. This will enable educational institutions to make more informed and effective decisions about to adapt to the educational challenges posed by the pandemic and to implement more effective long-term learning environments.
SHLE can be combined with additional technology to collect, process, and provide supplementary information to the teacher, with the aim of enhancing and making learning more flexible. The environments that employ this technology are referred to as Smart Learning Environments (SLEs) [
6]. In these environments, technology plays three key roles: sensing, obtaining data such as audio or positioning; analysing, processing that data; and reacting, using that data to support teachers and students with their pedagogical activities [
7]. Key features of SLEs include adaptability, which enables the personalisation of learning to meet the individual needs of students; traceability, which allows educators to make informed decisions by monitoring and analysing data on student performance; and real-time interaction, which enables real-time completion of tasks and access to educational resources from anywhere at any time [
7]. However, the application of SLEs also has disadvantages. For example, their costs can be high due to the need for additional technology and resources. In addition, technical glitches can disrupt learning and create frustration for students and teachers. There are also issues related to the privacy and security of personal data collected and used by SLEs [
5]. Overall, the implementation of SLEs can provide significant benefits in the personalisation of learning, informed decision-making, and access to educational resources. However, these benefits must be balanced with the constraints and considerations of security, privacy, and teacher agency to ensure effective and sustainable implementation of SLEs.
This study proposes the concept of Smart Synchronous Hybrid Learning Environments (SSHLEs) by bringing together the advantages of Synchronous Hybrid Learning Environments (SHLEs) and Smart Learning Environments (SLEs). SSHLEs enable students to interact synchronously from different locations. Therefore, SSHLE can offer greater adaptability and support more complex learning experiences [
8]. However, the implementation of SSHLEs also presents challenges inherited from SLEs and SHLEs, including the high cost of additional technology and resources required, possible technical issues, and privacy and security concerns [
9]. In conclusion, SSHLEs offer a promising approach to enhance the effectiveness of SHLEs, although they are not free from problems depending on the methodology used by the teacher, especially when implementing complex strategies such as active learning, collaborative learning, etc. [
3].
In the context of SSHLEs, enacting collaborative learning situations is particularly challenging because of the complexity involved in coordinating students and ensuring that activities are carried out effectively. Collaborative learning involves a joint intellectual effort by teachers and/or students to carry out activities in a group of two or more [
10]. Collaborative learning can be a valuable approach for fostering teamwork and enhancing students learning. However, this type of learning requires careful planning and organisation on the part of the teacher to ensure its effectiveness [
11], and adding technology into the mix may lead to an increased orchestration load. The
orchestration load is the effort required by the teacher and the students to carry out the desired activities [
12]. A high orchestration load can impact the success of collaborative learning and can be affected by various factors, such as the teachers level of experience, the type of activity, and the group size [
13]. Therefore, teachers and students must receive the appropriate training and support to plan and effectively manage collaborative learning in SSHLEs [
14]. Additionally, technology can play a significant role in facilitating this type of learning, providing tools and resources for collaboration and communication between students and teachers [
15].
Moreover, it is important to consider that the addition of new elements into the educational environment, particularly different types of technology to implement SSHLEs, can have an impact on teacher agency [
16].
Teacher agency refers to the experiences, professional training, resources, culture, social structure and environment, that influence the teacher decision-making process [
17]. Therefore, any limitations in the teacher agency may not only reduce the teacher ability to make effective decisions but may also negatively affect students performance [
18]. To mitigate these problems, it is important to implement SSHLEs carefully and strategically, considering not only the technological benefits but also the impacts on the educational process and teacher agency.
This study aims to analyse the factors that influence the orchestration load of the teacher and students, and teacher agency in the particular context of the implementation of collaborative learning situations in SSHLEs. To this end, two research questions are posed:
RQ1: What factors influence the orchestration load of the teacher and students in SSHLEs that support collaborative learning situations?.
RQ2: What factors influence teacher agency in SSHLEs that support collaborative learning situations?.
4. Discussion
This paper proposed two research questions and to address them three experiments using SSHLE to support collaborative learning situations were carried out. The first research question: "What factors influence the orchestration load of the teacher and students in SSHLEs that support collaborative learning situations?" is answered with the NASA-TLX questionnaire, and the ENA model and complemented with teachers interviews.
The NASA-TXL questionnaire served to obtain values for different factors that affect orchestration load and a general value called workload. The workload of the teachers in each experiment was respectively 50, 60.67 and 76. The teacher from experiment 1 was asked in an interview about possible factors that could affect her orchestration load. She indicated that she had extensive experience in this type of class and did not find it difficult to conduct hybrid classes as long as she had the appropriate technologies. The other two experiments presented a higher degree of orchestration load than the former. From interviews conducted with the teacher and comments made in the NASA-TLX questionnaire, it was deduced that the main problems encountered were the noise generated during JP3 with the hybrid groups, technical problems, the need for more time to carry out the activities, and the lack of experience of the teacher with SSHLE. No studies were found that use the NASA-TLX questionnaire to measure teacher orchestration load in SHLEs or collaborative learning. The most similar study is that of Prieto et al. [
12], which measures the orchestration load of teachers in Technology-Enhanced Classrooms. In this study, the teachers obtained a 53.3 (out of 100) in one session and a 56.3 (out of 100) in another session, which could serve as a reference to measure orchestration load in an environment with a strong presence of technology like in SSHLEs. It was also observed that in these experiments, incorporating collaborative learning and conducting it within a SSHLE increases the orchestration load by between 5 and 20 points more, but further research is needed for a broader perspective.
The orchestration load of the students had an average workload score of 50.66 and 50.94. These values are within the medium range of orchestration load (40-60) [
23]. No studies have been found that use the NASA-TLX questionnaire to measure the orchestration load of students in hybrid environments carrying out collaborative activities. The closest study is that of Zhang et al. [
28] who measured orchestration load in onsite class and different collaboration strategies were used. The results of the study by Zhang et al. (38.94) shows lower values than those obtained in our experiments. It was observed in this case that conducting collaborative activities within a SSHLE increased the orchestration load by approximately 12 points, but further research is needed for a more comprehensive view.
Regarding the ENAs, a pairwise comparisons were carried out for easier comprehension of the differences. A stronger relationship between announcements and tool usage can be observed in Experiment 1 after comparing Experiments 1 and 2 (
Figure 6). This is due to the fact that in Experiment 1 the students hardly initiated any interaction with the teacher, and she had to monitor the class progress through the tool. In contrast, Experiment 2 shows a strong relationship between class interaction and hybrid groups. This is due to the teacher requesting general information, and if there was a problem, assisting the indicated group. In both experiments, the teacher made extensive use of the tool. This action often becomes the pivot among other options, that is, after performing one action, the use of the tool was typically involved. This made the use of the tool a key point from the orchestration perspective.
In the comparison between Experiment 1 and Experiment 3 (
Figure 7), the same difference can be observed as in the previous comparison. Experiment 1 had a stronger relationship between class announcements and tool usage. Experiment 3 had a greater relationship between the perception of the class and interaction with the different groups. This may be due to the fact that the teacher already had more experience, and with a general perception, she was able to see where her presence was required. In this case, the most frequently used action was perceiving the state of the class; therefore, if this action is performed easily and quickly, it would decrease the orchestration load.
The two experiments conducted by the same teacher, Experiment 2 and Experiment 3, were compared (
Figure 8). In this comparison, as was the case in the first of these (
Figure 6), the relationship between class interaction and hybrid group interaction in Experiment 2 stood out. In contrast, Experiment 3 was distinguished by its individual onsite interactions with the class and perception of online groups. As for individual onsite interactions, they occurred because when the teacher asked the class for information and a student responded with a problem, the teacher assisted them individually. The interaction with online groups related to perception is due to the teacher assisting the group when she noticed a problem with any group in the tool. In this comparison, the most performed action was the interaction with the class. This action is crucial, especially for it to be carried out effectively for all students, whether they are online or onsite. Providing the appropriate tools to carry out this action is crucial for conducting activities in SSHLEs. Moreover, ensuring these tools do not pose a greater orchestration load is a challenge.
An analysis of the NASA-TLX questionnaire, the ENA model, and interviews revealed several key factors impacting the orchestration load of these experiments. One of these factors was task complexity. This factor was identified in the literature on technology implementation [
12] and gains greater importance in SSHLEs. This is due to the requirement of using new technologies together with the need to work with people in different environments (online and onsite). From the point of view of collaborative learning, this factor becomes more important as collaborative activities usually require extensive communication and the use of resources for collaboration. Both the NASA-TLX questionnaire from the teachers and the students, as well as the interviews with the teachers, pointed out this factor. The characteristics that helped to reduce task complexity, indicated by the teachers in the interviews, were the centralisation of resources, the adaptability to various changes that arose during the activity, and the support for group management. Another factor was time limitations, which, like the previous factor, are also found in learning environments where technology is added [
12]. Time limitations become more important in SSHLEs because, unlike other environments, if there is any problem with the technology, especially with communication technology, it is very challenging (at least in a short period of time) to find a solution or alternative. From the perspective of collaboration, calculating the time of activities is already a challenge in itself [
29]. But if this factor is compounded by the need to take more steps to complete an activity due to technology, not having alternatives when an error or complications arise (for example, problems with a student internet or microphone failures), this factor becomes more significant. Both the NASA-TLX questionnaire from the teachers and the students, as well as the interviews with the teachers, pointed out this factor. The characteristics that facilitated reducing the activity time, indicated by the teachers in the interviews, were adaptability to different changes that arose, and support for group management. Another factor that affected the orchestration load is the tools used in SSHLEs. This factor is inherited from both SLEs and SHLEs [
3,
7]. From the perspective of collaborative learning, more specifically Computer Support Collaborative Learning (CSCL), tools are also a key factor in enhancing development [
30]. In addition to being an individually identified factor in SLEs, SHLEs and CSCL, the ENA models indicated a significant weight in tool use, pointing it out as a key factor for the orchestration load. The prominent features of the tools in the teacher interviews were video/chat, real-time interaction, group manager, file manager and the ability to incorporate external resources. The last identified factor was knowledge about the state of the class and the students, which is present as a feature in some SLEs [
7] and is also a factor identified in other studies of the literature on collaborative learning [
22]. In the case of these experiments, this factor had been detected in the ENA models and teacher interviews. The features that contributed to this factor, as indicated by the teachers in the interviews, were student participation data, a notice that a student had a question, and viewing student progress. All these factors can be seen in
Table 8.
The second research question: "What factors influence teacher agency in SSHLEs that support collaborative learning situations?" is answered with the teacher agency questionnaire and complemented with teachers interviews.
The different results obtained from the teacher agency questionnaire seem to indicate that SSHLEs had minimal impact on teacher agency. However, in two out of three cases, they increased teacher agency in factors related to the use of tools. Factors related to the material teachers had at their disposal carry significant weight in teacher agency [
31]. For this reason, and based on the results obtained, it is possible that a specific approach to SSHLEs to support these factors could have a positive impact on teacher agency. In contrast, it should be noted that no studies have been found that assess teacher agency with a questionnaire, an issue also encountered by the creators of the model upon which the questionnaire of this paper on the teacher agency is based [
32].
Although the questionnaire results did not indicate a significant impact on teacher agency, some factors had been affected and had also been identified in the interview. One of these factors was the control to create and manage the activity. This factor was identified because the teachers experienced a slight increase at the beginning and end of Experiments 1 and 3 in the factors regarding the creation and implementation of activities with tools. Meanwhile, this factor did not change in Experiment 2, where the activity was designed in collaboration with the teacher. The teachers were asked in the interviews and indicated, in Experiment 1, that having designed the activity (they were not forced to follow the Jigsaw pattern) entirely, gave them more security, control, and freedom when acting. In Experiment 2, the teacher indicated that there had been no changes in teacher agency due to the collaboration in creating the activity. In Experiment 3, the teacher indicated that she felt more comfortable having more control over the activity. The features pointed out in the interview as potentially improving teacher agency were support in the design and management of collaborative activities and adaptability to possible changes that might arise during implementation. Another factor that was affected was the use of available tools (e.g. software WeConnect in Experiment 1 and Engageli in Experiment 2 and Experiment 3). Although this factor is identified in the literature as the available resources [
31], in the teacher agency questionnaire it is identified not as a general resource but as the available tools. In Experiments 1 and 3, questions related to the use of tools slightly increased (1 point more, on a scale of 1 to 5). In addition, in all interviews, teachers indicated how necessary the tools were to conduct the class and make any modifications. Features that could increase this factor, pointed out by the teachers in the interviews, were ease of use, adaptability to possible changes that might arise during the activity, and ease of access to all resources (teaching material, exercises, shared documents, etc...). The last identified factor was the perception of teaching efficacy. The only questions that decreased were related to the perception of their efficacy in teaching in Experiment 3. In the interview, the teacher indicated that the lack of time due to technical failures and the difficulty of checking in real-time the students progress complicated their evaluation. The features indicated by the teachers that could improve this factor were more information about the state of the class and easily accessible values to check student progress. All these factors are summarised in
Table 9.
5. Conclusions
This paper identified and analysed the factors affecting orchestration load on teachers and students, and teacher agency in SSHLEs adapting collaborative learning situations. To this end, the three experiments conducted in this paper present different collaborative activities in SSHLEs. Several factors that affected both the orchestration load of the students and the teacher, as well as factors that influenced teacher agency were extracted from these experiences. The factors found to influence orchestration load in these experiments were: task complexity, time constraints, tools used, and knowledge about the class and student status. These factors were extracted from the NASA-TLX questionnaire, and the ENA model, and also from interviews with the teachers. The factors found to influence teacher agency were: control over the creation and management of the activity, the use of available tools, and the perception of teaching effectiveness. These factors were extracted from the teacher agency questionnaire and also from the teachers interviews. Furthermore, from the teachers interviews, some characteristics that occurred in the experiments were extracted, which helped or could have been improved for a lower orchestration load. Some of these characteristics were: centralising resources, adaptability to errors, group management, the ability to incorporate external resources, information on student participation and student progression. Also, some characteristics were discussed that occurred in the experiments, which helped or could have been improved for greater teacher agency. Some of these characteristics were: support in the design and management of collaborative activities, adaptability to errors, ease of access to resources, information about the state of the class, and information about student progress.
The main limitation found in this study was finding a real scenario to conduct the experiments. SHLEs are present in some institutions, but the difficulty of transforming them into SSHLEs and incorporating the Jigsaw pattern to implement a complex collaborative learning activity were a significant barrier to conducting more experiments. Experiments 2 and 3 had to be implemented as workshops with a limited duration. Another limitation was the regulations of the different institutions. In the case of Experiment 1, the collection of the students orchestration load data was not allowed, and in Experiment 2, the need to go through the necessary steps for consent caused delays in carrying out the activity. Another limitation found in this study was the emergence of technical issues. In Experiment 2, due to a lack of experience, technical problems arose, causing delays in the activity. In Experiments 1 and 3, there were some issues related to student disconnections, which could not be resolved, but due to the teachers experience, they hardly posed a problem. The last limitation found was the noise when carrying out the last phase of the jigsaw (JP3). In Experiment 2, it was a significant problem indicated by both students and the teacher. In Experiment 3, although the use of headphones was recommended, due to the small classroom space, there were occasional issues, far less than in Experiment 2.
For future work, the plan is to conduct more experiments in other SSHLEs with a different distribution of the hybrid environment, for example, in telepresence classrooms where the teacher and a group of students are in one classroom, and on one of the walls, there is a projection of another classroom where the rest of the students are. These environments pose new challenges, but at the same time, we aim to find similarities with these SSHLEs studied in this research. Another future work is the incorporation of the features recommended by the teachers into the SSHLEs and evaluates their improvement impact.
Figure 1.
Phases of Jigsaw CLFP adapted for a hybrid scenario as part of a SSHLE.
Figure 1.
Phases of Jigsaw CLFP adapted for a hybrid scenario as part of a SSHLE.
Figure 2.
The organisation of the data sources from teachers and students.
Figure 2.
The organisation of the data sources from teachers and students.
Figure 3.
Experiment 1 - ENA Model (The size of the points corresponds to the number of times an action was performed, and the thickness of the lines corresponds to the number of times there was a transition from one action to another).
Figure 3.
Experiment 1 - ENA Model (The size of the points corresponds to the number of times an action was performed, and the thickness of the lines corresponds to the number of times there was a transition from one action to another).
Figure 4.
Experiment 2 - ENA Model (The size of the points corresponds to the number of times an action has been performed, and the thickness of the lines corresponds to the number of times there was a transition from one action to another).
Figure 4.
Experiment 2 - ENA Model (The size of the points corresponds to the number of times an action has been performed, and the thickness of the lines corresponds to the number of times there was a transition from one action to another).
Figure 5.
Experiment 3 - ENA Model (The size of the points corresponds to the number of times an action has been performed, and the thickness of the lines corresponds to the number of times there was a transition from one action to another).
Figure 5.
Experiment 3 - ENA Model (The size of the points corresponds to the number of times an action has been performed, and the thickness of the lines corresponds to the number of times there was a transition from one action to another).
Figure 6.
ENA - Comparison Experiment 1 (Blue) and Experiment 2 (Red).
Figure 6.
ENA - Comparison Experiment 1 (Blue) and Experiment 2 (Red).
Figure 7.
ENA - Comparison Experiment 1 (Blue) and Experiment 3 (Green).
Figure 7.
ENA - Comparison Experiment 1 (Blue) and Experiment 3 (Green).
Figure 8.
ENA - Comparison Experiment 2 (Red) and Experiment 3 (Green).
Figure 8.
ENA - Comparison Experiment 2 (Red) and Experiment 3 (Green).
Table 1.
Codes of teachers actions for the ENA model.
Table 1.
Codes of teachers actions for the ENA model.
Code |
Definition |
Teacher.individual.interaction.online |
The teacher answers a questions posed by an online student. |
Teacher.individual.interaction.onsite |
The teacher answers a questions posed by a onsite student. |
Teacher.group.interaction.online |
The teacher answers a question posed by an online group. |
Teacher.group.interaction.onsite |
The teacher answers a question posed by an onsite group. |
Teacher.group.interaction.hybrid |
The teacher answers a question posed by an hybrid group (some members online and others onsite). |
Teacher.class.interaction |
The teacher addresses all students expecting a response/reaction from them. Examples: - Teacher requests information from the class - Teacher gives instructions to the class about the jigsaw phase or about a task that the students have to carry out (switching groups or submitting tasks) |
Announcements.class |
Teacher announces information to the students. Examples: - Remaining time of the activity - Information about an assignment - Information needed to complete the task |
Teacher.perception |
The teacher checks or monitors the status of the class (both online and onsite). |
Use.tool |
The teacher uses some of the features of the tool, such as checking the level of participation, group management, etc... |
Table 2.
Detail of the three experiments carried out in the three SSHLEs.
Table 2.
Detail of the three experiments carried out in the three SSHLEs.
No. |
Place |
No. of participants |
Time |
Motivation |
Data sources |
Technologies |
1 |
Belgium |
46 (24 on-site and 22 online) |
2 h |
Study a setting prepared for SSHLEs, a classroom with greater incorporation of specific technology to cover hybrid learning, and where the teacher and students had more experience in these environments. |
- Teacher Agency questionnaires - Teacher orchestration load questionnaire - Teacher Interview - Recording activity |
- Televisions - Cameras - Speakers and microphone systems - WeConnect software - Participants’ laptops - Teacher’s laptop |
2 |
Spain |
17 (9 on-site and 8 online) |
1 h |
Study the topics in a classroom with the usual technologies (whiteboard, projector, speakers and computer) converted into SSHLE. |
- Teacher Agency questionnaires - Teacher orchestration load questionnaire - Students orchestration load questionnaire - Teacher Interview - Recording activity |
- Whiteboard - Projector - Speakers - Engageli software - Participants’ laptops - Teacher’s laptop |
3 |
Spain |
12 (9 on-site and 3 online) |
1 h |
Study a scenario with participants with experience in these environments for a better comparison and the lack of data on SSHLEs. |
- Teacher Agency questionnaires - Teacher orchestration load questionnaire - Students orchestration load questionnaire - Teacher Interview - Recording activity |
Table 3.
Experiment 1 - NASA-TLX Teacher results (Subscales in a range between 0 and 100, and pairwise comparisons in a range between 0 and 5).
Table 3.
Experiment 1 - NASA-TLX Teacher results (Subscales in a range between 0 and 100, and pairwise comparisons in a range between 0 and 5).
|
Mental Demand |
Physical Demand |
Temporal Demand |
Performance |
Effort |
Frustration Level |
Subscales |
70 |
1 |
60 |
10 |
50 |
25 |
Greatest variation |
4 |
0 |
5 |
2 |
2 |
2 |
Table 5.
Experiment 2 - NASA-TLX Students results. The first 6 students were online and are marked in italics.
Table 5.
Experiment 2 - NASA-TLX Students results. The first 6 students were online and are marked in italics.
|
A |
B |
C |
D |
E |
F |
G |
H |
I |
J |
K |
L |
M |
N |
O |
P |
Q |
Mean |
SD |
Mental Demand |
50 |
60 |
10 |
75 |
60 |
70 |
60 |
80 |
67 |
60 |
50 |
35 |
70 |
80 |
30 |
70 |
60 |
58.06 |
18.6 |
Physical Demand |
40 |
40 |
1 |
60 |
5 |
1 |
90 |
70 |
8 |
20 |
20 |
10 |
30 |
10 |
60 |
33 |
20 |
30.47 |
26.33 |
Temporal Demand |
85 |
80 |
10 |
40 |
65 |
75 |
10 |
80 |
79 |
80 |
40 |
45 |
60 |
80 |
50 |
70 |
30 |
57.59 |
24.83 |
Performance |
1 |
10 |
70 |
35 |
30 |
15 |
20 |
30 |
27 |
10 |
10 |
45 |
20 |
30 |
5 |
40 |
20 |
24.53 |
17.14 |
Effort |
60 |
70 |
10 |
65 |
70 |
70 |
80 |
90 |
58 |
60 |
70 |
55 |
50 |
70 |
55 |
70 |
50 |
61.94 |
16.98 |
Frustration Level |
35 |
20 |
1 |
40 |
40 |
75 |
1 |
60 |
7 |
20 |
1 |
60 |
10 |
30 |
25 |
65 |
30 |
30.53 |
23.73 |
Workload |
50 |
51 |
21 |
52 |
49 |
66 |
61 |
65 |
58 |
53 |
39 |
49 |
48 |
68 |
32 |
59 |
43 |
50.66 |
12.21 |
Table 7.
Experiment 3 - NASA-TLX Students results. The first 3 students were online and are marked in italics.
Table 7.
Experiment 3 - NASA-TLX Students results. The first 3 students were online and are marked in italics.
|
A |
B |
C |
D |
E |
F |
G |
H |
I |
J |
K |
L |
Mean |
SD |
Mental Demand |
70 |
40 |
35 |
70 |
40 |
50 |
60 |
80 |
70 |
70 |
60 |
70 |
59.58 |
14.84 |
Physical Demand |
10 |
40 |
1 |
0 |
2 |
5 |
20 |
20 |
10 |
10 |
0 |
60 |
14.83 |
18.31 |
Temporal Demand |
60 |
40 |
70 |
70 |
50 |
40 |
30 |
90 |
20 |
90 |
80 |
50 |
57.50 |
23.01 |
Performance |
20 |
30 |
20 |
20 |
10 |
20 |
30 |
20 |
20 |
60 |
20 |
20 |
24.17 |
12.40 |
Effort |
70 |
50 |
30 |
70 |
50 |
40 |
40 |
80 |
60 |
70 |
60 |
60 |
56.67 |
14.97 |
Frustration Level |
30 |
70 |
20 |
40 |
10 |
50 |
20 |
60 |
10 |
70 |
30 |
80 |
40.83 |
24.66 |
Workload |
58.67 |
46.67 |
40 |
56 |
35.33 |
36 |
37.33 |
74 |
41.33 |
70 |
60 |
56 |
50.94 |
13.38 |
Table 8.
RQ1: Factors influencing orchestration load of the teacher and students in SSHLEs that support collaborative learning situations.
Table 8.
RQ1: Factors influencing orchestration load of the teacher and students in SSHLEs that support collaborative learning situations.
Factor |
Data sources |
Reason |
Potential improvements |
Complexity to perform the task |
- NASA-TLX questionnaire of teachers and students - Teacher interviews |
- Problems inherited from the incorporation of technology - Need to work with people in different environments - Great importance for collaborative learning |
- Centralising resources - Adaptability - Group management |
Time limitations |
- NASA-TLX questionnaire of teachers and students - Teacher interviews |
- Problems inherited from the incorporation of technology - Difficulty in finding an alternative when an error occurs - Difficulty of timing in collaborative learning |
- Adaptability - Group management |
Used tools |
- ENA models - Teacher interviews |
- Problems inherited from the incorporation of technology - Important factor in SLEs and SHLEs - Important factor in CSCL |
- Video/chat - Real-time interaction - Group manager - File manager - Incorporate external resources |
Knowledge about the status of the class and the students |
- ENA models - Teacher interviews |
- Some SLEs are present as a feature - Identified in collaborative learning |
- Data on student participation - Notice student has a question - Student progress |
Table 9.
RQ2: Factors influencing teacher agency in SSHLEs that support collaborative learning situations.
Table 9.
RQ2: Factors influencing teacher agency in SSHLEs that support collaborative learning situations.
Factor |
Data sources |
Reason |
Potential improvements |
Control to create and manage the activity |
- Teacher Agency questionnaire - Teacher interviews |
- Feeling of greater freedom in Experiment 1 - Co-design in Experiment 2 - Greater comfort by having more control over the activity in Experiment 3 |
- Support in the design and management of collaborative activities - Adaptability |
Use of tools |
- Teacher Agency questionnaire - Teacher interviews |
- Identified in the teacher agency literature - Indicated as necessary by teachers |
- User-friendliness - Adaptability - Ease of access to resources |
Teaching effectiveness |
- Teacher Agency questionnaire - Teacher interviews |
- Lack of time - Difficulty in checking the progress of the students in real time |
- Class status information - Student progress values |