Preprint
Review

Augmented Reality: Current and New Trends in Education

Altmetrics

Downloads

494

Views

175

Comments

0

A peer-reviewed article of this preprint also exists.

Submitted:

22 June 2023

Posted:

23 June 2023

You are already at the latest version

Alerts
Abstract
The education landscape is an environment prone to change due to the volatile and ever-changing nature of the digital society in which we all live. Although the world moves at a different pace and any generalization is bound to have several exceptions, there is evidence from research conducted in different places and contexts that educational methods are becoming increasingly digitized and driven by technological innovation. Among technological trends fuelled in many cases by the COVID-19 pandemic and the need to stay at home but online, augmented reality solutions received an additional push as a valid and versatile educational technology worth exploring and eventually integrating into several teaching methods already in use. Although the technology still faces problems related to its affordability, accessibility, and the technical skills required of users, some ongoing projects have already provided evidence that using augmented reality solutions as teaching and learning tools can improve teachers’ and students’ learning outcomes by increasing their engagement and interactivity. The same issues arose when personal computers, tablets, and smartphones were first discussed as valuable tools for education and have now found their way into most classrooms. This paper reviews some of the key concepts related to augmented reality, as well as some current trends, benefits, and concerns related to its inclusion in educational contexts in areas such as life sciences, engineering, and health. The work carried out and presented in this paper provides an interesting insight into a technology that has led to global phenomena, such as Pokémon Go, and is constantly improving in terms of portability, usability, and overall user experience. Throughout the paper and in the conclusions section, we discuss the relevance of using the best features of augmented reality and how they can contribute to positive educational outcomes.
Keywords: 
Subject: Computer Science and Mathematics  -   Software

1. Introduction

One of the fundamental challenges education is facing in the upcoming decade is the evolution of educational methods. Classical teaching approaches and strategies have more and more problems with adapting to the digital society. The model of school in which a professor opens a coursebook and starts reading a lecture, which students have to learn by heart, pass the exam and forget, is starting to become inadequate [1,2,3,4,5]. This approach is inefficient, ineffective, and unfitted to 21st-century challenges, especially when the knowledge is widely and easily available. One of the most important skills currently is not remembering but the ability to quickly find the relevant information, analyze it correctly and apply it in practice.
Nowadays, the current educational framework, in most developed countries, are used active and pedagogical educational methods following new active teaching and learning strategies. These strategies are based on new methods and new methodologies developed essentially in the 21st century, like problem-solving, innovation and creative teaching ways, online modes, and others [6]. These strategies improve the skills to deal with the new challenges of the 21st century as well as facilitate knowledge transfer in an engaging way, stir students’ and pupils’ interest and provide new experiences, also interpersonal ones. What cannot be omitted is students’ emotional engagement as young people absorb knowledge easier when they are interested in the given subject and understand its applicability [7,8]. Thus, a strong emphasis should be put on activating methods being used in classrooms. One way to realize this concept is by introducing numerous educational technology (EdTech) tools into curriculum [7,9].
The use of information and communication technologies improves students’ attitude towards learning [10], once it motivates learners and develops several skills one individual and group students [11]. Therefore, EdTech is a rapidly growing and continually developing field of research, as the needs and expectations of students, labor markets, and increasing global competition force changes in the education system. Universities and schools are faced with many challenges in the ways they teach. According to Digital Education Survey [?] 75% of US teachers believe that digital learning content will totally replace printed textbooks within the next ten years and 42% of US classrooms use a digital device every day. Laptops, desktops, and tablets are the most common devices used in the classroom, with more than half of teachers saying each is used at least weekly.
More and more schools and universities include technology-facilitated content delivery in their programs. Usually, it is used in the form of blended learning, i.e., videos, apps, websites, games, and massive open online courses (MOOCs, e.g., Coursera) [12]. The most popular tools are computer-based simulations, online quizzes and exams, recorded video lectures, video conferences, and webinars. However, mobile applications are still rarely used in education and mostly repeat the functionalities of other platforms [13].
Augmented reality (AR) is one of the most promising and fastest-growing technologies. In 2022, this market is valued at 38.56$ billion, and it is estimated that it reaches a value of 597.54$ billion by 2030 [14]. Furthermore, analysts from Business Insider Intelligence estimate that by 2024 the number of users of this technology will grow above 1.7 billion. What is more, EdTech was valued at 254.80$ billion in 2021 and is expected to reach 605.40$ billion by 2027 [15].
AR is an interactive experience which enriches the real world with digitally-generated images and sounds. The possibilities to use that functionality are numerous, for example it can provide a novel and engaging method of acquiring knowledge and information during a teaching or learning process. Surveys and reports indicate that most students remembered AR-supported lessons better and concluded that AR is a more memorable environment than laboratory-based demonstrations [16,17].
The most common use of AR can be seen through mobile apps. In contrast to Virtual Reality (VR), AR does not require expensive hardware. According to Pow Research Center, 73% of teens have access to a smartphone. Thus AR is available for use for the majority of the target group [18]. Therefore, this technology has great potential to be used with printed materials, for example with AR-ready illustrations in a coursebook that come to life on a user’s phone and allow for interaction and in-depth analysis. The ability to move from 2D non-interactive educational illustrations to 3D interactive ones is making education more accessible and engaging.
Among the most significant trends in EdTech, AR has the leading position [19]. The use of AR can grant students additional information and facilitate the understanding of complex concepts. AR applications are usually applied to catch students’ attention [20] and explain abstract and difficult concepts [21].
Despite the increasing interest in this topic, we are aware of just a few relevant survey papers. One of the most comprehensive works is [22], where the authors reviewed AR applications intended to complement traditional curriculum materials for K-12. However, their literature search was conducted in 2012. According to our knowledge, since then, no such extensive review has been published. Additionally, other factors motivated us to conduct this research. The first one is the arrival of fifth-generation mobile technology, the implementation of which heralds massive changes in the field of virtual and augmented technologies [23]. The second one is the COVID-19 pandemic situation that forced the development and implementation of such solutions in education. For the purpose of this research, two different complementary sources were used. The review is based on the 2018-2022 literature. However, for Section 8 a Cordis database of projects [?] funded by European Union (EU) Framework Programs for Research and Innovation (FP1 to Horizon) was analyzed. It comprises the newest projects that have been launched recently and still have not been a subject of a proper scientific review. The second source was chosen to present state-of-the-art trends and potential directions of the development of AR in education, with the most recent data available.
According to the scope purposes, the main research questions addressed by this paper are:
  • RQ1: What types of applications (in the context of technology) are the most popular / used? What is the core AR technology? What external devices are used? What senses are stimulated?
  • RQ2: In which areas of education does AR technology have the highest demand / is the most popular? Why? Which of them are at the most advanced stage of development / are already in use?
  • RQ3: What kind of validation methods are used in testing AR educational applications?
  • RQ4: Has the current world situation (COVID-19) influenced in any way AR in education?
What follows is an overview of the big trend, opportunities, and concerns associated with AR technology in education. In Section 2, we briefly introduce key aspects of AR technology. In Section 3 the literature search-based methodology with general qualitative data analysis in Section 4. Section 5 is answering RQ1, discussing technologies used in AR educational applications. Then, we put together the most interesting educational AR solutions in Section 6, answering RQ2. RQ3 is discussed in Section 7, where we provide a comprehensive review of methods for evaluating the effectiveness of such applications. Furthermore, we discuss the educational advantages of AR applications in Section 8 and conclude the paper with a discussion and potential future research (taking into account RQ4) in Section 9.

2. Background

2.1. Augmented Reality

AR consists of the overlapping of virtual elements in the user’s perceptual space, creating the illusion that these synthesized elements are also real. In contrast to VR, AR does not replace reality with an immersive and synthetic environment, but rather combines them with the real elements that surround the user, increasing or conditioning his perception of the real scene. In VR, the user does not directly observe the real scene that surrounds him − he is immersed in a fully synthesized environment − although it is intended that this virtual environment (usually 3D and photo-realistic) is perceived as being real [16,24].

2.2. AR definitions and concepts

The term AR was coined by Caudell and Mizzell in 1992 [25] where they proposed a Head-Mounted Display that would help Boeing 747 assembly line workers in construction and assembly tasks (see Figure 1).
However, the systematization of the concept of AR, as it is currently known and accepted by the academic community, was presented by [16], based on the contribution of [26], when the author argued that an AR visualization system must be present or 3 properties:
  • Combines real and virtual.
  • Interactive in real-time.
  • Registered in 3-D.
Various definitions have been proposed [16,24,27,28,29] all of which remain very close to the original reference by Milgram and Kishino [26] who developed a framework conceptual for the theme of Virtual Environments (VE) and which resulted in a taxonomy that still deserves acceptance.
Figure 2 represents the classification of the real-virtual worlds, ordering them by their degree of mixing the real and the virtual. At the extreme on the left side, we have the experience of the real as it is lived in a "natural" way without the need for any visualization system. Thus, according to to [26], the visualization device that comes closest to the Real Environment is the one that allows the user to look directly at the real world without any mediation process.
On the opposite side, we have total immersion in a virtual environment, which typically corresponds to VR systems, in which the entire environment is synthesized and virtual. In between, we have a whole set of applications called Mixed Realities which treat the real and virtual scenarios in a hybrid way. AR superimposes virtual information on the real scenario. Augmented Virtuality overlays real elements on the virtual stage. There is no absolute criterion in the virtual continuum that allows defining fixed boundaries between concepts, only qualities that can be relativized and subject to comparison.

2.3. The precursors of AR

We can go back until at least the nineteenth century, to find inventions that conceptually share the main characteristics of AR, even though they have not been dubbed with that name or systematized in the same way. One of the most notable is the ghost effect used in the theater in the middle of the century. XIX to superimpose in the field of view of the spectator’s images coming from two scenarios physically separated through a glass. This expressive technique aimed to enrich the narrative with the production of a dramatic and visual effect, it was popularized under the name of Ghost Pepper’s Effect [30] (see Figure 3 (a)).
During World War 2 the British military’s Mark VIII Airborne Interception Radar Gunsighting’s windscreen project developed a system that is very close to the concept of AR that superimposes in the pilot’s field of view the information coming from the Radar, showing if the target is a friend or enemy plane [31]. Between the post-war period and the development of the see-through head-mounted display by Ivan Sutherland, which provided for the first time a true experience of AR, several devices, in a timid and scattered way, saw the light of day. Better known as being one of the first Virtual Reality devices, Sensorama (see Figure 3 (b)), developed in 1962 by Morton Heilig, was a system that stood out for trying to create a total immersive experience for the user, involving the various senses such as touch and smell, in addition to sight and hearing [32].
The development of the SketchPad system, during his PhD project in 1963, made Ivan Sutherland famous as one of the pioneers of computer graphics and Graphic User Interfaces (GUI). The input system, based on the light pen, favored the concept of direct manipulation later systematized by [33] within Human-Computer Interaction. However, because of the work developed in the design and creation of the See-Trough Head Mounted Display (see Figure 3 (c)) Sutherland is today, without any doubt, considered a crucial milestone in the history of AR.

3. Methodology

AR consists of the overlapping of virtual elements in the user’s perceptual space, creating the illusion that these synthesized elements are also real. In contrast to VR, AR does not replace reality with an immersive and synthetic environment, but rather combines them with the real elements that surround the user, increasing or conditioning his perception of the real scene. In VR, the user does not directly observe the real scene that surrounds him − he is immersed in a fully synthesized environment − although it is intended that this virtual environment (usually 3D and photo-realistic) is perceived as being real [16,24].
This study was conducted as a scoping literature review (SLR). The implemented methodology was based on good practices and guidelines proposed by other researchers [34,35,36]. The goal of this SLR was to assess the current state of AR in a wide area of education. The literature search was done according to the protocol presented in Figure 4 on November 20, 2022, in the Scopus database. The search string was ("augmented reality) AND ("education") OR ("learning") OR ("teaching"). The results were limited to articles written in English in the past five years and open access, as they are viewed and cited more often than articles with limited access [37,38]. Moreover, available content leads to greater public engagement with faster impact and increased interdisciplinary conversation between researchers.
The initial search indicated 2546 papers, including 1482 journals and magazines and 713 conference proceedings, reviews 269, editorials 25, and 17 book chapters. In the further analysis, only open access papers (933 in total) were taken into consideration.
Next, we analyzed the content of all articles which met the following inclusion criteria:
  • the content of the paper should be relevant to preschool, primary, secondary, or higher education;
  • the paper must present at least the preliminary version of an AR tool (no sketches, drafts, and paradigms);
  • the tool described in the paper should be applied to learning or at least tested among students and/or educators;
  • the paper should report the effect of the provided tool;
  • the full paper should be freely accessible.
Above mentioned inclusion criteria limited the results to 598 research papers. Data gathered from included articles is the following:
  • Simple publication details such as title, authors, year etc.
  • What are the most frequently used keywords in the AR articles?
  • Which sub-branch of education are the AR studies concentrated on?
  • What are the types of AR applications?
  • What are the technologies used for AR applications?
  • Which senses do the AR applications engage?
  • What is the type of AR application validation?
All the data mentioned above is analyzed further in this paper.

4. General qualitative data

Figure 5a presents the distribution of the publication year of papers based on the Scopus database according to our criteria. In general, the graph has an increasing tendency, which could be expected in the case of modern technology development. One can also observe the slowdown in the increase between 2019 and 2021. Since 2016, the number of open-access publications has increased significantly due to the worldwide scientific trend of increasing accessibility and availability of academic papers. In addition, there is a significant decline in post-conference publications between 2019 and 2021. The changes noticed in the chart are definitely determined by the international situation caused by the COVID-19 pandemic.
Figure 5b illustrates the distribution of keywords in the analyzed articles. AR is clearly the most commonly used one, however, it was not indicated as a keyword in some articles that actually discuss its application. It might have several causes, the first one being the fact that some researchers decided to use broader or similar concepts of Mixed Reality or Virtual Reality. The second one is that in some papers AR is not treated as the main subject, but rather as a tool to reach the assumed goal e.g. in education or e-learning. Finally, it is worth noticing, as it is rather unexpected, that only a few articles discussing AR in education use keywords such as teaching, learning, education, or students.
Figure 6a demonstrates the distribution of papers according to continents. It can be observed that works from Asia constitute almost 50% of all works and this is due to the abundance of papers stemming from Indonesia, an issue that is discussed in the course of this analysis. European contribution in the field of AR comprises about a quarter of all papers, North America following with approximately 15% of papers. The aggregated contribution of other continents amounts to 6.2% of all articles.
As can be noticed in Figure 6a, which presents the distribution of papers according to the country of origin, Indonesia is an uncontended leader. However, most of the papers were published as post-conference proceedings (Journal of Physics: Conference Series), which are indexed in the Scopus Database. The scope of these publications, level of detail, and scientific value do not match publications in scientific journals, because of completely different requirements and goals. Thus, the leading position of Indonesia in AR publications should not be taken for granted, as its position stems from a singular conference event. The second position is occupied by the US, a factual leader in AR studies. The next positions are distributed quite evenly among Asian and European countries with some contributions from Mexico and Australia.

5. Technologies used in AR applications

Most AR-based technologies use a combination of hardware-based accelerometer and gyroscope information coupled with SLAM [40,41,42] and other feature-matching based techniques. This is used to capture video in order to localize and accurately map a virtual overlay in the real world [43].

5.1. Software

For AR development there are a lot of development tools available and they all have varying benefits and drawbacks. Some of the most well-known ones are Apple ARKit [44] for IOS devices, ARCore [44] for Android devices, and Vuforia Engine [45] for both but with a paid business model. There are AR development platforms and SDKs for more specialized use cases like Bosch Common AR Platform [46] for the automotive industry, SmartReality [47] for construction, Inspace (River Fox) [48] for CAD visualization, HyperIndustry [49] for technical operators and DAQRI Worksense for more industrial use cases.
Some toolkits, such as holo|one sphere [50], are meant for easy setup and prototyping. Other kits are specifically developed by larger companies to work with their own platforms like Amazon Sumerian [51] for Amazon, ARCore for Google and Spark AR Studio [52] for Facebook. Even though most AR toolkits have their own dedicated development platforms, a large portion of them either supports or is supported by more well-known game engines like Unity [53,54] or Unreal Engine [55]. As new toolkits and development engines are created frequently and others are left obsolete, there are many more that are not mentioned here [56,57,58,59,60,61,62,63,64,65,66,67].

5.1.1. Hardware

Hardware-wise, AR applications can be run on heads-up displays, holographic displays, smart glasses, handheld devices or fully immersive goggles with a camera pass-through.
AR devices use a wide selection of different methods to project or display an image. Immersive goggles and handheld devices almost exclusively use a single screen or binocular VR screens in order to display an image. HUD’s holographic displays and smart glasses use more varied methods. These can range from holographic images that get projected on intermediate screens, to light field technology that uses multiple LCD screens with a combined back-light or wave-guides that project the light straight onto the retina. A sample of the differences can be seen in Figure 7.
Devices that are meant mainly for AR include Microsoft Hololens 2 [69], Magic Leap One [70], Google Glass [71], and Meta 2 [72] to name a few. There also exist some more specialized AR hardware options such as the Daqri Smart Helmet [73]. Display AR devices most often used in education are juxtaposed in Table 1. In addition, Figure 8 shows the variation in the equipment used in education throughout recent years. As one can see, mobile tools are the most commonly used ( 80%). As far as ar goggles are concerned, only HoloLens come into play at this point (20%).
Only a few works mention the application of external devices used to deepen the users’ immersion and enhance their feelings. It is often associated with an additional analysis of the user’s movement, e.g., using a Kinect [74] sensor or an external device necessary to perform the exercise [75]. The most commonly used external devices in educational AR apps are presented in Table 2.

5.2. Assets used in AR applications

AR applications follow slightly different rules when it comes to assets and what can be shown to the user compared to other high-end developments. Objects placed in an AR space need to look and feel as if they are actually present in that space and not appear as simple overlays. This is usually achieved with high-resolution models and textures, reflection probes sampled from the camera [92,93], lighting color and direction estimation [94,95], surface and people detection [96], surface occlusion [97], virtual shadows [98] and subtle post-processing effects [99]. All of these bring an AR object closer to looking like a real-life object. However, all of these systems are very costly in terms of computational power. This is especially important when considering that AR devices tend to fall into the lower-end category of computationally powerful devices. Nevertheless, there are some workarounds that can be applied. One method is decreasing the number of polygons in applications by replacing them with baked normal maps, using simpler tracking algorithms like Aruco markers [100], or by intentionally forgoing hyper-realism.

5.3. AR environments, scenarios and limitations

AR scenarios tend to be heavily influenced by the hardware they are running on and by the toolkits used to create them. Heads-up displays and holographic displays generally work through the use of optical guides that project an image into the user’s eye or project them onto a transparent screen. Because of this, colors in holographic displays add color on top of real-world colors instead of replacing them. This limits color variation in AR quite heavily and often gives a bloom effect on any displayed model or environment. True backs and darker colors can only be achieved on fully immersive or camera pass-through-based systems as those do not possess such limitations [68]. AR has far less constraints than its fully immersive VR counterpart. AR systems do not require open spaces with no physical obstacles as the user is still able to see the real world. Limitations on lighting are less strict as the localization methods are more robust but inherently less accurate. All of this means that AR is a tool that can be used in almost any given scenery, placed in a controlled office environment, a classroom with multiple people, an industrial setting, or even outdoors [101,102]. Some limitations that might arise are ones related to low light conditions and non-stationary environments. AR headsets do require moderately good lighting, as the camera of the device cannot perform feature detection on black or blurry images. Similarly, fully in-motion environments, such as a moving car, will provide additional accelerometer values that current AR algorithms do not account for [103]. Another benefit of AR compared to traditional VR is that as the real-world environment can be seen through the device, it reduces the likelihood of nausea due to contradictory senses [2]. This means that AR can be more easily used in educational environments and by less experienced users.

6. Taxonomy of scientific papers

To present the trend and development directions in relation to a specific field, we have gone through the latest publications using the methodology presented in Figure 4. In the analyzed scientific papers, just a few types of scenarios can be found and easily distinguished. A great part of them introduces simple visual solutions. Users can get familiarized with educational content by observing prepared augmented items. There are also more advanced scenarios where learners are more engaged in performed exercises. Apart from watching they have to interact with non-real objects, which may be called "learning by doing" – a teaching method that provides very good educational results [104]. In another type of scenario, AR is implemented to evaluate the user’s performance in a given task. Such a user gets virtual feedback and can make their own reflections on their work. The type of AR scenario needs to be adjusted to the learner’s age and skills. Moreover, the field of study or taught subject determines more preferable tasks and level of user involvement. A summary of the scenarios used in AR educational tools is presented in Table 3.
Since AR allows to add supplementary information next to real objects, it is usually used as a tool that can visualize elements that cannot be easily observed in a safe environment or cannot be observed at all. This feature indicates the possible application in such domains as engineering (machines’ and robots’ simulation, architecture) [108], life science (physics, chemistry, biology, astronomy) [109], general education, medicine, arts/humanities, and special needs. The percentage distribution of those domains based on above-mentioned searching criteria is presented in Figure 9. In this section, we present the most interesting and recent applications related to those educational domains as well as their content analysis.

6.1. AR in engineering

It is believed that the pioneering AR technology was the application implemented in 1992 for the needs of the aviation tycoon, Boeing company [110]. The application was used to improve the assembly of electrical circuits by presenting assembly manuals using the HUD (Head Up Display) system. The nearly 30-year history of using AR for engineering has evolved considerably and new technological solutions have provided increasingly better tools for the successful implementation of AR. From simple mixing live images captured from a camera and generated by a computer [111], through transparent displays mounted in goggles [112], for the latest solutions like displays in contact lenses [113,114]. The most popular engineering domains using AR applications for educational purpose is presented in Figure 10.
A very interesting but underestimated form of AR presentation is video mapping [115], where real objects are enriched by visual information generated by laser projectors. The modified environment can be used to draw students’ attention, rapid prototyping [116], signaling danger, or imitate specified properties such as hardness, humidity, etc. This kind of approach eliminates the need for additional equipment such as goggles or a mobile display. Engineering AR applications can be divided into three groups based on their purpose: enriching the real environment with additional information [117], explaining or instructing the user [118], and drawing the user’s attention to important elements. A good example of such a solution is the driver assistant system, which can support the drivers by road signs recognition and interpretation [119]. Another interesting AR solution is a help-desk video communication application, which apart from the voice consultations, provides additional information displayed on real images [120]. The possibilities of AR technology are widely used to illustrate or explain the operations of complex spatial constructions, which is a very effective alternative to two-dimensional forms of communication such as illustrations and printed technical documentation. An example of such an approach is the educational application used to illustrate the principle of electro-mechanical mechanisms [121]. The learners equipped with HoloLens glasses or a tablet can observe the internal elements of a particular mechanism tested on the laboratory stand (see Figure 11). It is possible to identify components and their locations, explore the mechanism and thus more easily identify the kinematic chain or transmission power flow. The application was tested among students in engineering training, as well as bachelor of technology divided into two groups: using AR and using only paper documentation and CAD. The assessment indicated improvements for AR users.
Modern technological solutions, assumptions of Industry 4.0, omnipresent automation of processes, and contemporary ways of managing production result in evolving expectations toward future engineering staff. This leads to a necessary change in technical education to meet the needs of the market. The curriculum of the studies is enriched with subjects run with AR and VR support, which allows an organization a simulation of teamwork on projects in a relatively easy way [122]. This kind of solution closely imitates the specificity of future work under the supervision of industry experts. The authors of implementation obtain very good educational results, while the incurred costs of vitalizing education remain relatively low. Similar systemic solutions are currently being implemented in German tertiary education [122] in such projects as ELLI − “Excellent Teaching and Learning in Engineering Science” or MOOC program. The creation of virtual laboratories with various industrial machines and devices allows every technical school to use the same advanced training opportunities. The second stage of this project involves implementing AR-supported interfaces as a basic form of interacting with virtual laboratories. In [123], two mobile AR solutions were analyzed. The applications were supposed to teach the students about Karnaugh maps. The first application is keypad-based, while the second one is marker-based. The researchers posed a question about usability (effectiveness, efficiency, and satisfaction) of each application and employed the System Usability Score and Handheld Augmented Reality Usability Score models for that purpose. The keypad-based application proved to score higher on usability scale. Education is not confined to school or university desks, it continues well into professional life, regardless of the chosen career. AR can be also useful in optimizing manual activities by a workstation. Modern workplaces are equipped with systems of sensors, cameras, and AI able to analyze the manual labor of a worker and based on that suggest the optimal flow of work in AR [124]. The advantage of this solution is a constant adaptation of the system to individual psychophysical abilities, taking into consideration their development and other factors which might escape analytical definition.

6.2. AR in life science

The second most popular domain in which AR may be applied as an educational tool is life science. The percentage distribution of the most popular life science domains using AR for educational purposes is presented in Figure 13. More than 50% of them is applicable to education such as physics and chemistry. AR is a great solution for the simulation of phenomena that are impossible to replicate in the real world, facilitating the understanding of complicated physical and chemical reactions.
In [125] where the authors assessed the impact of learning enhanced by a marker-based augmented reality environment, an important case is presented. They created an application to clarify light absorption and refraction, mirrors, and lenses. A state secondary school’s 7th grade 45 pupils (experimental group and control group) participated in the study. The author has demonstrated that augmented reality application is successful in terms of students’ academic achievement and the longevity of their learning based on pre-test and post-test findings. Similarly, in the study [126] the researchers used the HoloLens headset to support the education of faculty members in the area of biological molecular structures. The study displayed high objective and subjective effectiveness of the tested AR application with a high level of engagement. The participants of AR training indicated an ease of understanding of presented animated structures which explain complex biological processes in a clear way. In [127] the authors created AR note-cards rich in content, utilizing a free HP Reveal app and a smartphone. The information on note-cards is based on Organic Chemistry I reactions and they present solely a reagent and substrate. When students aim their smartphone camera at the cards while using HP reveal app, an AR video is played presenting the product of the reaction as well as a real-time explanation of the mechanism of forming the product. HP Reveal app was also used to generate AR video on laboratory equipment, presenting a virtual expert guiding the user in the ways of setting up and operating the device. A very interesting example of the use of AR in life science education is presented in [128]. The authors introduce the Real-World Oriented Smartphone AR Learning System (R-WOSARLS), an AR smartphone tool for seasonal constellation observation. The data is based on information from the planetarium of the Nagoya City Science Museum. The creators of the tool conducted two experiments to evaluate the effectiveness, educational values, and learner satisfaction of the system in tertiary and secondary educational facilities. The results show that R-WOSARLS facilitates knowledge acquisition for constellation observation and learning and it boosts learners’ motivation to gain additional knowledge about astronomy. AR applications in biology are also ubiquitous. For example, in [129] the authors presented an AR pervasive game that aims to promote learning about the plants in the local environment. Players collect plants in their area and grow them at home to compose a garden - all done in AR. The game links physical elements like plant pots with sensors and RFID tags with information on the plant with virtual elements like AR representations of the plant. The aim of the research is to study the change in experience when the whole game is transferred back to the real world and thus explore the educational benefits that can be achieved when the game remains in AR. A summary and discussion of AR/VR use in the field of biology can be found in [130], where the authors focus on answering questions that are vital for productive AR/VR implementation. Firstly, they discuss production and dissemination of AR/VR-ready materials that are user-friendly and widely available. Secondly, they analyze positive and negative experiences reported by test subjects when performing identical tasks in AR and VR environments. Finally, they studied the subjects’ perception of pre-recorded narration during AR/VR immersion. As far as biology is concerned, most papers focus on the anatomy and the biological functions of human beings. They are presented in the next section.

6.3. AR in medicine

AR can be also implemented with great success in the field of medicine. Virtual simulations allow to create realistic immersive environment to do effective training for professionals in medicine and dentistry. AR and VR give the opportunity to gain some experience in emergency situations, and to practise decision-making without the real consequences for patients.

6.3.1. AR in medical education

Using AR and VR in medical education has many advantages such as higher learner engagement and greater motivation, development of spatial knowledge representation, technical abilities enhancement, improvement in the contextualization of learning, the possibility of level adjustment according to the learner’s needs, and also skills development individually or in a team [131,132]. In the literature there are several good examples showing implementation of AR in medical training and medical education. Anatomy is one of the most important courses in all kinds of medical studies, starting from physiotherapy and nursery, and finishing with medicine. Even simple AR and VR apps allow to visualize all body parts, such as bones, joints, muscles and organs. Apart from in-depth visual possibilities students may see how a selected human part works, which gives the impression of reality. Such apps increase student anatomical knowledge and improve 3D understanding of anatomical structures [90]. Moreover, users are able to use them at the course but also individually after classes for revision. In [133], the authors test the mobile application dedicated to teaching–learning process of the spinal cord. Using AR technology it was possible to catch the abstract nature of this body part. The learning process was evaluated as very fruitful and students were able to explore interactive 3D rotating models on the macroscopic scale, to familiarize with theoretical content, animations, and simulations regarding the physiology of the spinal cord. More complex systems are investigated as well. The combination of two medical fields can be even more interesting, for example, the combination of anatomy and radiology, as in [90] where AR Magic Mirror is tested. The Magic Mirror is a system enabling users to explore anatomical structures in correlation with medical images using a picture of their own body. In the Magic Mirror users see a reflection of themselves with virtual information superimposed on a display that acts as a digital mirror. The system is presented in Figure 14. Such an AR Magic Mirror system can be incorporated successfully into medical curricula and provides great potential as a teaching device for anatomy courses. It provides a unique learning experience and students’ results are comparable with those achieved with traditional learning methods such as atlases and textbooks. However, students with poor spatial ability and three-dimensional imagination can take an advantage of learning with the app. Traditional methods with augmented and virtual reality provide extraordinary learning results, proven in [90,134,135].
Mixed reality involves a "learning-by-doing" approach and may help students with practicing some crucial skills such as motor skills. In [134] the researchers create an AR app to develop needed skills for an ultrasound examination. A considerable asset to learners is the immediate feedback that they receive and the possibility to follow various scenarios and virtually examine different patients. Not without significance for users is the freedom to train in terms of time and place at a reasonable cost. Both AR and VR-focused medical learning tools are in constant development as requirements set by medical professionals are defined more clearly. An example of this is the VR Anatomy project (VRAna), which aims to improve the level of detail in skeletal anatomy learning tools [136]. The medical training field also employs the use of digital twins for anatomy training and patient based training [137]. AR has its uses both in training and outside of it as an assistance tool to improve success rates. One of such tool is xVision by Augmentics which overlays the 3D virtual anatomy of a patient over the real-life patient [138]. AR is also being used to teach students central venous catheterization at their own pace [139] In terms of VR medical training and procedure planning, there are lots of examples. One VR of such procedure planning software is Surgical Theater, which allows neurosurgeons to rehearse neurosurgical procedures and even later demonstrate them to patients before the actual procedure even takes place [140]. Similar to the ones mentioned above are programs like FundamentalVR which allow surgeons to rehearse, practice and improve their surgical techniques in a haptic-controlled environment. This is provided to users through both VR systems and AR systems like Microsoft’s HoloLens [141]. A very similar tool that promotes very similar training outcomes is OSSO VR, also a surgical training tool with haptic feedback but also includes multiplayer support as seen in Figure 15 [142]. For more general training and more realistic scenarios, there are Health Scholars, an AI-enabled performance assessment tool designed specifically for first responders and clinicians. The tool addresses emergency care training for adult and pediatric scenarios in pre-hospital, general care, perioperative, and obstetrical settings [143].
Most of the applications mentioned in this section cover specific topics in medical teaching and are primarily meant as additional teaching aids. There are also some AR applications like HoloAnatomy that are meant to cover an entire curriculum in a digitized form as represented in Figure 16.
Both VR and AR are prominent in medical education and in assisting medical professionals during procedures. VR appears to be more prominent in the training phase while AR is more common when actual patients are involved. This is the case for training on real patients, for informing patients of, and even assisting medical professionals during actual coming procedures.

6.3.2. AR in medical treatment

According to [145] the scope of AR application in medicine includes not only training (education), but also a treatment that concerns operating rooms, therapy and rehabilitation. A lot of room to act can be found in surgery planning. Recently a few AR solutions have been available on the market. In 2018 FDA approved the first AR medical application - Novarad’s OpenSight AR System [?]. Open Sight AR System cooperates with the Microsoft HoloLens headset and allows to adjust the plan of surgery according to the actual case. 2D, 3D, and 4D pre-operative images (from MRI, CT or PET) are overlaid onto the patient’s body in real time [146]. Another FDA-approved AR solution is GLOW800, i.e. AR GLOW800 surgical fluorescence for vascular neurosurgery, the solution delivered by Leica Microsystems [?]. Thanks to GLOW800, surgeons have a complete view of anatomy and physiology, they can observe cerebral anatomy and real-time vascular flow in a single image, which helps them to make key decisions and actions during vascular neurosurgery.

6.4. AR for special needs education

Students with special educational needs tend not to benefit very much from traditional classroom-based teaching methods [147]. They tend to struggle with the rapid introduction of new topics, the lack of examples, insufficient explanations, practice, and reviews that are needed to cover multiple topics [148]. Even though this is an age-old problem, with the recent advancements in AR technology there have been new additions that attempt to bring a solution to this issue. Studies that have covered AR for the disabled or have developed such applications are becoming more frequent with the advancements in technology. As it is a relatively new field, there are some contradicting results. For example, it is currently unclear if AR increases the cognitive load on students or in fact decreases it as there are multiple studies contradicting each other. What is known is the fact that it is a new, emerging technology and both students and teachers require time to get used to it [149]. Regardless of the cognitive load, most works relating to AR with disabled students have shown that as a technology it is an effective means of independently teaching students all sorts of topics ranging from way-finding, numeracy, shopping, emotion recognition, and literacy, all the way to physical skills. More importantly, this appears to be true for students with a very wide range of disabilities including intellectual disabilities, autism spectrum disorder and attention deficit hyperactivity disorder, students with down syndrome, and even students who are visually or auditory impaired [150]. In [151], the authors proved that immersive technologies such as AR and VR are an effective alternative to teaching and learning classical Orientation and Mobility tasks among students with visual impairment. Moreover, both students and instructors were enthusiastic about this technology. A core issue and a crucial question about the technology is whether it will stay engaging after it has lost its novelty factor. For students with attention disorders, it seems to be one of the things, if not the main one, keeping their attention focused on the learning aspect [152]. Some of the AR applications developed for mentally impaired children include an AR tabletop game that teaches children mathematics and how to handle real-life currencies [153]. In the field of geometry, there is a developed application that keeps students motivated through the use of AR geometric puzzles. It is especially promising as it proves that disabled students can finish tasks on their own without the help or constant supervision from mentors [154]. There are AR apps for those who either have difficulties learning speech or due to trauma have to re-learn it. The idea proposed by K. T. Martono and his team was not evaluated on real patients but has been deemed useful by therapists that directly work with their intended audience [155]. In some cases, AR is even used to teach mentally impaired children about their regional fruits and vegetables. But in the case of Fancy Fruits, the studies were only short term and the long-term outcomes are uncertain [156].
As there are so many solutions and they have been proven to work in most cases, some authors have decided to delve deeper into the theory behind why and how these systems work as an effective means of teaching. Colpani and Homem, for example, have not only explained the importance of gamification for the field but they have also created a framework for future researchers of this domain [157].
It is observable that students taught with AR long-term tend to outperform all of their peers who have been taught with traditional methods [158]. If the main issues relating to cost and the steep learning curve for teachers with AR get resolved, its usefulness in our education systems should become apparent [149,154].

7. Application evaluation methods

Although application design and development are a core part of any AR development process, evaluation is, without a doubt, an essential activity. It enables the team to validate if what is being developed fulfills the users’ needs and expectations and provides valuable feedback regarding issues like performance, usability, and accessibility of the technology. Evaluation, at its core, refers to the action of evaluating something according to previously chosen metrics used to measure, calculate, judge, or estimate its value in a concrete context. An initial review of the existing literature outlined that establishing a standard or generally used framework for evaluating AR solutions, although continuously worked on for more than over a decade, has not reached as much consensus when compared to the evaluation of mobile apps or the more common websites accessed on laptop or desktop computers. For example, Bach and Scapin [159] discuss how evaluation methods used in other domains could be adopted to evaluate AR systems. When analyzing some of the work done and the merging and bridging of methodologies and tools from other scientific fields, there are some core aspects worth considering, such as the focus on the technology or on the user experience; the nature of data being collected during the evaluation, and also potential collaboration activities when using the AR solution. These core aspects are mentioned in works by Swan, Gabbard and Dunser [160,161,162]. However, the user experience is not quite looked upon as a relevant issue within studies of AR. A literature review done by Anastassova et al. [163] points out that a considerable amount of evaluation activities in AR focused on the elaboration of ad-hoc systems and their evaluation in artificial or informal settings. The analysis of specific user needs was mainly omitted. If considered, it was carried out by a small group of experts through a set of quick field studies based on the activities done by future users or simply by the application of questionnaires. Unfortunately, the situation in this field is stagnant, both in general and educational AR applications. One of the most important steps in evaluating an AR application as an educational tool is testing its performance. As it is shown in Table 4, approximately 63% of the papers use the UX questionnaire as the main tool of evaluation. While it provides necessary feedback on the user experience and technical side of the application, it is not supplying the researchers with information about the actual effectiveness or performance of the app. For that goal, which is an essential step, performance verification of any kind is necessary, and that was described in only 204 articles. An ideal evaluation process which includes both methods was implemented in as few as 87 articles. Table 5 illustrates the most commonly used validation methods and provides their description, content, and procedures in detail.
==layoutwidth=297mm,layoutheight=210 mm, left=2.7cm,right=2.7cm,top=1.8cm,bottom=1.5cm, includehead,includefoot [LO,RE]0cm [RO,LE]0cm
==In conclusion, regarding the evaluation of AR solutions, there is no universal methodology or set of tools, but rather a balance between user experience and performance validation tasks [175]. The framework of each solution’s evaluation fundamentally depends on the attributes linked to the user experience, the developed technology and covered issue [176].

8. Educational advantages of AR applications

For a better insight into current AR trends in education, a Cordis database of projects [?] funded by European Union (EU) Framework Programs for Research and Innovation (FP1 to Horizon 2020) ends in 2019 to 2024, was filtered. From the obtained results, 133 projects were selected that give results based on AR technology. Published summaries of such selected projects were additionally analyzed, and those whose results are applicable in the field of learning and training were singled out, and divided according to the application area, and the result is shown separately in Table 6.
Based on the previous selection methodology, it is initially possible to single out projects that deliver general-purpose educational platforms that can be used for different educational purposes. An example is the AR Interactive Educational (ARETE) project [?] which connects businesses and higher education to create a unique AR educational platform for the education of children in Science, Technology, Engineering and Mathematics (STEM), Literacy subjects and positive behavioural intervention. The project under study in this article is a continuation of a previous pilot study called AdHd Augmented (AHA) [?] which developed the WordsWorthLearning AR platform [177] for enhancing learning in children diagnosed with Attention Deficit Hyperactivity Disorder (ADHD). The ARETE project is developing various educational scenarios available on mobile applications that show students additional content after pointing a tablet or smartphone to content from a book. An example of using the WordsWorthLearning platform is shown in Figure 18a. However, it can be noticed that the largest number of AR training-related projects are used in medicine, especially in training in the field of surgery for the purpose of better and cheaper training of doctors and reduction of the number of bad practices that can have lasting effects on the patient’s health. One example is the UpSurgeOn Academy project [?] as part of which AR applications are being developed in combination with the so-called UpSim physical simulator. The mobile device’s screen shows the exact position of the patient and the course of the surgery, while at the same time the neurosurgeon refines his mental and manual skills on the UpSim physical simulator. The cost of purchasing the above system is EUR 800.00 for one neurosurgeon, which is a negligible amount compared to the average cost of training for neurosurgeons, which in the EU countries is approximately one million EUR. An example of using the UpSurgeOn system is shown in Figure 18b.
The importance of cultural heritage is emphasized in all the basic documents of the EU, and in particular in Article 3 of the Treaty on European Union which states that it respects its rich cultural and linguistic diversity and ensures the preservation and promotion of Europe’s cultural heritage. It is particularly important to ensure the information on cultural heritage for all European citizens, especially when the heritage is specially protected or is intended to display the former appearance of cultural property (eg underwater archaeological sites). This is the main objective of the iMARECULTURE project [?], within which a special UWAR software was developed using the Unity 3D framework libraries for the purpose of 3D imaging and development of AR submarine archaeological sites. The models created serve the purpose of educating visitors by displaying protected archaeological sites in AR technology on large screens in a museum setting (so-called Dry Visit) or on a screen of a mobile device (so-called Underwater Visit). An example of use is shown in Figure 18c. AR has also been used in the communication of artefacts and knowledge in the context of museums. One example of its potential is the AR exhibit at the Banco de Portugal’s Money Museum, in which a collection of precious coins from the 16th century can be explored interactively in a mirror setup. The visitor uses the entrance ticket to virtually manipulate the replicas of the coins. AR like other digital technologies plays an important role in the construction of the contemporary museography experience since it allows the user to relate to the contents exposed in the condition of first person [178]. In modern manufacturing, there is a strong connection between humans and automation, which requires new methods and tools to design and operate optimized jobs in terms of ergonomics, safety, efficiency, complexity management and job satisfaction. AR technology can achieve appropriate interfaces between humans and machines in a smart manufacturing environment that is supported by a so-called industrial cyber-physical system. Such a system is planned to be implemented within the HyperCOG project [?] to which AR applications will be linked to the lifelong learning of production employees. AR technology is nowadays commercially used in automotive traffic so that drivers are shown information about the movement of vehicles on Head Up Display which significantly increases their response speed and affects traffic safety. However, in the context of traffic learning and training applications, various flight or driving simulators are being developed to reduce training costs. One such system is being implemented in the WrightBroS project [?] as one of the first flight simulators based entirely on AR technology as opposed to current architectures based solely on. An example of use is shown in Figure 18d. The use of AR technology in the field of sports and recreation is also interesting because it helps professional athletes improve their technique while improving their performance. However, there are also projects such as Atomic Bands [?] that show recreational athletes the correct movements when exercising. The key components of the Atomic Bands AR system are wearable shown in Figure 18e, worn on the wrists or ankles, allowing athletes to learn correct movements and not have to stand in the range of a camera.
The previously mentioned cases clearly show the possibility of applying AR technology to simulate business environments where the student faces real problems that must be solved. However, technology alone cannot improve the educational process and it is necessary to think about suitable teaching methodologies that enable the acquisition of powerful knowledge from learning about real problems implemented on the AR platforms. In this sense, reference research indicates that it is necessary to apply modern educational methods in which the student is at the center of encouraging critical thinking, and the teacher is the leader who directs the student in the learning process. Dewey promoted the philosophy of "learning by doing" and introduced the term Project-Based Learning (PBL) [179] as a teaching method in which problems of real situations in practice are used as a starting point and incentive for acquiring and implementing new knowledge [180]. The potential benefits of PBL are reflected in the development of skills such as teamwork and independent work, professional and interpersonal skills, practising empathy, analyzing and critical thinking, explaining concepts and communicating, independent and collaborative learning, multidisciplinary problem solving, and applying learning content to the real world. A slightly new approach is enquiry-based learning (EBL) which should result in strong specialized knowledge intended for a specific purpose [181] or knowledge at a higher level. Research is a way of teaching and learning that consists of questioning, hypothesizing, and discovery. Furthermore, integrating AR into the research teaching environment can be a motivational factor due to sensory engagement, as activating multiple senses improves knowledge retention [182]. Numerous authors, based on conducted research and projects, point out the advantages of using AR in education as an increase in creativity, more fun learning and an increase in motivation for learning [183], an increase in autonomy and motivation of students to use technological devices [184], easier understanding and explanation of things that they cannot apply and observe [185], the possibility of retaining knowledge for a longer period compared to other pedagogical methods [186], as well as enabling contextual visualization that favors long-term memory [22]. Therefore AR technology has excellent possibilities for developing new systems and covers a wide range of topics and academic levels, which is why it has already partially taken root in education and other spheres of life.

9. Conclusion and future directions

Based on the review of research and scientific works on the broadly understood application of AR in education, one cannot help but notice that the level of interest in utilizing the technology is rapidly increasing. However, the AR publication’s growth dynamics slowed down in 2020, probably due to COVID-19 pandemic. On the one hand, this is because of a significant decline in post-conference publications caused by mass cancellations of such events. On the other hand, it should be noted that any research work on AR often requires contact with a large number of users of the implemented systems, which given the restrictive sanitation regulations has been significantly hindered. Despite a number of such negative factors, an increase in publications on the subject was recorded. Interestingly, significant growth in the share of Open Access publications has been noticeable for approximately three years. It is a consequence of changes introduced in the business models of publishing houses and scientific research institutions. Worldwide publications on AR applications are geographically dominated by researchers in Asia, Europe and North America, reflecting the technological leadership of these continents. While a large portion of them comprises very worthwhile scientific publications, a very disturbing phenomenon was observed. Among the reviewed papers, there was an unnatural overrepresentation of publications originating from Indonesia, which unfortunately differed significantly at the substantive level from the rest of discussed works. Probably the verification system was at fault here. Mobile devices are still the most widely used technology solution for AR systems. Compact size, built-in rear video camera, autonomous use, high system standardization, and relatively low purchase cost are the attributes favouring this solution. However, users of mobile AR systems notice some disadvantages, the most important being the low level of immersion and the discomfort of use. Mobile devices virtually always require hand-holding and the use of static grips severely limits the application’s functionality. The exceptions are projector-based AR systems created for users such as vehicle drivers, whose position and field of view are rather fixed. AR systems created as glasses with built-in transparent displays do not have these disadvantages. Based on the review, there is an observable increase in interest in products such as HoloLens. The upwards trend stems from a gradual decrease in the cost of such devices and a more competitive choice of products (e.g. Apple). What’s more, they are definitely the most comfortable form of experiencing AR. Solutions based on static workstations equipped with a camera, computer and monitor or projector are already present in trace quantities, absorbing the interest of researchers from developing countries as the cheapest alternative to commercial solutions. AR systems are also more often being used as a part of MR solutions. Visual markers, now past their prime, are being displaced by multi-camera object recognition systems. Real surroundings enriched with interactive 2d/3d visual forms provide numerous educational, cognitive and assistance opportunities. They allow the user to fully concentrate on the task at hand, and the AR system acts as a teacher, assistant or consultant. The review shows that the main form of interaction undertaken with an AR system is the image. Through the use of multiple sensors, active interaction performed using gaze has become possible. AR systems are beginning to use gaze as a complement to controllers and the indicators they control, thus interactions become more comfortable and intuitive for the user. Additionally, the sound is still an important form of communication and this includes both audio and voice/speech messages. AR systems equipped with sets of controllers often use haptic communication, which works especially well in situations of visual focus on other elements of the scene. However, there is a noticeable trend of the complete elimination of controllers favouring the user’s hands. The user can be wearing gloves with tracing or haptic systems, or bare hands can be used and their detection is performed by optical tracking. In the reviewed works, other external devices for multisensory interaction with the system are occasionally found. But these should be considered as niche and often highly specialized solutions. However, considering the constant development of technology, such as motion capture, speech, gaze and brain activity recognition, the feature of AR systems will be based on multi-channel human-computer interaction. Special attention should be directed to haptic interfaces and muscle-electrical stimulation solutions, which can significantly improve human-computer feedback. AR technology is increasingly utilized in general education and life sciences, especially in physics and chemistry. It can be explained by the fact that some concepts and phenomena are easier to understand if they are visualized. Additionally, AR allows the presentation of information in an attractive and engaging manner. Education supported by AR and VR becomes much more tailored to the way students who are tech natives experience the world [187]. Gamification elements and audiovisual content may significantly increase students’ involvement and it is proven that learners improve their performance using AR. What is more, AR has been implemented with great success in engineering, mainly for machine process simulations and architectural visualisations as future engineers need to develop three-dimensional skills as well as spatial imagination. AR supports them in gaining these essential engineering abilities by illustrating and explaining complex spatial constructions, mechanical or electrical mechanisms and highly abstract concepts such as electromagnetic fields etc., and drawing users’ attention to crucial elements. Learners are able to practice critical decision-making and safely explore hazardous and emergency situations. Another field of education that benefits from AR are medicine. The greatest advantage of AR medical training is the possibility to interact with simulated body parts and gain experience that is closer to reality than textbook knowledge or video footage. It is noteworthy that medical training appears to be more dominated by VR-based applications compared to applied medicine which is more AR-focused. Surprisingly, there is relatively little interest in solutions aimed at broadly understood support of people with disabilities, resulting from the low awareness of universal design in research and scientific communities. The presented review outlined that establishing a standard or generally used framework for evaluating AR solutions, although being developed for over a decade, has not reached as much consensus as evaluating mobile apps or websites. Most authors use standard usability testing methods based on HCI guidelines, tailoring tools and techniques to the type of project. It is always a challenge of choosing the tools that fully suit the purpose of the evaluation. Tools available for use include questionnaires, interviews, observation grids, and monitoring technology support and they are built according to the chosen research methodology, or the combination of more than one, and maybe combined in different techniques such as focus groups, walk-throughs, real case scenario simulations, and think aloud testing. The type of data and the method for its collection must be congruent with the purpose of the evaluation and the factors chosen to be evaluated. A systematic literature review by Valeria Martins [176], compiled several commonly used features for measuring the usability of AR applications such as ease of use and learning, user satisfaction, and application attractiveness. However, the evaluation may include additional concerns less orientated towards user experience and more linked to its efficacy in enhancing the learning process. Both the UX and the application aim-related issues must be given similar priority. It is recommended to use an iterative approach during the development process to ensure a decrease in the number of problems that will appear during the evaluation stage. However, based on the literature review, we have noticed a worryingly low validation level. It seems that scientists often focus their attention on achieving an intended goal without verifying effectiveness. However, unvalidated solutions should not be quickly introduced, especially in medical and engineering education, as they can stimulate acquiring bad habits or wrong actions. As a result of the combination of an iterative approach during the development stage [188] and a rigorous attitude towards the evaluation phase, it is possible to collect objective data (i.e., task completion times and accuracy/error rates, scores, position, movement, number of actions, etc.) and subjective data (i.e., subjective user ratings and opinions) linked with user preferences, interaction problems, system errors and even missing functionalities. Unfortunately, we have noticed that validation methods providing objective data (e.g., based on biomedical measurements) are used relatively rarely. Their substantive value is much more reliable than users’ subjective opinions or feelings. Thus, objective validation methods should be considered to measure user involvement, creativity, focus, and emotional state. This type of validation allows the developers to know exactly where and under what circumstances the user encountered difficulties. It is an incomparably better approach than a questionnaire/interview in which the user provides a subjective general assessment. Continuous development of external devices, such as wireless EEG headsets [189], eye/hand-tracking optical sensors and motion capture or various bio-marker monitoring devices [190], should encourage researchers to utilize objective validation methods to prevent the creation of a useless or potentially counterproductive application. In summary, we should expect continued vigorous growth of interest in the implementation of AR systems in education, especially in teaching and learning at High Education Institutions, more so because many of them can be successfully used to develop skills of students that will perform their jobs and support remote communication, control, and management systems.

Author Contributions

Conceptualization, D.K. and G.Z.; methodology, D.K. and G.Z.; investigation, D.K., G.Z., A.L.L, R.R., M.V., E.P., F.U., M.L.H., R.E.H. and G.A.; resources, D.K., G.Z., A.L.L; writing—original draft preparation, D.K., G.Z., A.L.L, R.R., M.V., E.P., F.U., M.L.H., R.E.H. and G.A.; writing—review and editing, D.K., G.Z. and A.L.L; supervision, D.K.; project administration, D.K.; funding acquisition, D.K. All authors have read and agreed to the published version of the manuscript.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Kamińska, D.; Zwoliński, G.; Wiak, S.; Petkovska, L.; Cvetkovski, G.; Di Barba, P.; Mognaschi, M.E.; Haamer, R.E.; Anbarjafari, G. Virtual Reality-Based Training: Case Study in Mechatronics. Technology, Knowledge and Learning, 2020; 1–17. [Google Scholar]
  2. Kamińska, D.; Sapiński, T.; Wiak, S.; Tikk, T.; Haamer, R.E.; Avots, E.; Helmi, A.; Ozcinar, C.; Anbarjafari, G. Virtual Reality and Its Applications in Education: Survey. Information 2019, 10, 318. [Google Scholar] [CrossRef]
  3. Cvetkovski, G.; Petkovska, L.; Di Barba, P.; Mognaschi, M.E.; Kamińska, D.; Firych-Nowacka, A.; Wiak, S.; Digalovski, M.; Celeska, M.; Rezaei, N.; Lefik, M.; Zwolinski, G.; Spanski, T.; Tikk, T.; Hammer, R.E.; Anbarjafari, G. ViMeLa Project: An innovative concept for teaching mechatronics using virtual reality. Przeglad Elektrotechniczny 2019, 95. [Google Scholar] [CrossRef]
  4. Kamińska, D.; Sapiński, T.; Aitken, N.; Della Rocca, A.; Barańska, M.; Wietsma, R. Virtual reality as a new trend in mechanical and electrical engineering education. Open Physics 2017, 15, 936–941. [Google Scholar] [CrossRef]
  5. Tikk, T.; Haamer, R.E.; Kamińska, D.; Firych-Nowacka, A. An interactive educational environment for the mechatronics lab in virtual reality. New Perspectives on Virtual and Augmented Reality: Finding New Ways to Teach in a Transformed Learning Environment, 2020. [Google Scholar]
  6. Pereira, E.T.; Vilas-Boas, M.; Rebelo, C.F. University curricula and employability: The stakeholders’ views for a future agenda. Industry and Higher Education 2020, 34, 321–329. [Google Scholar] [CrossRef]
  7. Weller, M. Twenty years of EdTech. Educause Review Online 2018, 53, 34–48. [Google Scholar]
  8. Anbarjafari, G.; Haamer, R.E.; Lusi, I.; Tikk, T.; Valgma, L. 3D face reconstruction with region based best fit blending using mobile phone for virtual reality based social media. arXiv preprint arXiv:1801.01089, 2017; arXiv:1801.01089 2017. [Google Scholar]
  9. Lee, E.A.L.; Wong, K.W.; Fung, C.C. How does desktop virtual reality enhance learning outcomes? A structural equation modeling approach. Computers & Education 2010, 55, 1424–1442. [Google Scholar]
  10. Faustmann, G.; Lemke, C.; Kirchner, K.; Monett, D. Which factors make digital learning platforms successful? 13th annual International Technology, Education and Development Conference, 2019, pp. 6777–6786.
  11. Pereira, E.T.; Vilas-Boas, M.; Rebelo, C.C. Graduates’ skills and employability: the view of students from different European countries. Higher Education, Skills and Work-Based Learning 2019, 9, 758–774. [Google Scholar] [CrossRef]
  12. O’Brien, K.; Forte, M.; Mackey, T.; Jacobson, T. Metaliteracy as pedagogical framework for learner-centered design in three MOOC platforms: Connectivist, Coursera and Canvas. Open Praxis 2017, 9, 267–286. [Google Scholar] [CrossRef]
  13. Thomas, D.A.; Nedeva, M. Broad online learning EdTech and USA universities: symbiotic relationships in a post-MOOC world. Studies in Higher Education 2018, 43, 1730–1749. [Google Scholar] [CrossRef]
  14. Reaserch, G.V. Augmented Reality Market Size, Share & Trends Analysis Report By Component, By Display (HMD & Smart Glass, HUD, Handheld Devices), By Application, By Region, And Segment Forecasts, 2021-2028 (2021).
  15. Staff, E. EdTech Market - Global Outlook and Forecast 2022-2027. Technical report, EdTechXGlobal, 2022.
  16. Azuma, R.T. A survey of augmented reality. Presence: Teleoperators & Virtual Environments 1997, 6, 355–385. [Google Scholar]
  17. Bradski, G.R.; Miller, S.A.; Abovitz, R. Methods and systems for creating virtual and augmented reality, 2019. US Patent 10,203, 762.
  18. Lenhart, A.; Duggan, M.; Perrin, A.; Stepler, R.; Rainie, H.; Parker, K. ; others. Teens, social media & technology overview 2015, 2015.
  19. Aleksandrova, M. Augmented Reality in Education: The Hottest EdTech Trend 2018 and How to Apply It to Your Business. Retrieved May 2018, 10, 2019. [Google Scholar]
  20. Huang, T.C.; Chen, C.C.; Chou, Y.W. Animating eco-education: To see, feel, and discover in an augmented reality-based experiential learning environment. Computers & Education 2016, 96, 72–82. [Google Scholar]
  21. Sırakaya, M.; Alsancak Sırakaya, D. Augmented reality in STEM education: A systematic review. Interactive Learning Environments, 2020; 1–14. [Google Scholar]
  22. Santos, M.E.C.; Chen, A.; Taketomi, T.; Yamamoto, G.; Miyazaki, J.; Kato, H. Augmented reality learning experiences: Survey of prototype design and evaluation. IEEE Transactions on learning technologies 2013, 7, 38–56. [Google Scholar] [CrossRef]
  23. Driscoll, T.; Farhoud, S.; Nowling, S. ; others. Enabling mobile augmented and virtual reality with 5G networks. Technical report, Tech. rep. Tech. Rep, 2017.
  24. McDonald, C.; Malik, S.; Roth, G. Hand-based interaction in augmented reality. IEEE International Workshop HAVE Haptic Virtual Environments and Their. IEEE, 2002, pp. 55–59.
  25. Caudell, T.; Mizell, D. Augmented reality: an application of heads-up display technology to manual manufacturing processes (pp. 659-669 vol. 2). IEEE, 1992. [Google Scholar]
  26. Milgram, P.; Kishino, F. A taxonomy of mixed reality visual displays. IEICE TRANSACTIONS on Information and Systems 1994, 77, 1321–1329. [Google Scholar]
  27. Dubois, E.; Nigay, L. Augmented reality: which augmentation for which reality? Proceedings of DARE 2000 on Designing augmented reality environments, 2000, pp. 165–166.
  28. Vallino, J.R.; Brown, C.M. Interactive augmented reality. PhD thesis, University of Rochester. Dept. of Computer Science, 1998.
  29. Cipresso, P.; Giglioli, I.A.C.; Raya, M.A.; Riva, G. The past, present, and future of virtual and augmented reality research: a network and cluster analysis of the literature. Frontiers in psychology 2018, 9, 2086. [Google Scholar] [CrossRef] [PubMed]
  30. Greenslade Jr, T.B. Pepper’s Ghost. The Physics Teacher 2011, 49, 338–339. [Google Scholar] [CrossRef]
  31. Vaughan-Nichols, S. Augmented Reality: No Longer a Novelty?, 1970.
  32. Grau, O. Virtual Art: from illusion to immersion; MIT press, 2003.
  33. Norman, D.A. The psychology of everyday things.; Basic books, 1988.
  34. Kitchenham, B.; Pearl Brereton, O.; Budgen, D.; Turner, M.; Bailey, J.; Linkman, S. Systematic literature reviews in software engineering – A systematic literature review. Information and Software Technology 2009, 51, 7–15, Special Section - Most Cited Articles in 2002 and Regular Research Papers. [Google Scholar] [CrossRef]
  35. for Reviews, N.C. ; Dissemination. Undertaking systematic reviews of research on effectiveness: CRD’s guidance for those carrying out or commissioning reviews. CRD report Number 4, 2001. [Google Scholar]
  36. Kitchenham, B. Procedures for Performing Systematic Reviews. 2004.
  37. Holmberg, K.; Hedman, J.; Bowman, T.D.; Didegah, F.; Laakso, M. Do articles in open access journals have more frequent altmetric activity than articles in subscription-based journals? An investigation of the research output of Finnish universities. Scientometrics 2020, 122, 645–659. [Google Scholar] [CrossRef]
  38. Norris, M.; Oppenheim, C.; Rowland, F. The citation advantage of open-access articles. Journal of the American Society for Information Science and Technology 2008, 59, 1963–1972. [Google Scholar] [CrossRef]
  39. Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; others. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. Systematic reviews 2021, 10, 1–11. [Google Scholar]
  40. Engel, J.; Schöps, T.; Cremers, D. LSD-SLAM: Large-scale direct monocular SLAM. European conference on computer vision. Springer, 2014, pp. 834–849.
  41. Chekhlov, D.; Gee, A.P.; Calway, A.; Mayol-Cuevas, W. Ninja on a plane: Automatic discovery of physical planes for augmented reality using visual slam. 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality. IEEE, 2007, pp. 153–156.
  42. Bang, J.; Lee, D.; Kim, Y.; Lee, H. Camera pose estimation using optical flow and ORB descriptor in SLAM-based mobile AR game. 2017 International Conference on Platform Technology and Service (PlatCon). IEEE, 2017, pp. 1–4.
  43. Pence, H.E. Smartphones, smart objects, and augmented reality. The Reference Librarian 2010, 52, 136–145. [Google Scholar] [CrossRef]
  44. Lanham, M. Learn ARCore-Fundamentals of Google ARCore: Learn to build augmented reality apps for Android, Unity, and the web with Google ARCore 1.0; Packt Publishing Ltd, 2018.
  45. Simonetti Ibañez, A.; Paredes Figueras, J. Vuforia v1. 5 SDK: Analysis and evaluation of capabilities. Master’s thesis, Universitat Politècnica de Catalunya, 2013.
  46. Common Augmented Reality Platform (CAP) From Bosch.
  47. Introducing our New Application: SmartReality.
  48. Inspace.
  49. The first Industrial Augmented Reality Platform.
  50. Sphere, The mixed reality standard for enterprises.
  51. Amazon Sumerian.
  52. Spark, AR.
  53. Unity.
  54. Cakmak, Y.O.; Daniel, B.K.; Hammer, N.; Yilmaz, O.; Irmak, E.C.; Khwaounjoo, P. The Human Muscular Arm Avatar as an Interactive Visualization Tool in Learning Anatomy: Medical Students’ Perspectives. IEEE Transactions on Learning Technologies 2020, 13, 593–603. [Google Scholar]
  55. Unreal Engine.
  56. AUGMENTED REALITY.
  57. ARToolKit.
  58. HP REVEAL.
  59. Blippbuilder.
  60. Virtual Support for your Frontline Workforce.
  61. Create your own augmented reality experiences.
  62. EasyAR Sense SDK.
  63. A Cloud-based Augmented Reality platform for everyone!
  64. MaxST.
  65. AR (Augmented Reality) for Mobile Applications.
  66. Augmented Reality SDK.
  67. Peace of Mind.
  68. He, Z.; Sui, X.; Jin, G.; Cao, L. Progress in virtual reality and augmented reality based on holographic display. Applied optics 2019, 58, A74–A81. [Google Scholar] [CrossRef]
  69. Hololens 2.
  70. Magic Leap.
  71. DISCOVER GLASS ENTERPRISE EDITION.
  72. Lampreave, P.; Jimenez-Perez, G.; Sanz, I.; Gomez, A.; Camara, O. Towards assisted electrocardiogram interpretation using an AI-enabled Augmented Reality headset. Computer Methods in Biomechanics and Biomedical Engineering: Imaging & Visualization, 2020; 1–8. [Google Scholar]
  73. Hobby, K.C.; Gowing, B.; Matt, D.P. Smart helmet, 2016. US Patent 9,389,677.
  74. Amantini, S.N.S.R.; Montilha, A.A.P.; Antonelli, B.C.; Leite, K.T.M.; Rios, D.; Cruvinel, T.; Neto, N.L.; Oliveira, T.M.; Machado, M.A.A.M. Using Augmented Reality to Motivate Oral Hygiene Practice in Children: Protocol for the Development of a Serious Game. JMIR research protocols 2020, 9, e10987. [Google Scholar]
  75. Muangpoon, T.; Osgouei, R.H.; Escobar-Castillejos, D.; Kontovounisios, C.; Bello, F. Augmented Reality System for Digital Rectal Examination Training and Assessment: System Validation. Journal of Medical Internet Research 2020, 22, e18637. [Google Scholar] [CrossRef]
  76. Lin, H.C.K.; Lin, Y.H.; Wang, T.H.; Su, L.K.; Huang, Y.M. Effects of Incorporating AR into a Board Game on Learning Outcomes and Emotions in Health Education. Electronics 2020, 9, 1752. [Google Scholar] [CrossRef]
  77. Chang, Y.S.; Chen, C.N.; Liao, C.L. Enhancing English-Learning Performance through a Simulation Classroom for EFL Students Using Augmented Reality—A Junior High School Case Study. Applied Sciences 2020, 10, 7854. [Google Scholar] [CrossRef]
  78. Tuli, N.; Singh, G.; Mantri, A.; Sharma, S. Augmented reality learning environment to aid engineering students in performing practical laboratory experiments in electronics engineering. Smart Learning Environments 2022, 9, 1–20. [Google Scholar] [CrossRef]
  79. Harris, T.E.; DeLellis, S.F.; Heneghan, J.S.; Buckman, R.F.; Miller, G.T.; Magee, J.H.; Vasios III, W.N.; Nelson, K.J.; Kane, S.F.; Choi, Y.S. Augmented reality forward damage control procedures for nonsurgeons: A feasibility demonstration. Military medicine 2020, 185, 521–525. [Google Scholar] [CrossRef]
  80. Flores-Bascuñana, M.; Diago, P.D.; Villena-Taranilla, R.; Yáñez, D.F. On Augmented Reality for the learning of 3D-geometric contents: A preliminary exploratory study with 6-Grade primary students. Education Sciences 2020, 10, 4. [Google Scholar] [CrossRef]
  81. Dolega-Dolegowski, D.; Proniewska, K.; Dolega-Dolegowska, M.; Pregowska, A.; Hajto-Bryk, J.; Trojak, M.; Chmiel, J.; Walecki, P.; Fudalej, P.S. Application of holography and augmented reality based technology to visualize the internal structure of the dental root–a proof of concept. Head & face medicine 2022, 18, 1–6. [Google Scholar]
  82. Wunder, L.; Gomez, N.A.G.; Gonzalez, J.E.; Mitzova-Vladinov, G.; Cacchione, M.; Mato, J.; Foronda, C.L.; Groom, J.A. Fire in the Operating Room: Use of Mixed Reality Simulation with Nurse Anesthesia Students. Informatics. Multidisciplinary Digital Publishing Institute, 2020, Vol. 7, p. 40.
  83. Ille, S.; Ohlerth, A.K.; Colle, D.; Colle, H.; Dragoy, O.; Goodden, J.; Robe, P.; Rofes, A.; Mandonnet, E.; Robert, E. ; others. Augmented reality for the virtual dissection of white matter pathways. Acta Neurochirurgica, 2020; 1–9. [Google Scholar]
  84. Hess, O.; Qian, J.; Bruce, J.; Wang, E.; Rodriguez, S.; Haber, N.; Caruso, T.J. Communication Skills Training Using Remote Augmented Reality Medical Simulation: a Feasibility and Acceptability Qualitative Study. Medical science educator 2022, 32, 1005–1014. [Google Scholar] [CrossRef]
  85. Rath, D.; Ipseeta, A.; Patnaik, B. Augmented Reality (Ar) & Virtual Reality (Vr)-A Channel for Digital Transformation in Industrialization Fostering Innovation & Entrepreneurship. Int. J. Innov. Technol. Explor. Eng, 2019; 3228–3236. [Google Scholar]
  86. Boonbrahm, P.; Kaewrat, C.; Pengkaew, P.; Boonbrahm, S.; Meni, V. Study of the hand anatomy using real hand and augmented reality. International Journal of Interactive Mobile Technologies (iJIM) 2018, 12, 181–190. [Google Scholar] [CrossRef]
  87. Rossi, M.; D’Avenio, G.; Morelli, S.; Grigioni, M. CogAR: an augmented reality App to improve quality of life of the people with cognitive impairment. 2020 IEEE 20th Mediterranean Electrotechnical Conference (MELECON). IEEE, 2020, pp. 339–343.
  88. Büth, L.; Juraschek, M.; Sangwan, K.S.; Herrmann, C.; Thiede, S. Integrating virtual and physical production processes in learning factories. Procedia Manufacturing 2020, 45, 121–127. [Google Scholar] [CrossRef]
  89. Pilati, F.; Faccio, M.; Gamberi, M.; Regattieri, A. Learning manual assembly through real-time motion capture for operator training with augmented reality. Procedia Manufacturing 2020, 45, 189–195. [Google Scholar] [CrossRef]
  90. Bork, F.; Stratmann, L.; Enssle, S.; Eck, U.; Navab, N.; Waschke, J.; Kugelmann, D. The Benefits of an Augmented Reality Magic Mirror System for Integrated Radiology Teaching in Gross Anatomy. Anatomical Sciences Education 2019, 12, 585–598. [Google Scholar] [CrossRef]
  91. Carl, B.; Bopp, M.; Saß, B.; Pojskic, M.; Voellger, B.; Nimsky, C. Spine surgery supported by augmented reality. Global Spine Journal 2020, 10, 41S–55S. [Google Scholar] [CrossRef] [PubMed]
  92. Croubois, H.; Farrugia, J.P.; Iehl, J.C. Fast Image Based Lighting for Mobile Realistic AR. PhD thesis, LIRIS UMR CNRS 5205; ENS Lyon, 2014.
  93. Avots, E.; Madadi, M.; Escalera, S.; Gonzàlez, J.; Baro, X.; Pällin, P.; Anbarjafari, G. From 2D to 3D geodesic-based garment matching. Multimedia Tools and Applications 2019, 78, 25829–25853. [Google Scholar] [CrossRef]
  94. Kuo, E.Y.; Pene, B.A. Estimation of light color and direction for augmented reality applications, 2013. US Patent 8,405,658.
  95. Traumann, A.; Anbarjafari, G.; Escalera, S. A new retexturing method for virtual fitting room using kinect 2 camera. Proceedings of the IEEE conference on computer vision and pattern recognition workshops, 2015, pp. 75–79.
  96. Andriluka, M.; Roth, S.; Schiele, B. People-tracking-by-detection and people-detection-by-tracking. 2008 IEEE Conference on computer vision and pattern recognition. IEEE, 2008, pp. 1–8.
  97. Fischer, J.; Regenbrecht, H.; Baratoff, G. Detecting dynamic occlusion in front of static backgrounds for AR scenes. Proceedings of the workshop on Virtual environments 2003, 2003, pp. 153–161. [Google Scholar]
  98. Wacker, P.; Wagner, A.; Voelker, S.; Borchers, J. Heatmaps, Shadows, Bubbles, Rays: Comparing Mid-Air Pen Position Visualizations in Handheld AR. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, 2020, pp. 1–11.
  99. Aittala, M. Inverse lighting and photorealistic rendering for augmented reality. The visual computer 2010, 26, 669–678. [Google Scholar] [CrossRef]
  100. Avola, D.; Cinque, L.; Foresti, G.L.; Mercuri, C.; Pannone, D. A practical framework for the development of augmented reality applications by using ArUco markers. International Conference on Pattern Recognition Applications and Methods. SCITEPRESS, 2016, Vol. 2, pp. 645–654.
  101. Yu, D.; Jin, J.S.; Luo, S.; Lai, W.; Huang, Q. A useful visualization technique: a literature review for augmented reality and its application, limitation & future direction. Visual information communication, 2009; 311–337. [Google Scholar]
  102. Plewan, T.; Mättig, B.; Kretschmer, V.; Rinkenauer, G. Exploring the benefits and limitations of augmented reality for palletization. Applied Ergonomics 2021, 90, 103250. [Google Scholar] [CrossRef] [PubMed]
  103. Van Krevelen, D.; Poelman, R. A survey of augmented reality technologies, applications and limitations. International journal of virtual reality 2010, 9, 1–20. [Google Scholar] [CrossRef]
  104. Thompson, P. Learning by doing. Handbook of the Economics of Innovation 2010, 1, 429–476. [Google Scholar]
  105. Blankemeyer, S.; Wiemann, R.; Posniak, L.; Pregizer, C.; Raatz, A. Intuitive Robot Programming Using Augmented Reality. Procedia CIRP 2018, 76, 155–160, 7th CIRP Conference on Assembly Technologies and Systems (CATS 2018). [Google Scholar] [CrossRef]
  106. Mei, B.; Yang, S. Nurturing Environmental Education at the Tertiary Education Level in China: Can Mobile Augmented Reality and Gamification Help? Sustainability 2019, 11, 4292. [Google Scholar] [CrossRef]
  107. Gang, Z.; Jingyu, H.; Wenlei, X.; Jie, Z. A mask R-CNN based method for inspecting cable brackets in aircraft. Chinese Journal of Aeronautics 2020. [Google Scholar]
  108. Johal, W.; Robu, O.; Dame, A.; Magnenat, S.; Mondada, F. Augmented Robotics for Learners: A Case Study on Optics. 2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN). IEEE, 2019, pp. 1–6.
  109. Setiawan, A.R. How to Teach and Do Research in Physics Education. Open Science Framework (OSF) Preprints, 2019. [Google Scholar]
  110. Thomas, P.; David, W. Augmented reality: An application of heads-up display technology to manual manufacturing processes. Hawaii International Conference on System Sciences, 1992, pp. 659–669.
  111. Ko, S.M.; Chang, W.S.; Ji, Y.G. Usability principles for augmented reality applications in a smartphone environment. International journal of human-computer interaction 2013, 29, 501–515. [Google Scholar] [CrossRef]
  112. Evans, G.; Miller, J.; Pena, M.I.; MacAllister, A.; Winer, E. Evaluating the Microsoft HoloLens through an augmented reality assembly application. Degraded Environments: Sensing, Processing, and Display 2017. International Society for Optics and Photonics, 2017, Vol. 10197, p. 101970V.
  113. Kim, T.; Hwang, S.; Kim, S.; Ahn, H.; Chung, D. Smart contact lenses for augmented reality and methods of manufacturing and operating the same, 2019. US Patent 10,359,648.
  114. Lan, S.; Zhang, X.; Taghinejad, M.; Rodrigues, S.; Lee, K.T.; Liu, Z.; Cai, W. Metasurfaces for near-eye augmented reality. ACS Photonics 2019, 6, 864–870. [Google Scholar] [CrossRef]
  115. Factura, B.; LaPerche, L.; Reyneri, P.; Jones, B.; Karsch, K. Lightform: procedural effects for projected AR. In ACM SIGGRAPH 2018 Studio; 2018; pp. 1–2.
  116. Liese, A.D.; Colabianchi, N.; Lamichhane, A.P.; Barnes, T.L.; Hibbert, J.D.; Porter, D.E.; Nichols, M.D.; Lawson, A.B. Validation of 3 food outlet databases: completeness and geospatial accuracy in rural and urban food environments. American journal of epidemiology 2010, 172, 1324–1333. [Google Scholar] [CrossRef]
  117. Macena Silveira, L. Evaluation of Augmented Reality Glasses for Use in Flight Test Courses. PhD thesis, Florida Institute of Technology, 2018.
  118. Friedrich, W.; Wohlgemuth, W. Augmented-reality system for situation-related support of the interaction between a user and an engineering apparatus, 2008. US Patent 7,324,081.
  119. Rusch, M.L.; Schall Jr, M.C.; Lee, J.D.; Dawson, J.D.; Rizzo, M. Augmented reality cues to assist older drivers with gap estimation for left-turns. Accident Analysis & Prevention 2014, 71, 210–221. [Google Scholar]
  120. Gurevich, P.; Lanir, J.; Cohen, B.; Stone, R. TeleAdvisor: a versatile augmented reality tool for remote assistance. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 2012, pp. 619–622.
  121. Scaravetti, D.; Doroszewski, D. Augmented Reality experiment in higher education, for complex system appropriation in mechanical design. Procedia CIRP 2019, 84, 197–202. [Google Scholar] [CrossRef]
  122. Gordon, S.; Ryan, A.; Loughlin, S. Meeting The Needs of Industry in Smart Manufacture–The Definition of a New Profession and a Case Study in Providing the Required Skillset. Procedia Manufacturing 2018, 17, 262–269. [Google Scholar] [CrossRef]
  123. Dutta, R.; Mantri, A.; Singh, G. Evaluating system usability of mobile augmented reality application for teaching Karnaugh-Maps. Smart Learning Environments 2022, 9, 1–27. [Google Scholar] [CrossRef]
  124. Tao, W.; Lai, Z.H.; Leu, M.C.; Yin, Z. Worker activity recognition in smart manufacturing using IMU and sEMG signals with convolutional neural networks. Procedia Manufacturing 2018, 26, 1159–1166. [Google Scholar] [CrossRef]
  125. Özerbaş, D.S. The Effect of Marker-Based Augmented Reality (MBAR) Applications on Academic Achievement and Permanence. Universal Journal of Educational Research 2019, 7, 1926–1932. [Google Scholar] [CrossRef]
  126. Wyss, C.; Bührer, W.; Furrer, F.; Degonda, A.; Hiss, J.A. Innovative teacher education with the augmented reality device Microsoft Hololens—results of an exploratory study and pedagogical considerations. Multimodal Technologies and Interaction 2021, 5, 45. [Google Scholar] [CrossRef]
  127. Plunkett, K.N. A Simple and Practical Method for Incorporating Augmented Reality into the Classroom and Laboratory, 2019.
  128. Tian, K.; Urata, M.; Endo, M.; Mouri, K.; Yasuda, T.; Kato, J. Real-World Oriented Smartphone AR Supported Learning System Based on Planetarium Contents for Seasonal Constellation Observation. Applied Sciences 2019, 9, 3508. [Google Scholar] [CrossRef]
  129. Zarraonandia, T.; Montero, A.; Diaz, P.; Aedo, I. " Magic Flowerpot": An AR Game for Learning about Plants. Extended Abstracts of the Annual Symposium on Computer-Human Interaction in Play Companion Extended Abstracts, 2019, pp. 813–819.
  130. Garcia-Bonete, M.J.; Jensen, M.; Katona, G. A practical guide to developing virtual and augmented reality exercises for teaching structural biology. Biochemistry and Molecular Biology Education 2019, 47, 16–24. [Google Scholar] [CrossRef]
  131. McGrath, J.; Taekman, J.; Dev, P.; Danforth, D.; Mohan, D.; Kman, N.; Crichlow, A.; Bond, W. Using Virtual Reality Simulation Environments to Assess Competence for Emergency Medicine Learners. Academic Emergency Medicine 2018, 25, 186–195. [Google Scholar] [CrossRef]
  132. Xu, X.; Mangina, E.; Campbell, A.G. HMD-based virtual and augmented reality in medical education: A systematic review. Frontiers in Virtual Reality, 2021; 82. [Google Scholar]
  133. Fernandes, J.; Teles, A.; Teixeira, S. An augmented reality-based mobile application facilitates the learning about the spinal cord. Education Sciences 2020, 10, 1–18, Cited by: 7. [Google Scholar] [CrossRef]
  134. Ebner, F.; De Gregorio, A.; Fabienne, S.; Bekes, I.; Janni, W.; Lato, K. Effect of an Augmented Reality Ultrasound Trainer App on the Motor Skills Needed for a Kidney Ultrasound: Prospective Trial. JMIR Serious Games 2019, 7, 1–8. [Google Scholar] [CrossRef]
  135. Cowling, M.; Birt, J. Pedagogy before Technology: A Design-Based Research Approach to Enhancing Skills Development in Paramedic Science Using Mixed Reality. Information 2018, 9, 29. [Google Scholar] [CrossRef]
  136. .
  137. Medical Augmented Intelligence.
  138. Molina, C.A.; Theodore, N.; Ahmed, A.K.; Westbroek, E.M.; Mirovsky, Y.; Harel, R.; Khan, M.; Witham, T.; Sciubba, D.M.; others. Augmented reality–assisted pedicle screw insertion: a cadaveric proof-of-concept study. Journal of Neurosurgery: Spine 2019, 31, 139–146. [Google Scholar] [CrossRef] [PubMed]
  139. Wijdenes, P.; Borkenhagen, D.; Babione, J.; Ma, I.; Hallihan, G. Leveraging Augmented Reality Training Tool for Medical Education: A Case Study in Central Venous Catheterization. Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems; Association for Computing Machinery: New York, NY, USA, 2018; CHI EA ’18, p. 1–7. [Google Scholar]
  140. Ehlers, J.P.; Uchida, A.; Srivastava, S.K. The integrative surgical theater: combining intraoperative optical coherence tomography and 3D digital visualization for vitreoretinal surgery in the DISCOVER study. Retina 2018, 38, S88–S96. [Google Scholar] [CrossRef] [PubMed]
  141. FundamentalVR: Working at the Intersection of Immersive Technology, haptics and machine learning.
  142. Virtual Reality Surgical Training Platform, 2020.
  143. O’Brien, M.J.; Garland, J.M.; Murphy, K.M.; Shuman, S.J.; Whitaker, R.C.; Larson, S.C. Training medical students in the social determinants of health: the Health Scholars Program at Puentes de Salud. Advances in medical education and practice 2014, 5, 307. [Google Scholar] [CrossRef]
  144. Wish-Baratz, S.; Gubatina, A.P.; Enterline, R.; Griswold, M.A. A new supplement to gross anatomy dissection: HoloAnatomy. Medical education 2019, 53, 522–523. [Google Scholar] [CrossRef]
  145. Eckert, M.; Volmerg, J.; Friedrich, C. Augmented Reality in Medicine: Systematic and Bibliographic Review. JMIR Mhealth Uhealth 2019, 7, 1–17. [Google Scholar] [CrossRef] [PubMed]
  146. Monsky, W.; James, R.; Seslar, S. Virtual and Augmented Reality Applications in Medicine and Surgery-The Fantastic Voyage is here. Anatomy Physiology 2019, 9, 1–6. [Google Scholar]
  147. Cawley, J.F.; Miller, J.H. Cross-sectional comparisons of the mathematical performance of children with learning disabilities: Are we on the right track toward comprehensive programming? Journal of Learning Disabilities 1989, 22, 250–254. [Google Scholar] [CrossRef]
  148. Salend, S.J. Effective mainstreaming: Creating inclusive classrooms; Macmillan College, 1994.
  149. Akçayır, M.; Akçayır, G. Advantages and challenges associated with augmented reality for education: A systematic review of the literature. Educational Research Review 2017, 20, 1–11. [Google Scholar] [CrossRef]
  150. Baragash, R.S.; Al-Samarraie, H.; Moody, L.; Zaqout, F. Augmented Reality and Functional Skills Acquisition Among Individuals With Special Needs: A Meta-Analysis of Group Design Studies. Journal of Special Education Technology, 2020; 0162643420910413. [Google Scholar]
  151. Thevin, L.; Briant, C.; Brock, A.M. X-Road: Virtual Reality Glasses for Orientation and Mobility Training of People with Visual Impairments. ACM Transactions on Accessible Computing 2020, 13. Cited by: 11; All Open Access, Green Open Access. [Google Scholar] [CrossRef]
  152. Syahputra, M.; Sari, P.; Arisandi, D.; Abdullah, D.; Napitupulu, D.; Setiawan, M.; Albra, W.; Andayani, U.; others. Implementation of augmented reality to train focus on children with s pecial needs. Journal of Physics: Conference Series. IOP Publishing, 2018, Vol. 978, p. 012109.
  153. Cascales-Martínez, A.; Martínez-Segura, M.J.; Pérez-López, D.; Contero, M. Using an augmented reality enhanced tabletop system to promote learning of mathematics: A case study with students with special educational needs. Eurasia Journal of Mathematics, Science and Technology Education 2016, 13, 355–380. [Google Scholar]
  154. Lin, C.Y.; Chai, H.C.; Wang, J.y.; Chen, C.J.; Liu, Y.H.; Chen, C.W.; Lin, C.W.; Huang, Y.M. Augmented reality in educational activities for children with disabilities. Displays 2016, 42, 51–54. [Google Scholar] [CrossRef]
  155. Martono, K.T.; Eridani, D.; Fauzi, A.; Purwanti, A.; Ulil, M. Augmented Reality Technology As One Of The Media In Therapy For Children With Special Needs. 2019 6th International Conference on Information Technology, Computer and Electrical Engineering (ICITACEE). IEEE, 2019, pp. 1–5.
  156. Steinhaeusser, S.C.; Riedmann, A.; Haller, M.; Oberdörfer, S.; Bucher, K.; Latoschik, M.E. Fancy Fruits-An Augmented Reality Application for Special Needs Education. 2019 11th International Conference on Virtual Worlds and Games for Serious Applications (VS-Games). IEEE, 2019, pp. 1–4.
  157. Colpani, R.; Homem, M.R.P. An innovative augmented reality educational framework with gamification to assist the learning process of children with intellectual disabilities. 2015 6th International Conference on Information, Intelligence, Systems and Applications (IISA). IEEE, 2015, pp. 1–6.
  158. Redondo, B.; Cózar-Gutiérrez, R.; González-Calero, J.A.; Ruiz, R.S. Integration of augmented reality in the teaching of English as a foreign language in early childhood education. Early Childhood Education Journal 2020, 48, 147–155. [Google Scholar] [CrossRef]
  159. Bach, C.; Scapin, D.L. Obstacles and perspectives for evaluating mixed reality systems usability. Acte du Workshop MIXER, IUI-CADUI. Citeseer, 2004, Vol. 4.
  160. Swan, J.E.; Gabbard, J.L. Survey of user-based experimentation in augmented reality. Proceedings of 1st International Conference on Virtual Reality, 2005, Vol. 22, pp. 1–9.
  161. Gabbard, J.L.; Swan II, J.E. Usability engineering for augmented reality: Employing user-based studies to inform design. IEEE Transactions on visualization and computer graphics 2008, 14, 513–525. [Google Scholar] [CrossRef]
  162. Dünser, A.; Grasset, R.; Billinghurst, M. A survey of evaluation techniques used in augmented reality studies; Human Interface Technology Laboratory New Zealand, 2008.
  163. Anastassova, M.; Mégard, C.; Burkhardt, J.M. Prototype evaluation and user-needs analysis in the early design of emerging technologies. International Conference on Human-Computer Interaction. Springer, 2007, pp. 383–392.
  164. Schnürer, R.; Dind, C.; Schalcher, S.; Tschudi, P.; Hurni, L. Augmenting Printed School Atlases with Thematic 3D Maps. Multimodal Technologies and Interaction 2020, 4, 23. [Google Scholar] [CrossRef]
  165. Parras-Burgos, D.; Fernández-Pacheco, D.G.; Polhmann Barbosa, T.; Soler-Méndez, M.; Molina-Martínez, J.M. An Augmented Reality Tool for Teaching Application in the Agronomy Domain. Applied Sciences 2020, 10, 3632. [Google Scholar] [CrossRef]
  166. Nadeem, M.; Chandra, A.; Livirya, A.; Beryozkina, S. AR-LabOr: Design and Assessment of an Augmented Reality Application for Lab Orientation. Education Sciences 2020, 10, 316. [Google Scholar] [CrossRef]
  167. Volioti, C.; Keramopoulos, E.; Sapounidis, T.; Melisidis, K.; Zafeiropoulou, M.; Sotiriou, C.; Spiridis, V. Using Augmented Reality in K-12 Education: An Indicative Platform for Teaching Physics. Information 2022, 13, 336. [Google Scholar] [CrossRef]
  168. Marini, A.; Nafisah, S.; Sekaringtyas, T.; Safitri, D.; Lestari, I.; Suntari, Y.; Sudrajat, A.; Iskandar, R.; others. Mobile Augmented Reality Learning Media with Metaverse to Improve Student Learning Outcomes in Science Class. International Journal of Interactive Mobile Technologies 2022, 16. [Google Scholar] [CrossRef]
  169. Pipattanasuk, T.; Songsriwittaya, A. Development of an Instructional Model with Augmented Reality Technology for Vocational Certificate Students. International Journal of Instruction 2020, 13, 539–554. [Google Scholar] [CrossRef]
  170. Thees, M.; Kapp, S.; Strzys, M.P.; Beil, F.; Lukowicz, P.; Kuhn, J. Effects of augmented reality on learning and cognitive load in university physics laboratory courses. Computers in Human Behavior 2020, 108, 106316. [Google Scholar] [CrossRef]
  171. Schmiedinger, T.; Petke, M.; von Czettritz, L.; Wohlschläger, B.; Adam, M. Augmented Reality as a tool for providing informational content in different production domains. Procedia Manufacturing 2020, 45, 423–428. [Google Scholar] [CrossRef]
  172. Nagayo, Y.; Saito, T.; Oyama, H. Augmented reality self-training system for suturing in open surgery: A randomized controlled trial. International Journal of Surgery 2022, 102, 106650. [Google Scholar] [CrossRef]
  173. Celik, C.; Guven, G.; Cakir, N.K. Integration of mobile augmented reality (MAR) applications into biology laboratory: Anatomic structure of the heart. Research in Learning Technology 2020, 28. [Google Scholar] [CrossRef]
  174. Wells, T.; Houben, S. CollabAR–Investigating the Mediating Role of Mobile AR Interfaces on Co-Located Group Collaboration. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, 2020, pp. 1–13.
  175. Dede, C.J.; Richards, J. Conclusion—strategic planning for R&D on immersive learning. In Virtual, augmented, and mixed realities in education; Springer, 2017; pp. 237–243.
  176. Martins, V.F.; Kirner, T.G.; Kirner, C. Subjective usability evaluation criteria of augmented reality applications. International Conference on Virtual, Augmented and Mixed Reality. Springer, 2015, pp. 39–48.
  177. Luna, J.; Treacy, R.; Hasegawa, T.; Campbell, A.; Mangina, E. Words Worth Learning-Augmented Literacy Content for ADHD Students. 2018 IEEE Games, Entertainment, Media Conference (GEM). IEEE, 2018, pp. 1–9.
  178. M. , V., The Engaging Museum. In Possibilidades: MUX – Museus em Experiência; UA Editora: Aveiro, PT, 2015; pp. 109–119. [Google Scholar]
  179. Wahbeh, G.; Najjar, E.A.; Sartawi, A.F.; Abuzant, M.; Daher, W.; others. The role of project-based language learning in developing students’ life skills. Sustainability 2021, 13, 6518. [Google Scholar] [CrossRef]
  180. Ansari, M.T.; Rahman, S.; Badgujar, V.B.; Sami, F.; Abdullah, M.S. Problem based learning (PBL): A novel and effective tool of teaching and learning. Indian Journal of Pharmaceutical Education and Research 2015, 49, 258–265. [Google Scholar] [CrossRef]
  181. Young, M.; Muller, J. On the powers of powerful knowledge. Review of education 2013, 1, 229–250. [Google Scholar] [CrossRef]
  182. Roberto, R.; Freitas, D.; Lima, J.P.; Teichrieb, V.; Kelner, J. Arblocks: A concept for a dynamic blocks platform for educational activities. 2011 XIII Symposium on Virtual Reality. IEEE, 2011, pp. 28–37.
  183. Buchner, J.; Zumbach, J. Promoting Intrinsic Motivation with a Mobile Augmented Reality Learning Environment. International Association for Development of the Information Society 2018. [Google Scholar]
  184. Ferrer-Torregrosa, J.; Torralba, J.; Jimenez, M.; García, S.; Barcia, J. ARBOOK: Development and assessment of a tool based on augmented reality for anatomy. Journal of Science Education and Technology 2015, 24, 119–124. [Google Scholar] [CrossRef]
  185. Akçayır, M.; Akçayır, G.; Pektaş, H.M.; Ocak, M.A. Augmented reality in science laboratories: The effects of augmented reality on university students’ laboratory skills and attitudes toward science laboratories. Computers in Human Behavior 2016, 57, 334–342. [Google Scholar] [CrossRef]
  186. Sommerauer, P.; Müller, O. Augmented reality in informal learning environments: A field experiment in a mathematics exhibition. Computers & education 2014, 79, 59–68. [Google Scholar]
  187. Bayne, S.; Ross, J., ‘Digital Native’ and ‘Digital Immigrant’ Discourses. In Digital Difference: Perspectives on Online Learning; Land, R.; Bayne, S., Eds.; SensePublishers: Rotterdam, 2011; pp. 159–169.
  188. Gutwin, C.; Greenberg, S. The mechanics of collaboration: Developing low cost usability evaluation methods for shared workspaces. Proceedings IEEE 9th International Workshops on Enabling Technologies: Infrastructure for Collaborative Enterprises (WET ICE 2000). IEEE, 2000, pp. 98–103.
  189. Kosiński, J.; Szklanny, K.; Wieczorkowska, A.; Wichrowski, M. An analysis of game-related emotions using Emotiv EPOC. 2018 Federated Conference on Computer Science and Information Systems (FedCSIS). IEEE, 2018, pp. 913–917.
  190. Kubota, T. Wearable device from Stanford measures cortisol in sweat. Obtenido de Stanford News, 2018. Available online: https://news.stanford.edu/2018/07/20/wearable-device-measures-cortisolsweat.
Figure 1. Head-Mounted Display proposed by Caudell and Mizzel in 1992.
Figure 1. Head-Mounted Display proposed by Caudell and Mizzel in 1992.
Preprints 77435 g001
Figure 2. Real-Virtual Continuum by Milgram and Kishino [26].
Figure 2. Real-Virtual Continuum by Milgram and Kishino [26].
Preprints 77435 g002
Figure 3. a) Ghost Pepper’s Effect 1862 b) Sensorama 1962 c) Head Mounted Display by Ivan Sutherland 1968.
Figure 3. a) Ghost Pepper’s Effect 1862 b) Sensorama 1962 c) Head Mounted Display by Ivan Sutherland 1968.
Preprints 77435 g003
Figure 4. Scopus keywords search based on PRISMA flow diagram [39].
Figure 4. Scopus keywords search based on PRISMA flow diagram [39].
Preprints 77435 g004
Figure 5. AR in education (a) publications per year until November 20, 2022 (b) most commonly used keywords.
Figure 5. AR in education (a) publications per year until November 20, 2022 (b) most commonly used keywords.
Preprints 77435 g005
Figure 6. The distribution of papers according to (a) continents (b) the country of origin.
Figure 6. The distribution of papers according to (a) continents (b) the country of origin.
Preprints 77435 g006
Figure 7. Sample of AR/VR technologies. a) is full VR or camera passthrough AR, b) is waveguide based binocular vision technology, c) is light-field display technology, and d) is computer-generated holographic display technology [68].
Figure 7. Sample of AR/VR technologies. a) is full VR or camera passthrough AR, b) is waveguide based binocular vision technology, c) is light-field display technology, and d) is computer-generated holographic display technology [68].
Preprints 77435 g007
Figure 8. Percentage distribution of the most commonly used AR technology in education throughout the years: a - tablet / mobile phone; b - Hololens c - Magic Leap d - Google Glass.
Figure 8. Percentage distribution of the most commonly used AR technology in education throughout the years: a - tablet / mobile phone; b - Hololens c - Magic Leap d - Google Glass.
Preprints 77435 g008
Figure 9. Percentage distribution of the most popular education domains based on Scopus keywords search throughout the years: a - medicine, b - engineering, c - life science, d - special needs, e - arts/humanities, f - general education.
Figure 9. Percentage distribution of the most popular education domains based on Scopus keywords search throughout the years: a - medicine, b - engineering, c - life science, d - special needs, e - arts/humanities, f - general education.
Preprints 77435 g009
Figure 10. Percentage distribution of the most commonly used engineering domains using AR for educational purposes.
Figure 10. Percentage distribution of the most commonly used engineering domains using AR for educational purposes.
Preprints 77435 g010
Figure 11. Tablet and Hololens glasses scenario presenting the analysis of an electric actuator [121].
Figure 11. Tablet and Hololens glasses scenario presenting the analysis of an electric actuator [121].
Preprints 77435 g011
Figure 12. Physical manufacturing cell (a) and its virtual equivalent (b) [122].
Figure 12. Physical manufacturing cell (a) and its virtual equivalent (b) [122].
Preprints 77435 g012
Figure 13. Percentage distribution of the most popular life science domains using AR for educational purposes.
Figure 13. Percentage distribution of the most popular life science domains using AR for educational purposes.
Preprints 77435 g013
Figure 14. AR Magic Mirror system: (A) AR view with virtual anatomy models superimposed on the user; (B) CT section image corresponding to the slice at the height of the virtual red circle in the AR view [134].
Figure 14. AR Magic Mirror system: (A) AR view with virtual anatomy models superimposed on the user; (B) CT section image corresponding to the slice at the height of the virtual red circle in the AR view [134].
Preprints 77435 g014
Figure 15. A still taken from a video demo of OSSO VR in a multi-student scenario [142].
Figure 15. A still taken from a video demo of OSSO VR in a multi-student scenario [142].
Preprints 77435 g015
Figure 16. Remote learning conducted through HoloAnatomy. The images represent the teacher, what the student sees and the student respectively [144].
Figure 16. Remote learning conducted through HoloAnatomy. The images represent the teacher, what the student sees and the student respectively [144].
Preprints 77435 g016
Figure 17. AR for special needs education (a) currency learning system [153] (b) an app for independently learning geometry [154] (c) sample AR models on flash cards used for speech therapy [155] (d) the application Fancy Fruits, used to teach disabled children about regional fruits and vegetables [156].
Figure 17. AR for special needs education (a) currency learning system [153] (b) an app for independently learning geometry [154] (c) sample AR models on flash cards used for speech therapy [155] (d) the application Fancy Fruits, used to teach disabled children about regional fruits and vegetables [156].
Preprints 77435 g017
Figure 18. Projects related to AR in education in training, for the period 2019-2024, co-financed from EU Framework Programs for Research and Innovation a) WordsWorthLearning AR platform in teaching geography b) UpSurgeOn AR platform for neurosurgeon training c) UWAR AR platform developed in iMARECULTURE project d) WrightBroS AR platform for pilot training e) Atomic Bands wearables.
Figure 18. Projects related to AR in education in training, for the period 2019-2024, co-financed from EU Framework Programs for Research and Innovation a) WordsWorthLearning AR platform in teaching geography b) UpSurgeOn AR platform for neurosurgeon training c) UWAR AR platform developed in iMARECULTURE project d) WrightBroS AR platform for pilot training e) Atomic Bands wearables.
Preprints 77435 g018
Table 1. Display AR devices most often used in education.
Table 1. Display AR devices most often used in education.
Device Characteristics (RES, FOV) Usage count E.g. ref.
Tablet / Mobile Phone Preprints 77435 i001 like the device used 478  [76,77,78]
Hololens Preprints 77435 i002Preprints 77435 i003 ver.1
1268 x 720
34°
ver.2
2048 x 1080
52°
82  [79,80,81]
Magic
Leap
Preprints 77435 i004 1280 x 960
50°
13  [82,83,84]
Google
Glass
Preprints 77435 i005 640 x 360
83°
12  [85,86,87]
Table 2. External devices supporting AR technology in education.
Table 2. External devices supporting AR technology in education.
Devices Context of use E.g. ref.
Kinect hand movements recognition for learn tooth-brushing technique  [74]
Leap Motion hand and finger tracking device used for visualisation of hand anatomy  [86]
Projector projecting the process mimicking a real production process and displaying additional data  [88]
Motion Capture operator hands tracking for manual assembly task verification  [89]
Magic Mirror a teaching device for anatomy courses  [90]
Microscope microscope-based AR environment successfully implemented for spinal surgery  [91]
Table 3. Types of scenarios used in AR educational tools.
Table 3. Types of scenarios used in AR educational tools.
Scenario type Usage [%] E.g. ref.
Simple visual solutions 54%  [83,90,105]
Learning by doing 35%  [76,77,106]
Evaluation of user’s performance 9%  [107]
Table 4. Methods evaluating AR application as an educational tool.
Table 4. Methods evaluating AR application as an educational tool.
Method type Usage count E.g. ref.
UX questionnaire 376  [164,165,166,167]
Performance Verification 204  [76,77,106,168]
Hybrid method (UX and PV) 87  [169,170,171,172]
Table 5. Studies evaluating students performance.
Table 5. Studies evaluating students performance.
Method Content of use Experimental protocol Participants Results Ref.
Expert validation AR application addressing the heart’s anatomic structure for pre-service science teachers’ laboratory learning.
  • introduction of the app and marker,
  • use of the app in a laboratory environment,
  • dissection,
  • juxtaposing the app with dissection and general evaluation.
30 pre-service teachers taking the biology laboratory course
  • the app provides the user with information about the anatomy of the heart before they got engaged in the dissection
  • the app provides students with rich content related to the subject,
  • by making small structures visible, it creates the opportunity to learn by doing
 [173]
Automatic validation AR application to assist and guide operators involved in manual assembly processes in real-time.
  • depth camera tracks human motions in the workstation environment,
  • the camera focus on the upper body of the operator and on their hands in particular,
  • visual feedback indicates whether the worker performed an incorrect action.
12 inexperienced operators, both female and male with various anthropocentric parameters. A real industrial case was adopted to evaluate the benefits of the developed technology compared to the traditional approach in terms of the learning rate, which increases by 22% with a reduction in manual process duration up to -51% during the first assembly cycles.  [89]
Pre and post-test Magic Mirror as a system for combined anatomy and radiology teaching. A non-announced examination with 20 multiple choice questions similar to the anatomy part of the first main German medical examination. 749 first-year medical students and 72students from follow-up elective course
  • A pre-and post-test revealed significant improvements in test scores between the two tests for the Magic Mirror.
  • Students with low mental rotation test scores benefited from the Magic Mirror and achieved significantly higher posttest scores compared to students with a low MRT score in the theory group.
 [90]
Observation Insights into how current mobile AR interfaces affect co-located group collaboration. Participants performed collaborative tasks with virtual models that have three different levels of complexity. The session was video recorded for further analysis. 20 participants (11 female, 9 male) in groups of 4 (snowball sampling)
  • AR allows collaborators to dynamically switch focus between a work-space and a communication space.
  • AR apps induce high mental load and frustration and lead to a reduction in group interaction.
 [174]
Table 6. Projects related with AR in education in training, ending in 2019-2024, co-financed from EU Framework Programs for Research and Innovation.
Table 6. Projects related with AR in education in training, ending in 2019-2024, co-financed from EU Framework Programs for Research and Innovation.
Field of application Number of projects
Medicine 10
General 7
Cultural heritage 7
Manufacturing 6
Sport and recreation 2
Transport 2
Total 34
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

© 2024 MDPI (Basel, Switzerland) unless otherwise stated