| Title: | Integrating AI-driven wearable metaverse technologies into ubiquitous blended learning : a framework based on embodied interaction and multi-agent collaboration |
|---|
| Authors: | ID Xu, Jiaqi (Author) ID Zhai, Xuesong (Author) ID Chen, Nian-Shing (Author) ID Ghani, Usman (Author) ID Istenič, Andreja (Author) ID Xin, Junyi (Author) |
| Files: | https://www.mdpi.com/2227-7102/15/7/900
RAZ_Xu_Jiaqi_2025.pdf (1,60 MB) MD5: 66C68811109662A3BE2631A01791EADC
|
|---|
| Language: | English |
|---|
| Work type: | Unknown |
|---|
| Typology: | 1.01 - Original Scientific Article |
|---|
| Organization: | PEF - Faculty of Education
|
|---|
| Abstract: | Ubiquitous blended learning, leveraging mobile devices, has democratized education by enabling autonomous and readily accessible knowledge acquisition. However, its reliance on traditional interfaces often limits learner immersion and meaningful interaction. The emergence of the wearable metaverse offers a compelling solution, promising enhanced multisensory experiences and adaptable learning environments that transcend the constraints of conventional ubiquitous learning. This research proposes a novel framework for ubiquitous blended learning in the wearable metaverse, aiming to address critical challenges, such as multi-source data fusion, effective human–computer collaboration, and efficient rendering on resource-constrained wearable devices, through the integration of embodied interaction and multi-agent collaboration. This framework leverages a real-time multi-modal data analysis architecture, powered by the MobileNetV4 and xLSTM neural networks, to facilitate the dynamic understanding of the learner’s context and environment. Furthermore, we introduced a multi-agent interaction model, utilizing CrewAI and spatio-temporal graph neural networks, to orchestrate collaborative learning experiences and provide personalized guidance. Finally, we incorporated lightweight SLAM algorithms, augmented using visual perception techniques, to enable accurate spatial awareness and seamless navigation within the metaverse environment. This innovative framework aims to create immersive, scalable, and cost-effective learning spaces within the wearable metaverse. |
|---|
| Keywords: | metaverse, embodied interaction, wearable, multi-agent, artificial intelligence, ubiquitous blended learning |
|---|
| Publication version: | Version of Record |
|---|
| Publication date: | 15.07.2025 |
|---|
| Year of publishing: | 2025 |
|---|
| Number of pages: | str. 1-19 |
|---|
| Numbering: | Vol. 15, issue 7, [article no.] 900 |
|---|
| PID: | 20.500.12556/RUP-21480  |
|---|
| UDC: | 004.8:37 |
|---|
| ISSN on article: | 2227-7102 |
|---|
| DOI: | 10.3390/educsci15070900  |
|---|
| COBISS.SI-ID: | 242838275  |
|---|
| Publication date in RUP: | 17.07.2025 |
|---|
| Views: | 505 |
|---|
| Downloads: | 8 |
|---|
| Metadata: |  |
|---|
|
:
|
Copy citation |
|---|
| | | | Average score: | (0 votes) |
|---|
| Your score: | Voting is allowed only for logged in users. |
|---|
| Share: |  |
|---|
Hover the mouse pointer over a document title to show the abstract or click
on the title to get all document metadata. |