Intelligent gesture interfaces in immersive education
DOI:
https://doi.org/10.56294/saludcyt20251810Keywords:
gesture interfaces, Immersive education, Virtual reality, Experiential learning, Artificial intelligenceAbstract
Introduction: intelligent gesture interfaces are transforming immersive education by enabling more intuitive and efficient interactions between students and digital content through bodily movements, especially facial and manual gestures. When integrated with technologies such as augmented reality and virtual reality, these interfaces enhance body perception, information retention, and the understanding of complex concepts, promoting a more active and personalized learning experience.
Methods: this study employed a Systematic Literature Review (SLR) based on Kitchenham’s methodology, structured in planning, execution, and presentation phases. Academic and empirical studies published since 2019 were selected, focusing on the integration of gesture interfaces with artificial intelligence in educational contexts and assessing their effectiveness, applicability, and associated challenges.
Results: the findings revealed that these interfaces support student engagement, adapt to individual needs, and strengthen multimodal learning. Technologies such as depth sensors, neural networks, and multimodal systems were identified as enabling more fluid and natural interaction. Despite their potential, technical challenges were noted, including gesture variability, real-time processing demands, and lack of standardization, as well as pedagogical barriers such as curricular integration and learning assessment.
Conclusions: it is concluded that intelligent gesture interfaces, when complemented by artificial intelligence, hold strong potential to enrich educational experiences, support personalized and student-centered learning environments, and align pedagogical practice with participatory and constructivist models of education.
References
[1] M. C. Johnson Glenberg, “Immersive VR and education: Embodied design principles that include gesture and hand controls,” Frontiers in Robotics and AI, vol. 5, article 81, Jul. 2018. doi: 10.3389/frobt.2018.00081. Disponible en: https://www.frontiersin.org/articles/10.3389/frobt.2018.00081/full
[2] Y. J. H. Pérez y P. A. E. Cevallos, “Impacto de la enseñanza basada en proyectos apoyada por tecnología en el desarrollo de habilidades del siglo XXI en estudiantes de secundaria,” Bastcorp International Journal, vol. 3, no. 1, pp. 4–18, Apr. 2024. doi: 10.62943/bij.v3n1.2024.33. Disponible en: https://editorialinnova.com/index.php/bij/article/view/33
[3] G. C. Trávez, “El uso de la realidad aumentada en la enseñanza de ciencias: Un enfoque integrador en educación secundaria,” Revista Científica Kosmos, vol. 2, no. 1, pp. 39–50, May 2023. doi: 10.69583/inndev.v3n2.2024.133.Disponible en PDF en: https://editorialinnova.com/index.php/rck/article/view/43/87
[4] Y. Lee y B.-S. Sung, “Immersive Gesture Interfaces for Navigation of 3D Maps in HMD Based Mobile Virtual Environments,” Mobile Information Systems, 2018. doi: 10.1155/2018/2585797. Disponible en: https://onlinelibrary.wiley.com/doi/10.1155/2018/2585797
[5] J. J. M. Cusme, “Análisis del mundo virtual con relación a la Educación 4.0,” Revista Ingenio Global, vol. 3, no. 1, pp. 29–45, 2024.doi: 10.62943/rig.v3n1.2024.73. Disponible en: https://editorialinnova.com/index.php/rig/article/view/73
[6] E. M. Mokhtar, Investigating User Experience Using Gesture based and Immersive based Interfaces on Animation Learners [Internet]. 2023 [citado 1 Jun. 2025]. Disponible en: https://research.gold.ac.uk/id/eprint/33974
[7] J. Agurto Cabrera y C. Guevara Vizcaíno, “Realidad Virtual para la mejora del rendimiento académico en estudiantes de Educación superior,” Revista Metropolitana Ciencias Aplicadas, vol. 6, no. 2, pp. 233–243, 2023.doi: 10.56048/MQR20225.8.4.2024.6856-6876. Disponible en: https://www.investigarmqr.com/ojs/index.php/mqr/article/view/2111?articlesBySimilarityPage=10
[8] D. Carrizo y C. Moller, “Estructuras metodológicas de revisiones sistemáticas de literatura en Ingeniería de Software: un estudio de mapeo sistemático,” Revista Chilena de Ingeniería, vol. 26, no. 1, pp. 45–54, 2018.doi: 10.4067/S0718-33052018000500045. Disponible en: https://www.scielo.cl/scielo.php?script=sci_arttext&pid=S0718-33052018000500045
[9] B. Kitchenham, Procedures for Performing Systematic Reviews [Internet], Jul. 2004. Disponible en: https://www.inf.ufsc.br/~aldo.vw/kitchenham.pdf
[10] M. A. M. De la Torre, “Diseño de un entorno de aprendizaje basado en juegos serios para la enseñanza de habilidades de programación en niños de primaria,” Nexus Research Journal, vol. 1, no. 1, pp. 24–33, 2022.doi: 10.62943/rig.v3n1.2024.74. Disponible en: https://editorialinnova.com/index.php/rig/article/view/74
[11] E. Henríquez y M. Zepeda, “Elaboración de un artículo científico de investigación,” Ciencia y Enfermería, vol. 10, no. 1, pp. 17–21, 2004. Disponible en: https://scielo.conicyt.cl/pdf/cienf/v10n1/art03.pdf
[12] F. Winckler Simor et al., “Métodos de evaluación de usabilidad para juegos basados en gestos: una revisión sistemática,” JMIR Serious Games, vol. 4, no. 2, 2016. doi: : 10.2196/games.5860
[13] M. Wu, “Gesture Recognition in Virtual Reality,” Psychomachina, vol. 1, no. 1, pp. 1–11, 2023.doi: 10.59388/pm00336. Disponible en: https://journal.scidacplus.com/index.php/psychomachina/article/view/336
[14] V. H. Contreras, Interfaces gestuales como complemento educativo, cognitivo y social en niños con TEA [Internet], 2019 [citado 1 Jun. 2025]. Disponible en: https://imgbiblio.vaneduc.edu.ar/fulltext/files/TC130444.pdf
[15] C. Y. H. Hongjun y X. B. Yalan, “GestureTeach: A gesture guided online teaching interactive model,” Computer Animation and Virtual Worlds, vol. 35, no. 1, 2023. Doi: 10.1002/cav.2218. Disponible en: https://www.researchgate.net/publication/375028482_GestureTeach_A_gesture_guided_online_teaching_interactive_model
[16] M. J. Abásolo et al., “Realidad aumentada, realidad virtual, interfaces avanzadas, juegos educativos,” en XVIII Workshop de Investigadores en Ciencias de la Computación, Entre Ríos, Argentina, 2016. Disponible en: https://sedici.unlp.edu.ar/handle/10915/52968
[17] G. A. G. Sarabia, “Integración de la inteligencia artificial en la educación secundaria,” Horizon International Journal, vol. 3, no. 1, pp. 4–12, 2025. Doi: 10.63380/hij.v3n1.2025.59. Disponible en: https://www.editorialsphaera.com/index.php/hor/article/view/59
[18] P. M. Vera, R. A. Rodríguez y K. Carrau Mariano, “Experiencias en el desarrollo de aplicaciones móviles con interfaces basadas en la interacción física,” ReCIBE Rev. Electron. Comput., vol. 9, no. 1, pp. C2–16, 2020. Disponible en: https://www.redalyc.org/journal/5122/512267930002/html/
[19] H. Liu et al., “A Reconfigurable Data Glove for Reconstructing Physical and Virtual Grasps,” Engineering, vol. 32, pp. 202–216, 2024. Disponible en: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=5208853
[20] S. Kang, G. L. Hallman, L. Son y J. B. Black, “The Different Benefits from Different Gestures in Understanding a Concept,” Journal of Science Education and Technology, vol. 22, pp. 825–837, 2013. doi: 10.1007/s10956-012-9433-5. Disponible en: https://www.proquest.com/docview/2259586292?sourcetype=Scholarly%20Journals
[21] L. F. B. Domínguez, “Impacto de programas de inmersión cultural virtual en el desarrollo de competencias interculturales en estudiantes universitarios,” Alpha International Journal, vol. 2, no. 1, pp. 51–66, 2024. doi: 10.63380/aij.v2n1.2024.48. Disponible en: https://editorialsphaera.com/index.php/alp/article/view/48
[22] N. J. Ulbricht, “The Embodied Teaching of Spatial Terms: Gestures Mapped to Morphemes Improve Learning,” Frontiers in Education, vol. 5, article 109, 2020.doi: 10.1371/journal.pone.0280543. Disponible en: https://www.researchgate.net/publication/367963295_Can_grammatical_morphemes_be_taught_Evidence_of_gestures_influencing_second_language_procedural_learning_in_middle_childhood
[23] C. D. Carreño y E. M. López, Asistente virtual para el aula basado en estilos de aprendizaje utilizando herramientas de reconocimiento [Internet], 2024 [citado 1 Jun. 2025]. Disponible en: http://repository.pedagogica.edu.co/bitstream/handle/20.500.12209/20547/asistente%20virtual.pdf
[24] J. A. Bejarano, Interfaz de usuario basado en seguimiento de manos para realidad extendida [Internet], 2024 [citado 1 Jun. 2025]. Disponible en: https://burjcdigital.urjc.es/server/api/core/bitstreams/7ad12403-2c2e-4a6d-9629-6c4c05575e35/content
[25] N. S. Chen y W. Fang, “Gesture based technologies for enhancing learning,” en How gesture based technology is used in education to support teaching and learning: A content analysis, National Sun Yat sen University, pp. 95–112, 2013. doi:
[26] 10.1007/978-3-642-38291-8_6. Disponible en: https://www.researchgate.net/publication/278701781_Gesture-Based_Technologies_for_Enhancing_Learning
[27] O. Restrepo y D. E. Casas, Prototipo de un ambiente inmersivo basado en realidad aumentada haciendo uso de dispositivos de control gestual e interfaces cerebro-computadora para la terapia fino motora en niños [Internet], 2019 [citado 1 Jun. 2025]. Disponible en: https://repository.udistrital.edu.co/server/api/core/bitstreams/f956edef-0b06-40dd-bc6b-2467aa7a2f78/content
[28] M. Bernardos et al., “Assessing the acceptance of a mid air gesture syntax for smart space interaction: An empirical study,” Sensors, vol. 13, 2024. doi: 10.3390/jsan13020025. Disponible en: https://www.mdpi.com/2224-2708/13/2/25.
Downloads
Published
Issue
Section
License
Copyright (c) 2025 Susana Masapanta-Carrión , Javier Guaña-Moya , Yamileth Arteaga-Alcívar (Author)

This work is licensed under a Creative Commons Attribution 4.0 International License.
The article is distributed under the Creative Commons Attribution 4.0 License. Unless otherwise stated, associated published material is distributed under the same licence.