DOI QR코드

DOI QR Code

메타버스 상호작용 기술 동향 및 발전 전망

Metaverse Interaction Technology Trends and Development Prospects

  • 백성민 (실감상호작용연구실) ;
  • 이용호 (실감상호작용연구실 ) ;
  • 김주영 (콘텐츠융합연구실 ) ;
  • 박상헌 (콘텐츠융합연구실 ) ;
  • 길연희 (실감상호작용연구실)
  • S.M. Baek ;
  • Y.H. Lee ;
  • J.Y. Kim ;
  • S.H. Park ;
  • Y.-H. Gil
  • 발행 : 2024.04.01

초록

The Metaverse industry is developing rapidly, and related technologies are being actively improved. Tools such as controllers, keyboards, and mouses are used to interact in the Metaverse, but they are not natural and intuitive interfaces to resemble real-world interactions. Immersive interaction in a Metaverse space requires the engagement of various senses such as vision, touch, and proprioception. Moreover, in terms of body senses, it requires a sense of body ownership and agency. In addition, eliciting cognitive and emotional empathy based on non-verbal expression, which cannot be suitably conveyed to the digital world, requires higher-level technologies than existing emotion measurement solutions. This diversity of technologies can converge to build an immersive realistic Metaverse environment. We review the latest research trends in technologies related to immersive interactions and analyze future development prospects.

키워드

과제정보

본 연구는 한국전자통신연구원 연구운영비지원사업(기본사업)의 일환으로 수행되었음[24ZC1200, 사용자 중심(Egocentric) 원격 교감 상호작용 핵심원천기술 개발].

참고문헌

  1. MIT Technology Review, "Meta is desperately trying to make the metaverse happen," 2022. 10. 11. 
  2. A. Winkler, J. Won, and Y. Ye, "QuestSim: Human motion tracking from sparse sensors with simulated avatars," in Proc. SA, (Daegu Rep. of Korea), Nov. 2022, pp. 1-8. 
  3. S. Lee, "QuestEnvSim: Environment-aware simulated motion tracking from sparse sensors," in Proc. SIGGRAPH, (Los Angeles, CA, USA), Jul. 2023, pp. 1-9. 
  4. https://www.noitom.com/ 
  5. https://www.movella.com/products/motion-capture 
  6. A. Toshev et al., "Deeppose: Human pose estimation via deep neural networks," in Proc. CVPR, (Columbus, OH, USA), June 2014. 
  7. G. Pavlakos et al., "Coarse-to-fine volumetric prediction for single-image 3d human pose," arXiv preprint, CoRR, 2017, arXiv: 1611.07828. 
  8. S. Wei et al., "Convolutional pose machines," in Proc. CVPR, (Las Vegas, NV, USA), June 2016. 
  9. K. He et al., "Mask r-cnn," in Proc. ICCV, (Venice, Italy), Oct. 2017. 
  10. K. Sun et al., "Deep high-resolution representation learning for human pose estimation," in Proc. CVPR, (Long Beach, CA, USA), June 2019. 
  11. Z. Cao et al., "Realtime multi-person 2d pose estimation using part affinity fields," in Proc. CVPR, (Honolulu, Hi, USA), Jul. 2017. 
  12. A. Newell, Z. Huang, and J. Deng, "Associative embedding: End-to-end learning for joint detection and grouping," arXiv preprint, CoRR, 2017, arXiv: 1611.05424. 
  13. V. Belagiannis et al., "3D pictorial structures revisited: Multiple human pose estimation," IEEE Trans. Pattern Anal. Mach. Intell., vol. 38, no. 10, 2016, pp. 1929-1942. 
  14. V. Belagianniset et al., "Multiple human pose estimation with temporally consistent 3D pictorial structures," in Computer Vision-ECCV 2014 Workshops, vol. 8925, Springer, 2014, pp. 742-754. 
  15. T. Cootes et al., "Active shape models-their training and application," Comput. Vis. Image Underst., vol. 61, no. 1, 1995, pp. 38-59. 
  16. S.X. Ju et al., "Cardboard people: A parameterized model of articulated image motion," in Proc. Int. Conf. Automatic Face Gesture Recognition, (Killington, VT, USA), Oct. 1996, pp. 38-44. 
  17. M. Loper et al., "SMPL: A skinned multi-person linear model," ACM Trans. Graph., vol. 34, no. 6, 2015, pp. 1-16. 
  18. D. Anguelov et al., "SCAPE: Shape completion and animation of people," ACM Trans. Graph., vol. 24, no. 3, 2005, pp. 408-416. 
  19. H. Joo, T. Simon, and Y. Sheikh, "Total capture: A 3d deformation model for tracking faces, hands, and bodies," in Proc. CVPR, (Salt Lake City, UT, USA), June 2018, pp. 8320-8329. 
  20. 연구개발특구진흥재단, "유망시장 Issue Report 햅틱 기술," 2021. 7. 
  21. S. Jeong et al., "Pattern design of a liquid metal based wearable heater for constant heat generation under biaxial strain," iScience, vol. 26, no. 7, 2023. 
  22. H. Kim and J. Bae, "Analysis of electrical resistance changes in liquid metal printed wires under strain for stretchable electronics," Smart Mater. Struct., vol. 30, no. 9, 2021, article no. 095004. 
  23. www.bhaptics.com 
  24. https://haptx.com/ 
  25. M. Kim et al., "SpinOcchio: Understanding haptic-visual congruency of skin-slip in vr with a dynamic grip controller," in Proc. ACM CHI, (New Orleans. LA. USA), Apr. 2022, pp. 1-14. 
  26. R. Kovacs et al., "Haptic PIVOT: On-demand handhelds in VR," in Proc, UIST, (Minneapolis, MN, USA), Oct. 2020. 
  27. www.playstation.com 
  28. http://first-vr.com/ 
  29. S. Oh et al., "Easy-to-wear auxetic SMA knot-architecture for spatiotemporal and multimodal haptic feedbacks," Adv. Mater., vol. 35, no. 47, 2023. 
  30. https://owogame.com/ 
  31. https://www.actronika.com/ 
  32. https://teslasuit.io/ 
  33. T. Yang et al., "Recent advances and opportunities of active materials for haptic technologies in virtual and augmented reality," Adv. Funct. Mater., vol. 31, no. 39, 2021, pp. 1-30. 
  34. A. Singhal and L.A. Jones, "Perceptual interactions in thermo-tactile displays," in Proc. IEEE WHC, (Munich, Germany), Jul. 2017. 
  35. Y. Zhang et al., "Force-aware interface via electro-myography for natural VR/AR interaction," ACM Trans. Graph., vol. 41, no. 6, 2022, pp. 1-18. 
  36. https://hpi.de/ 
  37. P. Ekman, "Basic emotions," in Handbook of Cognition and Emotion, Wiley, 1999, pp. 45-60. 
  38. J.A. Russell, "A circumplex model of affect," J. Pers. Soc. Psychol., vol. 39, no. 6, 1980. 
  39. K. Olszewski et al., "High-fidelity facial and speech animation for VR HMDs," ACM Trans. Graph., vol. 35, no. 6, 2016, pp. 1-14. 
  40. Road to VR, HTC Announces Face-tracker for Vive Pro and Vive Tracker 3.0, Mar. 10, 2021, https://www.roadtovr.com/htc-vive-facial-tracker-3-0-announcement-release-date-price/ 
  41. MIXED Reality News, Meta Quest Pro: What the new eye and face tracking can do, Oct 25, 2022, https://mixed-news.com/en/meta-quest-pro-what-the-new-eye-and-face-tracking-can-do/ 
  42. UploadVR, Quest Pro Now Has Tongue Tracking, Dec 18, 2023, https://www.uploadvr.com/quest-pro-tongue-tracking/ 
  43. Meta, Pixel Codec Avatars, Jun. 19, 2021, https://research.facebook.com/publications/pixel-codec-avatars/ 
  44. M. Horvat et al., "Assessing emotional responses induced in virtual reality using a consumer EEG headset: A preliminary report," in Proc. MIPRO, (Opatija, Croatia), May 2018, pp. 1006-1010. 
  45. N.S. Suhaimi, J. Mountstephens, and J. Teo, "A dataset for emotion recognition using virtual reality and EEG (DER-VREEG): Emotional state classification using low-cost wearable VR-EEG headsets," Big Data Cognit. Comput., vol. 6, no. 1, 2022. 
  46. M. Yu et al., "EEG-based emotion recognition in an immersive virtual reality environment: From local activity to brain network features," Biomed. Signal Process. Control, vol. 72, 2022, article no. 103349. 
  47. K. Gupta et al., "Affectivelyvr: Towards vr personalized emotion recognition," in Proc. ACM VRST, (Virtual), Nov. 2020, pp. 1-3. 
  48. Q. Wu et al., "Emotion classification on eye tracking and electroencephalograph fused signals employing deep gradient neural networks," Appl. Soft Comput., vol. 110, 2021, article no. 107752. 
  49. J.Z. Lim et al., "Emotion recognition using eye-tracking: Taxonomy, review and current challenges," Sensors, vol. 20, no. 8, 2020. 
  50. VREED: Virtual Reality Emotion Recognition Dataset Using Eye Tracking & Physiological Measures. 
  51. N. Hube, K. Vidackovic, and M. Sedlmair, "Using expressive avatars to increase emotion recognition: A pilot study," in Proc. CHI EA, (New Orleans, LA, USA), Apr., pp. 1-7. 
  52. A. Valente et al., "Empathic AuRea: Exploring the effects of an augmented reality cue for emotional sharing across three face-to-face tasks," in Proc. VR, (Christchurch, New Zealand), Mar. 2022, pp. 158-166. 
  53. M. Salminen et al., "Evoking physiological synchrony and empathy using social vr with biofeedback," IEEE Trans. Affect. Comput., vol. 13, no. 2, 2019, pp. 746-755. 
  54. S. Lee et al., "Understanding and designing avatar biosignal visualizations for social virtual reality entertainment," in Proc. CHI, (New Orleans, LA, USA), Apr. 2022, pp. 1-15. 
  55. K. Gupta et al., "VRdoGraphy: An empathic VR photography experience," in Proc. VRW, (Shanghai, China), Mar. 2023, pp. 1013-1014. 
  56. Y.S. Pai et al., "The empathic metaverse: An assistive bioresponsive platform for emotional experience sharing," arXiv preprint, CoRR, 2023, arXiv: 2311.16610. 
  57. T. Rinnert et al., "How can one share a user's activity during VR synchronous augmentative cooperation?," Multimodal Technologies and Interaction, vol. 7, no. 2, 2023. 
  58. T.M. Michaels et al., "Cognitive empathy contributes to poor social functioning in schizophrenia: Evidence from a new self-report measure of cognitive and affective empathy," Psychiatry Res., vol. 220, no. 3, 2014, pp. 803-810. 
  59. S.H. Konrath et al., "Changes in dispositional empathy in American college students over time: A meta-analysis," Pers. Soc. Psychol. Rev., vol. 15, no. 2, 2011, pp. 180-198. 
  60. C. Milk, "How virtual reality can create the ultimate empathy machine," TED Talk, vol. 22, 2015. 
  61. K.E. Stavroulia et al., "The role of perspective-taking on empowering the empathetic behavior of educators in VR-based training sessions: An experimental evaluation," Comput. Edu., vol. 197, 2023, article no. 104739. 
  62. M. Tassinari et al., "Investigating the influence of intergroup contact in virtual reality on empathy: An exploratory study using AltspaceVR," Front. Psychol., vol. 12, 2022, article no. 815497. 
  63. M.H. Davis, "Measuring individual differences in empathy: Evidence for a multidimensional approach," J. Pers. Soc. Psychol., vol. 44, no. 1, 1983. 
  64. E. Parra Vargas et al., "Virtual reality stimulation and organizational neuroscience for the assessment of empathy," Front. Psychol., vol. 13, 2022, article no. 993162. 
  65. 애플 뉴스룸, "Apple, 최초의 공간 컴퓨터 Apple Vision Pro 발표," 2023. 6. 5. 
  66. W. Xu et al., "Mo2Cap2: Real-time mobile 3D motion capture with a cap-mounted fisheye camera," arXiv preprint, CoRR, 2019, arXiv: 1803.05959. 
  67. J. Wang et al., "Estimating egocentric 3D human pose in global space," in Proc. IEEE/CVF ICCV, (Virtual), Oct. 2021, pp. 11500-11509. 
  68. J. Wang et al., "Scene-aware egocentric 3D human pose estimation," arXiv preprint, CoRR, 2023, arXiv: 2212.11684. 
  69. V. Mollyn et al., "IMUPoser: Full-body pose estimation using IMUs in phones, watches, and earbuds," in Proc. CHI, (Hamburg Germany), Apr. 2023, pp. 1-12.