top of page

​Meeting You

​Capstone project of Interactive Media Art

This project is a mixed reality interactive experience that explores immersiveness. It helps people to meet creatures with healing power(dogs for example) and expand the boundary of the virtual world.

Introduction

This project explores the impact of haptic feedback on human emotion perception in virtual environments through offline experiments. The project begins by searching and integrating data from online databases. Since there are many kinds of research on VR, haptic feedback, emotion, etc., and the three are strongly correlated to a certain extent, the experimental design and process will refer to some previous papers. Reference papers are mainly from the APA (American Psychological Association) database and Google Scholar. After experimenting, I decided to go a step further and make a VR project that would help people play with their pets (whether dead or away from home) and get some emotional experience. Due to Covid-19, people have lost many opportunities for real-life contact. VR brings people out from lockdown. However, the interactive settings of many VR experiences are not coherent in the design of the operation. But many VR experience interaction settings are not coherent in the design of the operation. For example, player displacement in the Lab needs to use ray cast to select a point and then jump over instantly. There are currently very few pet-related VR experiences, and secondly, these projects rarely design haptic elements. For the virtual part, I first built a lifelike VR scene in Unity, then imported the animated animal models and added some AI and interaction scripts to the characters. For the haptic experience, I used fabric materials and rubber toys to simulate the touching experience. In this project, viewers users can interact with virtual characters in a state of high fidelity and a high degree of freedom.

preliminary research

How does Haptic Feedback influence the emotional communication of adults in High immersive environment?

The purpose of this research is to investigate whether adding specific haptic simulation activities in a highly immersive virtual environment will affect the emotional communication between participants and virtual characters.
 

Inspiration

A documentary introduced how VR help a Korean mother reunite with her dead daughter.

The Lab is a VR game in which players can pet a robot dog.

RoomShift is a mixed reality room-scale Dynamic Haptics for VR with Furniture-moving Swarm Robots.

Process

Interactions

The scene starts in an early afternoon room. The viewer will see the dog wandering in front of he/she. There will be 2 interactive objects on the ground. A duck and a ball. By picking up the objects, the viewer will see different reactions of the dog. The viewer can also move freely in the room if the physical space is big enough. At the same time, there will be physical objects with texture. 

Platform: Oculus Quest 2, Unity Editor.

Game Scene

I referenced some pictures on the web, then built a simple but comfortable room in the unity editor, and used skybox and lighting effects to make a sunny afternoon atmosphere.

For the character and animations, I chose Shiba Inu as the main character for this project. The animation of Shiba Inu comes from the public sharing channel on the Internet. It will not be shared and used for commercial purposes at the request of the copyright owner. A total of fourteen animations were used in the project. I first referenced the video tutorial to add a script for the random movement of the character. Then add animations such as walking, turning left, turning right, lying down, and so on according to the actions of the script. Because the character's position is very random, I used Navmesh to position the character and its target.

图片4.png
图片7.png
图片3.png

In order for the character to match the animation when moving, I set up nine bool conditions and made each animation state establish a transition with other states. In order for the character to match the animation when moving, I set up nine bool conditions and made each animation state establish a transition with other states. I set the controller as a trigger in the player's controller, that is, the position of the hand. When the player touches an interactive object, the dog will respond accordingly. Also, I did collision recognition of dogs and interactable objects.

图片6.png
图片5.png

For interaction I use both Oculus controllers and hand recognition. Players can either interact with items through the handle, or choose to use hand recognition to interact with real-world props. The advantage of the handle is that it is more stable than hand recognition. In terms of player movement, I limited the scope of the room in which the player can move freely.

图片9.png
图片10.png

DEMO

​Reflection and Future Plan

The main function works smoothly during the user test. But there are still some problems with implementing this idea. Third, the collision detection in VR is somewhat different from the effect in the unity editor. I originally set the animation to be triggered after the player throws the ball (OnTriggerExit), but this process is very fast in VR, so it shows that as soon as the player touches the ball, the dog will be attracted. 

 

Future plan

It needs the networking transition of data directly from the sensor(Arduino part) to the VR headset rather than Unity editor, which is hard to achieve without facilitation such as Oculus Link. I want to finish applying sensors in a physical environment similar to the RoomShift project. In this case, the physical staff could move to a certain destination according to the updating data generated by the user.

bottom of page