Locomotion is the medium that allows users to navigate through a virtual environment. When space is limited, games often rely on buttons or directional pads to displace the user from their original position within the 3-Dimensional area.
We see many games try to work around locomotion by NOT giving movement control to the player. Such games are “Beatsaber” and “VR the Diner”, not requiring user movement. Even for 3D painting, movement can be overcome by rotating the sculpture, rather than having to move around it. There are also some shooting games that focuses on the shooting action scenes in a car or in a building, automatically advancing players when they clear a stage. This reduces the need for the player to move.
There are definitely times where movement is necessary for an immersive experience. The main workaround we see nowadays are 360 degree threadmills that keep players in place. Just be careful to let players play in a safe manner in case things go south like this: https://www.reddit.com/r/funny/comments/pkuzli/vr_workout_with_added_heart_attack/
Locomotion based on In-Place Motion Sensing
When the user wants to move in the VR landscape, he or she will need to execute a specific physical action on the spot to do so. This could be something simple like a small movement of the right arm forward to move the right leg forward. This is pertaining to the Sensory Conflict Theory, where the aim is to try and minimize the conflict since now both the user’s mental model which knows that the arm moved, is matched by the movement in the VR landscape.
Reduce Motion Sickness using Sensory outputs
Since in the Sensory Conflict Theory, the user experiences motion sickness due to a mismatch in the senses and/ or mental model of the user, we could try to output some form of sensory feedback to the user when they walk, to try and fill in the gap.
For example, when the user moves a joystick to move the player in the VR landscape, whenever the player in the VR landscape steps on the ground, a small vibration could be sent to the user through the joystick, to simulate the feeling of actually stepping on the ground. This can be the attempt to bridge the gap in the sensory cues.
Reduce Motion Sickness using User Interface as a REST Frame
In usual 2D/ 3D games, the User Interface is usually fixed to the game screen.
In VR, we can try to do the same thing by having the User Interface fixed onto the user’s vision just like how it would usually be in a regular 2D/ 3D game. The User Interface can then act as a REST Frame for the user, since the User Interface elements can act as fixation points for the user during actual locomotion.
Assume that we have 2 handles attached to the ceiling in front. Each hand can reach a handle and pull it to you. When your left hand pull the handle your right foot will move forward a grid. If you release a handle it will go back to its origin position pending for the next operation. So by pulling a handle, release it while pulling the other, you can move forward. And if you pull the 2 handles together, you will jump for one grid.