FPV Drone and DJI Virtual Flight

What is a FPV Drone?

FPV drone is very different from traditional drones. The pilots fly the drone wearing a VR goggle. Livestream view from the onboard camera on the drone is transmitted to the goggle with low latency allowing the pilot to control the drone more precisely. In fact, with the immersive real time viewing, FPV drones equip the pilots with the ability to shoot stunning cinematic aerial images or videos and perform acrobatic drone moves like back flips and rolls. The VR integration brings the level of photography, cinematography and entertainment of drones to whole nother level.

FPV is hard! 

Though FPV drones are much powerful compared to traditional drones in many aspects, they are also notoriously hard to maneuver. The need to control the drone with the first person onboard camera view means pilots need to get accustomed to the tilted view caused by the drone movement. Most drones have eight degrees of freedom (imagine an invisible xyz axis with drone centered in the origin, eight degrees of freedom are positive and negative x,y,z direction as well as clock wise and counter clockwise rotation), which means drone pilots need to learn to fly with 8 different camera illusion feedback from the movement. Let alone speed change. As a three year drone pilot, learning to fly a FPV drone still takes me weeks of training in the simulator to feel confident. While the DJIVirtualFlight is the VR simulator application that really helps here.

VR FPV simulator APP – DJI Virtual Flight

DJI Virtual Flight provides series of flight training sessions. From taking off, flying straight (you may be surprising that flying straight in a FPV is in fact not as easy as you think), turning, landing to some more advanced tricks. It also offers a few good maps for users to train their ability to shoot some cinematic clips or try out some acrobatic moves. Everything happens to the drone controller reflects on the view in the VR goggle. 

Possible improvements for DJI Virtual Flight.

It is true that DJI Virtual Flight provides good amount of training for a new FPV drone pilot, but the accuracy still has room for improvement. From my personal experience, DJI Virtual Flight is only about 60-70 percent accurate compared to the actual flight. Also, there are only three free flight maps provided in the application, which is very limited. More importantly, those maps are all virtual maps, they cannot be served as an actual rehearsal. Imagine if real world places are provided, drone pilots will be able to fly across Swiss Alps or The Blue Lagoon, Iceland in the simulator taking their time to think about how to frame the video before their actual visit.

Survival Horror Puzzles – Five Nights at Freddy’s: Help Wanted

Five Nights at Freddy’s: Help Wanted is a VR game that I enjoy recommending to first-time VR players. It consists of a collection of mini-games that pertain to a novel theme that could be described as Survival Horror Puzzle.

On PC, these mini-games were interesting at best, the animatronics were stuck on a flat-screen. I felt safe controlling the mouse to interact with the various UI elements. In VR, however, the experience was astonishingly brought to the next level. I was right there in that enclosed, darkroom. Alone. I couldn’t see the people around me. No matter where I looked it was just dark creepy aesthetics. There was no way out. It was truly immersive. I can’t help but be engaged within the game.

In the PC version, there was a UI button to click which will bring the user to a screen that shows the CCTV footage of the animatronics (who are out to kill you) at various locations.

In the VR version, the developers made a fantastic choice of embedding all UI components within the virtual environment.

This time, the buttons were right in front of you, as if you were right there in front of a working CCTV system. You have to physically “press” the buttons within the game with your virtual hands as you would have done intuitively in real life. There were no distracting UI elements or HUD to break the immersion in this 3D environment. This particular design choice deepens the immersion I felt in the game.

Another interesting thing to note is that the jumpscares here were noticeably more terrifying. Look at this moment where the player runs out of battery. The doors are now left wide open for the animatronics to attack. The game doesn’t just end here. The audio-visual experience provided by this moment is simply beautiful. The lights turn off. You hear the distinct sound of the power running out. A pair of eyes blink open to stare at you. Then an oddly friendly and creepy tune of “Toreador March” plays in the background. You are left to dread your impending doom while still leaving you with a slight chance of surviving the night. You never know for sure. And most importantly, your senses had no way out of this virtual environment. You can close your eyes, but you won’t fully avoid the light. This particular experience can be truly terrifying for the player (and also fun to watch as a bystander). This moment is simply one of many ways the player gets jump-scared in the game, but it is my favorite one that encapsulates all the various techniques needed to engineer a scare.

Note: The environment was desaturated in this particular level

There were some features in particular that I believe could be improved.

Firstly, the replayability value is quite low in multiple game modes. There are certain techniques that you will pick up to deal with the game’s various puzzles. For several game modes, once you master a few simple techniques, the game becomes repetitive and slightly boring. This could be owed to the fact that this game is a collection of mini-games. There was great breadth but little depth in some places.

Secondly, on certain modes, as seen in the video clipped above, the environment’s color scheme was altered. In the above example, the environment becomes desaturated. In my opinion, this particular mode drains both the game’s life and creepiness factor. As a player, I felt less appealed and less engaged with the environment.

Thirdly, while the precursors to the jumpscare were well done, as discussed above. I believe more work could have been done by the developer to make the actual jumpscare more terrifying. The environment blacks out immediately prior to the scare and breaks to an animatronic screaming to your face. This seems to lack a little polish. I felt that the scares could integrate a little more to the user’s surrounding 3D environment to completely seal in the immersion of the scare.

Finally, there were limited interactions with prizes in the game. The player can only hold the prizes around or put it near the face to “consume” it. I understand this is merely a fun gimmick for the player to fool around in a VR environment. But it is albeit too repetitive and predictable. I don’t actually look forward to winning these “prizes” upon completing a level. Several simple ways this can be improved are by giving the prizes more interactive components such as buttons to perform actions/generate sounds or movable limbs instead of just existing as a static prop.

My prize corner doesn't have the arrows to switch between prize menus. Is  there any way to fix this? I'm playing on non-vr mode. :  r/fivenightsatfreddys

Overall I enjoyed this game immensely, not only as a player but also as a bystander. 🙂

BMW iX: navigation with Augmented Reality Video

Introduction:
The new Augmented Reality Video function available supplements the BMW iX maps navigation system’s map view, enabling the driver to find their way on the road with great accuracy. It engages drivers with a live video stream from the driver’s perspective, shown on the control display and augmented by supplementary information that matches the context.

Interesting Features:
When dealing with confusing junctions, for instance, an animated directional arrow is integrated into the video image to help the driver take the best turn-off for the planned route. Depending on the situation at hand, the Augmented Reality Video view is activated prior to the maneuver to be performed and disappears again afterwards. Traffic accidents can be reduced by providing drivers with the safest possible route when it comes to complicated junctions such as three-way or four-way junctions.

Possible Improvements:
However, the driver will be constantly diverting their eyes from the road to staring at that center screen. It could be better if the augmented reality is in the windscreen in the form of HUDs, so that the driver can focus on looking at traffic along with the features as well.

Mozilla Hubs

Collaboration online has become more popular in recent days. The new technology using XR enables for interesting and exciting interaction that has not been possible before. Ever since Covid-19 came around, it has more or less been a necessity to be able to collaborate online, for safety reasons. 

One application that can be used for this is Mozilla Hubs which is an open-source social virtual reality platform. It is accessible on desktop, mobile or with VR-headset and it creates collaboration in a fun way. To start a meeting, simply create a room and share the link with your friends or colleagues. You can either select a premade room with just a click or create your own room from scratch using the 3D modeling tool Spoke. When inside the room, you can move around using the keyboard or a joystick, depending on what device you are logged onto. Hubs have the basic video conference settings such as screen sharing, draw, muting/unmuting mic and react. But you can also place 3D objects in the virtual space. Before I go into more detail about Hubs, it is worth to mention that I’ve only tried the application on a computer.

So why would you use this application? One reason for using it is that it feels like you are actually interacting with people during the meeting, and they must participate actively in the meeting. For me at least, there is nothing more boring than just talking to a screen while everyone has their camera turned off and not paying attention. But when someone is sharing a screen in Hubs for example, you have to walk up to the screen to see what is being shared. Everyone also gets to choose their own avatar, which makes it feel more personal.

The spatialized audio also makes it so that you have to be in proximity to the one you are conversing with to be able to hear them. This is clever because you can easily break out into smaller groups by just distancing yourself from the group. What I especially like about it is that is widely accessible for everyone. It works on all devices as I mentioned before and there is no installation requires which allows for an easy set up. However, if you want to spend more time on setting a room up, there are many opportunities for customization.

Another feature that I like is the object panel where you can find and edit all of your objects. Because when you are in a 3D world, it can be hard to keep track of all your placed objects, so it is nice to have an overview of them. But of course, to not take away from the immersive experience, you can interact and edit them with your avatar as well.

Although I think it is a good application, there is some features that could be further developed. One example is that it is hard to position the objects that you are placing in your room, like a shared screen or a 3D-object. When placing an object, it appears in front of your avatar and then you have to manually place it if you want to reposition it. Since the objects have no shadows, it is hard to perceive where you are placing the object. One suggestion to make the placement easier is to add a shadow under the object, or just a small marking on the floor that follows your object so you have a reference point. 

To sum this up, I think that Mozilla Hubs is a good VR application that is easy to use, and it is for everyone. 

Shooting Zombies in Hyperreality VR – Deadwood Mansion

Zombie-slaying gameplay isn’t something new. We’ve seen them in PC games and even in arcade stations. Deadwood Mansion takes the experience to a new whole new level. Players have to don haptic suits and body trackers, giving them a fully immersive experience.

Deadwood Mansion – Official Trailer

Deadwood Mansion is the first hyperreality escape experience ever created by Sandbox VR. They utilise a wide range of technology: Hollywood motion capture cameras, 3D body precision trackers, haptic suits, etc and are built by engineers from EA, Sony and Ubisoft. They have franchises across USA and Asia (one in Singapore!) that we can visit. The price of a ticket is surprisingly more affordable, ranging from SGD 32 – 42 per pax.

What drew me to Deadwood Mansion was the idea of really bringing us into a different world and letting us feel like we are living in it. As exciting as VR is, a part of me would always remember that what I saw isn’t reality. I also loved the multiplayer aspect — up to 6 players are allowed in the room. You can physically interact with everyone inside. You could even choose your choice of weapons, from dual pistols to rifles and guns.

Despite being a heavy shooter game, Deadwood Mansion does a great job of incorporating a narrative to the experience. These are achieved with voiceovers under the pre-tense of calls from the team’s intelligence. Other than that, the game uses HUDs sparingly. Like standard shooter games, hand reference frames are used to show statistics on their weapons. There are some UI placed in the environment when the group receives “calls” from the intelligence or the villain.

Sandbox VR also records several videos for each session: each player’s POV and a room-wide view of all players with and without the virtual environment. Each group can sit down together to watch these footages after their session. Some of the videos really looked like the players’ real bodies got transformed into virtual bodies.

All Time High Scoreboard

There is also a world wide leaderboard for game scores. I find that it gives some level of competition outside the game.

Besides Deadwood Mansion, Sandbox VR also provide many other worlds, some notable ones are Star Trek and UFL. Star Trek: Discovery lets players handle a Starfleet Phaser and find a lost Starfleet ship. UFL, on the other hand, lets people split into teams and battle in a gladiator fighting ring.

While I haven’t had the chance to experience it myself, there seems to be a lot of gears for player to put on before the game. I think that could potentially be a hassle before and during the game, depending on how heavy these equipment are. While it would be great to reduce the size and number of gears, that would probably require a lot more research and development before it can be achieved. After all, hyperreality VR experience is isn’t easy.

Another idea that I felt might make the game more immersive is having more than 1 room. Perhaps in between some zombie waves, the group could proceed to another room as part of the narrative. It could be something like the villain planting a bomb in the present room and the team has to escape it to avoid the explosion. I think might give a better sense of story/game progression.

Overall, I think Sandbox VR did a great job making hyperreality VR well, a reality. Personally, I see this as a one-time novel experience and not something that I would want to try again and again. Despite that, I think it offers a really fun and interesting activity that many people would be excited to try out.

Home Decoration With Ikea XR app

IKEA, one of the largest home furnishing businesses in the world, has further extended its service into the realm of mixed reality to provide users better shopping experience while purchasing home furnitures.

In 2017, IKEA launched a mixed reality mobile app that lets customers drop virtual furniture into their own homes and view it through their smartphone camera. It makes uses the ARKit developed by Apple and the quality of imagery and the realism is exceptional.

I particularly like this application because it breaks the norm of online shopping where users purchase items only based on media files and description and they have no idea how the product will actually fit before receiving them. Mixed reality however, brings more possibility into this market and bestows greater freedom into users hands. They are now able to perform tasks that used to be impossible, such as fitting all kinds of furnitures into the users rooms during selection and easily moving and rotating furnitures using only fingers. This greatly improves users shopping experience and leads a future trend of mixed reality application into the online shopping market.

In particular, this application has a few new features that make it very useful. 

1. It is able to scan and detect the home environment to make the AR furnitures to interact with the real world and hence well fit into the room. This avoids the troublesome of using rulers to measure the areas of the space and compare with the furniture sizes. With the use of phone device, scanning is also an easy task with camera pointing to the space.

2.  It is able to adjust the posture and position of the virtual furnitures before placing them onto the floor. This can be achieved by using fingers just like how users use their phone cameras to zoom in and out. By speaking users language, user would find it familiar and easy to use without much guidance and assistance, which offers a great user experience. Moreover, with a clean ui design, user could solely focus on how the object interact with the real world environment without being distracted by the app interface.

3. It is easy to follow with just four steps: browse – scan – adjust – drop and users could easily perform all the tasks with simply a few taps. The simpleness and the optimised performance speed ensures the fluidity of using the application and ensures the user experience. The final outcome is also high quality that resembles the reality, making it useful to become the reference before users make their purchasement. 

However, there are still space for improvements to further satisfy users’ needs. For example, user might want to purchase multiple furnitures and find out whether different furnitures fit with each other, but the current app only allows single selection at each time which makes this user task impossible to achieve. To meet this requirement, the app could introduce a “shopping cart” feature for which user could select multiple items to be put into the scene and interact with each other. The application could even introduce a feature where multiple items can be auto placed by its relative orientation and sizes.

Nevertheless, the Ikea XR application is a breath-taking innovation that dives into the market that are very suitable for the use of XR technology. While there are still room for improvements, this could potentially lead to more extensive and in depth application of XR in all the online shopping platforms.

Engaging emotions in VR: Resident Evil

What do I like about it?

Many VR applications uses human sensory to provide immersion for e.g. sight and touch. However, there can be also other ways to allow users to feel that they are present in the VR experience. One of which is the use of emotions and more specifically, the use of fear. Fear is especially effective as it is common that fear overwhelms the user and their perception of reality. It is true even for real life situations like going to haunted houses at theme parks. Even though we know that things are staged, we can’t help but still experience excitement and fear.

What features can be improved and how?

There were a few improvements that can be made from the users’ feedback in the video. One of the users noted that “I have the same setup in the room I am in now”. This can be very effectively in providing presence to the user. It would be good if the VR game can capture the surroundings before the game and provide the in-game experience of the player walking through some place familiar. The game can also have popular locations so that the user will associate the location with the real world and be truly immersed.

Another thing that can be improved on is the controls. As it is a horror game, users are likely to run away in real life. Below shows a video of an average consumer playing the game. Even though she was holding a controller, she constantly moves from her position and her family had to hold her in case she knocks into her furniture. This can be solved with an omni-direction threadmill which is probably too expensive currently. Apart from the VR headset itself, there should also be improvements in other aspects supporting the VR experience.

Personally, I also feel that the VR headset is bulky and therefore, would affect the user experience. To have a better VR experience and immersion in the game, the user should feel comfortable, and the weight of the headset should not affect the touch (stress on head due to weigh of headset). Like Elon Musk said, for now, I do not see users strapping on to VR headsets for a long period of time because it looks uncomfortable and the weight of the headset would probably cause some level of neck ache. Luckily, there are active development of lighter headsets which include the HTC flow.

For now, I feel that the horror genre provides the most presence for the user and I would like to research more about using emotions to provide the user with the same level of presence in the VR other than using fear. The use of other emotions like sadness, anger and joy are probably more difficult to invoke the same experience as fear but they are worth looking into to give users other VR experiences.

Sky Walk 2

Sky Walk 2 is a mobile application for iOS and Android, that enables you to explore the sky. It works by pointing your device at the sky and you will see a real-time interactive sky-map containing stars, planets, constellations and more. This app is perfect for anyone who wants to learn more about the night sky, by locating and identifying celestial objects.

Sky Walk 2 works by using your location and the built-in compass of the device to show you the stars you can see in real life above you and display more information about these stars. When you hover the device around the night sky, the app will follow your movements. Sky Walk 2 has an AR mode and a non-AR that can be switched on and off. With the non-AR mode, the app will show a virtual night sky that aligns with the real one. With the AR mode, the application will use the camera to show the real night sky and then add a layer on top of that with the different objects in the sky and information about these. The user is able to turn the visibility of this AR layer up and down to find the perfect mix of real-life and AR.

Why they are engaging? 

Most people have properly been sitting outside and looking up in the night sky at some point in their life. With Sky Walk 2, it is now possible to know exactly what you are looking at. With the AR mode, you can point your camera at any point in the sky and the app will tell which sky objects are visible to you from your location. Sky Walk 2 can also help you identify the best time for observation and notify you when special astronomy events are happening. 

What features are well done? 

The graphical overlay is very well designed, as well as the animations. When you locate a star constellation, an animation will be displayed, connecting the stars in that constellation. At the same time, a graphical representation of the symbol of that constellation is also represented. 

What features can be improved and how?

Sometimes the AR overlay of the application doesn’t align perfectly with the real-life stars. If something were to be improved, the automatic calibration of the AR overlay with the real-life stars could be it. Luckily, you are able to manually align the overlay with the real-life stars.

To summarize, Star Walk 2 is a cool and easy-to-use app if you ever find yourself looking up in the sky and wonder what you actually are looking at.

Sound Design in VR: Phasmophobia

Adi Kamaraj

Sound design in VR is a staple of what makes a deep and immersive environment able of captivating an audience and also for that environment to express emotion and spatialization relevant to the scene it describes. In providing entertainment in VR, often the senses that are able to be utilised are limited to sight and hearing and therefore audio design is important in providing the end user a great experience.

There are many forms of audio design that can be used to effectively used to convey genre and emotion. One VR game that does this well is a game called “Phasmophobia”. The general idea of the game is that of an investigative horror game played from a first person perspective where the player works alone or in a group to complete a contract in which they must identify the type of ghost haunting the specified site. Players can use various basic equipment such as spirit boxes, video cameras and UV flashlights for example in order to complete this goal.

Phasmophobia, although not next generation in it’s visuals, graphics or narrative, is elite in conveying fear and horror using it’s well designed audio soundscape and the developers have not overlooked audio as element that is essential in delivering that heightened sense of immersion.

One technique they utilise well is the implementation of wide spectrum sources and a borad spectrum of sounds including wind sounds and high frequencies to spatialize effectively and provide a lot of frequences for the HRTF (Head Related Transfer Function) to work with, also helping mask audible glitches resulting from pan and attenuation. The high frequencies are heavily used by humans for sound localization.

The developers have also leveraged 3D audio spatialization which provides much more accurate spatial cues, including height, and with this improved accuracy, in addition to volumetric sources, these audio techniques truly create a terrifying space when you are in the game. With the environment emanating such eery noises that lock you into the immersion of your surroundings and aware of the genre/emotions the game is trying to express, the developers also add ghosts with their own immersive sound design. The use of the doppler effect that changes a sound’s pitch as the source approaches or recedes makes the movement of the ghost (when heard) keep you always attentive.

The influence of an ambient nature soundscape and movement-triggered step sounds are also used to further drive that feeling of presence. Using the ambience as an auditory stimuli is great in influencing presence and creates a virtual environment that feels alive and immersive. The sounds of walking interact with the environments in their sounds and tempo giving the user a feeling of direct interaction with the surroundings. As these audio cues and implementations get better, we move towards even greater presence where the user is under a more believable impression that they are in a virtual environment with less awareness of a mediating technology, such as a game.

As mentioned before, interactive sound is one of the most important things in audio design when trying to deliver a feeling of presence. The game does this really well with its constant reminders that you are in the environment as all interactions with the surroundings provide responsive dynamic audio, such as footsteps and opening doors for example.

In VR, hearing is the only sense able to provide full spatial information going beyond our field of view, including elevation, 360 degrees and depth, allowing us to guide our decisions and behaviors as well as understanding our virtual surroundings. Not only is audio unique in being able to provide spatial information for the brain to analyze but also tell a narrative.

For me, playing this game really highlighted the importance of audio in immersion in the virtual realty space. However much a visual experience can paint a narrative, the sound design is what really adds the dimension that makes you believe you are living that experience. It can create tension, emotion and a sense of spatialization that is an important channel for information about their environment and creating presence. It is not going unnoticed as realistic and responsive spatial audio is quickly become a cornerstone for investment and key to the development of the so called metaverse by big tech companies. I for one am curious about the future of audio in VR and hope many new immersive audio technologies are used to create a deeper more believable virtual reality.

Overcome public speaking anxiety with training in VR

Have you ever felt discomfort when it is your turn to speak in a group discussion or when you have to give a speech in school elocution competition? Before giving a speech in front of a large audience, few people practice in-front of a mirror or find a friend to get some feedback in a practice session. Training in virtual reality is likely to be the solution to public speaking anxiety. Studies were done on virtual reality exposure therapy and companies are using VR applications for employee training on public speaking.

Cross reality (XR) which includes virtual reality (VR), augmented reality (AR) and mixed reality (MR) provide immersive digital experiences. VR actively uses human sensory capabilities (like sight and sound) to provide understanding and general relations of an experience. This can be used to improve public speaking using immersive and realistic simulations. A study published in Cognitive Behaviour Therapy suggests that VR can be used as a therapeutic tool for public speaking anxiety. [1]

There are multiple VR applications that help with overcoming the fear of public speaking. VIRTUALSPEECH [2] is a company that specialises in professional development training with courses for mastering public speaking. The training contains a wide range of self-paced VR scenarios.

TEDx theatre as VirtualSpeech training scenario

VirtualSpeech application is available on VR headsets like Oculus Quest, VIVE Focus 3 and Pico Neo3. Besides the variety of VR scenarios, the features from VirtualSpeech that stand aside are Real-time feedback, display notes in the room on an autocue, and audio & visual distractions from the avatars to simulate real world experience. The Real-time feedback is given while the speech is being delivered; feedback is provided on eye contact, pace of the speech and volume of delivery.

One feature allows speaker to record and upload questions in advance. These questions are then asked by the virtual audience during the speech. Conversational AI can be used to further enhance this feature to allow real time communication between the speaker and the audience.

Real-time feedback in VirtualSpeech

Ovation [3] is another application in VR that helps overcome public speaking anxiety. It also provides real-time training tools and feedback as one speaks to a realistic, simulated audience. Training is provided for Gaze, Voice and Hands. The category, Gaze refers to where the speaker is looking while delivering a speech. This is detected by the movement of the VR headset. If the headset includes eye tracking, it can determine the exact location in the virtual scenario where the speaker is looking. VR motion controllers can be used for training on mic distance. If the mic is too far from speaker’s mouth, a red pulse is displayed and speaker will experience vibration. The motion controllers can also be used to determine the movement of speaker’s hands.

Ovation Public Speaking training in VR

The recommended VR headsets from Ovation are HP Reverb G2 Omnicept and HTC Vive Pro Eye. Sensors in Omnicept can detect cognitive load in real time, it captures the brain power needed to remember and properly deliver the speech. Both Vive Pro Eye and Omnicept detect the exact location where the speaker’s eyes are looking, this is used by Ovation to provide more accurate analytics and better insights.

The study published in Cognitive Behaviour Therapy [1] concluded that one-session virtual reality therapy can be an effective treatment of public speaking anxiety. In future, we can expect to see more VR applications in public speaking training with enhanced features like automatically generated questions and emotional responses from virtual audience based on sentiment of speaker’s speech.

References:
[1] Philip Lindner, Jesper Dagöö, William Hamilton, Alexander Miloff, Gerhard Andersson, Andreas Schill, and Per Carlbring (Sep 2020). Virtual Reality exposure therapy for public speaking anxiety in routine care: a single-subject effectiveness trial. Cognitive Behaviour Therapy.
[2] VirtualSpeech.com
[3] OvationVR.com