Google Earth VR | Breathtaking VR Experience

Google Earth VR

Do you still remember the wonders of exploring Google Earth for the first time? The amusement you get when you get to see the place you live, in 3D, or when you are experiencing the first-person view via street view in a whole different country you’ve never visited. Now imagine that, in Virtual Reality! That’s exactly what Google Earth VR is about, you get to put on a VR headset and travel the world in the comfort of your own home.

Google Earth is a program that utilizes satellite imagery to render accurate 3D representations of our planet Earth. It allows users to traverse the world to view these 3D images by controlling the rotation of the globe and zooming in and out to have a closer view. The fantastic street view functionality is also added to the program, allowing users to stand on the street and soak in the view from a first-person perspective [1]. The VR version of the software was introduced back in 2016 [2], and it is currently free on Steam [3], which means any VR headset owner can hop onto this amazing experience without any additional cost.

Given how the original Google Earth works and functions, it makes a whole lot of sense that the experience of using it in VR would be nothing less than incredible. I have tried it out on HTC Vive before and the experience was simply breathtaking. The immersion of catching amazing views around the Earth is greatly complimented by the intuitive controls and functionalities the program provides to travel the world.

Controlling the Earth [4]

Figure 1: View of Google Earth VR Tutorial

When booting up the program for the first time, users are greeted by an outer-space view of Earth and a quick guide on how to control the program. The software shows an actual representation of both your controllers to easily show you how to control the software. Let’s run through some of the basic controls together to gain a better understanding of them.

Rotate the earth by holding the button on the controller and dragging the globe around.

Fly towards the pointed direction by pointing the controller in the desired direction and moving the joystick/pressing the touchpad.

Change your orientation to place the Earth below or in front of you with a press of a button, this allows you to change your perspective of viewing the Earth. Having the Earth below you give a natural perspective view of the place but having the earth in front of you makes searching for a desired location easier.

See featured places, saved places, and search for places by opening the menu and selecting from the user interface.

Change the time of day by dragging the sky while holding the trigger button.

Street View is available when a tiny globe shows up at the controller and users can hold the globe closer to their eyes to have a quick glance or press a button to enter street view fully.

Teleport to a new place by opening a tiny globe on one controller and using the other controller to select the new location to teleport to.

With these controls, this VR experience enables users to travel to any part of the world and view the place from any angle as if they are there in person.

Why I absolutely loved Google Earth VR

Google Earth VR is a one-of-a-kind experience that makes you realize the potential that VR has for the future. It is the perfect demonstration of integrating VR into an existing piece of software to increase the immersion for its user. The original software allows the user to have a bird’s eye view of different places on the Earth, but it is confined within a two-dimensional computer monitor. With VR, you can simply rotate your head around to have a good look at the environment, this greatly enhances the immersion of the user. The way you control your position in Google Earth feels incredible too, you get to fly around to different places like an eagle or plant yourself on the street to experience being there in real life. I fell in love with this piece of software when I traveled to places on Earth I have never visited, such as the peak of Mount Everest or the Eiffel Tower, and caught the breathtaking views from those places.

The other reason Google Earth VR excites me so much upon using it for the first time is imagining the potential it has. If Google has the potential to model the entire Earth in 3D and have people access them in VR, imagine the future applications where we can put our own human model in a whole metaverse that’s a perfect replication of the Earth we currently have, and we can traverse anywhere in an instant and experience other countries’ culture from the comfort of our own home. Even now, people can use it to plan their travel to a different country by using it to check those locations and have a mental map of the places they are visiting. Parties doing urban planning can also potentially use this technology to experience in person what their future buildings and roads will be like.

Why is Google Earth VR engaging?

I believe that you as a reader would find the prospects of seeing different places on Earth in VR exciting too. Some reasons that make it exciting are its vastness of it and how unbounded it is. With the software of this scale, users would first be in disbelief that they can visit every part of the Earth, even their own home. Once they realized that it is entirely possible, their imaginations would run wild thinking about the next location to visit. Google Earth VR doesn’t bind its users to certain places but gives them the limitless potential to decide where they would like to go next, one can say that the only limit is the user’s imagination. Furthermore, people are compelled to explore around because of a very simple reason – Earth is beautiful. There are so many good sights that most people won’t be able to experience in real life throughout their lifetime, but this VR experience allows them to catch a close representation of it.

We can learn a few things about creating an engaging VR experience from this. One of them includes making the users feel that the VR software contains limitless possibilities, this pushes them to use their creativity and imagination when interacting with the software. This engages the users because they will feel more involved in the VR world that they are in. A perfect complement to that is a beautiful VR environment that captivates the users to remain in it.

Great features in Google Earth VR

The controls in Google Earth VR deserve high commendation as it not only allows the users to control their position in the software easily but makes it feel extraordinary to traverse around. It starts from the software’s choice to use the actual representation of the controllers instead of some kind of virtual hands. This makes it easy to indicate to the users what each button on their controller does. They also enhanced the experience by attaching stuff like a tiny globe to the controller that the user can hold closer to choose where to teleport to or access the street view, this approach of adding flairs to the controller creates a big impact because the controllers are the closest interactable object to the user that users will constantly look at.

The way users use the controllers is also very intuitive, especially the feature that allows the user to fly toward the direction it is pointed at, it’s straightforward but creates an amazing experience of soaring through the skies for the users. I also especially enjoyed the feature of adjusting the time of day by dragging the sky to control the position of the sun and moon, as lighting has a very huge impact on the visuals of a location and users can get to see how a place looks at different times of the day.

Lastly, I was amused by this feature in Google Earth VR where they limit your field of view to a small circle when you are moving around. I later found out this feature is “comfort mode” and is done to reduce the potential of VR motion sickness, this feature is toggleable because doing so removes the full immersion in exchange for a more comfortable experience for people prone to motion sickness [4].

What features can be improved and how?

Despite the amazing feats that Google Earth VR has achieved, it still struggles to fully represent our Earth as users will sometimes see blocky textures, texture pop-ins, and less detailed places. Understandably, this is more of a hardware limitation of the VR headset and limitations from the satellite scanning capability, but for a 3D rendering of the real world, the immersion breaks when users see these occurrences. Hopefully, as the development of VR hardware and software progresses, we will be able to achieve a closer and more detailed representation of our Earth in the VR world.

The other lackluster part about Google Earth VR is the difficulty in finding specific locations or points of interest. The software has a search location feature but typing in VR is difficult as you must point towards the letters on a keyboard and select them one by one which can be quite time-consuming. One way to improve the search experience is to perhaps add voice-to-text functionality commonly found in smartphones so that users can type in search fields more quickly. Besides that, the software doesn’t offer a lot of recommendations on point of interest and place labels such as those found in Google Maps, it does show road names, but it disappears when you get too close to the surface, which makes it hard for users to find interesting locations to look at when using the software. Good geographic knowledge can be very helpful for the user to look for their desired locations but adding more indicators and labels to this 3D rendering of the Earth can improve the experience when searching for places. This feature should be toggleable as having visual indicators is undesirable sometimes as it might break immersion.

Conclusion

Google Earth VR is a breathtaking experience every VR headset owner should absolutely try. It’s simply amazing to visit every corner of the world to see what this beautiful Earth has to offer, and this experience is perfectly complemented by well-thought-out VR interaction designs. It is exciting to live in times where VR technology is greatly advancing, and Google Earth VR is undoubtedly one of the cornerstones that shows the vast possibilities VR technology has to offer.

[1] R. Carter, “Google Earth VR Review: Explore the world,” 04 October 2021. [Online]. Available: https://www.xrtoday.com/reviews/google-earth-vr-review-explore-the-world/. [Accessed 20 January 2023].

[2] M. Podwal, “Google Earth VR – bringing the whole wide world to virtual reality,” Google, 16 November 2016. [Online]. Available: https://blog.google/products/google-ar-vr/google-earth-vr-bringing-whole-wide-world-virtual-reality/. [Accessed 20 January 2023].

[3] “Google Earth VR on steam,” [Online]. Available: https://store.steampowered.com/app/348250/Google_Earth_VR/. [Accessed 20 January 2023].

[4] A. Courtney, “Google Earth VR controls – movement, Street View & Settings,” 13 March 2022. [Online]. Available: https://vrlowdown.com/google-earth-vr-controls/. [Accessed 20 January 2023].

An Immersive Shopping Experience with Shopify AR

Introduction

Due to the Covid-19 pandemic over the past few years, the use of AR in the retail sector grew rapidly. It is even estimated that by the year 2025, up to 4.3 billion people would be using AR on a frequent basis. Top reasons cited by more than 40% of global consumers for not shopping online are — not being able to see products in-person and not being able to try things out before making a purchase. With AR and VR, fit uncertainty can be reduced, increasing confidence in purchase by providing a more immersive shopping experience.

Shopify AR

An example of how consumers can use Shopify AR

Shopify is an e-commerce platform for online stores and retail point-of-sales systems. It allows retailers to set up online stores and provides many different services and solutions for the convenience of both retailers and consumers. One of the services Shopify provides is Shopify AR, bringing a new dimension to customer service with the Augmented Reality (AR) experience. 

Introduced in 2018, Shopify AR allows retailers to create interactive and personal AR experiences for their consumers on iOS devices, where products can be viewed from all angles and scale via the web browser, Safari.

Immersive Shopping Experience

According to a research conducted, the four broad uses of AR technologies utilised in the retail settings – to entertain customers, educate customers, help customers evaluate product fit, and enhance their post-purchase consumption experience. Shopify AR has use cases at every stage of the customer journey that retailers can leverage on.

Rebeca Minkoff shoppers can virtually ‘place’ 3D models in their environment with AR

One of the features is showcasing the product from different angles and scales via any iOS device with a camera and Safari. The consumer would also be able to place the object in their environment using AR technology, to visualise how the product would look and feel like if it were there. This simple feature entertains customers, due to its ability to transform inanimate 2-dimensional objects into 3-dimensional, animated and interactive objects, which creates fresh experiences that entertain and captivates customers. Additionally, by having the physical environment as a background to the virtual products, consumers can visualise how products would appear in their actual environment and context, helping consumers to give a more accurate evaluation about the given product. The brand, Reveba Minkoff also reported that there are 65% more likely to purchase when they utilised this feature on their website.

A traditional retail display reimagined in AR.

Shopify AR allows retailers to go beyond showcasing their products and having the consumers place them in their environment. Firstly, Retailers can choose to recreate a traditional retail displayed in the virtual setting. There would be no need to consider the physical restraints of space and inventory, and consumers would be able to explore the virtual space and products from the comforts of their home. Being a rather new technology, engaging AR displays would also be an avenue to enhance marketing campaigns and product launches.

Secondly, retailers can also create their own immersive local AR experience to enhance the shopping experiences of customers. For example, in a collaboration between the Jordan Brand, Snapchat, Shopify, and Darkstore for a pre-release sneaker during the NBA All-Star game in February of 2018, customers invited to the exclusive event were prompted to scan a mobile Snap code that reveals the new sneakers. The sneakers can then be purchased and delivered to them on the same day.

Demonstration of how a Gramovox floating record player is installed using AR animation.

Lastly, the platform can also be utilised to enhance consumers’ post-purchase experience. It can help to put products into context and provide more valuable and personalised information with clear and simple graphics. For instance, after a customer receives a product, an interactive tour can show them how to assemble and install the product they just purchased. In the video above, each part of the installation process is slowly animated to add a new, 3D visual that helps customers better understand the instruction manual included.

If brands and retailers can offer this post-purchase experience, customers could get more immediate and long-term value from the product, which improves their purchase experience and general satisfaction.

Why do you like this XR application?

As an avid online shopper, having the added AR experience would greatly enhance my online shopping experience. On most e-commerce platforms, only 2D pictures are shown, in much better lighting and it would usually be placed in environments that I may not be using them in. Hence, the product images shown may not be an accurate depiction of the product itself. Due to prior experiences of receiving products that are not the same as what was depicted, even with the simplest feature of being able to look at the product from all angles, zooming in to take a look at product texture and details would be of great help. I would also be more likely to make a purchase if I could see how the product would look like where I would like to use them, especially for furniture and larger items.

Object View
AR view

The feature is also easy to use, as I could easy switch between the AR and object mode with the click of a button. If I need to move the object, all I had to do was pinch it with 2 fingers and I could move it as I desired. The AR mode as shown in the image allows me to imagine the product in my own space and interact with it. If I find that it is a right fit, I would also be more likely to make a purchase, as compared to not having the AR experience. If I like how the product fits in my space, I can also take a picture using the button on the right to save the image.

What features are well done?

In general, Shopify AR has a simple interface and is easy to use. As mentioned earlier, the feature that consumers can use to switch between AR and the object has a simple interface. Both the AR and object functions fits the mental model of how these functions would behave and does not have any other functions that may confuse the user. This makes it intuitive to use, with little to no affordance.

The barrier to entry for retailers to use the simple AR functions are low, as they can easily embed the feature in their own website. Hence, it makes it much easier for smaller scale retailers to integrate simple AR features onto their own websites, without having to invest the time and effort to create their own applications (e.g. Ikea Place).

It is also easy for consumers to use it, since Safari and an iOS device is all that are needed, and additional downloads or installs are not required.

Potential Improvements

While Shopify AR greatly enhances the customer experience, there is still room for improvement.

Firstly, Shopify could try to include beauty categories, providing users with the option to see how they look like in different hair colours, hair styles, makeup and nails.

Secondly, Shopify could also provide templates to lower the barrier to entry for retailers of smaller scale to create their own virtual shops and spaces. I have observed that most retailers that have collaborated with Shopify for virtual or virtual-local experiences are those of larger brands, as they have the resources to create these spaces for their potential consumers. However, smaller retailers tend to not have AR experiences, or simply opt for the simplest features, discounting potential customers of their full immersive shopping experience.

Thirdly, if Shopify AR could to be integrated with e-commerce platforms with large user bases like Shopee and Lazada, it would benefit both sellers and buyers on the platform. The simplest features of Shopify AR would provide consumers with a much better shopping experience, especially for larger items like furniture. This could also potentially reduce the need for customers to return furniture of the wrong size, increasing customer satisfaction.

Lastly, Shopify could extend this feature to non iOS devices and other web browsers. Since 71.8% of mobile phone users use the Android operating system, while only 27% are iOS users. By extending this feature to accommodate Android systems, the Shopify AR user base would rapidly increase, which allows the retailers to reach out to more potential consumers, and consumers would also have a better shopping experience.

Conclusion

The rise of Augmented Reality technology, sped up by the pandemic over the past few years, presents great potential for AR technology to be used in the retail industry. While there is room for improvement, the user experience of Shopify AR has been great in enhancing user’s shopping experience. More research can be conducted to find out more about the impact of AR technology on various aspects of consumer behaviour, to further improve on the usage of AR technology in the retail sector.

References

Augmented reality brings a new dimension of engagement to the customer experience. (2018, November 15). Retrieved January 18, 2023, from https://www.shopify.com/sg/blog/augmented-reality-commerce

Augmented Reality in Retail A Business Perspective. (n.d.). Retrieved January 16, 2023, from https://www.byteplus.com/en/blog/detail/Augmented-Reality-in-Retail-A-Business-Perspective-

K, S. (2020, May 27). Retail in a new dimension. Retrieved January 16, 2023, from https://medium.com/scapic/retail-in-a-new-dimension-503249c4e46e

Laricchia, F. (2023, January 17). Global Mobile OS Market Share 2022. Retrieved January 20, 2023, from https://www.statista.com/statistics/272698/global-market-share-held-by-mobile-operating-systems-since-2009/#:~:text=Android%20maintained%20its%20position%20as,the%20mobile%20operating%20system%20market.

TAN, Yong Chin; CHANDUKALA, Sandeep R.; and REDDY, Srinivas K.. Augmented reality in retail and its
impact on sales. (2022). Journal of Marketing. 86, (1), 48-66. Research Collection Lee Kong Chian School Of Business.

XR for Construction and Interior Design | Magicplan

Introduction

Magicplan is an AR mobile application that can be used for interior design. Typically, in an industry that utilizes pen and paper or a computer to draw out floor plans and design rooms, Magicplan provides a convenient alternative that can be used on the go. Users can “add a room” to the floor plan, and by selecting the “Scan with camera” option, users will be brought to an interactive AR screen where they can scan the corners of their rooms. Magicplan will then calculate the room’s length, breadth and height and produce a floorplan, with up to 95% accuracy.

Why is Magicplan engaging?

Magicplan has an intuitive and interactive design that makes it beginner friendly. Instructions are readily available on the AR screen for first-time users. It allows people who are interested in interior design to have a unique platform to try designing their rooms to their liking without having to go through the hassle of using measuring tapes and other physical equipment. Users are also now given a chance to plan out the rough design of their house, before engaging a professional interior designer. This benefits both parties as users can better articulate their vision for the design and designers will have a platform to input and share their opinions. Doing so reduces the chances of misunderstanding, which will be problematic in the future if the house is furnished wrongly.

What is Well Done about Magicplan?

Clear markings and measurements placed on objects in the environment

As users are scanning the room, Magicplan shows the measurements of the room on the objects in the AR environment. This makes it very intuitive for users the know which measurement is for which part of the room, rather than having the measurements on a side menu at the side of the AR screen.

The application also marks out exactly where the start and end points are, allowing users to double-check and ensure that the marked points are correct. If users are unsatisfied, they can easily delete that point and mark it again.

Measurements between some marked points

Realistic visualisation of the room (3D View Mode)

After the user has finished scanning the room, they will be able to design the floor plan accordingly. They can add furniture to their floor plan and adjust the size to their liking. After finishing the design in 2D, users can switch to the 3D mode and view how the room will look like. This was a game-changer in an industry that mainly used pen and paper during the initial design stage. Interior designers are now able to show how their client’s rooms were going to look before any renovation even begins. This allows clients to have a physical visual image of the end product of the room, rather than having to leave it to their imagination.

Connect to Bluetooth sensors

While Magicplan can measure the length of the room via AR, it is only 95% accurate. That may not be good enough considering that the next step after designing the room is to go through an expensive renovation. Mistakes in measurements can result in the wrong number of wall tiles used or buying a wardrobe that is too big for the room. All these are expensive mistakes that users would like to avoid. To improve the accuracy of the measurements to 100%, Magicplan allows users to connect a wide variety of Bluetooth lasers to aid in the measurements. This helps to double-check all measurements and ensure a mistake-free design phase.

Possible improvements?

Corner detection

One of the major flaws that I came across as I was trying Magicplan, was the application’s inability to detect a corner. As I am scanning my room, I could not accurately mark the first corner of the room. It was always slightly above or below the corner. I am not sure if it was due to the poor lighting of my room, but I think that this will affect the user experience. The clear and distinct marking of corner points that I mentioned as a “well done” feature, will end up backfiring. If the marking isn’t exactly at the corner, users will want to rescan that corner continuously till it is exactly where they want it to be. The problem is further exacerbated as all other points use the first corner point as a reference. So, if the first point is slightly above the corner, the lines drawn between corners will look elevated on the AR screen. This may affect the accuracy of the measurements, especially the height. A possible way to avoid this is for the application to have the ability to detect a corner automatically.

Conclusion

Magicplan is an interesting and ground-breaking application that aids in interior design and construction. It utilizes the potential of AR to improve an industry that mainly uses pen and paper. Although there are some kinks that need to be ironed out, it is useful application for anyone who is interested in interior design or has a profession in that field.

References

Augmented reality apps archives. Indovance Blog. (2022, August 11). Retrieved January 19, 2023, from https://www.indovance.com/knowledge-center/tag/augmented-reality-apps/

Extended reality in construction – a new frontier for the AEC Industry. Indovance Blog. (2022, September 7). Retrieved January 19, 2023, from https://www.indovance.com/knowledge-center/extended-reality-in-construction/

Magicplan Help center. Magicplan Help Center. (n.d.). Retrieved January 19, 2023, from https://help.magicplan.app/

CS4240 Blog Assignment

For this post, I have chosen to discuss the VR application Half-Life: Alyx. It is a first-person shooter game developed and published by Valve Corporation. The game is set in the Half-Life universe and is designed to be played in virtual reality (VR) and it was released in March 2020.

One of the reasons why I like Half-Life: Alyx is that it is a great example of how VR can enhance the immersive experience of playing a first-person shooter. The game’s use of VR technology allows players to physically interact with the game world in a way that is not possible with traditional first-person shooter games. For example, players can reach out and grab weapons, ammunition, and even other objects that one could carry (e.g. boxes, crates, etc) and use them to fight enemies. This physical interaction adds a level of realism and immersion that is not possible with traditional first-person shooter games.

Another reason why I like Half-Life: Alyx is the game’s story and characters. The game’s story is engaging and well-written, with a compelling narrative that kept me invested throughout the game. The characters in the game are also well-developed and likable, with unique personalities and motivations that make them feel like real people. The game’s voice acting and motion capture also add to the immersion and realism of the game’s characters.

The game’s level design is also well-done, with a variety of environments that are visually stunning and well-crafted. Each level is full of details and secrets that make it feel like a living and breathing world. The game’s use of physics-based puzzles also adds to the game’s immersion and makes the gameplay feel more realistic, even more so thanks to its high quality physics system.

One feature that could be improved in Half-Life: Alyx is the game’s enemy AI. While the game’s enemies are challenging and provide a good challenge, their behavior can sometimes feel repetitive and predictable. Additionally, the game’s combat can feel a bit clunky at times, with some weapons feeling less responsive than others.

Another feature that could be improved in Half-Life: Alyx is the game’s replayability. The game is a linear experience and once the player finishes the game, there is not much reason to go back. Some additional modes or game mechanics that would allow players to explore the game’s world and story in new ways would be a nice addition.

Overall, Half-Life: Alyx is a fantastic example of how VR can enhance the immersive experience of playing a first-person shooter game. The game’s use of VR technology, engaging story and characters, well-done level design, and physics-based puzzles make it one of the best VR games available even with its flaws.

XR Application – Richie’s Plank Experience

A screenshot of the street view of the virtual city in Richie’s Plank

Introduction

Richie’s Plank Experience is a VR experience that clones the real-life environment of walking on a single plank from an elevator on the 80th floor of a skyscraper. First released in 2016, it became widely popular for its realistic graphics and heartstopping experience for users. Beyond the psychological experience, it also offers 4 bonus modes:

  1. Fire Deck – Users get to fly around the virtual city like a superhero with the controllers to fly around the virtual city and extinguish fires with a fire hose and rocket hands.
  2. Sky Brush – Users can write and draw the sky in different colors using rocket hands as a paintbrush.
  3. Nightmare Mode – By pressing 666 on the panel on the left, users will experience the plank with spooky additions.
  4. Santa Simulator – Reindeer will pick the user up in Santa’s sleigh such that users can deliver presents into chimneys.

Video gameplay of all modes of Richie’s Plank Experience

Why they are engaging?

From seeing many gameplay videos of Richie’s Plank, many users seem to be very immersed and engaged in the experience that the application offers as they react realistically. As Slater (2009) put together, there are 2 orthogonal components that contribute to users being engaged realistically in VR: place illusion (PI) or presence, and plausibility illusion (Psi). I believe that Richie’s Plank has very effectively fulfilled the two criteria, and therefore can engage audiences very well. 

Richie’s Plank Experience Trailer

Firstly, it is able to provide the illusion of presence, specifically the illusion of being in a stable spacial place and physical interaction. This is done through the hyperrealistic portrayal of the urban city landscape with moving vehicles and tall buildings. Coupled with the dynamic lighting and effective usage of perceptive illusion to depict spacial distances make the environment look realistic (See 0:02 to 0:10 of video). This is further amplified by how players are able to manipulate some parts of the virtual environment in a way that mirrors the physical environment. 

Secondly, it is able to trick users to believe that what is in front of them is really happening even though they know it is not real, which is a plausibility illusion (Psi). In order to achieve that, there have to be “correlations between external events not directly caused by the participant and his/her own sensations (both exteroceptive and interoceptive)” (Slater, 2009). How Richie’s Plank does this is by involving multiple sensory receptors to provide understanding to the players such as using elevator music in the lift and also having the ambient sound of wind and birds while on the 80th storey. 

Why do you like this XR application?

I like this XR application as it is very unique in how it has relatively simple mechanics and functions, it is able to become very engaging, so much so that when users engage with the application, they show similar reactions as if the things they are seeing in the VR application is happening for real. I am very amazed by how this is initially developed by a team of husband and wife before expanding to form a full team. 

I also like how there are different modes to the experience, which are all very creative ideas that use the landscape and do things that are not possible for humans to do. For instance, with Fire Deck mode, users can fly around with rocket hands and put out fires, which is definitely not possible under the laws of physics. Sky Brush also allows artistic individuals to enjoy the VR experience and fulfilling probably many children’s dreams of being able to paint the sky. These fun touches make the application more interesting. Notably, I did not see anyone complain about vertigo or other ‘VR illness’, despite allowing users to fly around. 

What features are well done?

The modeling of the virtual city environment is definitely a very well-done feature of the application. A lot of work is placed into making sure that the look and feel of the city landscape are as realistic and beautiful as possible such that they can provide users with the best experience. This is also why this experience is so well-known for being able to engage its audience and evoke realistic reactions. The picture below shows how the view from the plank changed as the application got developed – from the early access version to the final version that is on the market today. 

View from the plank changed from the early access version to the final version

It is also very interesting how the developers of the application encouraged the audience to use a real plank – making use of an additional sense of touch – to make the experience more ‘authentic’. It brings this VR experience somewhat closer to Hyper-VR, where the virtual plank now feels like an actual plank below one’s feet.

What features can be improved and how?

I think that Richie’s Plank can further enhance the illusion of presence to users through the usage of giving users a false sense of embodiment of a body. It is quite a waste that there is no feet tracking that is working well for this game. When the audience can see their feet while walking on the plank, they will have a stronger sense that they are physically there in the VR. Although there have been attempts at trying to attach sensors onto feet so that when the player looks down, it is not fully integrated. Though expensive, there are now shoe accessories specially made for VR so that feet movement can be tracked naturally. One such example is the Cybershoes, as seen in the picture below. 

Old sensor-attached shoes

Cybershoes

Furthermore, the player, when interacting, sees the joystick instead of some form of a hand to indicate that they are currently a form of character instead of a pair of floating joysticks. There was feasibility testing done for hand tracking on Occulus but unfortunately there has been no update from the developers since 3 years ago. Actually, the developers don’t have to achieve extremely accurate hand movement. There are only a few actions that are performed by the hand, and therefore simply having an iconic representation of a hand would be a great start to better representational fidelity in my opinion.

Picture of Feasibility test for hand-tracking in Richie’s Plank

Last but not least, I think that this application last replayability. While the developers have highlighted that this is not a game but an experience, I feel like there are a lot of things one can do with the premise set up with such nicely done models. For instance, players can be given the opportunity to explore the city and perhaps even have the power to customize the city. In the future, it can even grow to be like Minecraft but in VR.

Conclusion

In conclusion, Richie’s Plank Experience is a good example of a VR experience and suitable for entry into VR with its simple mechanics and realistic modeling. While it is not perfect, it is certainly very engaging to the audience. 

While researching the XR application, it is noteworthy that Richie’s Plank Experience is not only played for fun but used in research. Surprisingly, it is not only VR research that uses the application. It is also used for investigation into acrophobia (fear of heights) (Hu, 2018) and mental dissociation (Caulfield, 2022). Therefore, it is an interesting angle to look into how VR can help to further research into human behavior sciences and mental health as it is able to offer experiences that will trigger realistic physical and bodily reactions with less harm (as it is not actually happening to the participants) and less cost (as physical venue and activity constraints are now not a problem). 

References

  1. Caulfield, N. M., Karnick, A. T., & Capron, D. W. (2022). Exploring dissociation as a facilitator of suicide risk: A translational investigation using virtual reality. Journal of affective disorders, 297, 517-524.
  2. Cybershoes US – Shoes made for walking in VR. (n.d.). Retrieved January 19, 2023, from https://www.cybershoes.com/ 
  3. Hu, F., Wang, H., Chen, J., & Gong, J. (2018, August). Research on the characteristics of acrophobia in virtual altitude environment. In 2018 IEEE International Conference on Intelligence and Safety for Robotics (ISR) (pp. 238-243). IEEE.
  4. Reddit – Dive into anything. (n.d.). Retrieved January 19, 2023, from https://www.reddit.com/r/SteamVR/comments/7jbp3a/special_announcement_richies_plank_experience/ 
  5. Richie’s Plank Experience on Steam. (n.d.). Retrieved January 19, 2023, from https://store.steampowered.com/app/517160/Richies_Plank_Experience/ Slater, M. (2009). Place illusion and plausibility can lead to realistic behaviour in immersive virtual environments. Philosophical Transactions of the Royal Society B: Biological Sciences, 364(1535), 3549-3557.
  6. Todorov, D. (n.d.). Richie’s Plank Experience. Retrieved January 19, 2023, from https://dannytodo.artstation.com/projects/xzxqD2 

IKEA Place: Try Before Buying

Interior designing and buying furniture have been a struggle for a lot of people due to measurements issue and whether the product will actually fit in their place. It would be too much of a hassle for us when we have to go and return a large furniture just because it does not fit in our room. With the advanced development of technology, especially in the XR world, some companies, IKEA being one of them, start to see the opportunities on how this technology can solve their problems and benefit their business.

What is IKEA Place?

In 2017, the mega furniture company IKEA, published IKEA Place is an AR application which let users to test their desired furniture/products virtually in their own rooms/house. This application allow users to browse IKEA’s catalogue from their remotely and immediately try them out in their room by pointing their camera to the room. Immediately during the first year of its launch, IKEA Place gain their high popularity and become one of the most popular AR apps for non-gaming category.

What makes it so successful?

Simple and Straightforward

When using it for the first time, users are presented with an onboarding process on the available features and how to use the technology. This feature is extremely useful for those who have no prior experience in using XR applications. Moreover, with the classic and simple UI, it is relatively easy for users to navigate through the features in the app.

High Accuracy

Although some AR applications still seem ‘unreal’ in a sense that it does not blend really well with the physical world, the IKEA Place is not one of them. This app claims to have 98% scaling accuracy which is truly reflected when using the app. After selecting the item that you want to place, it will automatically adjust the size and dimension of the room and surroundings, even when you try to move the camera and the object around. Therefore, you don’t have to worry of the product not being able to fit your room!

Save Place Feature

Save Place Feature in IKEA Place
(source: https://miro.medium.com/max/828/1*FVjxiIJY3fqq2qFCeBNlWg.gif)

One of the design principles in UX design of an application is related to memory, how to make user able to recognize something instead of having to recall some information. With the “Save Place” feature, users are able to save their room design after placing all their desired products. Hence, when users want to go for a purchase, they do not need to remember all the products that they have put, instead, they can go to the “Saved Place” and see what products are used.

Find Similar Product Feature

IKEA has more than 10.000 catalogues which can be overwhelming for users to scroll through and search for a product that they want. With this feature, users are able to take a picture of their desired looking product, and the app will search similar products and present it to the users. This enhance the users’ experience since they do not need to spend so much time in browsing the catalogue.

Possible Future Developments

Despite all the benefits of the application mentioned above, surely there are still room for improvement that can be done to further improve the user experience.

In-App Purchase/Checkout

As of the current features, after users plan out their desired room settings using the IKEA products’ catalogue, users will still need to go to their website or the physical store to purchase the items. This can be quite a hassle since users will then search for the products again in another platform. Therefore, it will be helpful if users can directly purchase all the products after saving a design into a saved place.

Product Recommendation

Another possible future implementation is a product recommendation that can suggest additional furniture that can enhance the room. One of the example is by recognizing the type of available furniture, color scheme, and the dimension of the room.

Room Design Template/Recommendation

For some people, it might be difficult to design a room from scratch and they would need some inspiration. Therefore, it would useful if there are collections of room design templates that users can refer to based on the dimension of the room. This feature can also involve IKEA interior designers/consultants that can assist users in designing their room, whether it is by sending their scanned room, or via live call.

Conclusion

IKEA Place is one of the technological breakthrough in the shopping world by being one of the first companies that uses AR technology for users to ‘Try before Buy’. Overall, the user experience of the app is relatively simple and easy to grasp even for beginners. Also, there are some interesting features that are very useful in assisting users in choosing their desired product. Although there are still room for improvement to better streamline the user experience, this app still gives a pleasant experience for the users.

References

https://medium.com/@linderaats/ikea-place-app-review-2a5fbd223b8e

https://www.xrtoday.com/augmented-reality/ikea-place-review-ikeas-arcore-app/

https://www.ikea.com/au/en/customer-service/mobile-apps/say-hej-to-ikea-place-pub1f8af050

Enhancing Dining Experience with Augmented Reality

Augmented reality (AR) is defined by Microsoft as an enhanced, interactive of a real-world environment achieved through digital visual elements, sounds, and other sensory stimuli via holographic technology. It has been used extensively in the entertainment industry in the form of games (e.g. Pokemon GO), music and film. Its usage has also spread to other sectors such as commerce (e.g. IKEA Place), production lines (e.g. Glass Enterprise – Google) and restaurants.

AR is an area of technology that is gaining attention among restaurant owners as a tool to enhance customer’s dining experience. There are a variety of ways in which AR can be incorporated into the food and beverage (F&B) sector. A number of restaurants have been working with different visual artists to construct scenes and animations that can be projected onto the table. One of them is Le Petit Chef.

Le Petit Chef

They are a restaurant who worked together with an artistic collective, Skullmapping, to curate an immersive culinary journey by adding theatre to dining. They bring customers through different storylines, depending on the menu that they pick upon entry, using realistic 3D animations that are projected onto the table. 

Why it is engaging

They were able to captivate diners using AR technologies combined with music in the restaurant to provide a sensory immersion. It is said that the projected visuals did positively affect the customers’ dining experience, as the visuals helped in enhancing the diners’ perception of the dish that was about to be served, hence heightening their desire to taste the food that was featured. 

Features

Although table mapping is not the newest AR technology, table mapping has been around for awhile, Le Petit Chef did well in a few areas. 

High Definitions Projections

They installed their own projectors within the event space to streamline diners’ experience and ensure that the visuals and its effects are of the highest quality possible. This is extremely important as AR technology is fairly new and is highly dependent on the tool which is used to view them. In using the restaurants’ own HD projectors, customers will be assured that the visuals would not be pixelated or misplaced (not in the right position on the plate or table), and that their experience will not be discounted by lower quality tools (with headache-inducing visuals effects).

Curating Different Storylines

It is interesting to see that the restaurant has put in the hard work to craft different storylines for different menus that they are serving for the day. What’s most appealing to me was that they have not just decided to bring customers to different parts of the world which the dish that is served originates from, using AR. Some of the visuals also feature a miniature chef as the protagonist of the story that diners are taken on. The miniature chef on their table talks to the diners using through the speakers around the place. This does help in capturing the audience’s attention as they go down the items on the menu. By the end of the meal, customers did report that it almost felt like going to Disneyland and they felt good about the entire experience. 

Areas to Improve on

Visuals for each dish

It is indeed a novel experience, however, it does get boring quite fast as there is a limited number of shows that are available for choosing. After a few repeated shows, it would start losing its lustre from the diners’ perspective. It would be unrealistic to demand that creators and owners of the restaurant put out new shows every so often due to the vast amount of time and effort it takes to craft an entirely new menu and create a new story based on it. Hence, I would suggest that the restaurant can keep this elaborate and expensive set up for those who want the whole experience (perhaps for a special occasion), but they can also use the same technology for each dish. By each dish, I mean that visuals and stories are designed for a dish, so when customers look at the menu, a 3D visual of the dish can be projected onto the table, and after they have ordered the projection will continue to tell a story about the dish until it is ready to be served. In this case, customers will be able to experience different stories each time they visit as they select different dishes for their meal. 

More interactions

Currently, the augmentation that is done on the table tells a story, which makes the entire experience from a diner’s perspective equivalent to watching a film while having a meal, except that the film is closely linked with the meal. I feel that with the rise of motion detection and tracking, Le Petit Chef may be able to involve the diners in the story-telling process. Using deep learning technologies to recognise and track hand gestures, it may allow customers to not only be in awe of the 3D projections on the table, but also participate in it if they want to. I believe that this would further engage them which will in turn enhance the entire dining experience. 

Conclusion

While using augmented reality technology in restaurants is still a novel concept, Le Petit Chef certainly did a great job and showed the world the possibilities in which the technology holds. 

Resources

Virtual Reality in Medical Training

There are multiple areas where Virtual Reality (VR) can contribute to healthcare. These includes mental well-being, physiotherapy, pharmaceutical development, and education for professionals and patients. In particular, we will take a look at education for professionals.

The need for accurate medical illustrations for education is undeniable. Since the nineteenth century, textbooks such as Gray’s Anatomy (1858), which is a reference book of human anatomy, gave medical professionals insights into the complex human body. 3D technology was then incorporated to allow for better visualization. Now, VR is the next iteration of such medical educational tools to improve the learning and understanding process of our bodies. VR is able to provide users with real-world simulations through dynamic visuals and more importantly, a risk-free and realistic training ground.

VR in surgical training
VR is especially useful in surgical training. It comes in handy since taking away experienced surgeons from treating patients to train students is expensive and may have adverse impacts on their patients. VR training beats learning from textbooks and watching videos which do not provide the required hands-on experience. In addition, students can use virtually created models of medical equipment that are not easily accessible and perform procedures without risking any lives.

Why VR in medical training is engaging
There is always demand for competent medical professionals. The application of VR in the medicine field intrigues me as technology has long been used for the application of medicine, and not the education of medical professionals. VR can serve as a mean to reduce the chance of human errors through simulation training. Users can build muscle memory for certain procedures and hone their psychomotor skills. A meta-analysis of the relationship between medical error and simulation training has revealed that simulation can reduce medical error and prevent some risks related to medical treatment (Sarfati 2019)

What is next for VR in medical training
Indisputably, VR aids medical training through detailed imagery which shows the inner workings of the complex human body. The hands-on experience provided is indispensable to the education of medical professionals. Perhaps, future development may focus on collaboration in the simulations. Much of the medical applications are team-based so such developments may help users familiarize with their counterparts in training.

References:
Sarfati, Laura, et al. “Human‐simulation‐based learning to prevent medication error: A systematic review.” Journal of evaluation in clinical practice 25.1 (2019): 11-20.

Terracotta Warriors and XR

Introduction

Qin Shi Huang’s Tomb is China’s first grand, well-laid, well-preserved imperial tomb. The terracotta warriors preserved in the tomb were buried with Qin Shi Huang, China’s first emperor. In 1987, the mausoleum of Qin Shi Huang and the pit of Terracotta Warriors were approved by UNESCO to be inscribed on the World Heritage List, known as the “eighth wonder of the world”.

The application of XR technology enables people to get closer to the terracotta warriors, and even restores the original colors of the terracotta warriors, making them seem to come to life. At the same time, XR technology also recreates the scenes of ancient craftsmen building terracotta warriors.

Why the XR application is engaging

Before the application of XR technology, visitors could only look at the terracotta warriors in the pit from a distance while standing on a high platform. People can only see the general appearance of the terracotta warriors, and they cannot carefully view the details of the artworks, such as the decoration on the armor and the appearance of the terracotta warriors.

With the application of XR technology, visitors can scan the QR code with their mobile phones and watch different types of terracotta warriors on their phones. By rotating the terracotta warriors on the screen, visitors can watch the front and back of terracotta warriors, their faces, weapons, horses, and so on. It gives people an insight into the history of the First Emperor’s reign and the details of the crafting of these terracotta warriors.

Additionally, people can also wear VR devices to see the artworks in a more immersive way. Although mobile phones allow people to see the terracotta warriors in more detail, the images are flat graphics. VR devices make the terracotta warriors three-dimensional, and the size of the terracotta warriors in VR glasses is the same as the real ones. In addition to allowing people to take a closer look at the terracotta warriors, visitors can watch how the terracotta warriors were created step by step in ancient times.

Features that can be improved

Interact with users

Although people can see these terracotta warriors more clearly now, I think if we can make these terracotta warriors move through animation technology, so that they can do some simple actions such as walking, squatting, shooting, or riding, it will significantly improve the interaction with tourists. In addition, technicians can add a voice interaction function, so that they can talk with visitors.

Internal structure

I think VR technology can help tourists have a deeper understanding of the internal structure of these terra-cotta warriors. For example, a terracotta warrior can be broken down into small parts and assembled by tourists themselves, or the process of making terra-cotta warriors can be turned into a game of passing through the customs. Tourists can fire their own terra-cotta warriors.

Boneworks: A highly free physical interaction experience

If there is a game that allows you to move freely, all the items in it have real physical effects, NPCs also have a ragdoll system, and then enables you to have a pair of flexible hands, what kind of experience would that be?

What is it?

Boneworks is a 2019 First-person Shooter VR game developed and published by Stress Level Zero. The game’s design is entirely physics-based, with the player controlling an entire virtual body that responds not only to the player’s real-world input, but also to obstacles in the game world. In Boneworks, players take on the role of Arthur Ford, a rogue cybersecurity director who escapes to an unfinished simulated universe, battling surreal buildings and occult settings with a variety of experimental physics-based weapons.

The game uses a physics engine that is very close to the real experience. In the game, almost all items are interactive, such as water cups, trash cans, hammers, balls, gun. You can use all the items you can pick up in the game world Advance bravely, fight the enemy, and explore the unknown next corner.

Why is it engaging?

The main gameplay is to experience realistic physics interactions in semi-open levels and to combine these physics effects to unlock the puzzles in the levels. In this game, there are no traditional game guidelines or clear tasks, everything can be done according to the player’s choice to pass the route, the player can find their own way to play.

Boneworks has 3 modes, story, arena and sandbox mode. Story mode is more like a newbie tutorial, but only in the story mode can players get props to unlock and enter the other modes. Directly skipping the story mode in not possible. Wearing VR game equipment to begin with, players will be in an open museum available to visit, at each exhibition to experience the weapons props, as well as through the props to interact with the environment.

When the player is familiar with the mechanism will produce a refreshing, realistic experience. Players can freely move to pick up weapons and props, and when the bullets run out, players are also free to play, directly on the whack are not a problem. This freedom is very high, but also very strong simulation of realistic physical action gameplay greatly home immersion experience in VR gaming equipment.

What features are well done?

  • Great physics

Boneworks is the first VR game to use the idea of providing players with a variety of objects, each with their own unique weight and weight model, and then using them as components to solve single-player puzzles and battles.

Before Boneworks, while most VR games emphasized interaction, that interaction was “shallow”. Players could only interact with specific, scripted objects, and mold penetration was very prevalent. It wasn’t until late 2018 that a new batch of VR shooter games added the collision of the gun itself and the scene to the code. But Boneworks has gone a step further—or a giant leap—in physics. Almost all objects in the game can be interacted with, and players can also interact with other objects through the object in their hands. A more vivid expression is that it hits people with a frying pan. And in this interaction, the gravity of the object itself and various accelerations are also written into the game code as one of the parameters, that is to say, there will be objects that you can’t lift with one hand but can be lifted with both hands. There will be glass that won’t break when tapped lightly but can be smashed when tapped hard.

  • Hidden Sandbox mode and Arena mode

One of the coolest unlocks is the sandbox mode, which is neatly hidden behind a fake wall on the second floor of the campaign. It makes good use of whatever you collect during the campaign and puts it into giant blue recycling bins, allowing you to build any crazy death traps you can imagine. It feels a little simple at first, but once you’ve filled your arsenal with toys, you can spend hours tinkering with it.

The infinitely playable arena mode is just as fun, offering tons of clever new customizable challenges to beat in every way. Its three modes–Judgment, Challenge, and Survival–are enjoyable in their own way, providing reasons to return to Boneworks time and time again. They also introduce some interesting mechanics that weren’t explored in the main campaign, like the balloon gun, which lets you lift enemies, objects, and even yourself into the air. Some modes are particularly tough, including a trial that forces you to fight wave after wave of minotaurs without any health regeneration. If you beat it, there’s even an arcade-style scoring system you can use to prove to your friends and family that you’re John Wick.

  • Great gameplay

The game provides a very good physics engine, which gives players a strong sense of substitution. But this sense of immersion refers not only to the plot but also to the physical world of the game. The game does not introduce too many background settings. In the game, players must follow the limited prompts and the feeling of the game to keep advancing and exploring. However, these are not important. The unique style of the game, the operating experience of VR, and the physical interaction close to the real world is what players feel the most in the game. From the DAU data of VR content on the Steam platform above, it can be seen that the types of games generally accepted and loved by VR gamers are action, combat, shooting, and adventure. These elements are all reflected in Boneworks. Climbing, jumping, attacking enemies with bare hands or using guns, and groping forward in unknown environments, these elements make Bonework” easy to be accepted by players.

  • User-friendly

In the game, there will be 2 hours of user guide levels. In the teaching stage, players can learn and experience the grabbing and discarding of objects, the running, jumping, climbing and other operations of characters, as well as the use and selection of various weapons, etc. Each part provides a demonstration of the operation, and all items or scenes are included for practice, so even players who are new to VR games can quickly become familiar with basic operations.

Between the teaching scenes, there are text or direction guidance signs telling the player how to operate, and the player will feel very relaxed at this stage with the guidance. Excellent physics engine and creative scenes will bring players one surprise after another.

What features can be improved and how?

  • Motion sickness

In Boneworks, players can not only accelerate the run and jump, but even get down or pad toe. However, too much freedom of movement has also become the threshold for many people. Brain and visual mismatch led to the sense of dizziness so that many players in the beginning do not adapt to this way of playing.

  • Immersive experience

The physics-oriented gameplay of “Boneworks” is a double-edged sword. When everything is working properly, games can be quite magical experiences and provide a rich sense of immersion that few other games achieve. This can lead to nightmarish moments of frustration when things start to go wrong.

In Boneworks, the acceleration and gravity of the object are taken into account, that is to say, the damage I can cause by swinging the brick vigorously and gently swinging the brick is not consistent. However, in the game, it often happens that I swing the brick vigorously but there is still no damage to the object. The reason for this situation is very complicated: it may be the result of the cognitive mechanism of the human brain, the acceleration calculation formula of the game, etc. Since there is an object in the VR screen you see, so when you swing the brick, your subconscious mind will also think that there is an object here, so the force you use to swing the brick will actually start to weaken when your brain thinks that it has “touched” the object. What’s more, in fact, you don’t have a brick in your hand. Even if the handle itself has a little weight, it still can’t compare with most objects in the game. That is to say, many times even if I really want to “vigorously” swing the virtual object in my hand, the almost negligible sense of weight from the handle will still prevent my brain from really “strengthening miracles”.

Generally speaking, although the physical effects and interaction of the game are very good, in fact, no player can successfully “deceive themselves”. Players will be surprised by the performance of the game in terms of physical effects. But in the same time, the torment of being separated from the game world may not be resolved until the era of intubation. Boneworks has revealed the existence of this problem to most players who have not thought about intubation in advance.

Conclusion

Overall, Boneworks is a very worthwhile VR game. With a physics engine close to the real world, a futuristic graphic style, and classic gameplay of adventure and combat, whether you have VR game experience or not, you can get a variety of surprises in this game.

Reference

https://en.wikipedia.org/wiki/Boneworks

https://www.oculus.com/experiences/rift/2385436581584047/

https://www.ign.com/articles/2019/12/15/boneworks-review

https://arstechnica.com/gaming/2019/12/boneworks-review-an-absolute-vr-mess-yet-somehow-momentous/