Ambiens & BMW M2 Mixed Reality – XR Products

What is XR (Extended Reality).?

Imagine being fully immersed in a design or experiencing a new product before its being build. Or meeting a friend face to face when they are really across the world. Extended reality or XR encompasses around virtual reality, Augmented reality, and mixed reality.

VR is an experience of fully entering into an environment to interact with virtual objects instead of viewing an interface and only imagining it.

AR Alternatively is an experience in which virtual elements are added to the real world rather than a person being immersed in a virtual world. These elements are integrated to fit appropriately in real spaces.
MR is a mix of these experiences with Virtual and Real-world elements interacting within one environment. These forms of XR are made possible with an array of devices including Mobile and VR Headsets.

Extended Reality (XR) is nothing, but an umbrella term comprises of Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR).

Extended Reality XR

There could be endless applications of XR, following are a few of the major domains: –

  • Entertainment & Gaming
  • Healthcare
  • Engineering and Manufacturing
  • Food
  • eCommerce and Retails
  • Education
  • Real State
  • Workspace
  • Defense
  • Travel and Tourism

Among them, two applications excite me the most as their practical implementations are productive and have a huge scope for further development for the betterment of the world. These two products are: –

  1. Ambiens
  2. BMW M2 Mixed Reality

1. Ambiens XR Viewer

These days, professional Architects, and real estate professionals face difficulties in explaining their projects to clients or potential investors. Convincing someone without any imagination is quite difficult, to overcome this issue, Ambiens (A Tech Creative Company) came up with an XR visualization and simulation solution where the entire house is simulated using AR and is being presented to the customer as if they can feel the actual house. Their architects and designers go to the site and create a plan of how the place will look after the construction and the same is provided to the XR developer. Now, with the help of Augmented Reality, he created a virtual design and saves its link into a bar code, and that bar code is placed at that location. So, when the customer arrives, he just scans the barcode with his mobile phone and can observe the final product in that empty space.

Ambiens XR Viewer Application

Features of Ambiens

In the field of Real States, Ambiens provides some features of Realistic Visualizations advantages as follows: –

  • High-definition Rendering
  • Video Tour
  • 3-D Interactive Model and Maquette
  • Augmented Reality
  • 360° Interactive and Immersive Navigations

This product helps the customer to connect with Architecture and Designer’s thoughts and can make required changes as per the requirement, moreover, can feel the changes without any confusion.

Ambiens XR Viewer demonstration

Scope of Improvement

It’s a well-developed product but there are a few key points missing out there which adds a huge difference with the cost of any real-state property. Such as – “View from the Balcony”, and “Sunlight entering through the window”. Which could be added to this project if a responsive VR Simulation could have been created.

Suppose: –  

  1. In that product, while seeing the house, you go to the balcony and see a beautiful sight of a beach or some forest. This way, customers can feel like they are physically observing the property and that mesmerizing sight makes them buy the property that instant.
  2. Some people like natural light and the breeze entering the house through the windows. This can’t be presented with an AR demonstration. But in a Metaverse, it can. Based on the customer’s requirements, the builder can make necessary changes.

2. BMW M2 Mixed Reality

BMW proposed a gamic experience for those customers who want to be a part of a car race with the motto of “drive the change – change to drive”. Though this concept is still not available to the public but has the potential to change the way of traditional Car Race. In this, BMW used Mixed Reality where the driver drives a real car on a real track/road but what he views is not a real-world entity. He is basically taken into a Virtual Environment of his favorite car racing game, where this gadget demonstrates himself as a gaming car racer on a racing track and exciting music. This gadget creates the racing track on the basis of the road and obstacle of the real world and maps it into a virtual world with some additional gamic graphics and sound effects. There is also a virtual reality headset, which the driver of the aforementioned BMW M2 must wear. Inside this virtual reality headset is a virtual world that is programmed by BMW M to represent something called ‘M Town’.

BMW M2 Mixed Reality Complete Setup

They have also installed a “SmartTrack” by ART which is an integrated tracking system inside the car to track the driver’s head movement. It has 2 wind-angle cameras and a controller integrated into a single housing with the compatibility to sync with other devices. Which overall corresponds to increasing the frequency and decreasing the latency for the VR application, hence resulting in more accuracy.

BMW M2 Mixed Reality experience

Features of BMW M2 Mixed Reality

Team BMW tried to provide: –

  • Gamic and sporty experience with real motion, which in turn have negligible motion sickness.
  • Used SmartTrack for more accuracy
  • Provided 360° Navigation
  • High-Definition Rendering

Scope of Improvement

It could have been implemented in a real-world traffic scenario keeping all the traffic rules and regulations in mind. Moreover, a smart, and intelligent systems can detect all roadside signposts and modify the game scenario based on that making car driving more fun and relaxing. In this way, after a long and hectic working day, one can enjoy the ride back home with a gaming experience. Moreover, the Smart system will try to make the driving experience more comfortable, safe, and enjoyable.


The possibilities of XR are endless so what does the future hold for extended reality? It’s expected to improve with sophisticated and more affordable hardware, faster rendering, and more detailed modeling. More industries will implement XR in more areas including education, hobbies, health, sports, and work. Eventually, people may move into a mixed reality environment in an interconnected metaverse. This might include working together with innovators, production entities, and offices everywhere to increase knowledge, invent intelligently, implement resources intelligently, and solve more intricate problems. XR extends reality in many ways. Increasing understanding of the real world and inviting people to move beyond the edges of it into something previously unimaginable.


  1. SmartTracker – “” and “”
  2. Ambiens XR Viewer – “”
  3. BMW M2 Mixed Reality – “” and “” and “”

Atlas Bay VR – Leveraging VR for Real Estate

The COVID-19 pandemic posed unprecedented challenges to the real estate industry, which is primarily reliant on agents to organise property roadshows for new developments and facilitate physical visits, communication and transaction terms between a homeowner and a potential buyer for resale flats. During the early days of the pandemic, governments of countries worldwide imposed lockdowns, restrictions and safe distancing measures to stem the tide of infections, causing inconvenience for agents to organise property roadshows for new developments or schedule in-person viewings for resale flats. In a bid to stimulate the economy and prevent an economic recession during a drawn-out pandemic, central banks have reduced interest rates. The low-interest rate environment makes borrowing money cheaper as a potential homebuyer can get a cheaper loan from the bank to buy a property. This creates a huge demand since the borrowing cost is lower, thus driving the real estate market and boosting asset prices.

Despite the real-world inconvenience caused by the COVID-19 pandemic, the Extended Reality (XR) industry is booming. The global XR market size reached 28 billion U.S. dollars in 2021 and is estimated to rise to over 250 billion U.S. dollars by 2028 (Alsop, 2022). Given the projected increase in the market size and improvements to XR and its accessibility, agents can leverage VR to show homes to buyers and gain more exposure for their listings. One company that leverages the benefits of VR technologies in real estate is Atlas Bay VR.

Atlas Bay VR: Virtual Reality (VR) for Real Estate

Atlas Bay VR uses VR technologies to deliver solutions and services for real estate developers and marketing and property managers. They can custom-make high-end VR experiences that can be delivered to clients or customers on HTC Vive, which comprises a room-scale headset and tracked controllers. There are various attractive features in this idea, which makes it engaging. However, there is also some scope for improvement.

Why do I like it?

I like this idea because it enables us to experience a computer-generated simulation of a real estate property without being physically present there. We can experience how the property would look and feel even before it has been completely constructed. This way, we can try different property layouts for an apartment and make an informed decision about which of these would be best even before these layouts are implemented. This would save both time and money.

Why is it engaging?

This would be quite an engaging experience because, although it is a simulation of reality, we feel as if we are gaining real experiences of performing various activities. These include opening doors and entering different rooms, customising things such as changing the colours of walls, seating on the couch and getting outdoor views of an apartment from the different storeys of the building.

What features are well done and what can be improved?

The application is well-implemented because it provides the users with the ability to perceive how the property would appear if they change the layouts according to their choice. However, further improvements can be made by adding features to make the experience of customisation more flexible. Currently, only simple customisations, such as changing a wallpaper are allowed. The customer may want to experience more customisations, such as putting some furnishings. Since an apartment is usually handed over to the customer without any furnishings, it would be beneficial for customers to have first-hand experience in furnishing the apartment virtually, as this adds a personal touch to their apartment viewing before deciding whether the floor plan of the apartment is suitable for their design aspirations.


XR technologies allow users to perceive various scenarios in digital simulations in such a way that the perception is quite close to real-world experiences. These technologies are successfully being used by some real estate companies, such as Atlas Bay. The users can experience how a property would look and feel even before it has been completely constructed and can also perceive how some changes in the layouts would appear. A further improvement can be made by adding features to make the experience of customization more flexible. With advancements in science and technology, it is expected that there would be more applications of XR technologies.


Alsop, T. (2022, July 27). AR/VR market size worldwide 2021-2028. Statista. Retrieved January 20, 2023, from

Atlas Bay VR. (n.d.). Virtual Reality. Retrieved January 20, 2023, from

Explore the Mona Lisa in Virtual Reality (VR)

Introduction of Mona Lisa: Beyond the Glass

In 2019, the Louvre launched its first virtual reality experience, Mona Lisa: Beyond the Glass, during the Leonardo da Vinci exhibition. Through the immersive experience, the audience would be able to appreciate the famous art piece at a close distance and learn the details behind it. This is an application of VR in the fields of both education and tourism.

Why do I like it?

I physically visited the Louvre in 2018, and have seen the Mona Lisa with my own eyes. However, about 30,000 tourists come to visit the Mona Lisa every day from around the world, and undoubtedly I was only able to take a quick look at the famous art piece from meters away in the crowd. The presence of Mona Lisa: Beyond the Glass provides me with a brand-new experience of visiting the painting virtually, which allows me to closely study the painting and explore the mysteries of the Mona Lisa in no hurry. As the audience, I love the interactive design of the application and I appreciate its educational value. With this application, more people would be able to “visit” Mona Lisa and learn more about this great masterpiece.

Why is it engaging?

The understanding of art should not only be learned by explanations, but by experience as well.

The design of this application is interactive, with sound, images, and animations. The virtual experience follows a storytelling style. The story of the Mona Lisa is told during the virtual tour, and the details of the painting are presented to the audience with clear explanations. Different sides of the Mona Lisa are presented to the audience so that the audience can understand the painting in more aspects.

What features are well done?             

One amazing feature of the application is that the audience is able to see how the look of the painting changes over the years in the virtual space. It is what the audience cannot see in the real world, but this virtual application gives the audience the chance to view history. It is also amazing that a 3D model of Mona Lisa is built into the application, so the audience can see the woman in 360 degrees and watch her elegant moves.

What features can be improved?

From my own perspective, this is a rather perfect application and it is an art piece itself. If I have to give some advice for improvements, I would suggest that maybe the view of the landscape of the painting at the end of the virtual tour can be presented in a more real-world view instead of in a painting style, so the experience can be more genuine to the audience.


Remarkerable Beat Saber, from an 8-year music player’s view

I’ve been a music game player for more than 8 years, supposed to be picky enough about them. But I still could not forget the shock brought to me by the first sight of Beat Saber, so that whenever VR comes to my mind, Beat Saber comes up. And the first thing I did after switching to a 2060 PC, was to buy a VR device to play it.

Dance Monkey • Expert • Beat Saber • Mixed Reality [1]

Published since 2018, Beat Saber is now still a widely praised VR Game on Steam. The content of Beat Saber is good, but it couldn’t achieve such a huge success without VR. On my first sight of it, a player was like a brave warrior, wielding two fantastic red and blue swords, like sword-dance, nimbly avoiding the oncoming light wall, with gorgeous light effect and epic BGM sound effect. This sums up why Beat Saber is so enaging: the strong visual impact brought by VR, the wonderful interactive UI, and the sense of engaging rooted in a music game.

Immersive VR’s application

Too many wonderful music games I’ve tried, so that I won’t cast a glance at average ones. However, in Beat Saber, it differs. Vivid music blocks, FLY onto your face, and you waving your awesome lightswords, like dancing to the beat, especially in a unacquainted wonderland. What an impressive experience! And it’s what VR can offer, a fictional world full of fancy.

Less is more. Simple UI but a multisensory interation.

[Beat Saber] YOASOBI – Racing into the Night (Yoru ni Kakeru) Collaboration with MASO [2]

On the first sight of Beat Saber, you can find there are few words in it. Instead, you are embraced by a new music world where there are many instructions hiding in the properties of objects, from sight, hearing to touch. For example, the red/blue sword is to split red/blue blocks, arrows on blocks tell you in which direction to split them, feedback like an electric shock tell you bombs and walls are to elude, and etc. All of these make the game are easy to play, like you are born to know.

Sense of engaging rooted in a music game

Hearing plays an important in our life. Even without sight, simply a piece of music can make one feel sense of engaging. In some extend, “VR in sound” is the earlist and mature technology in VR developmenyt. As a music game, Beat Saber’s sound effect is no doubt excellent. In my opinion, when a VR application is able to offer me wonderful sight and hearing experience, it’s praisable enough.

Limitations of nowadays VR interation

Not just Beat Saber, nowadays many VR applications’ interation is limited to 6 DoF VR Controller. However, due to the limited technology of the current spatial positioning equipment, the interaction between the base station and the headset VR controller will cause disorder after the user moves his/her body greatly. During the game involving movement of entire body, users may find themselves gradually move to a direction where VR controllers don’t work well.

What’ more, space is much more expensive than VR equipments. These kinds of VR applications equire a lot of space, and in small rooms, such as bedrooms, it is difficult to play, likely to hit the furniture in the room. As a result, the highest cost of this game is to have a wide room rather than a VR device, which will make such kind of VR applications not applicable.


In a conclusion, nowadays a successful VR application, in my opinion, should be a VR+X application. Only VR doesn’t work. When it combine with a good content, in this article, a wonder music game, it may be potential to gain a success.


[1] Dance Monkey • Expert • Beat Saber • Mixed Reality, xoxobluff, Youtube

[2] [Beat Saber] YOASOBI – Racing into the Night (Yoru ni Kakeru) Collaboration with MASO, Artemisblue, Youtube

Google Lens


The concept of augmented reality is highly intriguing, offering a wealth of possibilities and the potential to transform our daily lives. It enables us to augment our perception of the physical world by providing an additional layer of information, without the need for active searching.

The application that I will be talking about in this post is one that is readily accessible to most people and might have already been used by most readers. That application is Google Lens.

About Google Lens

Google Lens was originally announced during Google I/O 2017 and was originally as a visual add on to Google assistant. Google lens was originally pre-installed only on the Pixel 2 however over the following two years it was rolled out to non pixel devices and from June 2018 it was separated out into its own standalone app.

How Google lens enhances my life(The features I like and are well done)

For the few people unfamiliar with Google Lens, Here is a short demo of the features it possesses:

The translate feature has been available on Google since 2006, but it had limited functionality in day to day life. Sure it could allow you to read text on your computer that was written in a language that you don’t comprehend, but it was unable to help in situations outside of computers. Google lens bridges that disconnect between the virtual and physical world. With google lens, travelling to a different country and understanding signage isn’t a hassle. It’s intuitive and it just works. This real time processing and translation is far from the flashy visions of XR that we imagine, but in my opinion, it is one that impacts our life the greatest to such an extent that we already take it for granted.

As a vegetarian, I have dietary restrictions. I look at ingredients for products that I buy to ensure that they don’t contain any animal product. This ends up being hard for products which are in a language other than in English. Google Lens has simplified that entire process for me and others who have similar dietary restrictions or allergies.

While travelling, gone are the days of running around with a pocket translator book, or typing in words in a foreign language into google translate. With the Live Translate feature, all you need to do is open the camera app and have the translated terms superimposed on top of whatever you would like to translate. And for devices after Pixel 6, all this translation occurs on device thanks to the tensor processing units.

The seamlessness and the fact that instead of showing it up in a separate window, lens ends up just overlapping on top of the original text while giving you additional functionality hidden off to the side like copying the translation or sending it to linked devices makes it a truly unique product which allows it to do things that no other product can do.

Future Possibilities and Upcoming Updates

And this is just the beginning of the interactions possible with Augmented Reality and Computer vision in Google Lens. During the 2022 Google I/O, Google announced an expansion to their Multi search feature that allows you to add search queries on top of a picture. Adding onto this, Google announced Scene exploration, a feature that would work very similarly to “Ctrl+F” for the real world. Google Lens’s “Scene Exploration” feature allows users to identify multiple products or objects in their surroundings by moving their camera around and gathering information. The feature can automatically recognize multiple objects and provide insights.

The demo that they gave during the presentation was about identifying the chocolates in a grocery isle and based on the user’s nutritional needs and reviews, picking out a chocolate and pointing it out to them using AR. The demo can be found here:

What makes Google lens engaging

During the Made By Google event in 2019, Google’s SVP of devices and services Rick Osterloh discussed Google’s vision for the future of computing. He described a world beyond smartphones, where computing is not confined to a device in your pocket, but is all around you and integrated into everything. He referred to this concept as “ambient computing”. The idea is that devices and services work together with artificial intelligence to provide assistance and help wherever you want it, and the technology should be seamless and fade into the background when not needed, with the focus on the user, not the device. Google lens is a step in the direction of seamless ambient computing where the interactions needed to get the information you need are so natural that they don’t stick out. What makes google lens engaging is that fact that it’s seamless and so intuitive to use that anyone can very easily pick it up and explore the environment around them that they might not be able to otherwise. Its clean UI and unobtrusive interface makes it blend right into the scene it is analyzing. Engagement is defined as the quality of being engrossed and connected. When looking out of a window, one doesn’t think look at the glass, rather the view outside. Google lens is a window, to an augmented world, and the fact that we forget its existence, is a testament to how engaging it is.

What can be made better?

For all their talk of ambient computing and computing that just exists, in most phones other than their own products, Google lens is still a standalone app that has to be launched separately. This needs to be improved since it ends up being several extra steps which the user could have used to search for the information they are seeking on a traditional browser, breaking the immersion and preventing them from completely connecting with what is in front of them. Google needs to work on integrating lens with third party manufacturer’s camera applications natively to allow the average consumer access to this technology. Google Lens is still a more niche product by Google and this move would allow them to reach a lot more consumers.

Google lens is unfortunately also only able to translate these languages:

  • Chinese
  • French
  • German
  • Hebrew
  • Hindi
  • Italian
  • Japanese
  • Korean
  • Portuguese
  • Spanish

They can improve their product by branching out to include other languages and helping create an ultimately more interconnected multilingual world.


Microsoft Flight Simulator 2020 – ready to be in the very front seat?


Microsoft Flight Simulator (abbreviated as MSFS) is a series of flight simulation applications that was first released in 1982. Starting with the 2020 version, the application now runs in Virtual Reality (VR) mode, allowing users to experience a highly interactive and realistic flight simulation.

What makes it engaging?

Porting the simulator to VR makes this application especially engaging and entrancing to the users. The users feel like they are inside the cockpit and maneuvering the actual aircraft. This sense of immersion is strengthened by the detailed depiction of surroundings, including the airports, cities, and skyscrapers, as well as the natural landscape, providing the users with a life-like experience. While flying over the Grand Canyon, users can see the intricate details of the canyon’s rock formations and the winding Colorado River. Additionally, while flying over a city like New York, users can see detailed 3D models of famous buildings such as the Empire State Building and the Statue of Liberty.

Features that are well-done

High-fidelity representation of the surroundings

Being a simulator, when it comes to representational fidelity, MSFS strives to be on the realistic side of the triangle. As such, MSFS employs a number of methods to ensure that the surroundings the users interact with are as realistic as possible. To begin with, it uses high-quality 3D photogrammetry data from Microsoft’s Azure 3D maps library. When the area is not captured well by the in-house data, it then applies a deep-learning algorithm on the 2D satellite image to restructure the sight into 3D graphics. Lastly, some areas that are worthy of attention are modeled by the designers, manually[1].

User-friendly interactions inside the cockpit

When run in VR mode, MSFS does not use any HUD. Instead, it relies on the virtual world reference frame, where the users learn the status of the current flight via the dashboard inside the cockpit. Previously, hand-tracking features were not supported by the MSFS, meaning that users had to purchase simulator-compatible controllers to maneuver the aircraft fully[2]. With the update in November 2021, the users are now able to use hand-tracking controllers to interact with the cockpits of the planes, thus greatly increasing the immersion of the simulation.

Features that need improvement


As MSFS tries to resemble the flying experience as realistic as possible, performance and hardware requirements come as a natural concern. There are complaints from the users about the FPS drops and the performance issues when the application is run, especially in VR settings. While Asobo Studio, the original creator of this simulation, regularly releases updates and hotfixes to improve the frame rates, more can be done to optimize hardware usage and graphics rendering[3].

Control complexity

While it is fantastic that users can directly interact with the dashboards and controls, the complexity of flying an aircraft poses a significant learning curve for most users. As MSFS claims itself to be an amateur flight simulator, while it is important to preserve the realistic experience of flight simulation, it also needs to consider the entering users and provide a more intuitive way to maneuver the aircraft, such as adding voice-recognition commands or virtually placing a person (i.e. a co-pilot) who will do the heavy-lifting.


If you are one of those who are into flight simulations and want to have the surreal experience of becoming a pilot by yourself, MSFS is a phenomenal application that could achieve your dreams. Nevertheless, beware of the influx of complex instructions and preferences–whether on the hardware or software side—you should learn before embarking on your dream journey.


[1] “Exploring the Whole World in VR with Bing 3D Maps and MRTK.” TECHCOMMUNITY.MICROSOFT.COM, 1 May 2022,

[2] Feltham, Jamie, et al. “Microsoft Flight Simulator Finally Has VR Controller Support.” UploadVR, 22 Nov. 2021,

[3] Chawake, Anurag. “Microsoft Flight Simulator ‘Fps Drops’ & ‘Performance Issues’ in V1.25.9.0.” PiunikaWeb, 12 May 2022,

Hololy – The AR app that brings your idol to the real world.

Hololy is an Augmented Reality (AR) application that allows you to project 3d anime girls into the real world through your phone. The application allows you to choose from a selection of models, poses as well as dances that allows your AR model to appear as if they are right where you are.

How it works.

The main attraction of the application is the ability to use your creativity to bring the 2D world into the 3D world and make interesting pictures as well as videos. First, the application identifies a flat piece of area where the model can be suitably placed, this prevents the model from floating in thin air. This piece of area will be a reference point where the model is. After confirming the location, you can then place the model and rotate around. Moving the camera around will show different perspectives of the model, as if the model itself was really in said area. From then, the user can choose poses, expressions and dance moves to make the model look as if it were alive in the place.

What makes it fun?

The idea that you can see something usually seen in the 2d world, whether it be from a cartoon or game, in a 3d sense is pretty amazing. Usually we see 3d models and animation through a 2d screen, and hence your perspective is locked through that screen. Though perspective may change, the user is glued at the same spot. Through this AR app, the user can see and feel, how we can see a model in different ways and how it fits in with the environment. The idea that you can position your model to sit on a chair, or let them dance in your room allows a lot of wacky photos to be taken.

Things to improve on

The first glaring thing upon using the app is that the 3d models are lighted in a consistent way. Thus, it conflicts with how the lighting is at the environment. There will be shadows appearing on sides where it is clearly lighted in the scene. If the app could allow tracking of light through machine learning it could enhance the immersion as it would fit more into the picture.

The second thing was the identification of a base to stand on was not very accurate. At times the model would seem like it’s floating in the air. Also, restricting the reference point to standing actions limits the creativity of poses. For example, they could do sitting or leaning poses where the reference point would be on furniture. This would enhance the experience as standing straight up in someone’s house isn’t the most natural pose to be in.


Hololy is a decent app for AR immersion, especially for fans of the anime characters. It allows you to utilize your creativity along with the AR powers of the application to create memorable photos and videos. However, there is plenty of room for improvement as the model does not really fit in with the 3d world.


[2022/23S2 CS4240 VAR] XR – Ikea Kreativ


Ikea Kreativ is a Virtual Reality (VR) design tool to help customers visualise how furniture will look like in their room. Powered by VR and AI-technology, home redesign/renovation is made easy, helping customers to visualise their ideas into their home. Features include:

  1. Scanning of users’ room using iPhone camera to accurate capture room dimensions.
  2. Remove of existing furniture and placing virtual furniture in its place to see its fit.
  3. Availability of more than 50 virtual showrooms for users to place furniture in.
  4. Includes more than thousands of furniture and decorations for users to interact with.
Figure- Ikea Kreativ

What do I like it?

Ikea Kreativ provides the convenience and hassle-free of redesigning space, skipping the process of measuring furniture, space and provides users the ability to imagine their space not just in their mind but right in front of their eyes.

It also provides good variety of customisability, with over thousands of furniture, decoration, and accessories, allowing user to customise their space with. The ability to select items to remove in the room further provides greater customisability for users who wish to integrate new items into the space containing existing items.

Why is it engaging?

I found the application engaging because of the good degree of freedom it provides. Ikea Kreativ allows user to adjust Ikea items with ease, through rotation and movement around the space, just like how user would move their items in real life.

What features are done well?

Ikea’s furniture into the virtual space/room does a good job in allowing users to picture the proportion of the furniture to the room. With Ikea Kreativ’s VR element, it helps to visualise how items would look like in users’ space, skipping the hassle of having to measure the dimensions of furniture and room.

The wide selection of items and showroom also does a good job replicating Ikea’s real-life stores, allowing users to browse through item selections, without comprising the experience users get from real-life stores.

What features could be improved and how?

Though marked as a selling point, the erasing tool is not perfect. For example, when removing existing items, the software does not perfectly understand how to fill in the empty space. On this note, this can be improved by feeding the software more data to learn. Nonetheless, it does not obstruct users in adding Ikea’s furniture into the space. Afterall, the eraser tool is a new feature and I believe Ikea will continue to improve it to make a better user experience.

Another feature I think could be improved on is the degree of freedom to navigate around the room. This would require more devices, such as head-mounted display and tracker, allowing users to “feel” its surrounding space and the ambience the new environment (new environment here refers to the space after user completes with setting up the space) creates.


Porter, J. (22, June 22). Ikea’s new virtual design tool deletes your furniture and replaces it with Ikea’s. Retrieved from The Verge:

Wilson, M. (2022, June 22). Ikea’s new app deletes your living room furniture so you can buy even more. Retrieved from Fast Company:

IKEA Launches AI-Powered “Kreativ” Mixed Reality App. (2022, June 22). Retrieved from Hypebeast:

Extended Reality (XR) in Arts and Entertainment

Van Gogh experience Virtual Reality (VR)

I am not a big art fan. Especially with it being so up to interpretation and abstract, it usually bores me. However, I recently got a chance to visit the Van Gogh exhibition which includes a 360º immersive experience of his art and journey throughout his life. It was held in a cathedral where all the walls, including the floor, had projections on them. It was quite a spectacle, bringing the audience through his thought process and showcasing his artwork and this was just the beginning.

Image of 360º projection display in York, St Mary’s Cathedral [1]

After the 35 minutes showcase, there is a VR experience which included a headset, speakers and we were seated on a chair that allowed us to turn 360º. It brought us through the time periods and locations where Van Gogh was inspired and explained the motivation behind each painting.

A snippet of Van Gogh Experience VR section [2]

One of the more famous pieces that were showcased was Starry Night. We were brought to the exact location of the painting, with Van Gogh “narrating” his thought process; like the colors he saw and why he decided to use a particular color in the art. I liked that the information was easy to digest on top of the fact that we could see what Van Gogh saw and thought when he was painting the various art pieces.

However, one thing I think can be improved is the mobility of the experience. I think it would have been better if we could walk through the whole “exhibit” as if we were in that time period and explore the area and paintings at our own pace. This could be done by providing users with controllers and adding demarcations in the VR to allow users to move around the area without having a big venue to move around in.

VR brings art to life, making it easier for people to understand the artist’s point of view. This reinvents museums and strays away from traditional art galleries, which is likely to attract more youths to the art scene.

Zero Latency (Sol Raiders)

VR in gaming is nothing new anymore. With the rise of games like Beat Saber, it pushes innovation and the possibility of multiplayer games. Zero Latency is a free-roaming, multi-player VR experience providing games such as Survivor, Outbreak Origin, and Sol Raiders. I, alongside 7 players participated in Sol Raiders which is a 4v4 game, where the objective is to complete as many tasks and minimize the number of deaths on each team.

Sol Raiders trailer by Zero Latency [3]

I do not usually play first-player shooter-type games but this was so much fun! After putting on the vest, headset, headphones, and guns we were transported into the game dimension and all the players (teammates and opponents) were dressed up like robots! Everything was so life-like, it really felt like I was a robot in that reality, especially with the sound effects.

Players in real world [4]
Players in game world [5]

At some point during the game, I felt very lightheaded because of the mismatch between the real world and the game world. The game world was multi-dimensional with slopes and lifts while the real world was a flat ground room. So walking up the slopes and taking the lifts were disorientating.

I think one thing that can be improved is the UI of the headset. The existing screen did not include any information about the game status, only showing it on the screen at the end of each round. Since this was a team game with a common objective, the screen could have included more information like the number of kills/deaths and objectives fulfilled. This would allow us to better plan our game instead of constantly trying to keep track of these data.

Overall, I think the game was very well done, especially since it had to sync the 8 players throughout the game.


[1] Google maps. [Online]. Available:,-1.0809633,3a,83.9y,90t/data=!3m8!1e2!3m6!1sAF1QipO3uPo03Wb4cJR9A05JAQ7rzK_QfEesIPJ3zPY8!2e10!3e12!!7i1024!8i682!4m7!3m6!1s0x48793130726fa6ed:0x651d1a44837b68c1!8m2!3d53.9572689!4d-1.0808674!14m1!1BCgIgARICGAI. [Accessed: 20-Jan-2023].

[2] “Van Gogh: The immersive experience,” YouTube, 24-Feb-2021. [Online]. Available: [Accessed: 20-Jan-2023].

[3] “Sol Raiders – trailer – zero latency VR,” YouTube, 07-Feb-2019. [Online]. Available: [Accessed: 20-Jan-2023].

[4] “Zero Latency’s Latest Free-Roam Experience Made Me A Believer In VR Esports” VRScout, 10-Aug-2018. [Online]. Available: [Accessed: 20-Jan-2023].

[5] “Sol Raiders,” Zero Latency Luxembourg, 07-Aug-2022. [Online]. Available: [Accessed: 20-Jan-2023].

MEDIVIS – AR Surgery Platform


MEDIVIS is an XR application in surgery. It creates the 3D models of the human body (organs) through Magnetic Resonance Imaging (MRI) scan, and then the 3D models can be used in AR as the environment. The application provides 3D realistic structure visualization for the human body through a VR environment, and the user can zoom the visualization (model).

Attractive Reasons

MEDIVIS provides two primary services: an AR-based surgical support platform and a VR/AR-based health education platform. The application combines two stages (training and working) with the same interface, which smooths the transaction for people from a student to become a surgeon. The teaching and reality tools intergrade together is a highlight which attracts me.

Another advantage of the application is that it offers medical students without qualifications to get real surgy experience. Every surgeon has the first time conducting the surgy for a realistic patient, but using the AR/VR platform beforehand can reduce the nervous or unskilled situations that may happen in real surgy. This improvement would increase the success rate for the surgy overall, which saves more lives.

Engaging Reasons

Compared to other XR applications, the highlights of MEDIVIS listed below make it stand out:

  1. The application enables more medical students to have realistic chances to practice. By contrast, the number of medical students is more than the traditional practice materials they can conduct training.
  2. Compared to the monitor screen or textbook, which only provides a 2D view, the application provides a 3D visualization of the human body (organs), which makes students benefit from the AR visualization as the real world is in 3D. Students obtain more intuitive interaction without relying on their imagination.
  3. The application only needs a single glass to implement the provided features. The original devices used to support surgery, teaching, and training are sets of multiple large items. The application achieves better performance overall and is more convenient to use.

Highlights Features

Two main features are well done:

  1. Reusable is a critical feature provided by the application. In the healthcare industry, every case is unique, and one limitation for practitioners is the experiences (illness cases) cannot be saved for further review in a traditional way. The AR application makes the model once and can keep it in the database to share for future reference.
  2. Sharing the same view/opinion synchronously with other students makes abstract concepts in medical subjects easier to understand. With the 3D environment, students can get a deeper understanding of concepts, and communicating with others leads the learning experience to a new level. For example, the application could make students stand in the blood vessel to investigate around and look at the same parts together to discuss.

Improvable Features and the Way to Implement

  1. Delay for realistic surgy may cause serious consequences. The devices for supporting surgery are connected wirelessly. The delay for using a cable is an order of magnitude reduction. The cable connection can make the platform more stable as well. Using high-band cables to connect devices when doing a realistic surgery instead of a wireless connection is necessary to improve delay and provide a more stable application.
  2. Real-time scan modeling provides no pain discovery for patients would be a further improvement. This feature can benefit right before the realistic surgery to provide surgeon feedback for the patient and evaluate the risk of conducting the surgery, which reduces the danger of surgery. At the same time, creating a customized profile for patients is a benefit for further decision-making. The real-time investment needs more computing power and an efficient algorithm. Adding high-performance computing units and implementing a customized algorithm for this device would be the next step to improve the application.


In healthcare, the XR application brings a more intuitive way to better help both patients and doctors through its 3D visual interaction. It realizes the 3D existence in the real world, which greatly makes up for the lack of information caused by the lack of one dimension.