How Will AR Merge with The Real World?

How will AR merge with the real world

“Reality Channel” was unveiled at the Niantic conference for the first time as an installation exploring the possibilities of the “Real World Metaverse” created by virtual reality (AR). We asked Keiichi Matsuda, the filmmaker and designer known for his video work HYPER-REALITY, about how the Metaverse merges with the real world.

When you look up, rainbow-colored eels fly in the sky, and when you turn your back, a giant plant you’ve never seen pierces the sky. This is a scene from the installation “Reality Channel” presented at the conference “Lightship Summit 2022” hosted by Niantic in May 2022.

This installation, which uses augmented reality (AR) technology, was created with the intention of having visitors go back and forth between multiple “reality channels” (=reality channels) through their smartphones. For example, one channel displays a venue guide and lecture introduction as a layer in the real world, while another channel shows fictional creatures coming and going around the venue.

A recently released video shows visitors chasing imaginary creatures and placing virtual stickers around the venue.

he Reality Channel gives us a glimpse of the future of the “Real World Metaverse,” which Niantic CEO John Hanke announced in a 2021 blog post. In his blog, he proposes a Metaverse that layers the virtual world over the real world, instead of sharing a three-dimensional immersive VR using virtual reality (VR) technology..

“Don’t try to change, create”

──Mr. Matsuda has presented the possible future of AR with video works such as “HYPER-REALITY”, “Augmented City 3D”, and “Merger”, which depicts a future workplace that pursues productivity too much. It’s been done. Under these circumstances, what made you decide to start a design office?

As a designer, I see everything in the world as a product of people’s daily choices. One option is to create a world of surveillance and advertising, where the loudest divisive voices are diffused by business models. Another option is to create a future in which we connect with our environment and others, and use powerful technology to imagine and create things that bring us new perspectives.

This is the vision I’ve been trying to build in tech companies for a while. However, it was difficult to come to terms with the internal politics of a large company, and I became depressed. It was then that his friend God Scorpion of the Psychic VR Lab in Tokyo introduced me to Zen monks. The Zen monk told me, “Don’t try to change, create.” That simple word completely changed my approach.

Liquid City is a gathering of adventurers who understand the possibilities of the present and explore the “road” for a better future. If I can find that path, I believe others will follow.

Read Also: Can EV Battery Production Be Completed In The United States Alone?

──How did this collaboration with Niantic begin?

I’ve always been a fan of Niantic. Introduced in 2016, Pokemon GO is still the most famous use case for AR in pop culture. Niantic games have invented a whole new experience where you can imagine a parallel reality around you while living in the real world. I think the idea of ​​being able to superimpose multiple realities in the same space is very profound.

So when I went to hear Niantic CEO John Hanke give a talk, he showed me “HYPER-REALITY” without knowing I was there. After I learned that he was also a fan of my work, I met him when he came to London and when I visited San Francisco, and we were looking for opportunities to work together. And when I founded my own studio, Liquid City, and Niantic announced their AR developer platform, the Lightship ARDK, the opportunity arose.

How will AR merge with the real world1
How will AR merge with the real world1

Interacting with virtual creatures

──Did you make any new discoveries when you actually created an interactive installation using AR as Liquid City this time?

When we actually started working on the reality channel, we ran into problems because of the narrow field of view of smartphones. When I held up my smartphone and looked around, it became easy to overlook virtual objects. So I used sound, movement, and composition to draw attention to what was important.

For example, we gave virtual creatures vocalizations, from small creatures in trees and rocks to giant eels flying through space. Hearing a low growl behind him, he turns around and sees an eel flying through the trees and over the terrace—and so on.

It was my first time building a full-fledged real-world metaverse, so I tried a lot of other things. Elements that are visually simple, carefully placed in just the right places, and provide real-world value seem to work best.

Video production can be a wonderful tool for thinking, as it allows us to think farther and broader without being bound by current technical constraints. But prototyping is also very important to test your hypotheses and find new problems and approaches.

──How was the reaction of the visitors?

It was interesting to see people’s reactions to the installation. The venue was so big that we walked around a lot and explored different parts of the building. I often saw groups of people walking together and pointing to each other.

Visitors were also fascinated by the creatures in the virtual world. There are several types of living things, and when you look at them, they respond in some way. Everyone enjoyed chasing living things, taking pictures, and interacting with them. People tend to be attracted to things that look like living things, and I’m personally interested in whether such living things can have useful functions like “Siri” or “Alexa.”

I’m inspired by the idea of ​​animism, like Shinto, where supernatural spirits exist. The Metaverse of the future may be based on interaction with virtual creatures, rather than just looking at floating information panels.

──Did you see any change in the interactions between visitors through the reality channel?

Some people come to conferences alone. Some people may be too shy to even introduce themselves. At times like that, I feel that installations like this one, which tickles my sense of play and adventure, are a catalyst for encounters.

In the future, information such as mutual friends and common hobbies could be displayed above people’s heads. You can even attach a profile for places like recruiting or looking for collaborators. I think it’s very effective at conferences where there are many people who want to expand their network but don’t know where to start.

How will AR merge with the real world2
How will AR merge with the real world2

──What do you have in mind for the next step?

The Reality Channel was the first in a series of prototypes to come. In the future, I plan to create a number of “fragments” of the Real World Metaverse. In the short term, places such as museums, parks, tourist sites, and sports arenas can be considered. This will allow you to develop high-value use cases, improve your design approach, and streamline your development process.

Of course, I would like to make a prototype using AR glasses, but the important thing is not the hardware. Because hardware is just a means to access the world. What matters to me is the experience itself.

We are currently looking for partners to work with to demonstrate the incredible potential beyond reality channels. Eventually, all the “fragments” will unite, and the “Real World Metaverse” will be born.

No Internet Connection Instagram Blocked
Unveiling the Mystery: Why Is My Alarm So Quiet?
Unraveling the Mystery: Discord Says I Have a Direct Message