Skip to main content

Market Overview

'The Metaverse Isn't Just Sci-Fi': Jack Shaw On How Extended Reality Will Transform Business Operations

Share:
'The Metaverse Isn't Just Sci-Fi': Jack Shaw On How Extended Reality Will Transform Business Operations

Jack Shaw is a visionary AI speaker working with Champions Speakers Agency. With nearly 40 years of expertise in Artificial Intelligence (AI), change management, and transformational leadership, Jack has become one of the most trusted voices in digital innovation today.

In this exclusive interview with The Champions Speakers Agency, Jack breaks down the impact of the metaverse on modern business – from immersive training and virtual collaboration to the infrastructure required to make shared digital realities a practical tool for companies worldwide.

With his trademark clarity and insight, Jack explores what businesses can expect as extended reality technologies mature, and how leaders can prepare for a new era of interconnected operations.

How do you envision the evolution of the metaverse and extended reality technologies impacting enterprise innovation, operational efficiency, and cross-border collaboration over the next decade?

Jack Shaw: Now the metaverse ties into these concepts that I mentioned briefly, of extended reality, sometimes called augmented reality or virtual reality.

So let’s clarify what those terms mean:

Virtual reality is when you put on a headset and you see images and hearing sounds — in many cases — that are not actually present there in the real world, at least not present immediately in front of you. Now it could be that you are communicating with somebody at a great distance, but through virtual reality, it looks to you — and perhaps them, if they’re doing the same thing — like you’re sitting in the same room with each other.

So it’s not that virtual reality is always conveying something that’s non-existent or not truthful, but it also can be used, for example, as we’re seeing for things like game playing and role-playing games — those kinds of things that a lot of people enjoy doing — and making those seem much more realistic.

Augmented reality is when, again, whether you’re using glasses (and we’re going to see these very quickly moving from these giant goggles that prevent you from seeing the outside world to things like more traditional-looking eyeglasses), that overlay information on the actual outside world that you’re experiencing.

And it could be anything from, as you’re walking down the street, if you are looking for, say, a clothing shop, having something pop up in your field of vision — but via your glasses — that point out to you: “on the next block on the left there’s a clothing shop of the type that you are looking for.”

To even more sophisticated forms that — and these are things we’ll see in the future, and it may take 10, 15, 20 years before this kind of technology has reached this level of maturity — but where you could say, if you were in a historic environment walking down the street, and having all of the buildings and locations that you’re walking past appear as they would have appeared 500 years ago, or even a thousand years ago.

Or maybe even the people that are walking down the street that you’re passing by in the street appear to be attired in the clothing styles that they might have worn a hundred years ago, or 500 years ago, or what have you. Which, as you can imagine, would have tremendous potential for both education (for people who are trying to learn about history) and just of general interest (for people who want to experience that history).

But these tools can also be used for very practical business-like applications. For instance, some of the applications I’ve seen virtual reality, or augmented reality, more frequently used for are training people to do such things as how to maintain a particular product.

Let me tell you a story that I often use in some of my presentations when it’s appropriate.

So we have a part that needs to be installed on an airplane — and it’s a new fuel injector that needs to be installed in one of the engines of the airplane. And we have a maintenance engineer who is familiar with installing fuel injectors, and as it happens, she is the only person available to install this fuel injector while this airplane is parked at the gate, getting ready to leave within the next hour.

And while she’s very familiar with the general process, she hasn’t happened to have installed this particular type of fuel injector on this particular engine in this aeroplane.

But fortunately, the fuel injector comes with digital instructions. So she puts on her augmented reality goggles, and what happens is the instructions as to exactly what she needs to do, step by step, to install this fuel injector are projected onto the engine and walk her step by step:

  • First, remove this cover.
  • Then, loosen this part.

And actually projecting visual images of what she needs to do next to install this. Because she is a knowledgeable and trained aircraft maintenance engineer — even though she hasn’t worked with this particular injector before — she’s able to properly install it and get it done on a timely basis.

And the end result is that the new fuel injector goes into place, and the aircraft can take off on time and safely with the new fuel injector in place. And we don’t have any costly — for both the airline and the passengers — delays while we wait around for somebody who already has personal experience with installing this particular type of fuel injector on this particular type of engine.

Training people can also be done very well.

The city of Austin, Texas, for example, uses virtual reality. They have a vehicle called an AMBUS, which is short for ambulance bus. And how it’s different from a regular ambulance is: a typical ambulance — which is the size of what we in the United States call an SUV, or used to call a station wagon — or a van will have maybe two beds in it that people who need some kind of emergency medical treatment could be treated on while they’re en route, say, to a hospital.

An AMBUS is designed to respond to potential larger-scale disasters and will have up to a dozen beds on it. But they only have one of these AMBUSes, and they can’t afford to take the AMBUS itself out of service to train the dozens and dozens of emergency medical personnel that would need to operate or work on this AMBUS.

So how do we get these people trained?

They developed a virtual reality training process, where these people can put on virtual reality goggles and then walk through a room that is safe and of the right size and space — but it looks to them like they’re actually in the AMBUS. People can open and close doors and drawers. They can see where various supplies are located, where the different beds will be. Where they would find bandages, or medications of various types, or what have you, that they might need to use.

So that if and when they’re assigned to the AMBUS, even if it’s the first time they’ve ever set foot on it and they have to help respond, they’re already familiar with the layout — how it works, where everything is. And so that’s an excellent example of using extended reality technology to train people.

Now, what the metaverse concept is about is making that much broader. Allowing for multiple people in multiple locations around the world to have a sense of shared space or shared reality.

And the challenge to doing that well is that it’s going to require:

  • Massive amounts of computing power
  • Massive amounts of communications bandwidth

…so that as things change in real time — people are interacting with one another or with the internal systems — whether it’s a bunch of people playing a multiplayer role-playing game of some kind, or whether it’s a bunch of people coordinating on the development and implementation of some kind of change to the real world that requires people in multiple times and places to be able to respond in a coordinated way.

So metaverse technology — where your interface with what people call the metaverse, which could be a representation of the real world or it can be a representation of an imaginary world — is mediated through:

  • Visual
  • Audio
  • Eventually, even finger-sensitive (haptic) technologies

…which makes your body be able to feel things like grasping an object or feeling wind blowing, and that sort of thing. That’s going to become more and more powerful and effective over time.

I personally think it’s going to be 10, 15, 20 years as this technology emerges.

But the massive increases that we’re seeing in computing power and communications bandwidth — that are being brought about to facilitate artificial intelligence, among other things — are also going to have other beneficial effects going forward in terms of enabling the utilisation of tools like extended reality, such as reflected in the metaverse, blockchain technologies, internet of things technologies.

The end result is: I am absolutely convinced that we’re going to see more changes in the next 10 to 15 years in our world, in our lives, than we have seen in hundreds — perhaps even thousands — of years.

This exclusive interview with Jack Shaw was conducted by Mark Matthews of The Motivational Speakers Agency.

Benzinga Disclaimer: This article is from an unpaid external contributor. It does not represent Benzinga’s reporting and has not been edited for content or accuracy.

 

Related Articles

View Comments and Join the Discussion!

Posted-In: contributors Expert IdeasOpinion Tech Interview