The edge on speed: Infrastructure for the immersive Internet

By on 14 Feb 2024

Category: Tech matters

Tags: , , , , , ,

Blog home

How do we envision the Internet of the future? Popular culture — from the Holo Deck in Star Trek to the Matrix — offers us a glimpse of the immersive Internet, a digitally engineered three-dimensional virtual space, where digital twins of ourselves will interact in an environment so authentic that it’s hard to distinguish what’s ‘real’ from what’s programmed. There have already been many advances on the path to the Internet of sci-fi blockbusters, but realizing a true Holo Deck will still require further evolution of the technology.

To enable a truly immersive tactile Internet, end-user technology needs to be developed and receive widespread acceptance, and the digital infrastructure needs to be put in place to transport immense quantities of data in the lowest possible latencies.

For science fiction to gradually resolve into science fact over the next couple of decades, each of us will need to be well-equipped with sophisticated gadgetry, which will be interconnected in a fine mesh of high-performance networks to ensure the synchronized and seamless delivery of video, audio, sensory, and cyber-locational data to and from our physical locations.

Our experience of the tactile Internet will revolve around our own digital twin interacting with the digital twins of both other people and real-world objects, as well as with completely virtual objects and environments. Our experience will be powered by Virtual Reality (VR), but not only via headsets — the tactile Internet will require, at the very least, wearables conveying sophisticated and subtle variations in sensory information, including but not limited to pressure, heat, texture, and aroma. The data streams providing this information will need to be synchronized with video streams representing three-dimensional space, as well as with audio streams. Here, even the slightest delay on one of the data streams would lead to a disjunction, taking away from the authenticity and the pleasure of the experience.

Programming the real world

The high-definition virtual 3-D representation of real-world spaces will be based on multiple video feeds, providing a 360° perspective, including multiple altitudes. Sensors in wearables and brain-computer interfaces (BCIs) — be they embedded chips or non-invasive sensors in headsets — will track our intentions and actions and represent these as movement. This information will be used to control the video feed sent to each user. Tactile information relating to physically handling or moving objects (shaking the hand of a business partner or remote-controlling a vehicle) will also need to be seamlessly incorporated into the data being fed to each user or device.

These data streams will all need to be synchronized to the millisecond with one another, with the help of AI, to avoid the stomach-churning sense of vertigo that comes from turning your head and the world spinning dizzily in its attempt to catch up. Now, remember, all of these data streams need to be delivered live — meaning they will require low latency and significant bandwidth.

Current cloud gaming and cloud VR give us a glimpse of what this demand will be. Even today, it’s generally accepted that a latency of 20 milliseconds (ms) is really the maximum for a good gaming experience. The more interactive the scenario is, the lower the latency needs to go. For 360° 8K cloud VR, this will need to drop to below 10ms.

From the bandwidth perspective, cloud gaming requires 35-50Mbps as an absolute basis. For VR, with increasing video resolution and a frame rate of 90 frames per second (FPS) — recommended as the minimum for avoiding cybersickness — it is now foreseeable that an excellent VR experience in 8K resolution will demand bandwidth of upwards of 1 Gigabit per second (Gbps) for interactive exchanges. Remember, this is for one user. If the kids are also online downstairs, then you can do the sums for yourself of what kind of connectivity you’ll be needing in a few years.

The cyber-physical continuum, a seamless transition and interaction between the physical and digital worlds, is in the process of being created. We can see the beginnings of this continuum in Nintendo’s Wii game console — but this entertaining bridge between the physical and virtual is just the start.

We already have the basis for the experiential side of existing VR and Augmented Reality (AR) technologies, with every new generation offering new opportunities. Exoskeletons will enable us to carry out actions in harsh environments via telemanipulation. Non-Fungible Tokens (NFTs) provide a mechanism to verify ownership of digital assets — including, for example, our digital identity, thus protecting our digital twin from being misused by third parties. Whether we will have a specific room in the home/office that will act as the portal to our digitalized lives, or whether we will be able to carry it with us as a hyped-up form of extended reality (XR) is yet to be seen. Probably both of these scenarios will exist, either at different stages of technological evolution or in parallel for different use cases.

Thus, we will be able to shake hands and socialize with people in cyberspace, experience gaming on a whole new level, and hug our children goodnight, regardless of where we are. What more will we be able to do? We’ll need to wait and see. There is, however, one caveat — even the existing applications are already stretching the capabilities of current-day hardware and infrastructure, while the new ones will be even more complex to manage.

The challenge of distance — getting as close as possible to the user

So much for the front-end, big-bucks, luxury consumer technology front, but it’s what’s behind the scenes that will make or break this cyber future — the digital infrastructure intelligently storing, analysing, processing, exchanging, multiplexing and synchronizing, and delivering streams of data to end-user devices or straight to your BCI. And doing so within a few milliseconds, at most.

Innovations in haptic VR require extremely low latency — down to 1ms or even 0.5ms for some use cases. The pure physics of the speed of light means that interactions dependent on latency in the region of a millisecond will only be possible with people and objects within the near vicinity — around 80km (calculated for processing time and Round-Trip Time (RTT) with the assumption of 5G and fibre, with the most direct route and the lowest latency interconnection).

So, the use cases here would be limited to localized tasks such as remotely controlling robots for clean-up work, rescue operations in hazardous environments, or performing highly complex specialized surgery remotely from a centralized hospital within the local region. For less critical use cases, such as interpersonal interactions, entertainment and gaming, and virtual shopping, we can assume a variety of applications in the tactile Internet requiring a more pragmatic latency of between 5 and 15ms, making a distance of approximately 400 to 1200kms to the data centre practicable.

Easy, you say; we can already achieve high bandwidth and low latency in a 5G campus — as low as 1ms RTT. But a standalone 5G island is nothing like the bigger picture of digital infrastructure connecting enterprise networks and end-user networks with clouds, data centres, the Internet backbone, and the rest of the world. Indeed, the tactile Internet is likely to be a killer application for 5G and beyond 5G (B5G) networks, as well as for Wi-Fi 6 and 7, as long as other infrastructure components are further developed to match the performance of these wireless generations.

Digital infrastructure will need to get much, much closer to the user. This will require edge computing on another scale — container data centres in every neighbourhood and pizza boxes in basements — and a high bandwidth last mile, be it 5G or FTTH/B, as well as greatly densified access to interconnection infrastructure. It’s basically the edge on speed.

The future Internet will demand technology neutrality

We’ve got a long way to go to achieve this level of digital infrastructure. Using existing and new network technologies, a densely interconnected mesh of infrastructure will be necessary to support the use cases of the immersive Internet. There will be the need for vastly more data centres than we currently have, fibre and 5G networks serving end users across the board. And not just the mega-hubs; all smaller regional cities around the globe will need their own highly scalable interconnection infrastructure to achieve such low-latency, high-bandwidth environments.

At the same time, there is a need to preserve the openness and neutrality of the Internet and to ensure flexibility for users. We have already seen what happens when interoperability and portability are not built into the system — just look at the concerns companies have had about vendor lock-in when it comes to cloud services. It was the foundation of openness, interoperability and standardization that enabled the Internet to develop as quickly as it has. So, we will need standardization on multiple levels to achieve interoperability, and we will need a highly diverse and technologically neutral infrastructure landscape to ensure openness.

The power of a truly immersive tactile Internet will be found in allowing for the interoperability of systems, perceptions, and more and the fluidity of assets, digital twins, and so on, between different spaces within the first-order infrastructure that makes up the Internet.

There will not be one single ‘metaverse’ as often discussed today, in the same way that there is not one single cloud. Instead, we will see multiple universes that will need to have common protocols and be based on common understandings of data management, to be able to interact with each other. To make this a reality, the spaces between these universes will need to be interconnected in just as fine a mesh of infrastructure as the spaces within these universes.

The connectivity fabric for the tactile Internet

In the early stages of the Internet of the future, all network technologies — even potentially 3G mobile and copper cables in some regions — will be needed, working together to achieve the necessary density of coverage. Over time, there will be no choice but to upgrade the older technologies to much higher bandwidth and low-latency technologies to support increasingly immersive applications. ISPs will need on-net caching of content and environments ready to serve end users, much like streaming providers do today. Data centres will be needed to house AI applications and digital twins and cache larger environments much closer to the user.

High-performance interconnection platforms will be needed; in the long run even every 50-80 kms, keeping local traffic local and offering not only 400 Gigabit Ethernet (GE) connections but 800GE and (after two decades of GE) Terabit Ethernet (TE) connections, to meet the hunger for bandwidth that new immersive applications will generate.

The next-generation Internet Exchange Points (IXPs) that will facilitate the exchange of data at the lowest of latencies will be fully automated, secured with the latest encryption technology, and extremely resilient. These interconnection platforms will need to be data centre and carrier-neutral, to bring a critical mass of digital infrastructure players together to work in concert to interconnect the different universes that develop. Interconnection platforms are the magnets that will draw networks and data centres together, enabling networks to connect directly, and keeping the latencies between users, content, and resources as low as possible.

In this way, we can weave together the connectivity fabric of the future — the intelligent interconnection of devices, data flows, clouds, and data centres in low latency and high bandwidth.

However, if we take 5-15 milliseconds as our standard for the general-purpose applications in the immersive Internet, it will also pose limitations on interactions over greater distances. Not including processing time, the best possible RTT for data to travel halfway around the world is over 130 milliseconds, let alone the lag that will be experienced when communicating with people on the Moon or Mars.

With significant delays in transcontinental or interplanetary communication in the coming decades, different applications will be necessary. It may still be possible to shake hands at this distance, but we will continue to experience the lag we know so well from video conferencing today. But perhaps, by then, we will have conquered the challenge of distance — for example, through future generations of quantum network technology. The future is an exciting place — and we’re building the foundations for it today.

Rate this article

The views expressed by the authors of this blog are their own and do not necessarily reflect the views of APNIC. Please note a Code of Conduct applies to this blog.

Leave a Reply

Your email address will not be published. Required fields are marked *

Top