TheJakartaPost

Please Update your browser

Your browser is out of date, and may not be compatible with our website. A list of the most popular web browsers can be found below.
Just click on the icons to get to the download page.

Jakarta Post

The growing importance of low latency connectivity

For a growing number of applications that demand near-instantaneous response times even a slight connectivity hiccup of milliseconds can mean the difference between success and failure – or life and death, in both the virtual and physical worlds.

Madhusudan Pandya
Singapore
Wed, January 11, 2023

Share This Article

Change Size

The growing importance of low latency connectivity A staff member of the start-up Actronika shows the use of a virtual vest and headset during the Vivatech startups and innovation event in Paris on June 16, 2022. (AFP/Eric Piermont)

C

onnectivity, certainly not a new concept, but one that’s receiving a lot of attention these days as the world continues its steady march towards moving online with increased adoption of cloud-based services and content. In a post-COVID world, many of us are still working, learning, playing, and even socialising from home to some degree.

It comes as no surprise, then, that more people than ever are connected to the online world. This certainly rings true in the Asia-Pacific (APAC) region, with internet penetration rates at the start of 2022 hitting new highs in emerging markets like Vietnam (73 percent), Thailand (78 percent) and Indonesia (74 percent), and remaining consistently strong in mature ones like Singapore (92 percent), Hong Kong (93 percent), Japan (94 percent) and Australia (91 percent), according to DataReportal.

But these days, it’s no longer enough to just be connected. To fully participate in the digital world, you need to be able to engage with more experiences, content, and applications in near real-time. That’s where reliable, high-bandwidth, high-speed, and low-latency connectivity comes in that together provides an overall acceptable quality of online experience.

Much has already been said about bandwidth and speed. But what exactly do we mean when we talk about latency, and how does it factor into today’s networks?

Compared to its more mainstream counterparts like bandwidth and speed, latency has taken a backseat in conversations about connectivity until recently, when the guarantee of having zero-to-no delays while online proved invaluable. Whether it’s an important video conferencing call or time-sensitive online gaming, buffering has no place in today’s fast-paced digital-heavy reality. That’s where latency comes into play, and the lower and more consistent, the better.

Latency refers to the time it takes for your device, the internet, and everything in between, to respond to an action you take (such as clicking on a link) – that is, the time it takes for data to travel from its origin to its destination and back. Closely linked to latency is jitter, which refers to the consistency – or lack thereof – of latency across the network. While high latency can be incredibly disruptive, unpredictable, and erratic, jitter can be equally as frustrating.

Viewpoint

Every Thursday

Whether you're looking to broaden your horizons or stay informed on the latest developments, "Viewpoint" is the perfect source for anyone seeking to engage with the issues that matter most.

By registering, you agree with The Jakarta Post's

Thank You

for signing up our newsletter!

Please check your email for your newsletter subscription.

View More Newsletter

A low latency connection experiences minimal delays and delivers seamless connectivity. In everyday situations like online gaming and content streaming, lag is considered a nuisance. However, for a growing number of applications that demand near-instantaneous response times – from high-speed financial trading, role-playing games, through to emerging technologies such as driverless vehicles and remote surgery – even a slight connectivity hiccup of milliseconds can mean the difference between success and failure – or life and death, in both the virtual and physical worlds.

While minor hold-ups may not seem like major issues in the moment, they often snowball into huge complications down the line. For example, in the volatile high-frequency financial trading context, a slight hiccup could be the cause of a trader filling an order at a much higher share price than initially desired. The latency issue has even birthed a whole trading strategy – latency arbitrage – where investors on ultra-low latency networks capitalise on the minor price differences in a stock that arises due to the time disparity between them and other participants in the market. This practice has significant market impact, with a 2021 Bank for International Settlements study estimating global profits as a result of latency arbitrage at about US$5 billion annually.

More than just a “nice-to-have”, ultra-low latency has now become a fundamental consideration that informs how we design, implement, and envision networks. So, what does this look like in practice?

Widespread ultra-low latency connectivity is not unrealistic, especially considering recent developments in networking that significantly reduce lag. For example, 5G latency can be ten times lower than that of 4G networks, with some 5G networks recording delays as low as four milliseconds.

Physical location also plays a key role in reducing latency, as closer proximity to end-users, humans and machines, means data travels shorter distances and along simpler paths. This is one reason behind the increasing number of data centres sprouting up across APAC. Hosting content physically to the region being served reduces the number of intersections data must traverse before reaching end-users, thus reducing overall latency in the network.

Evidently, network operators have already started taking steps to make widespread low-latency, high-bandwidth connectivity a reality. A prime example is the Southern Cross’ NEXT cable, which has become the third route between Australasia and the United States in the Southern Cross network ecosystem. NEXT will boost the capacity of the entire ecosystem almost five-fold, to around 100 terabits per second. To put this into perspective, this means that the transfer of an estimated 10 billion photos on social media would take just over 300 seconds!

For communications service providers (CSPs), there are economic benefits in reducing latency in their networks. Consider the economic potential promised by what is expected to be a major beneficiary of ultra-low latency connectivity – the metaverse. Early estimates of the size of the metaverse opportunity range enormously from US$750 million to US$13 trillion by 2030, presenting a major new market for CSPs to tap on by leveraging technology like 5G fixed wireless access to deliver the ultra-low latency experience that virtual and augmented reality necessitates.

In these times of rapid progress, latency has become the latest commodity on the market – and rightly so. Governments, businesses, and communities everywhere are looking to future-proof their networks. This means they’ll need to choose solutions that can keep pace with a swiftly evolving online world that has no patience for delays and lags.

The good news is that there are already solutions that CSPs are constantly developing, improving, and bringing to market. Building ultra-low latency networks gives CSPs the prime opportunity to differentiate their offerings to end-users, ultimately introducing new revenue streams for the business while giving them the chance to exceed customers’ expectations and build loyalty along the way.

 ***

The writer is senior advisor of international market development at Ciena.

Your Opinion Matters

Share your experiences, suggestions, and any issues you've encountered on The Jakarta Post. We're here to listen.

Enter at least 30 characters
0 / 30

Thank You

Thank you for sharing your thoughts. We appreciate your feedback.