New Network Fallacies

#

I remember reading articles about how 3G connectivity was going to transform performance and, more generally, the way we used the internet altogether.

I remember how, later on, a common question I would get in after giving performance-focused presentations was: “Is any of this going to matter when 4G is available?”

The fallacy of networks, or new devices for that matter, fixing our performance woes is old and repetitive.

To be fair, each new generation of network connectivity does bring some level of change and transformation to how we interact with the internet. But it does so slowly, unevenly, and in ways that maybe aren’t what we originally envisioned.

It takes time, money and significant other resources to roll out support for a new network. We’re not talking months in most places, but years. Inevitably that means it’s going to hit some of the big market areas first and slowly trickle down to everyone else. I don’t even get to pretend that I’m in a particularly remote area and yet even here, it’s only in the past few months that I’ve been able to connect to a reliable 4G network.

Even if it is 4G (5G, or whatever else) that doesn’t exactly guarantee you’ll be getting the amazing, theoretical speeds promised. For one, carriers like to throw around new networks as marketing, and (shockingly) they’re not always 100% honest about it.

AT&T’s “5G E” is a great example. While they technically do state in the description of the technology that it’s not 5G, they still call it 5G E and display it as such on your phone if you connect to it. Embarrassingly, if you use a “5G E-capable” phone on other providers, who aren’t pretending to have shipped anything related to 5G yet, those other providers 4G networks outperform AT&T’s 5G E according to Open Signal.

This is nothing new. There was all sorts of similar controversy when the first carriers started rolling out supposed 4G networks.

Once a new network does get rolled out, it takes years for carriers to optimize it to try and close in on the promised bandwidth and latency benchmarks.

We’re still nowhere close for 4G. In theory, the maximum downlink speed is 100 Mbps. Compare that, for example, to recent data from Open Signal about actual speeds observed in India. The fastest 4G network clocks in around 10 Mbps, and the slowest around 6.3 Mbps.

And those speeds aren’t constant. A few months earlier, Open Signal reported on the variance of 4G network performance in Indian cities based on the time of day and found that 4G download speeds can be 4.5 times slower during the day than at night.

In other words, new network technologies sound amazing in theory—and certainly do provide substantial benefits—but not for everyone and not at the same pace.

All of this makes me more than a little leery when I read articles like the one the New York Times posted about how they plan on experimenting to see how they can use 5G to push storytelling online further.

Over the past year The Times has honed its ability to tell immersive stories, allowing readers to experience Times journalism in new ways. As 5G devices become more widely adopted, we’ll be able to deliver those experiences in much higher quality — allowing readers to not only view more detailed, lifelike versions of David Bowie’s classic costumes in augmented reality, but also to explore new environments that are captured in 3D.

I’m excited about what this could mean for their readers.

I’m also terrified of what this could mean for their readers.

There’s already a massive, and rapidly growing, divide between the “have’s” the “have not’s” online—I worry about us doing things that will only widen that gap.

Experimentation is great. Moving the web forward has always involved a healthy level of friction between those seeking to push its boundaries and those looking for ways to improve its stability and resilience. There’s a bit of yin and yang involved here for sure.

What worries me is how often that experimentation ends up hurting users. It’s one thing to experiment and test limits, it’s another thing to push those experiments onto people who can’t afford, or don’t have access to, the technology required to use them.

I share Jeremy’s concern:

One disturbing constant in web development is that as network connections and devices improve in speed and quality, we will inevitably eat those gains by shipping more crap in our apps people never asked for.

It echoes one of my favorite quotes from Jeff Veen’s episode of Path to Performance.

..as bandwidth grows, and as processing power grows, and as browsers get better we just keep filling everything up. We often lose track of the discipline of now that bandwidth is faster let’s work on making our sites load faster rather than now we can do more with that available bandwidth.

There’s a scientific name for this: Jevons paradox. Personally, I favor the more approachable—and humorous—Andy and Bill’s Law.

In either case, the meaning is the same: as the efficiency of a resource increases, so does our consumption of that resource. It’s why Uber and Lyft have increased traffic congestion, not reduced it, and it’s why, even with the massive improvements to CPU and network performance over the last few decades, performance is still a business critical issue needing to be addressed.

I’m always happy when we see network technology take a leap forward because I do know that, eventually, billions of people stand to benefit from it. But even as I drool over theoretical promises of those new technologies, I think it’s important to remember that those technologies won’t solve our issues for us. It takes a lot of time, and we still have to do the work ourselves.

Whether we choose, as Jeff Veen said, to focus on how we can use those new technologies to provide a more performant experience or to focus on how we can use them to provide more stuff plays a massive role in determining just how effective those new technologies are ultimately going to be.