Yeah I totally get that. My greater point is that developers can't even really develop for low-latency high-bandwidth applications because they don't exist. I live a weird life. At work, we've got a server. It's got two thirds of a petabyte of storage. And we fill it up every year. Heinous, heinous stacks of Newscutter. And we move stuff around on 10gigE. I literally can not think of an application that a normie would use that can tap into that kind of bandwidth. Speed is power and power is shortened battery life and if you can watch eight 4k streams at a time but also can't get your battery to make it to the end of the day, you're going to cut back to one 4k stream which will fit on 4G.
I can think of plenty of things, but they're all stupid. It's a ton of "Eight 4k Streams at the SAME TIME" ideas that don't need to exist but fucking will when 5G happens. I think we'll see some ridiculous IoT things with the low-latency, but they'll be stupid and useless and shiny and new. I think there are big, schnazzy, unimportant strides to be made with location tracking and remote automation / control. Ultimately, it'll all be shit that we do right now but with less wires. I might be missing your point here, but that's my predictionI literally can not think of an application that a normie would use that can tap into that kind of bandwidth.
Right, this is my point: all the applications for "shittons of bandwidth" are "shittons of video" and 4K out of Netflix is 15Mb/s. Verizon 4G is 5-12Mb/s all day long with peaks of up to 50Mb/s. Even those numbers are conservative: So here's the thing. Let's say you're gonna do "eight 4k streams AT THE SAME TIME". Do you have an 8k cell phone? Really? Can you see the pixels? And do you want to pay for the bandwidth? Because I've got an app on my phone that will gladly stream every goddamn camera I have at my phone all at once. They're not all 4K but they're all HD. And there are thirteen of them. And yer damn skippy that the server they're all talking to transcodes that shit down to one composite screen that fits the resolution of my phone and the bottleneck is, of course, passing the cameras to the phone. I have another app that will show me all the imagery I want on another set of cameras. It politely lets me know what my bitrate is. And here I am, in my house, looking at my four HD cameras at once, and none of 'em are breaking 1 Mb/s. Live streaming video on a brand new Samsung over Wifi. Even dumber is on my phone, I can't tell the difference between an SD stream and an HD stream until I zoom in... and the SD streams are a paltry 60kb/s. I edit video for cinema and I can't saturate CAT5e. I work for a place with 80-odd HD cameras recording to 12 HD decks uncompressed and we're throwing it all around on 10GigE. I've seen 8k/240. It's like looking through a goddamn window. Rec.2020 - the shoot-the-moon video spec - using the dumbest, most uncompressed, jurassic tech can cross 50Gb/s. So yeah - there it is. If you want to throw 8k bitmaps across your channel 240 times per second, yer gonna need 5G. But you're also gonna have to pay for it, and that, I think, is where everyone will reef the fuck back down to 1Mb/s streams.First, a bit of context. The current average download speed for 4G phones across the US is about 35 Mbps (megabits per second). As with all averages, however, there are a big range of speeds that make up that figure. For example, using a modern 4G smartphone, like the Samsung Galaxy Fold, iPhone 10S Max, or Google Pixel 4, inside my home at the north end of Silicon Valley on AT&T’s network, I regularly see average download speeds of 120-135 Mbps, or 4x faster than the national average. Peak speeds—which aren’t sustainable over time—can go even higher.