Listen, some of you need to hear this.

When properly configured and in normal situations (ie. all equipment works as it should), Wi-Fi can't beat a wired network connection.

The very nature of how Wi-Fi handles network connectivity works against it for low latency applications.
There's a lot more that goes into keeping a connection established and maintained, since it basically requires two devices to be broadcasting out in the air and for those two devices to also be listening for those broadcasts.

Being open-air also opens it more to interference.
Certainly, wired internet is not immune to interference, particularly if you are using a poor, unshielded cable, but that's not a particularly common issue nowadays.

But a lot of things mess with wireless broadcasts: other wireless broadcasts, device saturation. A microwave.
A wireless router basically pushes everything it gets from the internet out into the air.

That's why wireless security matters: anyone can see everything on the network, because you can't direct who receives what. Individual devices do the work of decoding.
In the end, it comes down to amount of effort.

So basically, from a router to your console, if you use a wire, the data goes to the router, and is sent down the line to your console, and viceversa. That's it. Packet loss by design is likely to be minimal and interference is low.
With wireless, once data gets to your router, it has to

- get encoded
- get put in with other data
- chucked into the air w/some level of redundancy/repetition

Then your device has to

- find all of the pieces of your data in the air
- rebuild
- decode
This is a lot slower when radio waves are the medium than when receiving basically an electrical signal over a cable, because, again, everything causes interference.

Someone turned on a TV? Someone making phone calls? Jimmy wants a Hot Pocket!
The extra steps it takes to encode and decode radio waves and the effort it then takes to successfully retrieve them (and the possibility that corruption happens while they are radio waves) makes Wi-Fi simply worse if latency/stability are critical components of your application.
Meanwhile, a cable just fuckin. It sends some shit down. And you get that shit. And then you send shit back.

And you can resend shit super goddamn fast, so even if the router goes "hey wait can you repeat that" you already sent it again and they already received it.
Don't get me wrong, Wi-Fi is a technological future miracle, and it rules, and being able to shitpost on Twitter while you're taking a dump is something you would've never convinced 1995 me would ever be a thing.

But that's kind of what Wi-Fi is for. Non latency-intensive stuff.
Let me put it this way: the guy scalping and reselling tickets for a living isn't using WiFi if they can avoid it at main office. Chances are they're plugged straight into the internet so updates are goddamn instant when they hit F5, and they pay for Ultra Fuck You internet.
Anyways, tl;dr.

Wi-Fi always adds delay and latency to a connection and introduces a lot more interference. It's unavoidable and the protocol is designed around mitigating this.

Wired is much more stable and resistant to interference by nature and doesn't have that overhead.
You can follow @nothingxs.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled: