Crack the 40 (MHz Wide Channel) Open, Homie and Guzzle (the Bandwidth Available Over) It

Everybody likes high Wi-Fi speeds.  Because high Wi-Fi speeds mean that the channel is being used more efficiently (often false).  An efficient channel means that there's more available throughput (only in sterile test environments) and more available throughput means that more users can be supported concurrently (completely wrong).

Unfortuantely, high Wi-Fi speeds sometimes (all the timecome at a cost.  To get higher Wi-Fi speeds, wider channels must be used (which makes the Wi-Fi suck).  Using wider channels means that fewer channels will be available (plus it ups minimum RSSI requirements, which just about guarantees a bad design).  It is therefore essential that wireless professionals analyze the environment and carefully choose whether to use 40 MHz or 80 MHz wide channels (or they could stop wasting everyone's time and just stick to 20 MHz channels).

But this blog post isn't about choosing the correct channel bandwidth (although it should be, because tons of Wi-Fi goes bad because wide channels are configured).  This blog is to explain how Wi-Fi devices use 40 MHz and 80 MHz channels, and how spectrum analyzers sometimes make channel bonding look worse than it actually is (which it kills me to say, because using 40 MHz or 80 MHz channels is almost always bad).

I post this blog begrudgingly.  I would prefer to just say, "fuggetaboutit, use 20 MHz channels" and leave it at that.  I would prefer to just write a bunch of stuff about how the 802.11 Working Group should've made 802.11ac a standard for residential Wi-Fi and how a separate 802.11 amendment should've been created for enterprise Wi-Fi.  And I would prefer to go on a big ol' rant about how frustrating it is that 40 MHz/80 MHz channels deliver higher throughput numbers, but that channel bonding makes Wi-Fi failures (disconnected users, lack of Internet access, etc.) far more likely.  So, it really, really pains me to have to write this blog post where I defend 40 MHz/80 MHz channels against some of the negative comments I've seen written about them.

Here's the thing: I want this blog to be about LEARNING.  I believe that if the people who work in Wi-Fi really understand Wi-Fi, then more better Wi-Fi will happen.  And so I feel compelled to explain why spectrum analyzers give a deceptive picture of how efficiently 40 MHz and 80 MHz channels are used.

Here's a look at my 40 MHz channel during a large file download:


My wireless router is set to channel 48/-1, meaning that I am using a 40 MHz wide channel covering channels 44 and 48, with channel 48 being the primary 20 MHz channel.

If you look closely, you can see that channel 48 is slightly more dense than channel 44.  If I would have been in an environment with a large number of devices, the difference would have been more pronounced.  The 20 MHz spectrum for channel 48 would have showed much more activity than the 20 MHz channel for channel 44.

When Spectrum Analysis Goes Wrong

Wi-Fi folks sometimes make the mistake of believing that using 40 MHz channels (or 80 MHz channels, for that matter) causes frequency to be "underused".  The theory is that since the density/duty cycle is higher for the primary 20 MHz channel, that means that the non-primary 20 MHz channel(s) is(are) being wasted to some degree.  

People who get channel usage wrong always seem to have the same reasoning behind it.  They think that some of their users have devices that only support 20 MHz channels.  The idea is that when those  20 MHz-only devices are active, the unused 20 MHz (or 60 MHz, in the case of 80 MHz channels) goes to waste.

In reality, modern Wi-Fi devices support 5 GHz channel widths wider than 20 MHz.  The last year that that saw 20 MHz-only support for 5 GHz devices was 2011.  That's a long time ago and chances are 99%+ that lack of 40 MHz/80 MHz channel support is not the reason why spectrum density & duty cycle look lopsided.

The Protocol Analyzer Shows Why

The real reason that 40 MHz/80 MHz channel usage looks lopsided is because of control and management traffic.  Modern Wi-Fi devices send & receive all management frames and control frames using a 20 MHz channel (this is probably because 20 MHz channels are more reliable than 40 MHz/80 MHz channels, WHICH IS WHY IT PAINS ME TO DEFEND THEM IN THIS BLOG).  

An OmniPeek capture of my primary 20 MHz channel shows it clearly:


Less than 1% of the 20 MHz traffic on my primary channel is data traffic.  

At the same time, my full 40 MHz channel shows plenty of data:


For my test I only had one device active on channel 48.  In the real world, with dozens of devices on channel 48, the result will be the same: tons of non-data traffic over the primary 20 MHz channel, but almost no data traffic.

The reason why a spectrum analyzer gives deceptive information about Wi-Fi is that a spectrum analyzer is just showing raw channel activity.  A spectrum analyzer has no ability to differentiate between data and non-data traffic.  When data frames occupy the channel, the spectrum analyzer sees the entire 40 MHz/80 MHz used.  When management or control frames occupy the channel, only the primary 20 MHz channel gets used.  Hence, the lopsided density/duty cycle readings when analyzing a 40 MHz/80 MHz channel for raw spectrum activity.

Using a spectrum analyzer to view 20 MHz wide channels is also going to show deceptive information.  The total channel space will show density/duty cycle numbers that make it seem like channel space is being used more efficiently, but that's not the case.  The "more efficient" channel usage would just be due to the additional management and control traffic that comes when two APs use 20 MHz channels in place of a single AP using a 40 MHz channel.  The amount of channel time available for data (which is what really matters) would remain the same.

All That Having Been Said

40 MHz/80 MHz 802.11n/802.11ac Wi-Fi channels are just as efficient as 20 MHz channels (maybe even more efficient), but narrower channels are still best for most enterprises.  20 MHz channels allow for fewer devices per channel and for data to be successful at lower signal-to-noise ratios (SNR).  Those things matter a lot.  They matter a lot more than hitting a high number on a throughput test.  So, if good Wi-Fi is the goal, ignore this blog's defense of 40 MHz/80 MHz channels and stick to 20 MHz.

***
If you like my blog, you can support it by shopping through my Amazon link.  You can also donate Bitcoin to 1N8m1o9phSkFXpa9VUrMVHx4LJWfratseU or to my QR code:













Twitter: @Ben_SniffWiFi
ben at sniffwifi dot com

Comments

Popular posts from this blog

Spectrum Deception

What's New (and Missing) in the WiFi for iPhone 6

Free Sniffing in Windows! (Kind Of)