Introduction
One of the wireless networking stories at this year’s Las Vegas CES – aside from the scads of networkable DVD players and “media adapters” – was the battle for bragging rights to the highest throughput “starburst” number. (The “starburst” is the number prominently displayed on the front of a product’s box).
As of CES, all the major wireless chipmakers’ entries – Atheros’ Super G, Broadcom’s Afterburner and GlobespanVirata’s Nitro XM – are now public, with the companies touting maximum raw data rate numbers of 108, 125 and 140Mbps respectively.
The Atheros Super-G Need To Know took an in-depth look at the controversy that Broadcom has attempted to stir up over what has now become not just a potential wireless LAN killer (at least if you believe Broadcom), but a first-to-market competitor to Broadcom’s own technology.
In this article, we’ll take a look at GlobespanVirata and Broadcom’s entries into the battle to soup up your 802.11g network’s speed.
GlobespanVirata’s Nitro XM
I spent over an hour with numerous GlobespanVirata folks getting the pitch and poking at a demo they had set up in their hotel suite. But before I get to the demo, let me share what I learned about what makes Nitro XM tick.
Nitro XM builds on Nitro’s packet bursting mechanism, adding Compression, Concatenation and something GlobespanVirata calls Direct Link (Figure 1).
Figure 1: Nitro XM Direct Link explained
Direct Link had me confused when I first read the company’s press release because of this statement:
PRISM Nitro XM reduces costly overhead throughput loss by creating an automatic, secure, high-speed link directly between wireless clients when they are within communication range of each other. This allows clients to communicate directly with each other at speeds of up to 140 Mbps while maintaining a simultaneous network connection with the AP.
After my time with GlobespanVirata, I’m now unconfused. DirectLink is not the same as 802.11 AdHoc mode. Direct Link Stations (STAs) must associate with a Nitro XM-enabled AP and stay that way even when Direct Linking. (STAs listen to 802.11 beacons and receive WEP or WPA key information from the AP – that’s the “simultaneous network connection”). But when it comes time to move data, the transfer does take place directly from STA to STA, not retransmitted through the AP.
Nitro XM delivers enhanced throughput between any two radios, including the radio in a Nitro XM-enabled AP. So wireless STA to STA throughput will be the same as STA to (Ethernet) LAN – assuming equal signal conditions. This “one-hop” method gives Nitro XM an advantage over all other throughput enhancement methods, which require all communication to be retransmitted through an AP or wireless router, costing a “hop” (and throughput loss) in STA-to-STA transfers.
The other not-so-obvious benefit of Direct Link comes from the higher throughput that should be obtained between two closely-spaced STAs, especially when they are both located some distance from their associated AP. Since the main data flow is directly between STAs, they should be able to communicate at a higher speed instead of having to fall back to a lower transmit rate. Although it’s an unusual approach, this allows Direct Link to provide effective range (more accurately throughput vs. range) enhancement.
Nitro XM’s weakness is that a major portion of its throughput enhancement comes from the compressibility of the data being sent. GlobespanVirata made it clear that the demo used a best-case-compressible data file. In real multimedia applications, which use highly compressed data, Nitro XM will provide little to no throughput gain over normal Nitro. You’ll also see no throughput gain when sending encrypted data or statically compressed (zipped, etc.) files over the network. Then why do they call it Xtreme Multimedia? I’m still trying to figure that one out… Note that Atheros’ Super-G technology also relies on data compression and will provide less enhancement with already-compressed files.
It’s important to note (and I would get a nasty-gram from GlobespanVirata if I didn’t) that Nitro XM doesn’t use any form of channel bonding, and so shouldn’t interfere with neighboring WLANs. Nitro XM STAs and APs also constantly look for non-XM STAs and skip using the fancy stuff when they’re associated with a Nitro XM-enabled AP.
Got all that? On to the demo then…
Speed Limit 70Mbps
Figure 2 shows the test setup that GlobespanVirata had in its hotel room. Please excuse the quality of the picture, as I’m still getting used to using my new mini digital camera, which lacks auto-focus.
Figure 2: Nitro XM test setup
On the left is a two laptop setup using two NETGEAR WG511T 108Mbps Cardbus cards and WGT624 108Mbps router, both of which are based on Atheros’ Super-G technology. The right side has two laptops using PRISM reference design cards running through a PRISM reference design router, all of which were Nitro XM-enabled.
Figure 3: Nitro XM Test Result
Figure 3 shows the results of a LAN Mark XT throughput run, using a highly compressible data file (A-Z, 0-9 repeated multiple times). The results show that Nitro XM really can produce the claimed 70Mbps throughput, without the channel bonding that has made Atheros’ Super-G the bad boy of throughput enhancement technologies. (The two lines in the graph are receive and transmit throughput and the Y axis scale is 10Mbps per line.)
To see if the Super-G WLAN was able to play as nicely with Nitro XM as it did with other PRISM-based products, I asked the demo master to run both the Super-G and XM setups at the same time.
Figure 4: Nitro XM with Super-G running
Figure 4 fuzzily shows (the vertical scale is the same in Figure 3) that although the Nitro XM network showed significant throughput variation, the Super-G network didn’t come anywhere near shutting it down. I also took a screen shot of what happened to the Super-G network, but it really didn’t come out. But the result was that the Super-G network showed similar throughput variation due to interference from the Nitro XM network, but again, no network shutdown.
These results might seem to contradict GlobespanVirata’s claim that Nitro XM “does not cause interference to other networks in the area”. But as I showed in the Atheros Super-G NeedToKnow, any two wireless networks placed as closely as the two in this setup will interfere with each other, no matter what flavor they are.
The other “factoid” I learned was that XM has UDP optimization built in. This is a good thing since multimedia streams are UDP based. I saw a quick animated-block demo that showed throughput over 200Mbps with Nitro XM vs. about 40Mbps when run on the SuperG setup.
That’s all for Nitro XM. Next I’ll next take a look at what Broadcom’s Afterburner has to offer.
Broadcom’s Afterburner
I got the sense during my time in Broadcom’s private room at CES that Afterburner isn’t quite yet ready for prime time. The company made no announcement of the technology at CES, and searches I made just now on both the Broadcom and 54g.org websites turn up empty. But since Broadcom-customer Buffalo Technology introduced its WHR2-G54 125Mbps router at CES, the show had to go on!
I wasn’t able to get many details about Afterburner’s component parts, other than a confirmation that is also doesn’t use channel bonding techniques and won’t be the “bad neighbor” that they accuse Super-G of being. It’s safe to say, though, that most of Broadcom’s “secret sauce” is comprised of getting rid of as much 802.11 overhead as possible, while still looking out for non-Afterburner STAs.
I did get to see a demo – sort of. The noisy RF environment at the show wasn’t being kind to Broadcom and they were having a hard time running both the Afterburner and Super-G “bad neighbor” demos. Since I didn’t want to go away empty-handed, I begged the Broadcom engineer giving the demo for something to show what sort of throughput that Afterburner could deliver.
Fortunately, he had done some test runs at home where the air was clean and free and the results are shown in Figure 5.
Figure 5: Broadcom Afterburner test run
This shot of a Chariot throughput run (the vertical axis spacing is 3Mbps) shows average Afterburner-enhanced throughput of about 34Mbps. Since two different programs were used – with different test files being tranferred – you can’t directly compare these results with the Nitro XM results in Figure 3.
These results could be more directly compared with the Chariot-based Super-G testing that I did as part of the Atheros Super-G NeedToKnow article. If you take that leap of faith, then Afterburner looks like it provides throughput enhancement similar to Super-G, but without Super-G’s potentially disruptive channel bonding.
What are these guys really selling?
Without having run my own tests, and with both Nitro XM and Afterburner not yet in their first release, I’m reluctant to draw any which-product-is-better conclusions from what I saw. Instead, I’ll offer my view on this whole fixation with throughput number inflation.
The race for higher throughput is allegedly being made in the name of every consumer’s desire to have multiple video streams wirelessly beamed to all rooms of their homes. Supposedly the demand is there, just waiting for technology to make it all possible.
In reality, the 2.4GHz band where 802.11g lives is a lousy place to stream a video signal. Heck, as tons of folks who went out and bought wireless networking stuff this past Christmas found out, it can be hell to even get data networking running reliably! There are simply too many gadgets competing for too little spectrum and higher throughput doesn’t do anything to solve this problem.
Another problem that higher throughput doesn’t solve is product returns. Every networking product maker I talk to says their wireless products have the highest return rates. The products are not coming back because they’re too slow, but because they don’t work at all, don’t work reliably, or are too hard to get working. Once again, higher throughput doesn’t solve any of these problems. (And think of this return rate problem the next time you see a press release from a wireless chip manufacturer trumpeting all the chips they’ve sold…)
So why is everyone selling throughput? A better question might be why are you buying? Because you, dear consumer, have shown the industry that when they put a bigger number on the box, you’ll make those boxes fly off the shelves! In an industry where the normal driver – business – has kept its collective wallet clamped shut for the past three years, manufacturers are just trying to stay in business. And since – fueled by easy money from mortgage cash-out refinancings – consumers have kept buying, that’s where networking companies (and chip manufacturers that supply them) are aiming their marketing engines.
It will be interesting to see if manufacturers change their focus given the hopeful signs that businesses are once again starting to buy. But the impression I got from my conversations at the show was that pumped-up 11g was for multimedia streaming and not for expanding the capacity of heavily used office WLANs – at least that’s what the current pitch is. This is probably because business won’t touch non-standard products unless there’s a very compelling story that outweighs the risk. But consumers are less concerned with standards compliance, and so far, folks have been just lapping this stuff up.
So think of those poor little streaming video bits, yearning to deliver the latest Simpsons’ episode to your 42 inch plasma. Isn’t all this wireless speed the way to get them safely to their destination?
What do you really need?
Although video streaming does need higher throughput than audio, it certainly doesn’t need 100Mbps+! As the rule-of-thumb table below shows, standard resolution TV can stream very nicely at 5Mbps, and HDTV resolution streams can run over a wide range of speeds, with the “sweet spot” around 20Mbps.
Standard | Application | Bit rate |
MPEG 3, WMA |
Audio
|
28 – 500kbps
|
MPEG 2
|
Standard TV (480i)
|
3 – 5Mbps
|
MPEG 2
|
DVD / HD (720p, 1080i)
|
6 – 25Mbps
|
MPEG 4, DivX
|
Portable video, “VHS-quality” TV
|
64kbps – 4Mbps
|
Table 1: Multimedia Formats and Bit rates
|
Any consumer entertainment product manufacturer seriously working on wireless streaming video (and any consumer who has experienced the “freedom” of wireless networking) knows that wireless communications are by nature highly unreliable. These industry giants know they’d better have systems that can deliver acceptable pictures under very adverse signal – and therefore throughput – conditions, or the products will stay on the shelves.
The real smart ones – including Panasonic and Toshiba – are pursuing IP-over-coax systems and wired delivery systems in addition to wireless. Coax is likely to be present in any room where serious, i.e. high definition, video watching will take place, and wires will always beat wireless for reliable delivery of high-speed data. Makers of high performance streaming video products also realize that they’ll probably need to move to 802.11a to get the clearer spectrum and larger number of channels needed to deliver high-definition video, or multiple lower definition streams.
The real performance issue is to maintain high (enough) data rate over the desired range. For residential applications, the problem is not so much long range, as it is a relatively “closed” environment that tends to quickly knock down signal strength, or bounce signals around like crazy, creating lots of multipath.
One way to get a high data rate is to start out high and expect to lose speed relatively quickly. This seems to be the approach that all the 802.11g throughput enhancements are taking. Another approach would be to start out at a high enough rate for the intended application (with some margin thrown in for buffer) and maintain that rate with only slight degradation over the target area of coverage. This second approach focuses more on range improvement than boosting maximum data rate.
So why aren’t wireless chip manufacturers working on improving range? Maybe it’s because networking companies got so burned by the range claims they used to prominently feature on their product packages. The average wireless networking product buyer may not be able (or inclined) to check throughput specs, but they sure can take out a tape measure and check distance claims. In many cases, though, a tape measure wasn’t even needed, since some products didn’t even make it from one room to another!
Strangely enough, the only manufacturer I see focusing on range performance improvement is the one probably most burned by its distance claims – Atheros. 802.11a still has a “doesn’t go as far” reputation from the poor distance performance of Atheros’ first generation 11a products. 802.11a products using the chipset didn’t have the range of 802.11b products, were more expensive and buyers left the products on the shelves.
Somewhere in Atheros, though, a hard lesson was learned, because from my testing, Atheros-based products using their current-generation chipsets maintain higher throughput over longer ranges than any of their competitors, and do it in both 2.4GHz (11b and g) and 5GHZ (11a) bands. Atheros still needs to get this message through, however. You can read their whitepaper (PDF) on the subject, or my report if you’d like an independent view.
Although Atheros is catching crap on Super-G for the interference problems that are sometimes caused by its channel bonding feature, they could shut channel bonding off and still have the best 11g radio for video streaming due to the superior radio performance of their current generation chipset. And this is before they enable the XR (eXtended Range) technology which promises to flatten and extend the throughput vs. range curve even further for both 2.4GHz and 5GHz bands.
Conclusions
Afterburner should be out shortly in Buffalo Tech’s 125Mbps product, but other Broadcom customers – most notably Linksys – are mum on their plans to pump up the throughput of their 11g product lines.
GlobespanVirata would say only that the first Nitro XM-enabled products would be out this quarter, but declined to say from which of their customers. My guess is that SMC or D-Link would be first, but both companies have already introduced (Atheros-based) products with the 108Mbps number proudly displayed. NETGEAR also seems content to lead the Super-G charge, having introduced a new Atheros-fueled multimedia router at CES.
All of this begs the question of how much confusion the poor consumer can stand. Although the Wi-Fi mark was supposed to be the guiding light for confused wireless networking product buyers, it seems to be running the risk of becoming somewhat irrelevant. Many companies aren’t bothering to submit products for certification, and when they do, the detailed Wi-Fi approval box (with the check marks) is relegated to the back of the product box.
Instead, all the major consumer companies have developed their own system of wireless product marking to attempt to make it clear to the consumer what it is that they’re buying. But with 11, 22, 44, 54, 108, 125, 140 (and ya gotta love D-Link’s “15x” attempt at consumer guidance) to choose from, will anyone less than an industry expert be able to sort it all out?
My advice?
-
It’s too early to call a winner in the “enhanced 802.11g” race. If you must buy the 100Mbps+ hype that the networking vendors are selling, buy stuff from the same manufacturer. Implementations are immature and changing rapidly and it’s more likely than not that even products with the same “starburst” on the box won’t work properly together.
-
Don’t count on wireless as the only distribution strategy for any home. I’m in the process of building a new home and you can be sure that I’ll have regular CAT5, two RG 6 or 59 coax and a phone line run to any location where I’ll be doing serious video watching!
-
Stop chasing throughput alone and concentrate on range vs. throughput if you really want the best performing products. Unfortunately, manufacturers won’t help you in sorting this out. You’ll need to check newsgroups, product reviews and knowledgeable friends for help.
-
If you’re serious about wireless video streaming, give 11a a shot. Better yet, get dual-band gear to give you the most flexibility, including the ability to separate data and multimedia traffic into different bands. Unfortunately, manufacturers still need to add dual-band USB 2.0 adapters and wireless-to-Ethernet adapters to their catalogs to make this possible on the client end