What's new

Real world performance vs. SNB's benchmarks

  • SNBForums Code of Conduct

    SNBForums is a community for everyone, no matter what their level of experience.

    Please be tolerant and patient of others, especially newcomers. We are all here to share and learn!

    The rules are simple: Be patient, be nice, be helpful or be gone!

dhatcher

New Around Here
Would it be considered normal if your wireless performance results don't match what is benchmarked here on SNB by Tim?

For example, in his test/review of the Netgear Nighthawk, Tim shows 240Mbps'ish download and upload speeds on the 5GHz range.

I purchased a Netgear Nighthawk and tested this by using 2 laptops to transfer a 4.5GB file locally across my wifi network. I only got approximately 1/3 the speed shown in the benchmark here on SNB (about 85Mbps).

My environment is a Lenovo X1 Carbon pulling the file from my 2014 13" macbook pro (so I believe that's a 2x2 nic in the X1 and a 3x3 nic in the mbp). Both on the 5GHz range and both approximately 4 feet from the router with no obstructions between them.

I played with some of the settings on the router (changing the 5GHz channels mostly) but, nothing seemed to change the results much. So I'm curious if this is expected because of real world interference (though in theory, it should be lower amount on 5GHz range and I used inSSIDer to check to see who else is on 5GHz with their router in my area. Nobody :) ).

Any help is greatly appreciated!
 
Hi,
SNB tests things with controlled methology with set of equipment in an environment
as basis. Of course real world result will be different. But SNB result is good as a reliable reference comparing things. Analogy, when you buy a new car there is a sticker stating
gas mileage. In real world driving, no one can realize that mpg figure on the sticker.
 
Benchmarks may indicate what should be expected in the real world (not as absolute numbers; but as relative differences), but to achieve them would be extraordinary in my experience.


I would suggest testing with more than 4' between the clients and the router as you're too close. In my experience, 12' to 30' (still line of sight) gives the optimum throughput in a relatively noise-free environment.

I would also try changing the orientation of the laptops minutely (around 65 to 75 degrees from one edge of the screen to the router vs. the laptop screen being 'flat' to toward the router's antenna) and making sure the laptops are both AC powered.


The biggest thing though is that you're using two clients (both wireless and of different capabilities - do either of them even have an SSD to test off from?) whereas Tim's testing is normally to a single wireless client.


In summary; what you're seeing may be 100% in line with what should be expected given the differences in the testing setup. But I still think that it is low.


Try giving the router and the clients a little more room to fully develop the 'donut' the radio waves are creating and see if minute changes in client and/or router positioning changes the outcome significantly.

You may also want to put the router as high as possible too (especially if you're in the basement close to 'ground').


In one customer's office I visit often, putting my laptop's monitor right side edge almost pointing to the router directly (about 80 degrees actually) gives me more than double the throughput (from a wired computer) from 15MB's to over 35MB's with an Intel 7260 AC card and their RT-AC56U router with RMerlin 374.41 firmware.



Laptop screen orientation seen from top:

\
\
\
\
\


about 12 feet or more (line of sight)

| | |
===== Router orientation.



Hope my crude diagram helps illustrate the orientation that works for my equipment.
 
Thanks guys! I do understand real world results would be different. I just thought my results were pretty far off. L&LD I will give your suggestions a try and see what happens. Both laptops have ssd drives.
 
SSD drives will give better or at least more consistent results, I'm sure.

I see my crude diagram got mangled, let me try again (the spaces were stripped).


Laptop screen orientation seen from top:

.\
...\
....\
.....\
......\


and, leave about 12 feet or more (line of sight)


-----------|....|....| (these indicate the antenna)
-----------===== (and Router orientation).
 
To start, you'd be looking at half the throughput as you are pushing data from one laptop wirelessly to the router and then down to the other laptop. So really, your results are only around 20-30% slower, not 3x slower.

A lot depends on the clients, router, what you are doing, etc.

I've seen results that have more than 30-40% better performance than what SNB shows on some routers...but again, a lot is up to the client and base stations in questions.
 
Sorry I did not see this thread earlier.

As azazel points out, running benchmarks between wireless clients cuts available throughput in half. This is because each packet must be received, then retransmitted using the same radio.

The "240 Mbpish" throughput you cite is the average of the four "location" values that can be viewed in the Performance Table. A more granular view of throughput vs. attenuation (signal level) is available in the "profile plots". The number in the bar charts for these benchmarks is the average of all the values shown in the Performance vs. Attenuation plots.

In either method you see best-case / strong signal throughput (down / up). These values, however, are even higher.

However, aside from the wireless-to-wireless throughput loss, the second factor most influencing your throughput is your clients. The Charts measurements are made with an AC1750 class client. Your measurement has one AC1300 class and one AC867 class client.

The IxChariot composite plot from this article, shows throughput almost cut in half when the same router is tested with AC1300 and AC867 class clients.

To get a better picture of wireless throughput, take L&LD's suggestion of connecting one of the clients via Ethernet. Then you can better see the performance with each client.
 

Attachments

  • b_551_0_16777215_0___images_stories_wireless_max_ac_ac1300_876_433_solo_dn.jpg
    b_551_0_16777215_0___images_stories_wireless_max_ac_ac1300_876_433_solo_dn.jpg
    60.3 KB · Views: 432
I appreciate Tim's testing - he discloses exactly how he tests, and he gets consistent and repeatable results in his environment.

Every location is slightly different, even in test labs where things are controlled - in the subjective space of UserLand, e.g. in the home or small office, environmental differences will show slightly different results, but for the most part, if Product X vs. Product Y in Tim's lab shows a difference, that same difference is likely going to be the same in the real world.

sfx
 
The REAL DEAL in broadband wireless testing is using a "delay and fading simulator". This is an expensive piece of hardware test equipment; like $50-150K or so.
It connects by coax to an access device like a WiFi access point (AP). The AP's signals to and fro are split n ways (e.g., 6). Each split is sent to a programmable delay and a programmable attenuator. Then the 6 are recombined. The recombined signal is then provided to a subscriber/user device, usually via coax, but it could be done with antennas, in an RF anechoic chamber were there are no reflections.

So the programmable fade/attenuation are changed rapidly in time. This simulates a particular movement scenario (vehicular, pedestrian, each amidst certain reflections like indoor office, indoor warehouse, outdoor suburban, etc. The amount of programmed delay and fade and all the statistics on how these vary in time is the key. Field measurements of fade/delay with a "channel sounder" build up this historical (empirical) information. This has to be valid, else the whole assessment is invalid.

Now, the simulator can be used with different client or access devices and antennas, and different fade/delay statistics. This pushes MIMO to show its stuff, or not, by truly improving the error rates and thus throughput, under different statistics of fade/delay.

I spent a year doing this in a lab for one of the big national cellular companies. They just have to know what works well. I wondered if much of this is done for WiFi/MIMO, other than a scarce few university labs, for lack of funding. Selling WiFi RF features though is more like polishing tomatoes.
 
true true...

In development, models exist, or can be created, in tools like Matlab to design things, but to prove things in a more real world environment, it's a lot more effort - and one that needs the original inventor to lay out the cost - to whit, look at EEBS, Panda, Koala, and more... (these are club terms, if you know, you know, otherwise you don't).

Then test equipment vendors like Agilent, Spirent, Anritsu, and others - they can build gear around it...

Multiple Path Faders are spendy - as SteveCH indicated, a single unit can be $100K or more, and that is for a single RF path - with BeamForming, this can be much more, one for each physical path, and then the script development needed to meet the engineering models. And then the post-processing after the fact - we had two PhD's and a trailer park of servers post-processing the data to confirm the models - dedicated full time - for months at a time...

Folks here see a $250 dollar WiFi router as expensive - if you knew the work that went into it, it's pretty much a value.

sfx
 

Similar threads

Support SNBForums w/ Amazon

If you'd like to support SNBForums, just use this link and buy anything on Amazon. Thanks!

Sign Up For SNBForums Daily Digest

Get an update of what's new every day delivered to your mailbox. Sign up here!
Top