G
gportellihale
Guest
Hi
While I was checking the performance of the new NASs that were currently released I came across the following note on the NAS charts web pages:
NOTE: The maximum raw data rate for 100Mbps Ethernet is 12.5 MBytes/sec and 125 MBytes/sec for gigabit Ethernet. Throughput above these values is due to memory caching effects in the client OS and NAS under test.
I cannot understand how you can obtain a higher throughput than Ethernet's RAW data rate which can be attributed to cache on the NAS. That you can obtain higher throughput because of cache on the client side I can understand, but on the NAS side, I simply am at a loss as how the cache on the NAS can boost the data rate to a value higher than the RAW data rate.
Can someone please explain?
Thanks
George
While I was checking the performance of the new NASs that were currently released I came across the following note on the NAS charts web pages:
NOTE: The maximum raw data rate for 100Mbps Ethernet is 12.5 MBytes/sec and 125 MBytes/sec for gigabit Ethernet. Throughput above these values is due to memory caching effects in the client OS and NAS under test.
I cannot understand how you can obtain a higher throughput than Ethernet's RAW data rate which can be attributed to cache on the NAS. That you can obtain higher throughput because of cache on the client side I can understand, but on the NAS side, I simply am at a loss as how the cache on the NAS can boost the data rate to a value higher than the RAW data rate.
Can someone please explain?
Thanks
George