What's new

Trouble getting more than 90MB/s on 10GB running Windows Server 2008 R2

  • SNBForums Code of Conduct

    SNBForums is a community for everyone, no matter what their level of experience.

    Please be tolerant and patient of others, especially newcomers. We are all here to share and learn!

    The rules are simple: Be patient, be nice, be helpful or be gone!

Roveer

Occasional Visitor
I bought 2 MNPA19-XTR 10GB MELLANOX CONNECTX-2 PCIe X8 10Gbe on eBay so I could do high speed transfers between machines.

Initially I put one in my Windows Server 2008 R2 and one in a PC. Had all kinds of trouble. I then put both cards in my ESXi machine and ran up 2 Windows 10 VM's. I tuned the drivers (jumbo packets, buffers and a few other things you should do on 10GB drivers. I added 8GB Ram Disks on both Win10 instances and pushed 6GB file across at 700MB/s+ Just what I was looking for. Of course on ESXi they use their own drivers (vmxnet3) so I'm not using any Mellanox drivers.

I then put one of the 10GB cards into my Dell T320 Xeon machine into a 16x PCI slot. Installed various different Mellanox drivers (very confusing), set up the same ram drive and haven't been able to get more than 90MB/s no matter what I do with the Win10 VM.

I have verified that the traffic is indeed going across the 10GB link. By using the ram disks on both sides I'm eliminating the disk subsystems.

Tonight I ran up 2008 R2 as a VM, did same setup and got the same 700MB/s+ speeds, again using the ESXi vmxnet3 driver.

Does anyone have any insight into how I fix this problem? There are so many Mellanox drivers it's very confusing, but I've tried many of them with no fix.

I'm not expecting 700MB/s+ once I re-introduce the disk subsystems, but until I fix my transfer problems, I'm using the ramdisks to try and figure out what is going on. If I can get 200-300MB/s on disk I'd be happy, double/triple what I'm getting on GBe.

--- update ---

I'm in the process of trying to pass the Mellanox NIC directly through to the guest OS on ESXi. If I am successful (waiting on some windows updates to finish), then I'll be even closer to the exact configuration I have with my stand-alone 2008 R2 server (which is a Dell T320), and then I can see if I'm having a similar slow-down. I will then try and fix the problem using the VM's before I try and take the card back to the Dell.

--- update 2 ---

I was able to get the Mellanox card passed through to the VM running Windows Server 2008 R2 and installed the 4.80 version of the driver. After tuning it up I'm still transferring 700+MB/s. So what's happening on my Dell T320 that under very similar configuration I'm only getting 90MB/s? This is getting frustrating.





Thanks,

Roveer
 
Last edited:
A Breakthrough!!!

What I had not provided in my first post is that the OS on the Dell T320 is SBS 2011 which runs on Windows Server 2008 R2.

Today I re-built the VM as a SBS 2011 server and when trying to copy files to it over the 10gb link I got the poor 80MB/s performance. So now I have a test platform to try and work this out without interrupting my production SBS 2011 server.

Now the question becomes, what's happening in SBS 2011 which is Server 2008 R2 that is not happening in in 2008 R2 alone. Not exactly sure where to go from here, but I'm going to post over on the Microsoft forums to see if they can provide any help.

Ideas?
 
I will take a stab at it. I assume the NIC is pugged into a Dell slot which does not have a shared bus.
From a networking stand point you do not want your high speed NICs in the same network with other slow NICs which will cause a slow down. I would isolate the 2 high speed NICs and go from there.
 
I found what I believe is the problem. Last night I did a deep dive into the problem and had 2008 R2 and SBS 2011 both run up as VM's. What I found is that the problem would arise as soon as I installed AD on 2008 R2, without AD it would not exhibit the problem. SBS 2011 has AD installed by default (hence the problem exists). I then went about googling to try and find the problem and late last night I hit on something that gave a positive result.

If I disabled Digital Signing and encryption on either of the OS's with AD installed the problem would go away. I'm still determining exactly what policy commands I need in order to fix the problem as I had changed 3 things last night and it started working. I also have to determine if changing any of those designations will have negative effects in my environment.

So for right now, it doesn't seem to be hardware but rather Windows software. I'll make a full report once I've figured out exactly what is happening.

Having my nice little Dell R710 ESXi vm setup has helped me tremendously in not only doing 10GB testing, but debugging this problem. This would have been impossible to do in the production environment, but with VM's and snapshots I was able to quickly and easily create the environment and test away. Very happy I set that system up. It's been a godsend.

Roveer
 
Thank you for posting your solution. I'm sure someone else will run into this same problem as 10gb starts coming down in price and getting more popular.
 

Support SNBForums w/ Amazon

If you'd like to support SNBForums, just use this link and buy anything on Amazon. Thanks!

Sign Up For SNBForums Daily Digest

Get an update of what's new every day delivered to your mailbox. Sign up here!

Members online

Top