Monthly Archives: October 2015

10GE Testing – Multi-NIC vMotion

Had access to some 10GE NICs and therefore decided to do some performance testing and try out Multi-NIC vMotion.

Intel X520-DA2

The 10GE NICs were Intel X520-DA2 and I used SFP-H10GB-CU3M to connect directly to both hosts as I do not access to a 10GE switch at the moment.

Without reading any documentation, I tried add both 10GE ports into a vSwitch and vMotion does not work. After googling and reading the documentation, we actually have to separate the 10GE ports into their own vSwitches before the Multi-NIC vMotion will work.

http://kb.vmware.com/selfservice/microsites/search.do?language=en_US&cmd=displayKC&externalId=2007467

Single-NIC 10GE vMotion Configuration
multi-nic-vmotion-1nic-host1multi-nic-vmotion-1nic-host2

Single-NIC 10GE vMotion Tests
Test 1: 14 seconds
Test 2: 14 seconds
Test 3: 10 seconds
Test 4: 11 seconds
multi-nic-vmotion-1nic-vmotiontest1multi-nic-vmotion-1nic-vmotiontest2multi-nic-vmotion-1nic-vmotiontest3multi-nic-vmotion-1nic-vmotiontest4

Single-NIC 10GE vMotion Network Performance Graphs
multi-nic-vmotion-1nic-host1-network-graphmulti-nic-vmotion-1nic-host2-network-graph 

 

Multi-NIC 2X10GE vMotion Configuration
multi-nic-vmotion-2nic-host1multi-nic-vmotion-2nic-host2

Multi-NIC 2X10GE vMotion Tests
Test 1: 10 seconds
Test 2: 12 seconds
Test 3: 09 seconds
Test 4: 12 seconds
multi-nic-vmotion-2nic-vmotiontest1 multi-nic-vmotion-2nic-vmotiontest2multi-nic-vmotion-2nic-vmotiontest3multi-nic-vmotion-2nic-vmotiontest4

Multi-NIC 2X10GE vMotion Network Performance Graphs
multi-nic-vmotion-2nic-host1-network-graphmulti-nic-vmotion-2nic-host2-network-graph


Conclusion:
Although with Multi-NIC, vMotion seems to perform faster as compare to Single-NIC, however, the difference is not significant. This is probably due to the size and the number of VMs I used for the vMotion tests.

It was noticed that there was load sharing of vMotion traffic between the 2 10GE ports as you can see from the network performance graphs. Using single-NIC, network usage is about 200MBps while multi-NIC scenario, network usage is about 100MBps per NIC which is 50% of that single-NIC.

How to recover unmounted disks in VSAN?

With VSAN6.1, Virtual SAN monitors solid state drive and magnetic disk drive health and proactively isolates unhealthy devices by unmounting them. This has happen to my home lab and wish to share how I recover the unmounted disks.

You can read more about this feature on http://cormachogan.com/2015/09/22/vsan-6-1-new-feature-problematic-disk-handling/

VSAN_unmounted_disks

As you can see, the disks in one of my hosts are unmounted. I have tried to find a menu so I can remount the disks but could not find one. Put the hosts into maintenance mode and restart the hosts also did not recover the unmounted disks.

I have to erase the partitions (both HDD and Flash) for that host, claim the disks again before I could observe the disks are mounted correctly again.

VSAN_unmounted_disks_delete_partitions