Skip to main content

I know this is not Veeam related but I am also part of the vExpert program with VMware and had the chance to get some Intel Optane drives through the program, so I wrote a blog on how I got them set up and some initial testing done.  You can find it here- https://justvirtualization.blog/2023/03/31/intel-optane-vexpert-awesome/ 
 
Intel Optane + vExpert = Awesome!

 


 
Thank you to @Intel and @vExpert for the opportunity to test some unique technology in the Intel Optane drives.


I have been part of the VMware vExpert program for four years now, and in 2023, there was a webinar done by Intel on their Optane drives. After the webinar, there was the chance to be chosen to receive a gift from Intel of some Optane drives for HomeLab testing.

 

I decided to submit for the chance to receive the Optane drives and was lucky enough to be selected. At the time of selection, you had to choose your drive choice, which was PCIe or U2 format. My HomeLab consists of four Intel Skull Canyon NUCs, so using PCIe was not going to work, but dumb of me was the selection that I accidentally chose. After a discussion with Corey Romero, the vExpert Community Manager, I got in touch with Intel, which sent the U2 format drives to me.

 

Once receiving the Intel Optane U2 form factor drives, it was time to determine a USB adapter to use them with my Intel NUCs. One of the first things I needed to do to use a USB drive within VMware ESXi8 was to turn off the USBArbitrator service on all my hosts. After that, it will recognize the USB adapter and drive, at which point I can change the drive to ” FLASH ” to be used with vSAN. So I tried the first adapter from Sabarent, which can be found here –

 

Sabarent Adapter – https://www.amazon.com/SABRENT-Type-Adapter-Cable-EC-U2SA/dp/B0BRBTKJMD/ref=sr_1_3?crid=3E6MFWXOTRQM3&keywords=sabrent+u2+to+usb&qid=1680287407&sprefix=sabarent+u2+to+usb%2Caps%2C229&sr=8-3

 


While this adapter did work for my U2 drives, it was very problematic within VMware ESXi8, wherein it would randomly disappear from the storage list. I would need to unplug the drive from the adapter and then plug it back in to get it working again. This was not going to suffice for me to be able to properly test vSAN without it constantly going into error states. This would just not do, so I started to explore other adapters. I found another one which is also USB but is like a dock for the U2 drives, keeping it vertical, so I decided to just order one to try before committing to four total. This adapter is from StorageMall and can be found here –

StorageMall Adapter – https://www.amazon.ca/NVMe-Adapter-SFF-8639-Docking-Station/dp/B0B31LRRGZ/ref=sr_1_1crid=36RVYLRAIITRR&keywords=storagemall+u2+to+usb&qid=1680287840&sprefix=storagemall+u2+t%2Caps%2C406&sr=8-1 

 

So I ordered it, and it was time to connect it to one of my Intel NUCs for testing

StorageMall – U2 to USB Adapter

This is what it looks like connected to my Intel NUC –

U2 drive connected to Intel NUC via adapter

So after testing just one adapter, the U2 drive never disconnected from ESXi8 while the other drives, which were still connected via the Sabarent adapter, did! So that sealed the deal, and I ordered three more adapters – one for each Intel NUC. I already had a 1TB NVME drive in each Intel NUC for the capacity tier, and the Intel Optane drives would be used for the cache tier.

I received my other StorageMall adapters and got all my drives connected to each Intel NUC. I then set up vSAN correctly, and it has been rock solid since getting these little docking units. Highly recommend them if you want something reasonably priced and not overly expensive for USB connectivity.

So, I could configure VMware vSAN with just minimal settings as I did not need anything fancy, and the status eventually appeared as OK.

uSAN- Health Monitor with U2 Optane drives

One of my first tests was with a Windows VM using DiskSpd. The configuration is as follows –

  • Windows 10 (64-bit)
  • VMware Paravirtual SCSI controllers (x2)
  • OS drive (C), Data drive (E)

So, I used the following command to run the tests on both the C drive and E drive, just changing the command for the drive letter each time, and the command was piped to a text file for future reference –

diskspd.exe -d60 -W15 -C15 -c128M -t4 -o4 -b8k -L -r -Sh -w50 c:\disk-speed-test.dat

The results were not bad, to say the least, with the VM being on vSAN and having USB Intel Optane U2 drives. An example of the tests is below for the C drive –

Input parameters:

 

        timespan:   1

        -------------

        duration: 60s

        warm up time: 15s

        cool down time: 15s

        measuring latency

        random seed: 0

        path: 'c:\disk-speed-test.dat'

               think time: 0ms

               burst size: 0

               software cache disabled

               hardware write cache disabled, writethrough on

               performing mix test (read/write ratio: 50/50)

               block size: 8KiB

               using random I/O (alignment: 8KiB)

               number of outstanding I/O operations per thread: 4

               threads per file: 4

               using I/O Completion Ports

               IO priority: normal

 

System information:

 

        computer name: TON-7UZ

        start time: 2023/03/21 18:37:39 UTC

 

Results for timespan 1:

*******************************************************************************

 

actual test time:      60.02s

thread count:          4

proc count:            4

 

CPU |  Usage |  User  |  Kernel |  Idle

-------------------------------------------

   0|  17.29%|   3.07%|   14.22%|  82.71%

   1|  18.15%|   3.20%|   14.95%|  81.85%

   2|  17.42%|   2.79%|   14.63%|  82.58%

   3|  16.95%|   3.05%|   13.90%|  83.05%

-------------------------------------------

avg.|  17.45%|   3.03%|   14.42%|  82.55%

 

Total IO

thread |       bytes     |     I/Os     |    MiB/s   |  I/O per s |  AvgLat  | LatStdDev |  file

-----------------------------------------------------------------------------------------------------

     0 |      1355177984 |       165427 |      21.53 |    2756.41 |    1.450 |     3.899 | c:\disk-speed-test.dat (128MiB)

     1 |      1347731456 |       164518 |      21.42 |    2741.26 |    1.458 |     3.943 | c:\disk-speed-test.dat (128MiB)

     2 |      1348632576 |       164628 |      21.43 |    2743.09 |    1.457 |     3.926 | c:\disk-speed-test.dat (128MiB)

     3 |      1348214784 |       164577 |      21.42 |    2742.24 |    1.457 |     3.976 | c:\disk-speed-test.dat (128MiB)

-----------------------------------------------------------------------------------------------------

total:        5399756800 |       659150 |      85.80 |   10983.00 |    1.455 |     3.936

 

Read IO

thread |       bytes     |     I/Os     |    MiB/s   |  I/O per s |  AvgLat  | LatStdDev |  file

-----------------------------------------------------------------------------------------------------

     0 |       678486016 |        82823 |      10.78 |    1380.03 |    0.209 |     0.552 | c:\disk-speed-test.dat (128MiB)

     1 |       673710080 |        82240 |      10.71 |    1370.31 |    0.210 |     0.914 | c:\disk-speed-test.dat (128MiB)

     2 |       672129024 |        82047 |      10.68 |    1367.10 |    0.211 |     1.162 | c:\disk-speed-test.dat (128MiB)

     3 |       674004992 |        82276 |      10.71 |    1370.91 |    0.211 |     0.894 | c:\disk-speed-test.dat (128MiB)

-----------------------------------------------------------------------------------------------------

total:        2698330112 |       329386 |      42.88 |    5488.35 |    0.210 |     0.906

 

Write IO

thread |       bytes     |     I/Os     |    MiB/s   |  I/O per s |  AvgLat  | LatStdDev |  file

-----------------------------------------------------------------------------------------------------

     0 |       676691968 |        82604 |      10.75 |    1376.38 |    2.693 |     5.201 | c:\disk-speed-test.dat (128MiB)

     1 |       674021376 |        82278 |      10.71 |    1370.95 |    2.705 |     5.210 | c:\disk-speed-test.dat (128MiB)

     2 |       676503552 |        82581 |      10.75 |    1376.00 |    2.694 |     5.130 | c:\disk-speed-test.dat (128MiB)

     3 |       674209792 |        82301 |      10.71 |    1371.33 |    2.703 |     5.263 | c:\disk-speed-test.dat (128MiB)

-----------------------------------------------------------------------------------------------------

total:        2701426688 |       329764 |      42.93 |    5494.65 |    2.699 |     5.201

 

 

 

total:

  %-ile |  Read (ms) | Write (ms) | Total (ms)

----------------------------------------------

    min |      0.025 |      0.810 |      0.025

   25th |      0.089 |      1.980 |      0.125

   50th |      0.125 |      2.337 |      1.283

   75th |      0.197 |      2.952 |      2.343

   90th |      0.383 |      3.534 |      3.111

   95th |      0.652 |      3.959 |      3.541

   99th |      1.380 |      6.024 |      4.755

3-nines |      3.349 |     28.245 |     18.948

4-nines |     14.731 |    242.986 |    240.320

5-nines |    213.511 |    244.998 |    244.357

6-nines |    239.836 |    340.090 |    340.090

7-nines |    239.836 |    340.090 |    340.090

8-nines |    239.836 |    340.090 |    340.090

9-nines |    239.836 |    340.090 |    340.090

    max |    239.836 |    340.090 |    340.090

 

 

The results from the E drive –

Input parameters:

 

        timespan:   1

        -------------

        duration: 60s

        warm up time: 15s

        cool down time: 15s

        measuring latency

        random seed: 0

        path: 'e:\disk-speed-test.dat'

               think time: 0ms

               burst size: 0

               software cache disabled

               hardware write cache disabled, writethrough on

               performing mix test (read/write ratio: 50/50)

               block size: 8KiB

               using random I/O (alignment: 8KiB)

               number of outstanding I/O operations per thread: 4

               threads per file: 4

               using I/O Completion Ports

               IO priority: normal

 

System information:

 

        computer name: TON-7UZ

        start time: 2023/03/31 19:33:49 UTC

 

Results for timespan 1:

*******************************************************************************

 

actual test time:      60.00s

thread count:          4

proc count:            4

 

CPU |  Usage |  User  |  Kernel |  Idle

-------------------------------------------

   0|  76.69%|  30.42%|   46.28%|  23.31%

   1|  76.46%|  32.14%|   44.32%|  23.54%

   2|  75.63%|  33.41%|   42.21%|  24.38%

   3|  76.72%|  36.41%|   40.31%|  23.28%

-------------------------------------------

avg.|  76.37%|  33.09%|   43.28%|  23.63%

 

Total IO

thread |       bytes     |     I/Os     |    MiB/s   |  I/O per s |  AvgLat  | LatStdDev |  file

-----------------------------------------------------------------------------------------------------

     0 |      1076240384 |       131377 |      17.11 |    2189.62 |    1.824 |     2.812 | e:\disk-speed-test.dat (128MiB)

     1 |      1060225024 |       129422 |      16.85 |    2157.03 |    1.852 |     2.830 | e:\disk-speed-test.dat (128MiB)

     2 |      1046495232 |       127746 |      16.63 |    2129.10 |    1.877 |     2.965 | e:\disk-speed-test.dat (128MiB)

     3 |      1010950144 |       123407 |      16.07 |    2056.78 |    1.943 |     3.253 | e:\disk-speed-test.dat (128MiB)

-----------------------------------------------------------------------------------------------------

total:        4193910784 |       511952 |      66.66 |    8532.53 |    1.873 |     2.967

 

Read IO

thread |       bytes     |     I/Os     |    MiB/s   |  I/O per s |  AvgLat  | LatStdDev |  file

-----------------------------------------------------------------------------------------------------

     0 |       539795456 |        65893 |       8.58 |    1098.22 |    0.499 |     1.017 | e:\disk-speed-test.dat (128MiB)

     1 |       530014208 |        64699 |       8.42 |    1078.32 |    0.514 |     1.017 | e:\disk-speed-test.dat (128MiB)

     2 |       521920512 |        63711 |       8.30 |    1061.85 |    0.514 |     1.139 | e:\disk-speed-test.dat (128MiB)

     3 |       505266176 |        61678 |       8.03 |    1027.97 |    0.529 |     1.360 | e:\disk-speed-test.dat (128MiB)

-----------------------------------------------------------------------------------------------------

total:        2096996352 |       255981 |      33.33 |    4266.35 |    0.513 |     1.139

 

Write IO

thread |       bytes     |     I/Os     |    MiB/s   |  I/O per s |  AvgLat  | LatStdDev |  file

-----------------------------------------------------------------------------------------------------

     0 |       536444928 |        65484 |       8.53 |    1091.40 |    3.158 |     3.357 | e:\disk-speed-test.dat (128MiB)

     1 |       530210816 |        64723 |       8.43 |    1078.72 |    3.190 |     3.377 | e:\disk-speed-test.dat (128MiB)

     2 |       524574720 |        64035 |       8.34 |    1067.25 |    3.233 |     3.544 | e:\disk-speed-test.dat (128MiB)

     3 |       505683968 |        61729 |       8.04 |    1028.82 |    3.356 |     3.914 | e:\disk-speed-test.dat (128MiB)

-----------------------------------------------------------------------------------------------------

total:        2096914432 |       255971 |      33.33 |    4266.18 |    3.233 |     3.551

 

 

total:

  %-ile |  Read (ms) | Write (ms) | Total (ms)

----------------------------------------------

    min |      0.034 |      0.730 |      0.034

   25th |      0.137 |      2.079 |      0.240

   50th |      0.240 |      2.618 |      1.564

   75th |      0.517 |      3.483 |      2.676

   90th |      1.023 |      4.860 |      3.865

   95th |      1.572 |      6.412 |      5.025

   99th |      4.330 |     13.682 |     10.370

3-nines |     15.588 |     30.139 |     25.808

4-nines |     29.935 |    223.382 |     68.381

5-nines |     58.715 |    238.147 |    236.003

6-nines |     77.962 |    238.896 |    238.896

7-nines |     77.962 |    238.896 |    238.896

8-nines |     77.962 |    238.896 |    238.896

9-nines |     77.962 |    238.896 |    238.896

    max |     77.962 |    238.896 |    238.896

 

So, the overall performance, while not great, is pretty decent using U2 over USB. I will try to run some further benchmarking tests, and since I did receive the PCIe Intel Optane drives, I will be testing a couple of those in a Synology DS923+ NAS that I am getting from them for testing and more blogs. It will be interesting to see how they boost the performance in the NAS using them as possible cache drives or even volumes may be depending on the number I can squeeze in there.

I hope this article was informative, and I will do more on the Intel Optane drives in future posts. Until then, happy testing!

Huge thank you to @safiya for posting this for me via the backend since it was so long. 😂


Comment