DroboPro iSCSI IOmeter Test

I ran the IOMeter test from VMKtree against a DroboPro and a Raid 5 3Ware Ar2343 array as well as a LSI i8PRO with a RAID 0. I have four 1TB drives, WD and Seagate,  in the RAID 5 array and two 2Tb drives as well as three 1Tb drives and Two 750Gb drive in the Drobo. The LSI array has 3 10k drives in RAID 0.

The Drobo uses 76W at its max and 75W idle. I wasn’t able to get the Drobo to spin down after 15 min but when I manually put it in standby its uses 15W and waits for the iSCSI connection to reconnect (I have to manually reconnect it – I think I broke it when I changed the IP a few weeks ago). The server that has both of the arrays uses about 185W max and  around 166W when idle.

Controller: LSI SAS211-8i 
Size: 835Gb


Controller: 3Ware 9650SE-4LPML
Size: 2.73Tb


Controller: DroboPro 
Raid: Beyond Raid
Size: 7.34TB (16Tb logical)


Updated Test Lab

Since I found the last post laying around I wanted to post it so we can see how my lab has changed. I’m close to my ideal test environment, more on that in a second, till then I’ll make do with what I can find for cheap :).  A few months ago I was able to leave my test server at my work and borrow two static IP’s and a UPS. With the two static IP’s I can set up one for the host, CHI-HV1, and one for the virtual domain, AIR.Local, and send email out without worrying about the email settings every time the IP changes or blocked email since I’m sending from a residential IP.  I ended up buying airideas.net so I could use it for the test domain and treat it like a real company and instead of messing around with my airideas.com. Getting another domain for $6 is a awesome deal to bad .co is $15 a year, I could have the trifecta of domains.

On to my new(er) testing lab. My main test server, CHI-HV1, is about 20 miles away from me connected to a 20Mb Comcast Business connection and is running Windows Server 2008 R2. I decided to rewrite the specs below because I wanted to see if anything has changed and to get the full scale of my test domain. Soon I will add a “remote” site, Seattle or SEA-HV1, to test out Read-only Domain Controllers and Site to Site VPN. The box that that will be running the VM’s will be a ASUS running a 2.4Ghz Q8800 with 4GB of ram and a 500Gb WD hard drive. I know its nothing like the the server at the office but it will work fine for now.

Host – CHI-HV1 (Hyper-V | Comcast)


: P180


: Q6600 @ 3GHZ
Motherboard: ASUS P5k/Premium WiFi

Hard Drives

: 2x 300Gb VelociRaptors , RAID 0 – Host Drives  2x 1Tb RE3 , RAID 0 -Storage Drives  1x 500GB Scratch Shadow Copy drive

Child Virtual Servers

CHI-DC1 – Domain Controller

CHI-MAIL1 – Mail Server
CHI-DPM1 – Backup Server

CHI-MOSS1 – Windows SharePoint Server

CHI-SQL1 – SQL server
CHI-SQL2 – SQL Server
CHI-WIN71 – Windows 7 Machine (“client” testing)

Chi-Win72 – Windows 7 Machine (“client” testing)

–Future VM’s–

CHI-ISA1 – VPN Proxy Server
CHI-TS1 – Terminal Server / Remote APP Gateway
CHI-TS2 – Termianl Server / Remote APP

Host – SEA-HV1 (Hyper-V | Comcast)


: Antec 

CPU: Q8800 @ 2.4GHZ

Motherboard: Asus

Hard Drives

: 500Gb WD

Child Virtual Servers
SEA-DC1 – Domain Controller

SEA-SRV1 – QuickBooks / FIle Server

KIR-RO1 – Read-only Domain Controller



Below is a diagram of CHI-HV1. If I have more free time I’ll map out the network and SEA-HV1.

CHI-HV1 V3 Virtual Build

In House Test Lab

My test lab , mainly for Hyper-V, consists of a Asus P5k WiFi/Premium motherboard, Q6600 quad core processor overclocked to 3.0Ghz, 8Gb ram, and a 5n3 IcyDock hot swap tray. I have two 300Gb VelociRaptors in RAID 0 running Windows Server 2008 R2 and two 1Tb Western Digital RE3’s in RAID 0 for the virtual machine store. I’m currently getting 230Mb/s read and 215Mb/s write on the VelociRaptors and 231Mb/s read and 214Mb/s write on the RE3’s. Before I updated to the RE3’s I was running a single 500Gb Western Digital Black hard drive as the host drive and a 250Gb Seagate drive for the virtual machine store with 4Gb of RAM.

The performance on the 500Gb and the 250Gb was ok for 2 virtual machines, VM’s, running at a time and was barley useable when you would run 3 virtual machines let alone a whole test domain. I ended up bringing a lot of my VM’s down to 512Mb ram so I could give the Exchange server 2Gb and the SharePoint server 1Gb. It was so bad I would have to wait minutes for all the servers come back from being paused or ,even worse , turning on the Exchange server. I ended up upgrading everything since I spent more time waiting than learning or testing.

I also built a second workstation/server so I could test out iSCSI and gigabit Ethernet. Something about using network storage as your local hard drive seems pretty cool. You can have a fast server hosting the VM’s and have a medium fast (translation – way cheaper) machine for storage. The box, Antec P180, consists of a P5Q Turbo motherboard with a E7300 dual core processor and 4Gb of ram and a 74GB VelociRaptor running Windows Storage Server 2008 with two 1Tb first generation RE3’s and a dual port PCI-e gigabit network card. It also has a 5n3 IcyDock tray with a 500Gb drive and a 160Gb drive with Windows 7 for random OS and software testing. Ill  write up another post about the IcyDock, soar they have been great. I moved twice and rearranged the two PC’s more times than I can remember and the docks still work.

A few of the VM’s I built are member servers, one is a domain controller, two SQL servers, a Exchange server, and a Data Protection Manager server for backup. Depending on a few things I might also build one or two Windows 7 machines so I can carry out some client side testing. Below you can see physical map of the lab.


Physical Map

CHI-HV1 V3 Virtual Build

I also purchased a few Cat6 cables from MonoPrice to test out iSCSI performance. I want to see if a switch slows down the transfer speed and if bridging the ports on the  gigabit cards will have any performance increase. All together I’m glad I upgraded the hard drives along with the ram. In the end virtual machine juggling isn’t fun.

I Found this post in my drafts last week. It looks to 10-12 months old. Ill work on getting a newer of my lab this month.