The new test lab is born…

Ok, so last weekend I was happily messing around in my Exchange test lab, then as Elzar from Futurama would say “Bam” the Exchange 2010 virtual server that I am working on dumps me out of RDP and will not let me back in.

I tried to ping the said Exchange Server – no reply, in fact I could not ping anything that was previously running in my lab. At this point I suspected that something screwy had perhaps happened to my home server – darn it, I was in the middle of something!

I popped upstairs to where my trusty friend resided, and noticed immediately that the machine was not powered on.

There was also a strange smell of burning in the air ~ you know the type, like burned out Electronics and melting plastic with a distinct sensation that this was not going to work out well.

I knew that this was a screwdriver job, so I opened up the case which revealed the power supply which can only be described as not looking as good as it used to – with some pretty hefty scorch marks emanating from its air flow vents.

This wasn’t good, in fact I suspected a total disaster.

Fitting a spare power supply that I had stashed away (like many I suspect, over the years I have built up a large range of spare parts for situations just like this one) pretty much confirmed my diagnosis.

Whereas I could get the machine to Power on (well I say “power on” – it beeped at me a whole hell of a lot), the brown out in the PSU had destroyed the motherboard, and after some further (more time consuming) troubleshooting  – the CPU and the RAM.

It was at this point, I probably started crying a little bit as to be without the test lab was one thing, but to know that it was pretty much fried really rounded off what had been a pretty crappy week for me, and at this stage I had no idea if the data on the Hard Drives was safe (about 1.7 TB of Exchange labs for various projects that I had been an was working on).

The new specification

I thought that I would put the boat about a little bit and spend some serious cash on my new lab server, after all I was already dog house, so how much worse could it get? (actually don’t answer that!).

As many of you know I am serious about my blogging, and even more serious about working with Exchange Server so I decided that it was about time that I stopped messing around with kit at home that was in constant need of tweaking in order to get 7 Virtual Machines running with a reduced specification – I needed a home rig that can run up to 10 VM’s at a specification which is as close to production as I can realistically get.

So, the following was what I came up with:



CPU Intel i7 2600 8MB 3.40GHz
Memory 16GB (x 4 4 GB Corsair DDR3 XMS3 1600Mhz DIMMS – CL9)
Motherboard Asus P8 Z68-V Pro
Graphics Card ATI Radeon 6770 850MHz HD 1GB PCI-E HDMI Direct CU
Hard Drives x 2 Samsung 2TB SATA 3.0
Case Antec 85-Dark Fleet Full Tower Case
Power Supply Antec EA-430 Green EarthWatts PSU
Optical Drive Samsung 40 Speed SATA DVD Writer


What I liked about the above specification is that it gave me a good price point, which weighed up favourably with performance out of the box, and scalability for the future (although I have promised the wife that I will not upgrade it for at least a year ~ well, it might have been two years, but time erodes people’s memory of conversations!)

Putting it all together

This is perhaps my favourite part of getting any new computer – physically putting it together (I am the type that has always had custom machines made up from separate parts . I find that consumer models that you can get in the shop just don’t have the right specifications for the task that I will use the boxes for).

Of course there are a few of the downsides to building you own rigs, is that they can be fiddly to get to work first time, and indeed in the past I have been known for, how shall we say, making the odd fatal mistake (like putting in too many AIX mounting pins ~ bang!).

However I made sure that this time around, that I double, and triple checked each and every part’s placement within the case – avoiding that crispy “hair do” and a dead mother board moment (and more sobbing).

Below are a few snaps from the build process (I have a habit of documenting every machine that I build like this) for those whom are interested;


Now that the machine is up and running, I can tell you that the performance is incredible.

The i7 processor just does not break into a sweat, and with the intelligent overclocking features of the Asus Mainboard mean that power is delivered to the Virtual machines “on demand”.

I can get about 11 machines running concurrently in the lab (if using Exchange 2010 with a reasonable spec about 6) all without noticeable slow down on the host or within the VM.

The only thing that I would probably change is replacing the Stock Cooler that comes with the i7 for a commercial alternative. From monitoring the machine the idle temperature is around 30 – 35 degrees Celsius  per core – which is ok, however I have seen it spike to about 60 – 72 when under load.


Whereas none of the above will cause the i7 major problems (the Tj.Max is around 98 ~ for those whom don’t know Tj.Max stands for Temperature Junction max and it is the threshold that needs to be reached before the CPU will be throttled down to a slower speed – this is different to the Thermal Cut-Out – which is where the CPU kills the system and shuts down before it is damaged).

By changing the stock cooler can probably overclock the entire rig safely to about 3.8 (although managed to get one to 4.3 running at around 73 degrees – but I am not interested in that!) with a different heat sink.

Did I get all my old VM’s back?

The good news was, that although the internal hard disk drives from the old server were, like the rest of the old kit – “dead” – I stored all of my VM’s on USB3 external Hard Drives (long story) – all of them worked perfectly when connected to the new rig, so I was able to copy the content to the new 2 TB disks with no issues.

The Case (Enclosure)

I know, there are other cool parts in this rig that perhaps deserve a mention, but the case is – to say the least very cool (both from a looks and, well, cooling perspective).

Normally I would always buy ThemalTake cases and Power Supplies – but the vendor whom I went to did not seem to have any in stock, so I opted for the Antec Dark Fleet 85 ~ All I can say is wow and I was not disappointed!

7 fans, all adjustable, cleanable fan filters, provision for hot swap SATA, 2.5 inch SSD hot plug bay at the top and really easy to work with.

Really amazing bit of kit, and it runs the with the associated parts so well – I can really recommend it.

As you can imagine – I am really pleased with the whole setup – and the best bit of all is that I can continue to write articles for all of you – perhaps just slightly faster!


  1. Nice, thanks for new configuration guide, I had read your earlier config, I was thinking to buy the same but i was little bit confused, thanks again.
    I just want ask one thing u always keep your machine running mode, during office hours and night?
    I am Exchange Beginner I am handling exchange 2010 with 12 users in India.


    1. Hi Wasim, I used to keep the maching running overnight; but now given the investment that I have made I shut it down when I am finished (also saves a little bit on the electricity!).

    1. Not sure what you mean – why didn’t I go with the i5 or what do I think of it?
      I bought the i7 2600 as its a top of the line CPU at a great price point that was within my range.
      Don’t get me wrong; I think the i5’s are great bits of kit – but having the choice I favoured the i7.

      1. Hi Andy i checked the same Configuration in the market the price is very high,can you tell me how much you paid for each parts.Please


  2. I would like to know if you took noise dB that produce your beast system, For me is one of the big factors to built my own system and save a big money from any commercial solution, in my country there is not any place where you can take measurements of this kind of systems. Could you please if you did that exercise to provide us the noise level pressure of your system when’s running your eleven virtual machines?

    1. Juan, I will be honest – that was not a big consideration when I built the machine; mainly because that sort of thing does not bother me having spent most of my 17 years in IT working on Data Centre floors. However, that being said – and by luck – it is almost silent, even at peak load.
      The case and CPU cooler have physical controls for the fans which I have set at a level where I have to check to see if the machine is actually turned on.
      The Ninja heat sink keeps the CPU well below the TJ max, and the other components are well within thresholds.


  3. Hi Andy,

    Hope you still monitor this thread.

    I was wondering if you have a more recent document that contains an updated specification of hardware that would recommend when creating an exchange 2013 test lab


Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.