Not a lot of updates this week mostly because I’ve been pretty busy with work.
Right now I’m front-loading the setup hours for a test that requires five 2012r2 servers. I’ll be using some the Dell “PowerEdge” 1950 III servers I picked up from Dell for a series of tests I did for the State of Colorado in 2008.
The 1950s are a bit slow compared to more modern hardware around here (dual Xeon E5405 CPUs and 16Gigs of ECC DDR2), but they’re still workhorses and I use them for a lot of testing… Most of the three-dozen of them I got for the above testing are still running the original 250G HDs they came with, and some have been running 24/7 since that testing.
The company website runs from one of these 1950s in fact.
The test in question is another “Marketing Test” where I will be doing 3rd party comparisons between products to generate pretty graphs that the client can use to sell their software. And being as none of the data of interest actually involves server performance — 1950s. 🙂
I’ve done a lot of these over the years; the first one was for “NetZero” back in 2006 or so, where I had several ten-machine banks of Dell desktops running scripted browser instances clocking the load times of a series of 20 web pages over various dialup providers… That was a crazy number of phone lines, and the modem cacophony on the redial phase was epic.
These comparison tests are easy to do mechanically, but the report process is always a bear… The client wants a report that says THREE MILLION PERCENT BETTER THAN BRAND X! — of course, but that’s rarely the case. And because I’m an engineer and facts don’t care about feelings, my reports are rarely good sales fodder.
Which is why these days I design the test, execute, and gather data — and the QA Director creates the word salad that makes the client happy and yet sticks to the findings. Which works better for everyone involved.