Over the weekend I realized I have become English. No, I haven’t picked up the accent. Nor do I eat Marmite. Actually eating Marmite can be relegated to one of those things I’d never do. Never, ever. So why have I become English? I tend to discuss the weather and temperature … a lot. I’m a friendly creature by nature so I am always talking with people. And because no one is ever fully happy with the weather here (no matter what it is doing), I find that it is one of the safest topics for discussion. Everyone has an opinion and no one is wrong except when you talk to the experts. Then I take their word as gospel. Case in point. I’ve moved to the other side of our office building where we have no air-conditioning. For 9 months of the year this is not a problem but when it gets hot here then I boil. I used to complain about the heat until one of our gurus showed me a little trick: put the aircon in the meeting room at the end of the hallway on high and open all doors. After an hour the cool air finally makes it to my end of the hallway but only if no one books the meeting room. If they do, it’s back to suffering in silence.
I always find that ironic. After all, we are the developers of FloVENT, one of the industry’s best HVAC simulation tools for the design and optimization of airflow in structures. We have the tools to help us fix this problem but like most experts, we tend to ignore the home (just ask anyone in construction about how many jobs they’ve got to do in their own home). So imagine my surprise when I came across this recently published whitepaper “Data Center Design Using Improved CFD Modeling and Cost Reduction Analysis”. The paper was written by Messrs. vanGilder, Mikjaniec, Manning and Small and was presented at Semitherm last month. The fact that a whitepaper was written was not the surprising bit. What caught my eyes was the topic – Mentor Graphics used CFD to optimize the design of two new data centers. Well, I’ll be… the experts listen to their own advice!
Mentor Graphics has data centers spread around the world. The company was interested in consolidating 20 local and small data centers in Europe and North America into just two. The objectives behind the consolidation were to increase capacity, efficiency and redundancy as well as reduce operational costs. The current distributed design has had server heat loads increasing by 33% per year. Simply by improving efficiency the group thought that they could reduce it to 24% but that wasn’t enough. The group was interested in much higher reduction – that of about 14% increase.
The team considered several designs – among them raised floor design, dropped ceiling design, chimney and hot aisle containment. Their findings are absolutely fascinating. For example one option would increase the cost of construction by as much as 28%. Without the help of simulation I doubt the team would have figured this out until well into the construction phase when the construction team would have turned around and asked for more funds – that’s a surprise no one likes to be at the receiving end.
If you find yourself needing to know more then this whitepaper is a must read. You can get a copy of the whitepaper from IEEE through this link. I won’t spoil the ending for you but here’s a glimpse into it: the chosen design will allow the North American data center to achieve 7000 hours of free cooling per year. The remaining 1760 hours would require partial cooling. In anyone’s book that would translate into a tidy sum of money…something that is routinely celebrated here on this blog. Considering our ever increasing reliance on data there are lots of companies needing to increase capacity in their data center and save costs. I just wonder how many other companies are going through this exercise too. However many there are, I hope that they too are using the tools that are out there to look into an optimal solution for them. As for us, I’m ecstatic that the “physician” has taken his own advice and by way of this blog I’d like to thank the experts. Well done Jim, Travis, Andy and Derrick!
Until next time,