Year of the Green Data Centers, Part 2: Mechanical Ops
Paradoxically, although mechanical operations consumes only a third of the power required to operate a data center, this is where the largest energy savings can be found.
In addition, certain measures can be applied to both existing and new computer rooms.
From this perspective, we consider the basic principles of CRAC or the concept of hot and cold aisles, keeping cold and hot air separated, sealing all openings in raised floor tiles (if there is a raised floor) to avoid any loss of cold air, installing “blank covers” where there are no servers, etcetera.
The purpose of these small improvements is to efficiently produce cold air, that is to say, if the cooling requirements in the room is 30 tonnes, produce 30 tonnes and not 40 tonnes, which is often the case in existing rooms given its configuration in addition to lack of knowledge of basic principles.
These small improvements in a computer room can easily generate 1 to 2 percent of savings on air conditioning.
A higher stage to an existing room is cold (or hot) aisle containment to direct the cold and warm air to their respective destinations: the cold aisles for cold air to return back through cooling units through the hot aisles-for hot air.
In short, the goal is to optimize the use of cooling resources already present in order to minimize issues with temperature as well as maximize the use of air conditioning units.
This confinement methodology reduces power consumption for a fixed charge and often allows for the addition of servers without the need to add a new cooling system.
When the room is brought up to the standards mentioned above, it is time to move to the final stage of increasing the set point temperature in the cold-aisles. New trends and ASHRAE standards now mention operating temperatures in cold-aisles above 80°F.
Naturally, this temperature change can only be done if the servers in place accept this temperature as the operational mode temperature.
The mentality within the operations team is another big challenge within any company that wants to implement this measure, although it has been proven for more than two years at Google, Yahoo and Facebook.
Free-Cooling
In contrast, the largest savings can be found in the cooling method for data centers, commonly known as “free-cooling.”
Cooling water is more easily applicable in existing sites if the configurations of the air conditioning system in place allows for certain modifications. A new site can accommodate both methods. The size of the facilities and the room will dictate the preferred method.
If we assume and accept an operating temperature of 80°F in cold-aisles, we could have free-cooling for over 80% of the time in Montreal. Energy savings are significant, approximately 75% compared with the standard computer room design.
Depending on the choice of free cooling installed, the return on investment is approximately 18 months.
Power Redundancy: Attention!
Although we advocate the importance of redundancy and concurrent maintenance in the data centers we design and during meetings with our clients, these two concepts make the goal of saving more cumbersome.
Indeed, having redundancy means having equipment (mostly electrical) operating outside of their range of optimal performance. Therefore it is very important to validate the configuration of the system and its impact on energy saving.
A Few Words on PUE
Although PUE is very popular and a good measure of energy usage, it is of no assistance in the development of an energy saving plan.
Indeed, the PUE is the ratio of the total kWh over 12 months of the total Data Center on the IT loads kWh. In other words, if you spend 2,500,000kWh annually for your entire data center load consuming 1,000,000kWh, you have a PUE of 2.5.
The goal is to get the closest to a PUE of 1 possible. The average PUE for a data center on operation today is between 2 and 3. The best in class (Google, Yahoo or Facebook) have a PUE of 1.1. The best in class in the real world (Bell, CGI, IBM) have a PUE of 1.2 to 1.5.
In contrast, facilities can be equipped with the less efficient UPS available on the market and it will not affect the PUE. An extreme example is that the PUE for a room with a UPS that is 80% efficient but utilizes free-cooling throughout the year would be 1.2 or less.
The PUE measurement is aimed to improve the efficiency of installations, and not the measure of energy efficiency.
To conclude, any energy begins with an analysis of not only your needs but also of your computer room and existing electro-mechanical installations in order to identify potential sources of savings. However, it is important that savings do not reduce the reliability and robustness of your facilities and your customer service.
Michel Chartier is Quebec’s leading consultant for the design of Mission Critical Centres. He was the first Engineer in Quebec to receive Uptime’s Institute accredited Tier Designer certification. His professionalism and the originality of the solutions he presents have earned him a solid reputation with large-scale clients. UPS representatives consult him when they need expert advice for their special projects. He has visited hundreds of sites, always taking into consideration the technology used by clients to evaluate their current and future needs. Animated by passion and curiosity, he keeps up with technology and new developments in his areas of expertise.
Also, be sure to check out Kelvin Emtech here, visit them on LinkedIn
or Google+ here .