Application service providers periodically replace servers for obvious reasons: enhanced performance (improvements in processors, memory, etc.); reliability (newer equipment is theoretically less likely to fail); and warranty issues (servers will drop off maintenance contracts; spare parts are no longer available).

These are all logical, rational reasons for undergoing a periodic server remediation program. But there is a hidden benefit that organizations can realize when upgrading infrastructure — it helps the environment. While doubling server metrics, such as processing power, room for expansion, and available storage, you also can reduce overall power consumption on a single server up to 50 percent. Yes, it’s a win-win, unlike when you replace that grilled chicken salad with a New York strip steak garnished with butter-laden garlic potatoes, where the increase in taste is offset by the cost of the calories. But server upgrades now can be put into the class of red wine or dark chocolate — something enjoyable that actually may be good for you! How is this possible? Well, there are several classes of innovation within the hardware industry that contribute to these opportunities for environmental gains. Some come with simple improvements to servers, others are related to gains from other common projects, such as shifting applications to the cloud or server virtualization.