The need for cooling capacity is increasing globally. An expanding population and an evergrowing dependence on data increases the need for process cooling, centralized space cooling, and data center cooling. Meanwhile, in many places, water scarcity is a massive issue.
In conventional, industrial cooling applications, the use of water for heat rejection is critical. Cooling towers and most evaporative fluid coolers depend heavily on water to reject waste heat to the atmosphere.
Taking blowdown and drift losses into account, evaporative cooling systems use approximately three GPM per 100 tons of cooling capacity. A 1,000-ton industrial cooling application running around the clock consumes approximately 15.7 million gallons of water annually, assuming a constant year-round load. As ambient air temperatures rise globally and the climate becomes more severe, the demand for water increases.
Many facility managers – when they first learn of the volume of water used for an evaporative cooling system – look to dry cooling systems as the answer. This is often an idealistic response, before they’re aware of the obstacles that dry cooling systems pose for large facilities. Eliminating the use of water from the cooling process entirely dramatically increases one or more of the following: total connected fan horsepower (and energy consumption), initial investment, and mechanical footprint, often to the point of being impractical or impossible. So, is there a happy medium?
Read the full article, here.