Google's datacenter in Finland is going to be expanded and at the same time they'll stop dumping the excess heat into the sea. They're building a heat recapture system wich will feed the heat into the central heating loop of the city of Hamina, for free. The excess heat from Google will be enough to keep the houses of Hamina heated during the winter.
How long until Google makes them start paying for the heat? Of course require them to use Google products to monitor them (in more ways than is told to the people). Google ONLY has Google's interest in mind. If you make a deal with a tech company you are making a deal with the devil and the day will come when the devil the changes the contract and demands its pound of flesh. That aside - for now this is good because the heat isn't just wasted.
Then there is the Hetzner DC, which loses all trust in the green babble they go on about. They literally built it on a property that is bordering the district heating piping going to a backup plant. The closest and arguably easiest hookup to a district heating system and they went all air cooling! 😂
meanwhile in america a certain special interest group does everything they can to sabotage green energy efforts because it would cut into their private energy company profits a lil
we have a new facebook data center here in new mexico that facebook has quadrupled the size of. our state gave them free unlimited water. between this and the nestle bottling plant next door im super worried about these 2 businesses lowering the watertable in my desert town and maybe even depleting the aquifer
Mexico naturally faces water scarcity and inequality. Approximately 70% of the country’s territory is close to day zero in terms of water availability. Despite a 15-year drought and low water levels in the Rio Grande, the City of Las Cruces, New Mexico, has a secure water future, being carefully planned and managed since 1930. Las Cruces Utilities draws water entirely from approximately 72 wells located around the city, tapping into depths of up to 1,290 feet in the Mesilla and Jornada bolsons. They do not rely on surface water from the Rio Grande. The Nestlé bottling plant in Veracruz, Mexico, made significant investments in environmental technologies. It uses wastewater treatment systems to ensure 100% water recirculation, zero wastewater discharges, and zero waste to landfills. Additionally, the factory consumes 100% green electricity and utilizes a biomass boiler that generates energy from biological waste produced during the coffee process. Facebook, will sell you a water tank.
Aquifer situation is something that is really bad and almost never talked about. Especially out in the American West where you have unbelievable quantities of water being pulled out with no sign of stopping and they take centuries/millennia to refill through seepage. Here where I live in NZ, in many areas the aquifer water has been deemed UNSAFE TO DRINK because of several decades of intensive fertiliser application for dairy farming has been seeping down into the aquifer and has raised nitrate levels to an unacceptable level. Very cool.
6 месяцев назад+7
@@jimster1111 How bad are bottling plants really? The amounts of water involved seem rather minuscule. Just like water for drinking is much less than water for agriculture or industry.
Here's the thing. In NL we pay some 70% extra per energy unit. It is extra tax that is used for subsidies on renewables. So from that tax money the province is littered with wind turbines. So that should be in the benefit for the people, right? Wrong. There's a municipality that acts like a banana republic and brought in a whole host of datacenters in their area. So our tax money is invested in wind and solar, but the benefits go to the companies that own the datacenter. This municipality is called Hollands Kroon. But that's not all. For over a year now the provincial water supplier is issuing op-eds in papers and news outlets about shortage of water due to salivation of ground water and use by people living here. They threaten to cap our allowed water usage because we have the audacity to wash our car and water the plants, and be anti social an take showers. Bottom line, when corrupt aldermen squander our resources that we pay for, it is the people that get reemed for being use energy and water.
@@FranceGaulGallia not all of that 70% is tax for renewables, but the point stands. The government decided that it was necessary, since the government is democraticaly elected 'we' decided. Regarding the water usage, that is more nuanced. Water companies (semi government, meaning they are extremely regulated private companies) are obligated to provide water for households first as it is a primary living requirement. So capping water usage is unlikely and even impossible to enforce.
In Minnesota, we get free cooling for data centers when the ambient outside temp is below about 32 degrees. It works using an external machine called a "dry cooler" which is basically a large fan cooled radiator that cools the external water loop. It gets rid of enough heat to require no evaporative cooling during colder months, so evaporative cooling works in the summer and free cooling in the winter.
HVAC engineer here. If they would just choose to build these data centers in a cold enough climate, Alaska for example, they could cool the whole facility entirely with outdoor air practically year round. For certain climate regions, the IECC(International Energy Conservation Code) currently requires most HVAC systems to use an economizer. When outside air is sufficiently cool and dry, an enthalpy economizer works by mixing outdoor air with return air to achieve the desired supply air temperature, 55 degrees Fahrenheit. The only energy required is what is needed to power the fan motor. These systems save massive amounts of energy in colder months. Office, industrial, and large buildings require cooling year round and only need heat at the perimeter of the building and during morning start up. I don't know why they choose to build these global data centers in hot deserts. By the way, an error in this video. Outdoor air is generally cleaner than indoor air. That is why it is mandatory to have outdoor air brought in for ventilation for buildings that don't have operable windows. A ventilation schedule is one of the most basic things a code official will require on mechanical drawings. Outdoor air ventilation is so important, it often governs the sizing of equipment. A conference room for example requires 5 CFM per person of outdoor air with an assumed 1 person for every 20 square feet and 0.06 CFM per square foot of outdoor air. That works out to 0.31CFM per square foot. That means a 500 square foot conference room requires 155 CFM of outdoor air. The catch is, a typical HVAC unit can only cool a certain percentage of outdoor air vs return air in hot summer months. A safe rule of thumb is 20% for my climate zone. New units can some times do more than 25% and dedicated outdoor air units can do 100%. That means a 500 square foot conference room would need 775 CFM total and would require two tons or 24,000 BTU of cooling. That is far more cooling than the room will need to satisfy temperature.
For much of the work being done in a data center locality matters. Latency becomes a huge issue, and latency doesnt just make responses slower it literally limits the maximum transfer speeds. People are investigating how to use cold areas but its going to be very complex and require deep logistics to get around latency issues.
> why they choose to build these global data centers in hot deserts. Latency is a part of it. Using the Alaska example, Fairbanks to San Francisco introduces an additional ~50ms latency (5000 Km), which is not acceptable for many applications. Building local means that 50 ms latency becomes 1-2 ms. From these companies' perspectives, if a problem can be solved with money (paying more for cooling, etc) it's not a problem.
@@AlexanderHamilton-df9dj - Light down fiber can be beaten across the Atlantic by Starlink in terms of latency. When the system is at full capacity it'll be able to beat all existing connections when you're hopping outside of your city/town (give or take). This is worth billions alone for Stock trading etc So expect the locality of some kinds of datacenter to become less important as more swarms launch
Datacenters used to be central. Then the Internet boom happened and all the world pulls data from those centers. That doesn't scale, so data center bandwidth capacity is distributed to population centers that actually use it.
They set up this stuff in dry areas because an evaporate air cooling is way more efficient in dry areas.
6 месяцев назад+25
I can just about imagine trying to use evaporative cooling in humid Singapore.. I wonder how much of that evaporate water comes back down again in, say, Arizona? You could use salt-water for evaporative cooling (where it's available) and essentially get desalination?
As I understand, say, in the US, a few of the SW Desert States offer subsidies to tech companies to set their Data Centers in their State territories. Where groundwater is pumped out for cooling. Fossil fuels to generate power, groundwater depleted, to run the newest tech goldrush!
@@LazerDon271 Yes, I've worked with cooling towers, the water is regularly partially discharged and then topped up, especially with salt water systems. And when discharged, it's high salinity and full of the chemicals you need to add to prevent corrosion/scaling/fouling etc. Plus the biocides. I remember a system that use to test in Southern Itatly. The operator told me that a few years before a technician from Northern Europe set the hypochlorite injection to be intermittent to reduce waste during discharge. The condenser of the powerplant lost efficiency over time so they had to stop everything for maintenance. They found the tubing coated with muscles inside.
I mean, this water problem is really a thermal dissipation problem as far as I can see - and since large-scale computing is limited in density by thermal dissipation, it just means that the "problem" is a foregone conclusion. Optimizing anything to improve thermal dissipation will just result in denser datacenters that pack more computing power, consume more energy, and run into the same thermal limits. They will never stop having this "problem" until a different bottleneck is hit or until water stops being the cheapest/densest thermal management option. At the end of the day, it all boils down to the same old engineering problems - reducing power consumption, using renewable generation, more efficient computation, etc. And again, it will lead to more powerful datacenters, not less thirsty datacenters.
@@ronjon7942You may be wrong about that. AI is quite rapidly killing the internet by spamming social media with mountains of bad AI-generated content, which is being interacted with and boosted by AI-enabled bots.
Water is used for condensing cooling because it's cheaper than air cooled radiators, or massive ground loops of pipes which would have almost no water use. I think the real problem when talking water policy with people is that numbers of billions of gallons sound large but don't mean much. Farmer used acre feet of water (the amount of water needed to cover 1 acre in 1 foot of water), which converts 1 acre foot to 1233 cubic meters of water or 325,850 gallons. In context the Mackenzie River in Northern Canada dumps an average flow of 10,000 cubic meters a second into the arctic ocean. Google's 4.3 billion gallons of water converts into 1,627 seconds of the average flow of the Mackenzie River, or about 27min. All of Google's data water need is met by half an hour of flow of a river that is almost unused for human activity. The total economic activity of the Big Mac might be expressed in millions of dollars not tens of millions. The water issue and the fear of misusing water is a peak first world problem. We can solve it by accepting a return of slightly warmer water back into a municipal system negating all the issues, are you going to care that your cold tap water was 2 degrees warmer than it was before the data center used it for cooling? We could also force data centers to used closed loop air cooling, in practice the same as your computers air cooling only with huge fans blowing massive amounts of air. We could have buried water loops dumping heat into the Earth. Or just bite the bullet build lots of cheap nuclear power and run energy intensive refrigeration systems. Cooling towers are the cheapest solution so long as water is legal to use. In terms of water use they are fantastically efficient and almost benign to the environment. That's why we use condensing steam turbines in power generation. Cooling the steam to a condensation point pulls a vacuum on the turbine which provides more monetary value of electrical energy than it costs to reheat the steam back up to it's operating temperature or to pay for the water use.
This is exactly what I want to do as a career, not necessarily inhibit ai progress, but take an objective look at the problem with a realistic view and help make, for example, use of water cooling illegal. If you’re to do it, you’re to do it right
I know it is a minor detail, but the water that is being cooled in the cooling towers does not produce steam. That water evaporates into water vapor (the evaporation process is what cools the remaining water). Steam, per definition, is water vapor that has a temperature above the boiling point of water, as that is only possible if the pressure is higher than the normal atmospheric pressure.
I was a bit unsure about this subject. 'Vapor' is the gas phase of water, so it should technically be able to reach any temperature over 100 C until it phases over into plasma, right? Guess it doesn't help that in my language 'steam' and 'vapor' is the same word: damp.
@@megalonoobiacinc4863 Practically, you would be right. vapor pressure is a function of temparature over pressure. You could run a steam turbine from 50c water. your pressure would be a miserable 0.12 atm. Assuming the cold side is 20c, your carnot efficiency would be around 10%
@@-yttrium-1187 old steam ships did that with their multistage engines and turbines such that the last stage was below atmospheric pressure. The Titanic started with 215psi @394F to the HP cylinder and the output of the LP cylinders was 9psi @ 188F which fed the turbine and a final output of 1psi @102F.
Evaporation also puts the water to the water cycle, which increases our planet's surface albedo so it is not a major climate problem regarding the 'greenhouse' effect (which is not the same as a greenhouse anyway but you know). The real worry if there is one is "crypto" with an energy footprint larger than the IEA global datacenter usage figure, straight convective heating of the atmosphere with loss thermal masses, absolutely insane stuff compared to regulated datacentres going on there
Here in germany (esp. in the frankfurt area) were working at using the waste-heat for communal heating - as you said. The trick is the combination with a (big) heat pump instead of direct heating, bc. then the pumped medium can be cooler and transported farther, while maintaining the overall energy capacity. Additionally, fluid to fluid exchange is much more efficient! Imho we cannot any longer afford to let the heat from those giant Data Centers go to waste!
Relying on rivers for cooling or producing electricity is generally stupid. The same year the Russia-Ukraine war started France had a drought which led to low flows in their rivers. That meant that many of their nuclear reactors being shut down. That combined with the new natural gas supply issues because of the war led to extremely high electricity prices.
One test which I was a part of as a chemical engineer was simply rejecting waste heat to the atmosphere without the aid of cooling towers. There are to be two cooling loops in the data center. One is internal cooling loop, while the other which transfers heat from the internal loop to the external evaporating-condensing cooling loop. On the external loop, different refrigerants were used, in some cases the refrigerants were special mixtures or developed specifically for this task. When passed through compressors arranged in a series format, rather than a parallel format, each compressor in a series steadily increasing temperatures and about 3rd or 4th compressor is where phase change starts occurring, from gas to liquid. We were able to achieve temps as high as ~250C at the end of compressor series. Now, since the delta between ambient temperature and the liquidifed refirgerant was so great, absolutely amazing rates of heat transfer were achievable and by the time the liquid refrigerant cooled down to about 100C, we would send it to the evaporating loop, to pull heat from the internal loop. 1. The cooling fins of the condensing part of external loop are exposed, requiring frequent cleaning and maintenance. This can probably be automated for large scale operations. 2. Refrigerant decay occurs due to wide swings in temperature after a few thousands of passes through the evaporating-condensing loop. This needs better research and development work on different types of refrigerants. 3. The external loop needs to be much more sturdier and this causes thickness of tubing to increase on condensing side, to be able to handle the absolutely crazy high pressures for HVAC. More fins, and better materials, namely copper preferred. With current copper prices, not so cheap. 4. This absolutely requires higher electricity demand than a process which involves evaporating off some water to get rid of the heat. One proposed solution was to have mega-clusters of data-centers with local battery backups. For eg. Google, FB, Microsoft make data-centers in Wyoming, and power them with the wind energy there. And have a gigantic battery backup so operations run 24x7, and the wind turbine and battery backup is managed by a company that Google, FB and Microsoft create together. This has the disadvantage of bringing in talent too close to competition and one unnamed company outright refused to even consider this. 5. This is difficult work, as there are no known standards anywhere for this design, so whoever does this first will probably writing the standards or have great influence on those. 6. Unless multiple companies come together or anyone is willing to give this 100% commitment, establishing the research and legwork, this is too expensive for anyone company to spend that much in R&D and capex.
R12 as a refrigerant are durable except ozone destructive speciality use cases only with exceptions are able to be designated this to provide cooling? Geo-cooling?
@the_expidition427 I know that is not viable in data centers, geothermal only works well in locations with a balanced heat and cool load. If there is an inbalance the system will warm the ground temps and make itself less efficient. The exception would be an aquifer, but that gets uncomforatable.
They are power hungry, in Ireland there are a massive amount of these. A study showed that by 2026 it's estimated the 35% of all Power Generation in the country will be required for Data centers alone.
This is the real issue...We are giving these data centers old power plants....new power plants...new Super powerlines...new data collection tech...We sold our land water and frequencies to Big Data...We are dumb as F#_=
As someone who lives in Quebec that there are not more datacenter built here always stun me. Our grid is 98% renewable. We have plenty of water and our temperature is below zero half the year.
The problem is your awful internet infrastructure. The reason that Northern VA has so many data centers is that they can be directly connected to the big fiber-optic backbone that runs under the Northeast US, allowing for pretty low latency in the most important market in the world. Quebec’s problem is that outside of Montreal, Quebec is effectively just empty space as far as internet infrastructure is concerned. No big population centers, no big fiber trunk lines, no reason to put a data center there.
Building a data center in a non centralized location adds a lot of unnecessary infrastructure costs. The amount of bandwidth these centers have to swallow cannot be understated. We operate our servers mostly in Europe and had to do entire datacenter migrations just because the underlying infrastructure caused too much strain on everyone's bandwidth. (Paris in Europe has some horrendous infra and at peak hours suffers immense package loss to certain european regions for example) So it's not as simple as just plopping it down in a cold region
To say Arizona 'experiences droughts for time to time' is a huge understatement. The whole US southwest has been in a perpetual drought for over 10 years. If the current rate of water pulled from the Colorado river continues to grow the river one day won't make it to the pacific ocean.
This is something that has become a problem here in Chile. Because all the networks and infrastructure present in the country and specially in the capital Santiago, the city is becoming the hub for Southern's Cone Internet; and that means there are several data centers now with some more on planning/building stages. In the next years Google wants to expand its current service while AWS and Microsoft build new facilities all around Santiago, but the capital is an water-stress zone, far from important rivers or the ocean, and air quality is generally bad. And that has created problems with local community because of the consume of water, to the point the Google's project got stuck in legal procedures to obtain its environmental permit, while Amazon had to move its DC outside the metropolitan area of Santiago to get the green light.
Commercial Laundries. Anything that can use Warm to Hot air or water. AWS (Amazon) could heat their Distribution centers as needed. All we need to do is PLAN an interlocking infrastructure. Hub - Spoke. Heat production at the Hub. Heat consumption out on the spokes.
@@c1ph3rpunk It's oretty cold here in the north just now at 12:28 hrs on 15th July in so-called summer. People have their heating on for their homes and greenhouses with heating have the heating on. The municipal composting site are running heat exchangers for nearby heating. He did mention in the video locating datacentres in more polar locations. British Sugar once co-located greenhouses with their factories to produce tomatoes but they switched to a more 'tropical crop', aaiii, Some heat could be dumped in the sea and could support further industry such as alginate farming to mitigate environmental negatives. Lots of industrial processes need heat all year round eg lumber drying.
There’s a startup here in the UK called Deep Green that has a more unique means of recycling that heat: heating swimming pools. Given how much more expensive it’s become in recent years for local councils to heat public swimming pools, it cuts their heating costs while also allowing for a great way to cool data centres installed beneath them!
I think this comment kinda misses the forest for the trees. You don't need to run DX CRAC units. Almost every midsize data center in my area (South Texas) just runs rows and rows of air cooled screw chillers. Screw compressors are actually more efficient on their own than the centrifugal compressors which are more common on water cooled chillers, but of course that's lost when there are so many motors running for a bank of air cooled units as opposed to only a few water cooled ones, and because DX compression is more energy intensive than simply pumping water around for evaporative cooling. Other savings on makeup water costs, a big advantage with large numbers of air cooled units is the sheer redundancy of having dozens of machines that can easily bare the extra load should something fail.
@@jfbeam You'd be surprised how close in efficiency they are. On an annualized basis they're often nearly as efficient because in the cooler months they run an augmented thermosyphon loop, but miss by just a few points. We spent months arguing with California about this and how the slight increase trade in energy for water usage was well worth it, but they were hell bent on forcing evaporative cooling despite being a desert.
You could easily use the waste heat from data centers to run absorption chillers . Reducing the electricity and water usage needed to cool the buildings.
This reminds me of Rain World, minor spoilers but not too bad: In rain world one of the main plot premises is that these superstructure city-sized, absolutely gargantuan biomechanical supercomputers require a small sea's worth of water to cool themselves while running their computations and processes. The reason the game is called rain world is because the exhausted steam and evaporated water accumulates in such mass that it completely altered the world's global climate. What we're doing is a much smaller scale but still a massive amount of water being displaced and evaporated in places that water typically isn't found. I wouldn't be surprised if we notice effects on the local climates these massive data centers are located.
Depends on the operator. Some operators even use DIRECT evaporative cooling (where it can literally rain inside a data center) to minimize the energy costs.
We've tried using reclaim water, it's an absolute mess. The chemicals used eat at all the cooling equipment so you need a huge amount of post treatment. Furthermore when you evaporate that water it's carrying a lot of those chemicals into populated areas (i.e. Ashburn).
@ThylineTheGay lol but the ceo won't get hundred million dollars and shareholders won't get their $0.14 dividends who cares if the world is burning. Think of the stocks!
My company uses air handlers the size of 4 shipping containers (100 of them in the newest building designs) that put out 100k cfm. We run free cooling mode up to around 83F dry bulb outside air temp.
Your mention of Arizona brought to mind TSMC's construction of two fabs in that state. TSMC will use water extracted from the already depleted Aquafier as part of its production process. The seven stages of washing the silicon wafers use considerable groundwater. Some of these stages need to use toxic chemicals. As part of their agreement with the state government, this water will be processed through a desalination plant and returned to the aquifer. How well the process is monitored for toxic chemicals returned to the aquifer depends on the vigilance of the state government. American environmental protection is not at the top of the list for excellence worldwide. It will be interesting to see how well they are policed when the new TSMC fads become operational. Thanks for another excellent presentation.
This only happens because we have, historically and currently, dramatically underpriced water. When your water is nearly free, it makes sense to reduce your power bill (which costs you actual money) by consuming some water for evaporative cooling (which is close to free). Same thing with power plants; they use evaporative cooling because the water costs them a tiny fraction of what it's worth. It's entirely possible to have an electric grid that consumes no water whatsoever; renewables do this naturally, and even thermal power plants can be cooled in other ways (at the cost of consuming some of the produced electricity to drive the cooling). Data centers could entirely operate from air conditioning (to air or ground) or, even more efficiently as you note, with free-cooling (potentially coupled to watercooling inside the facility). As you pointed out, hard drives are one of the more temperature-sensitive components, and they'll still tolerate 45C or so just fine; well above ambient temperatures in most locations. There's no reason you even need to use air conditioning if you can transfer the heat from your components (which only need to be kept somewhere between 40c and 90c) directly to the surrounding air and get rid of the heat passively.
I think a case could be made for establishing acres of greenhouses around such data-centers, where the heat could be used to grow various vegetables and fruits for nearby markets. This would obviate the need to air-freight at least some of such produce from distant countries, thereby helping to cut CO2 emissions, create jobs and maybe reducing prices of those products.
I paid patreon to see this video. Thermodynamics! When it comes to harddrives, the benefit isn't about the ultimate temperatures, but temperature fluctuations. Keeping HDDs at a steady temperature increases life. After all, they are precision machines. It's also worth noting that water consumption is modified by the cost savings, i.e. they don't want to build cooling towers to recycle water.
Back in the day, our hospital moved to a new data center, and when we were 100% operational, we lost a chiller followed by the backup (a rag stuck somewhere?? and evidently failover wasn’t tested? Anyway…). It got really hot, and really humid, really quick. It was at the end of the day, so fortunately a lot of us sysadminsmwere still there, shutting down hundreds of vms, the hardware, disk storage…etc, all in an effort to keep our AIX Epic EMR servers/storage up. The AIX was a giant p690 and the storage was a full rack of spinning disk - with two of them for redundancy (in the same freakin’ room!! We lost that battle, but got them geographically dispersed later.). When the air temp hit 120deg, I was cleared to shut them down but then the chiller came back online. The point: we were losing disk all over the place, often at the start but gradually back to normal after over a year. Oh, yeah, this happened in Wisconsin in the late fall. Not like we had windows to prop up some box fans, but still ironic a bit.
i'll never get bored of your self commercial", not when its always that short, and straight to the point. Every other good creator on youtube has included some form of sponsoring into their videos. Well at least i can easily right-arrow it!
Even in the wetter Eastern US, these water demands are getting too high. Indiana is planning to pipe a ton of water from the Wabash to sustain industry in the middle of the state. Turns out there's nothing preventing them from pumping the river dry.
I am happy to see you directly attribute power equals water usage. Almost no one seems to address that fact when they evaluate system types. Watching the video though, lots of small details are a bit off. They dont change the overall message, but worth mentioning regardless. It seems you're well versed in energy therory, but not the HVAC industry. - CRAC is normally pronounced "crack" - In the evaporative loop, there is normally a water cooled chiller in between the two loops. - the system you described with two loops primarily cooled by water is a water side economizer. They are some of the most efficient systems in many climate and weather conditions. - direct cooling with the outside air should never come at the expense of dehumidification, the AC part of the HVAC cools and dehumidifies. Often air-side economizers are programmed with enthalpy controls to prevent this issue. - all of the systems described here are a part of the HVAC systems
Well, when you take out the heat, and use that to heat neighboring apartments and their hot water, you get much better power efficiency. Problems with legacy operators, like Google, is that they boast loudly "how efficiently we can loose the heat", when the real question is "where can we dump this heat, for re-use?". Google uses cold sea water, and boasts about their "efficiency of loosing heat energy!", while real and smart engineers, pump the heating to the area to be used elsewhere. Again, not too applicable in Singapore, but there are data centers, also Google data centers, in the areas where heating is needed, and a lot of it in certain months of the year.
4:18... no way all those deer just happened to be standing there haha. Fantastic video as always Asianometry. Didn't think I'd be graced with a crash course on Datacenter cooling tonight! Well done sir.
Data Center HVAC Engineer here. Your description of the recooling stages is mostly outdated today. No datacenter that i know of uses a open secondary chiller. They have allmost all been phased out. We use 2 closed loops, one for the whitespace, one as a recooling loop for the chillers. They can be run in "free cooling" wereas the system just chills the water via plate heat exchangers without running compression based cooling. Here in europe we run free cooling around 60% of the year round. If we need emergency cooling, we can switch to a adiabatic solution witch uses watermist to increase the efficency of the recoolers. Watercooling is nice, but except from some high density solutions we build, the cost to use and maintenace isnt worth it. Everybody seems to hate on the energy used by datacenters, but what is the alternative? We are operating about 13 DCs with PoEs from 1.37 to 1.19. There is not mutch more to gain here.
Cornell University has operated a deep lake cooling system since the year 2000. Cornell is close to Cayuga Lake, one of the Finger Lakes. These lakes are pretty deep - down near the bottom, the water temperature is consistently low. The system draws from the bottom, heat exchanges it with a closed loop which then sends chilled water up the hill to Cornell. The entire of Cornell's campus is cooled this way, as is part of the local school district. Cornell has a website that has a lot of details including annual greenhouse gas savings. Prior to the deep lake cooling system, Cornell chilled its water using a plant that was powered by... coal. There's another system using Lake Ontario water in Toronto. Anywhere you're close to deep water you'll find it nice and cool at the bottom because cold water sinks. I believe there was a demo system in Hawaii using ocean water. The potential of the oceans for this kind of thing is massive, but salt water is an issue and if you're not in a place that's already cold, you need to be near deep water. The potential of the Great Lakes for this kind of thing is enormous too. You'd think Iceland would be a natural place to site data centers. Inherently cold, near a lot of cold ocean water, tons of hydropower energy, etc.
Tons of earthquakes and volcanoes in Iceland, too. But yeah, you make excellent points, especially with the Great Lakes scenario. I wonder if data center consultants prefer a more steady state climate throughout the year, though, like Arizona, as opposed to a place like Wisconsin, with wild swings of temp and humidity during the same time frame.
Ive thought that building massive data centers close to the Bruce Power site would make sense. They can produce a lot of clean power and use the lake to cool the data center. They already use the lake to cool the nuclear power plant. They would need to run large fiber cables there but that would be pretty cheap. They can use the same right of way the power lines are using.
This is why the Pacific North West is such a good location for Data Centers. More specifically along the Columbia River. There are a number of damns including Grand Cully along with large Wind Farms to provided cheap locally sourced power as well as the Columbia River that provides the water when needed. The Free Cooling factor is also a well known and used factor in these data centers as unlike Arizona, Oregon and Washington are temperate with no lack of rain fall. That last bit should be taken with a pinch of salt as east of the Cascades, aka Eastern Oregon and Washington are High Desert that get plenty hot and see far less rain than the Arboreal Rainforest that makes up the West of the states including the Cascades.
Many reasons. You want them near people for latency reasons, far away from each other for redundancy (single power outage doesn't affect all of them) and there is not as much power available in cold places. You'd need to burn fossil fuels to power them up there.
@1:15 lol... 5000 servers is by no means anywhere near 'hyper scale. you can fit 5000 servers in roughly 6 or 7 rows of 15 racks. most people have more sq ft in their homes than that... maybe add another 2 0's and your getting close to 'hyper scale'
So datacenters are massively wasting water and heating the atmosphere and they don't really know at what temperature their components need to be??? WTF!!!
But we stupidly build datacenters where its hot, because its cheap to do so. So there isnt much use for all the waste heat in Arizona or the southeast.
1:53 and that's a stupid calculation, because up to 30% of "IT load" is used for server fans. And I have data to prove it, because our company builds water cooling high density servers where ALL cooling systems (water pumps, radiator fans, etc) are using only up to 6% (usually ~4) of server power
It appears that power efficiency is unfortunately not really a design goal for many of these servers. They seem to lack many sensors, which leads them to run the fans at a much higher speed than they need to. The amount of energy saved could be very significant with better firmware design. I'm also very surprised that nobody has come up with the idea of centralized fans (i.e. one large fan to cool multiple servers). This would allow them to run silently and in a much more power efficient way I think.
Fantastic video. I recently saw a liquid cooling DC lab completed where liquid was either used for immersion or was piped through heatsinks on the processors. I set it up with a dual loop as per your examples in the video- but ran the hot process cooling loop through a dry cooler with fan assist. No need for evaporative cooling as the fluid was coming out towards 40'C and the chips themselves - as you say - could run even hotter (60'C was a reasonable target). Look up a company called 'Iceotope' who are UK leaders in direct liquid cooling.
Worked in a datacenter with economizer chillers and all racks were set up with pod style cold aisle containment. In the winter PUE of 1.05, it was wild.
@PinkFZeppelin the reason it worked was that while it was collocation, that site was only cage customers (15+ cabinets). We worked with the customers to make sure all unused space in the racks was blanked off and had automatic louvered vent tiles that were tied into a pretty nice temperature monitoring system. Site had 5 200 ton chillers when I was there and was installing the last two when I left. Was great working there since everything was pretty much brand new so we rarely had anything that would break.
I think you addressed another issue. Why do 2 golf courses use as much water as a datacenter. These datacenters serve millions of people and handle banking, communications etc. While these golf courses serve like 1000 members for leisure. 😊
Why not, easier to spend millions of galloons of water on a shitty golf course than put some money into R&D figure out how to make something eco friendly that is nice to walk on at all times of the year, the government certainly doesn't care. Water everywhere, even in the West is a commodity, and treated as nothing more than dirt.
They need to stop evaporating water for cooling. Convert everything over to refrigeration and just use refrigerant to air heat exchangers or refrigerant to water (glycol and water) heat exchangers, just using the water to move the heat around. They’ll use a lot more electricity, maybe that’s why they don’t do this, water is cheap for now. I work at an industrial plant in the desert, they used to have evaporative chillers, we’ve recently had to switch over to refrigeration. The only water that’s used is make up for evaporation off of our holding pit and spillage.
There are also power plants that do not use evaporation chillers for their steam condensers. There are some in the desert not far from me that use air cooled condensers.
Joy Cone moved their production from Phoenix to Flagstaff ielevation 7,000 feet), because they were able to cool the cones down after baking quicker with the cooler air at that altitude. Free cooling indeed, and when the wind blew the right direction, it smelled great.
This is why Finland has had lots of data centre investment. We go down to -15 to -30 C even in south - like in case of Hamina. On top of this Finland has lots of renewable and nuclear power keeping our electricity cheap and green on average. Also even the most remote bit of Finland is relatively close to urban development and good infrastructure.
Also a relatively new 144Tbps cable going to Rostock that cost 100m€. There is also planning phase now for the next cable C-Lion2 and also a cable going via Norway to the Atlantic, Ireland and the US.
There is a misconception I keep hearing and I want to know where this notion comes from, or it's a misconception as far as my understanding and that's this issue of power consumption inside electronics being turned into heat, as if this means if 100KW is being used at any given moment, it's THAT amount of power generating heat but for the LIFE of me I don't understand the physics of THAT one, so someone PLEASE enlighten me. My understanding of electric power is that it's a form of energy. Energy can be converted to do different things. For instance, in a simple home PC with a modern CPU and say that CPU has 10 billion transistors in it. What you are saying is that all power is converted to heat. What the hell was switching those transistors on and off? You can run electronics to where a unit is putting off almost no heat simply because you run it at low frequencies. The unit is still doing WORK, switching transistors on and off along with making other components work. Slow down a CPU to 1.5 GHz max and tell me if you need that big honking cooler on it. As far as I know, switching transistors on and off is REAL WORK. Computing is REAL WORK. Heat is generated, yes, but I can run a brand new Intel or AMD CPU and have it generate almost no heat and the efficiency of the circuit will be CRAZY GOOD, because I slowed it down. Heat is generated as a result of reactance in electronic circuits that have clock signals driving them, where a clock signal is very much an AC signal (I've put an O'scope to plenty clock signals). You also get SOME heat loss as a result of switching transistors on and off but for the LIFE of me once again I don't understand how that's 100% of the input power or you NEVER would have switched the transistors on or off, which is REAL WORK. So, there is SOME heat loss from switching transistors on and off. 500W being consumed is NOT 500W getting turned into heat. Get a small electric heater and run it around 500W and let me know if that's what your PC feels like and if it is I REALLY feel sorry for you. There is MORE heat loss from reactance in electronic circuits driven by high speed clock signals, WHEN you push that clock signal up to the point where the die starts on that exponential curve of inefficiency. Every process node has different characteristics, one is power efficiency at different clock speeds and that's an exponential curve, so the faster you clock the die, the faster the rate of change in inefficiency of the die. It's not linear, it's exponential. There is a point where pushing the clock signal faster isn't worth it because the power consumption is not worth the small gains in REAL WORK, the actual computing being accomplished.
@@simontist Dude it took POWER to turn on and off those billions of transistors. Duh? Once again people say statements like that and totally fail to realize a computer is doing REAL work. So, what's happening is power being used inside the computer is doing work (work uses energy, look it up) and has heat as a BY-PRODUCT. Power is energy. I worked on electronic systems for 20 years, having a good chunk of a computer science and electrical engineering degree which was required to work on 60s - early 80s computer systems. I COMPLETELY understand the equation P = IE. That doesn't change the fact that electronics CONSUME power when doing work. P does not = heat. P is the total power, no matter what that power did, in the very simple equation of P = IE.
@@johndoh5182 I guess I'm coming from a thermodynamics perspective where work has a specific meaning, that is force*distance. EDIT: Though I guess all those electrons getting moved around probably involves force and distance, it's all something like frictional losses.. the system as a whole doesn't do physical work.
Also in Arizona, we are a disaster free zone. We have a ton of data centers here and lots of companies headquarter here. Data centers mainly use water chillers here and we use treated waste water for data center cooling. And alot of data centers are putting on massive amounts of solar panels to offset its energy. Plus, we have multi-grid with nuclear and hydro electric and so all data cemters are on both grids (separate owners, APS/SRP) as full redundancy here in the Phoenix area, where each server and metwork equipment js fed from UPS, but they have two outlets, where one plugs into one ups/grid and the kther plugs jnto the other grid/ups and almost every other home has solar backfeeding the grid and the natural gas demand plant is located in the four corners area.
I think we have a real issue with what we consider a necessity for data center usage: e-commerce: yes. Facebook, Instagram, Snapchat:not so much. The same people who wring their hands and clutch their pearls over global warming and water usage will spend hours doom scrolling on social media. Show you care by stopping with the memes, selfies and other nonsense that give you a dopamine rush when somebody gives you a "like". If I'm supposed to switch to an electric car to soothe your conscience, then put the phone down to soothe mine.
I once built a watercooled PC and overclocked it hard, was a nice bucket list item but honestly after doing maintenance on that thing for years, I developed a newfound love for a simple and low maintenance rig with mid range air cooled hardware. Currently rocking an i5-12400, RTX 4070, 32GB of DDR4 3600 C18, a SeaSonic PSU, a variety of storage, and a well ventilated case. Runs like a dream.
Well recently, I may or may not be involved in a data center project that not only is the largest in the world, but may or may not be filled entirely with graphics cards, not storage. Allegedly, the customer is trying to achieve AGI, basically a super intelligence, utilizing unfathomable compute power all running in parallel, and there is a race among tech giants to gain that kind of power. Or so I’ve heard. Allegedly.
I don't think you understand how tier one ISPs charge for traffic. There is tremendous economic incentive to locate data centers near data consumers. "You can get anything you want, but you ain't gonna get it for free"
I can't help but think that all this stuff we are doing in reducing "global warming" caused by burning fossil fuels and the like, is just being undone by all the stuff we now use that generates mass amounts of heat. As for moving data centrers to colder regions (anywhere closer to the arctic) is that just going to cause the ice to melt quicker? What we should be looking at is making all this tech run cooler and more efficient.... and cheap enough to replace old tech. (My home computers are old... but i can't afford to replace them with anything newer. Even when considering the costs, new kit would still cost more than what i could save in energy bills)
Heat dissipates from earth through the atmosphere into space, CO2 prevents that. I'd guess that producing more heat wouldn't matter if we could remove all that excess CO2 from the atmosphere.
8:52 they're trying to restore their water sources, but the question is how are they doing it? They can take it from a water source that creates a new water problem in that area.
I liked the comment about 100% of data center energy is converted to heat. So it struck me that like the old saying about 'The statue was already there, I just removed the extra marble.' i think you could say 'The data was already there, I just removed the extra heat (in a very precise way).'
8:00 That's just nonsense. Would you have shipped trillions of liters of water from US to Africa if US didn't have datacenters? Classic environmentalist sentimental argument.
No, your HR data, sales data, inventory data, shipping data, credit card info, home address, purchase preferences, all user agreements, legal documents, compliance audits. Shall I go on? All your email, your daily calendar, the television shows you watch, the interface to the camera in tour tv, the microphone in your phone. I could keep going.
xAi is building a new facility in Memphis, TN. Memphis sits on one of the largest, cleanest supplies of fresh water in the world. Memphis doesn’t even filter the water coming out of the ground before it goes into our distribution system, but it has the cleanest water in the country according to the EPA. Teh xAi facility is expected to draw 1 million gallons per day,
Crypto is estimated to be 3% of global power use already. Who knows what AI is adding to that total. I wonder at what point/how the issue will be addressed? We'd have made significant progress already towards using all green energy if these new tech industry technologies didn't keep driving up global energy usage.
Proof of work crypto takes up probably 99% of the share which is bitcoin in this case as it is most dominant. Proof of stake is very minimal in terms of power usage.
Right, I think it’s time we switch off all the computers and return to a level of technological advancement that still forces us to think for ourselves.
lol literally warming up the ocean by taking the water out heating it and putting it back lol. Obviously it’s a drop in the pond probably has a near zero impact but still funny to think about.
It's important to remember that water is not consumed in the same way that fossil fuels are. Water is simply relocated from its natural source and released back into the environment.
apologies but this is at best an oversimplification - water sources rely on _replenishment_ and the actual replenishment processes for some sources can be measured in decades or centuries, or in some cases not at _all_. this is particularly true of groundwater sources which involve very slow processes and usually ENORMOUS (subcontinental, I do mean enormous) reservoir sizes meaning that the supply is for practical purposes, finite. loss to these sources in america is well documented. usage has increased SEVENTEEN times since 1940. 40% of tracked wells have posted record depletion levels in the last decade, and that in turn is reflecting in the exceptionally poor performance of cropland that largely overlaps almost perfectly with the most depleted and water-scarce areas. hell, I'm not even touching on rivers, lakes, reservoirs. freshwater usage, depletion and general management is a serious problem. 3%. Three percent.
@@humanbeing9079 your contribution is worth exactly an amount subjective but categorically less what op posted. do better than the people you criticise and maybe your vague, uncontrolled efforts to self-validate will become praxis instead of poison.
It would be really interesting to build a massive facility to co-locate a powerplant, water desalination, and a data center all in one huge facility. Might be worth a study to see what theoretical efficiency savings there could be.
One thing to realize: data centers are likely the most efficient compute we can get. If they didn't exist, companies would build their own on-premise data centers, which are always more inefficient. So you either take this, inefficient decentralized data centers, or no computing at all.
Might be the worst recommended video ever. Big data, water. We in this channel are obssesed with water. Datacenters, water. Water. Hard disks need to be at the right temperature.. lol, what? 😆
Yeah, but when companies like Google or Meta preach about "muh climate change" and wasting water, it is a teeny-tiny bit hypocritical and ridiculous, don't you think?
We have a growing data center empire in Quincy, WA. They buy dirt cheap electricity from the Columbia River dams. I believe all the cooling water they use is relatively brackish ground water which is treated on site. You can see mineral ponds in the Google earth images. Every data center I'm familiar with operates with economizers (outside air) before using mechanical cooling whenever possible. Cold aisle temps typically kept around 80 degrees F with hot aisles at 100 degrees F. Fanwalls are the biggest innovation I've seen lately.
Do note that evaporating water is not required for power generation or cooling. It's just more efficient in drier locations and physically a much smaller footprint. Locally power generation has shifted to dry fluid coolers instead. (like a radiator in a car)
Many electrical power plants in downtown Detroit that used to burn coal and have since been converted to natural gas make extra income by selling their waste low pressure steam from their generator turbines to the local downtown business district office towers for heating in the Wintertime. If you drive around downtown Detroit you see what the locals call Hell's Steam vents releasing steam (I assume from leaky pipes) up through manhole covers to the local avenues. The City had its own power generating lighting plants that have since been sold-off to DTE (the local power utility). These same generating plants supply electrical power year-round to the downtown business district. The expended steam is then returned to the same generating plants as condensate to be reheated back into super heated steam for powering the turbines. In the Summer they supply chilled water from evaporator water air cooled units for downtown air conditioning. This shared system has been in place in Detroit since the 1920's. I used to service DTE's data center and they used to brag about how cheap their steam heating bills were in the Wintertime. Steam heat is also used to thaw out side walks, stair ways to office buildings and some streets in Michigan. If data centers located further North in Michigan they might find more customers for their waste heat and save water by recycling it back from heating customers. Many large Michigan cities already have the under street steam tunnel infrastructures in place. We also have a lot of fresh water around the Great Lakes area for cooling. The only condition is if they draw water out they need to filter it before they return it as run-off water to maintain the lake and river levels. Water drawn from the Great Lakes runs cool around 50 degrees all year long and especially when the lakes freeze over. The only issue we have is a lack of new electrical generating sources. Smaller scale nuclear plants that can be licensed more quickly might help to solve that issue. Many power plants in Michigan are called co-generation plants as they supply both electricity and local steam heat to their industrial customers. There is also a lot of unlit fiber installed in Michigan for transmitting data. Moving North makes more sense to me than locating in hot water scarce states like Arizona.
Google's datacenter in Finland is going to be expanded and at the same time they'll stop dumping the excess heat into the sea. They're building a heat recapture system wich will feed the heat into the central heating loop of the city of Hamina, for free. The excess heat from Google will be enough to keep the houses of Hamina heated during the winter.
Great idea!
How long until Google makes them start paying for the heat? Of course require them to use Google products to monitor them (in more ways than is told to the people). Google ONLY has Google's interest in mind. If you make a deal with a tech company you are making a deal with the devil and the day will come when the devil the changes the contract and demands its pound of flesh.
That aside - for now this is good because the heat isn't just wasted.
Then there is the Hetzner DC, which loses all trust in the green babble they go on about. They literally built it on a property that is bordering the district heating piping going to a backup plant.
The closest and arguably easiest hookup to a district heating system and they went all air cooling! 😂
Saw a video of a similar project in Paris. Some of the heat will be be used to heat an Olympic pool that will be used in the Olympics.
meanwhile in america a certain special interest group does everything they can to sabotage green energy efforts because it would cut into their private energy company profits a lil
we have a new facebook data center here in new mexico that facebook has quadrupled the size of. our state gave them free unlimited water. between this and the nestle bottling plant next door im super worried about these 2 businesses lowering the watertable in my desert town and maybe even depleting the aquifer
Already happened in Mt Shasta, gl
Mexico naturally faces water scarcity and inequality. Approximately 70% of the country’s territory is close to day zero in terms of water availability. Despite a 15-year drought and low water levels in the Rio Grande, the City of Las Cruces, New Mexico, has a secure water future, being carefully planned and managed since 1930. Las Cruces Utilities draws water entirely from approximately 72 wells located around the city, tapping into depths of up to 1,290 feet in the Mesilla and Jornada bolsons. They do not rely on surface water from the Rio Grande. The Nestlé bottling plant in Veracruz, Mexico, made significant investments in environmental technologies. It uses wastewater treatment systems to ensure 100% water recirculation, zero wastewater discharges, and zero waste to landfills. Additionally, the factory consumes 100% green electricity and utilizes a biomass boiler that generates energy from biological waste produced during the coffee process. Facebook, will sell you a water tank.
The age of watering your garden is over. The age of sinkholes is upon you🧌
Aquifer situation is something that is really bad and almost never talked about. Especially out in the American West where you have unbelievable quantities of water being pulled out with no sign of stopping and they take centuries/millennia to refill through seepage. Here where I live in NZ, in many areas the aquifer water has been deemed UNSAFE TO DRINK because of several decades of intensive fertiliser application for dairy farming has been seeping down into the aquifer and has raised nitrate levels to an unacceptable level. Very cool.
@@jimster1111 How bad are bottling plants really? The amounts of water involved seem rather minuscule. Just like water for drinking is much less than water for agriculture or industry.
Here's the thing. In NL we pay some 70% extra per energy unit. It is extra tax that is used for subsidies on renewables. So from that tax money the province is littered with wind turbines. So that should be in the benefit for the people, right? Wrong. There's a municipality that acts like a banana republic and brought in a whole host of datacenters in their area. So our tax money is invested in wind and solar, but the benefits go to the companies that own the datacenter. This municipality is called Hollands Kroon. But that's not all. For over a year now the provincial water supplier is issuing op-eds in papers and news outlets about shortage of water due to salivation of ground water and use by people living here. They threaten to cap our allowed water usage because we have the audacity to wash our car and water the plants, and be anti social an take showers. Bottom line, when corrupt aldermen squander our resources that we pay for, it is the people that get reemed for being use energy and water.
Why did you agree to pay 70% extra and why don't political representative work for you
Because Europe
Don't forget to recycle and that also climate change is caused by individuals and not corporations.
@@the_expidition427
Stench of European Socialism
@@FranceGaulGallia not all of that 70% is tax for renewables, but the point stands. The government decided that it was necessary, since the government is democraticaly elected 'we' decided.
Regarding the water usage, that is more nuanced. Water companies (semi government, meaning they are extremely regulated private companies) are obligated to provide water for households first as it is a primary living requirement. So capping water usage is unlikely and even impossible to enforce.
In Minnesota, we get free cooling for data centers when the ambient outside temp is below about 32 degrees. It works using an external machine called a "dry cooler" which is basically a large fan cooled radiator that cools the external water loop. It gets rid of enough heat to require no evaporative cooling during colder months, so evaporative cooling works in the summer and free cooling in the winter.
HVAC engineer here. If they would just choose to build these data centers in a cold enough climate, Alaska for example, they could cool the whole facility entirely with outdoor air practically year round. For certain climate regions, the IECC(International Energy Conservation Code) currently requires most HVAC systems to use an economizer. When outside air is sufficiently cool and dry, an enthalpy economizer works by mixing outdoor air with return air to achieve the desired supply air temperature, 55 degrees Fahrenheit. The only energy required is what is needed to power the fan motor. These systems save massive amounts of energy in colder months. Office, industrial, and large buildings require cooling year round and only need heat at the perimeter of the building and during morning start up. I don't know why they choose to build these global data centers in hot deserts.
By the way, an error in this video. Outdoor air is generally cleaner than indoor air. That is why it is mandatory to have outdoor air brought in for ventilation for buildings that don't have operable windows. A ventilation schedule is one of the most basic things a code official will require on mechanical drawings. Outdoor air ventilation is so important, it often governs the sizing of equipment. A conference room for example requires 5 CFM per person of outdoor air with an assumed 1 person for every 20 square feet and 0.06 CFM per square foot of outdoor air. That works out to 0.31CFM per square foot. That means a 500 square foot conference room requires 155 CFM of outdoor air. The catch is, a typical HVAC unit can only cool a certain percentage of outdoor air vs return air in hot summer months. A safe rule of thumb is 20% for my climate zone. New units can some times do more than 25% and dedicated outdoor air units can do 100%. That means a 500 square foot conference room would need 775 CFM total and would require two tons or 24,000 BTU of cooling. That is far more cooling than the room will need to satisfy temperature.
For much of the work being done in a data center locality matters. Latency becomes a huge issue, and latency doesnt just make responses slower it literally limits the maximum transfer speeds. People are investigating how to use cold areas but its going to be very complex and require deep logistics to get around latency issues.
> why they choose to build these global data centers in hot deserts.
Latency is a part of it.
Using the Alaska example, Fairbanks to San Francisco introduces an additional ~50ms latency (5000 Km), which is not acceptable for many applications.
Building local means that 50 ms latency becomes 1-2 ms.
From these companies' perspectives, if a problem can be solved with money (paying more for cooling, etc) it's not a problem.
@@AlexanderHamilton-df9dj - Light down fiber can be beaten across the Atlantic by Starlink in terms of latency. When the system is at full capacity it'll be able to beat all existing connections when you're hopping outside of your city/town (give or take). This is worth billions alone for Stock trading etc
So expect the locality of some kinds of datacenter to become less important as more swarms launch
@@AlexanderHamilton-df9dj @AncientTyrant Why would locality matter for GLOBAL data centers being used world wide?
Datacenters used to be central. Then the Internet boom happened and all the world pulls data from those centers. That doesn't scale, so data center bandwidth capacity is distributed to population centers that actually use it.
They set up this stuff in dry areas because an evaporate air cooling is way more efficient in dry areas.
I can just about imagine trying to use evaporative cooling in humid Singapore..
I wonder how much of that evaporate water comes back down again in, say, Arizona?
You could use salt-water for evaporative cooling (where it's available) and essentially get desalination?
That generally creates a lot of maintenance challenges.
As I understand, say, in the US, a few of the SW Desert States offer subsidies to tech companies to set their Data Centers in their State territories.
Where groundwater is pumped out for cooling.
Fossil fuels to generate power, groundwater depleted, to run the newest tech goldrush!
Desalination is when you end up with fresh water. Evaporating salt-water results in you ending up with salt encrusted equipment. You're creating salt.
@@LazerDon271 Yes, I've worked with cooling towers, the water is regularly partially discharged and then topped up, especially with salt water systems.
And when discharged, it's high salinity and full of the chemicals you need to add to prevent corrosion/scaling/fouling etc. Plus the biocides.
I remember a system that use to test in Southern Itatly.
The operator told me that a few years before a technician from Northern Europe set the hypochlorite injection to be intermittent to reduce waste during discharge.
The condenser of the powerplant lost efficiency over time so they had to stop everything for maintenance. They found the tubing coated with muscles inside.
I mean, this water problem is really a thermal dissipation problem as far as I can see - and since large-scale computing is limited in density by thermal dissipation, it just means that the "problem" is a foregone conclusion. Optimizing anything to improve thermal dissipation will just result in denser datacenters that pack more computing power, consume more energy, and run into the same thermal limits. They will never stop having this "problem" until a different bottleneck is hit or until water stops being the cheapest/densest thermal management option.
At the end of the day, it all boils down to the same old engineering problems - reducing power consumption, using renewable generation, more efficient computation, etc. And again, it will lead to more powerful datacenters, not less thirsty datacenters.
Yep. The only thing that will make less thirsty data centers is less demand. Not happening. Demand for compute will only increase, probably forever.
@@ronjon7942You may be wrong about that. AI is quite rapidly killing the internet by spamming social media with mountains of bad AI-generated content, which is being interacted with and boosted by AI-enabled bots.
The Jevon's Paradox, Increased Efficiency leads to Increased Consumption 😂
Power plants have the same problem... the reject heat has to go somewhere.
They could use air cooling, it'd just be more expensive (and take more energy)
Water is used for condensing cooling because it's cheaper than air cooled radiators, or massive ground loops of pipes which would have almost no water use.
I think the real problem when talking water policy with people is that numbers of billions of gallons sound large but don't mean much. Farmer used acre feet of water (the amount of water needed to cover 1 acre in 1 foot of water), which converts 1 acre foot to 1233 cubic meters of water or 325,850 gallons. In context the Mackenzie River in Northern Canada dumps an average flow of 10,000 cubic meters a second into the arctic ocean.
Google's 4.3 billion gallons of water converts into 1,627 seconds of the average flow of the Mackenzie River, or about 27min.
All of Google's data water need is met by half an hour of flow of a river that is almost unused for human activity. The total economic activity of the Big Mac might be expressed in millions of dollars not tens of millions.
The water issue and the fear of misusing water is a peak first world problem. We can solve it by accepting a return of slightly warmer water back into a municipal system negating all the issues, are you going to care that your cold tap water was 2 degrees warmer than it was before the data center used it for cooling? We could also force data centers to used closed loop air cooling, in practice the same as your computers air cooling only with huge fans blowing massive amounts of air. We could have buried water loops dumping heat into the Earth. Or just bite the bullet build lots of cheap nuclear power and run energy intensive refrigeration systems.
Cooling towers are the cheapest solution so long as water is legal to use. In terms of water use they are fantastically efficient and almost benign to the environment. That's why we use condensing steam turbines in power generation. Cooling the steam to a condensation point pulls a vacuum on the turbine which provides more monetary value of electrical energy than it costs to reheat the steam back up to it's operating temperature or to pay for the water use.
Best comment I read till date. I thought cooling stem was wasteful
Thermodynamics is fun!!!!!
This is exactly what I want to do as a career, not necessarily inhibit ai progress, but take an objective look at the problem with a realistic view and help make, for example, use of water cooling illegal. If you’re to do it, you’re to do it right
Is the cheap nuclear power in the room with us?
@@theowainwright7406 Depends on if you live in the US Midwest or not.
I know it is a minor detail, but the water that is being cooled in the cooling towers does not produce steam. That water evaporates into water vapor (the evaporation process is what cools the remaining water). Steam, per definition, is water vapor that has a temperature above the boiling point of water, as that is only possible if the pressure is higher than the normal atmospheric pressure.
I was a bit unsure about this subject. 'Vapor' is the gas phase of water, so it should technically be able to reach any temperature over 100 C until it phases over into plasma, right? Guess it doesn't help that in my language 'steam' and 'vapor' is the same word: damp.
@@megalonoobiacinc4863 Practically, you would be right. vapor pressure is a function of temparature over pressure. You could run a steam turbine from 50c water. your pressure would be a miserable 0.12 atm. Assuming the cold side is 20c, your carnot efficiency would be around 10%
@@-yttrium-1187 old steam ships did that with their multistage engines and turbines such that the last stage was below atmospheric pressure.
The Titanic started with 215psi @394F to the HP cylinder and the output of the LP cylinders was 9psi @ 188F which fed the turbine and a final output of 1psi @102F.
@@megalonoobiacinc4863 Dry steam does exist.
Evaporation also puts the water to the water cycle, which increases our planet's surface albedo so it is not a major climate problem regarding the 'greenhouse' effect (which is not the same as a greenhouse anyway but you know). The real worry if there is one is "crypto" with an energy footprint larger than the IEA global datacenter usage figure, straight convective heating of the atmosphere with loss thermal masses, absolutely insane stuff compared to regulated datacentres going on there
0:12 3 hospitals or 2 18-hole golf courses
The comparison we all can relate to 😂😂😂
How many streams of piss would that be?? haha I like the idea of using the preheated water for steam turbines, it's a win-win.
Americans will do anything to avoid the metric system 😂
All I got out of this video is how much water golf courses waste
if only my water bills were expressed in golf courses... how unfortunate they chose cubic metres instead!
@@brianjz This joke is being brought too far.
Like "30 raccon tails and 9/16th possum tongues" far.
Here in germany (esp. in the frankfurt area) were working at using the waste-heat for communal heating - as you said. The trick is the combination with a (big) heat pump instead of direct heating, bc. then the pumped medium can be cooler and transported farther, while maintaining the overall energy capacity. Additionally, fluid to fluid exchange is much more efficient!
Imho we cannot any longer afford to let the heat from those giant Data Centers go to waste!
The datacenters in the Dalles of the pacific northwest dump heat into the river and do considerable harm to the salmon populations.
Who cares as long as shareholder value is being created amirite? /s
Forrests will die if the salmon fail
@@swaggitypigfig8413 seriously who cares, we eat millions of salmon every year. Where are people crying about that?
@@onagain2796 because "considerable harm to the salmon populations"
Relying on rivers for cooling or producing electricity is generally stupid.
The same year the Russia-Ukraine war started France had a drought which led to low flows in their rivers. That meant that many of their nuclear reactors being shut down.
That combined with the new natural gas supply issues because of the war led to extremely high electricity prices.
One test which I was a part of as a chemical engineer was simply rejecting waste heat to the atmosphere without the aid of cooling towers.
There are to be two cooling loops in the data center. One is internal cooling loop, while the other which transfers heat from the internal loop to the external evaporating-condensing cooling loop.
On the external loop, different refrigerants were used, in some cases the refrigerants were special mixtures or developed specifically for this task. When passed through compressors arranged in a series format, rather than a parallel format, each compressor in a series steadily increasing temperatures and about 3rd or 4th compressor is where phase change starts occurring, from gas to liquid.
We were able to achieve temps as high as ~250C at the end of compressor series. Now, since the delta between ambient temperature and the liquidifed refirgerant was so great, absolutely amazing rates of heat transfer were achievable and by the time the liquid refrigerant cooled down to about 100C, we would send it to the evaporating loop, to pull heat from the internal loop.
1. The cooling fins of the condensing part of external loop are exposed, requiring frequent cleaning and maintenance. This can probably be automated for large scale operations.
2. Refrigerant decay occurs due to wide swings in temperature after a few thousands of passes through the evaporating-condensing loop. This needs better research and development work on different types of refrigerants.
3. The external loop needs to be much more sturdier and this causes thickness of tubing to increase on condensing side, to be able to handle the absolutely crazy high pressures for HVAC. More fins, and better materials, namely copper preferred. With current copper prices, not so cheap.
4. This absolutely requires higher electricity demand than a process which involves evaporating off some water to get rid of the heat. One proposed solution was to have mega-clusters of data-centers with local battery backups. For eg. Google, FB, Microsoft make data-centers in Wyoming, and power them with the wind energy there. And have a gigantic battery backup so operations run 24x7, and the wind turbine and battery backup is managed by a company that Google, FB and Microsoft create together. This has the disadvantage of bringing in talent too close to competition and one unnamed company outright refused to even consider this.
5. This is difficult work, as there are no known standards anywhere for this design, so whoever does this first will probably writing the standards or have great influence on those.
6. Unless multiple companies come together or anyone is willing to give this 100% commitment, establishing the research and legwork, this is too expensive for anyone company to spend that much in R&D and capex.
R12 as a refrigerant are durable except ozone destructive speciality use cases only with exceptions are able to be designated this to provide cooling? Geo-cooling?
You perfectly described what Supermicro is offering in the US…
@the_expidition427 I know that is not viable in data centers, geothermal only works well in locations with a balanced heat and cool load. If there is an inbalance the system will warm the ground temps and make itself less efficient.
The exception would be an aquifer, but that gets uncomforatable.
They are power hungry, in Ireland there are a massive amount of these. A study showed that by 2026 it's estimated the 35% of all Power Generation in the country will be required for Data centers alone.
Gotta feed that good ol' silicon!
This is the real issue...We are giving these data centers old power plants....new power plants...new Super powerlines...new data collection tech...We sold our land water and frequencies to Big Data...We are dumb as F#_=
As someone who lives in Quebec that there are not more datacenter built here always stun me. Our grid is 98% renewable. We have plenty of water and our temperature is below zero half the year.
The problem is your awful internet infrastructure. The reason that Northern VA has so many data centers is that they can be directly connected to the big fiber-optic backbone that runs under the Northeast US, allowing for pretty low latency in the most important market in the world. Quebec’s problem is that outside of Montreal, Quebec is effectively just empty space as far as internet infrastructure is concerned. No big population centers, no big fiber trunk lines, no reason to put a data center there.
Building a data center in a non centralized location adds a lot of unnecessary infrastructure costs. The amount of bandwidth these centers have to swallow cannot be understated.
We operate our servers mostly in Europe and had to do entire datacenter migrations just because the underlying infrastructure caused too much strain on everyone's bandwidth.
(Paris in Europe has some horrendous infra and at peak hours suffers immense package loss to certain european regions for example)
So it's not as simple as just plopping it down in a cold region
Paris can be considered non-central due to how it's major European internet lines are set up.
Because clearly geographically it's in a good location
Microsoft is actually investing 500 million USD in data centers in the greater Quebec City area. Google it : )
When you say renewable what are you referring to? Solar?
To say Arizona 'experiences droughts for time to time' is a huge understatement. The whole US southwest has been in a perpetual drought for over 10 years. If the current rate of water pulled from the Colorado river continues to grow the river one day won't make it to the pacific ocean.
It hasn't made it to the Gulf of Mexico for 20 years.
@@daveeyes Technically it's the "Gulf of California"
Watching this while pooping in a large data center
Go and shout in the datacenter
Ahhh another Equinix customer
On the toilet i hope.
if you take 16 minutes to poop you should go see a doctor
This is something that has become a problem here in Chile. Because all the networks and infrastructure present in the country and specially in the capital Santiago, the city is becoming the hub for Southern's Cone Internet; and that means there are several data centers now with some more on planning/building stages.
In the next years Google wants to expand its current service while AWS and Microsoft build new facilities all around Santiago, but the capital is an water-stress zone, far from important rivers or the ocean, and air quality is generally bad. And that has created problems with local community because of the consume of water, to the point the Google's project got stuck in legal procedures to obtain its environmental permit, while Amazon had to move its DC outside the metropolitan area of Santiago to get the green light.
wait what TWO golf courses?
Water consumption, runoff, and reuse has been greatly improved.
@@Katchi_is that greatly improved from 130 million gallons? Or greatly improved to 130 million gallons?
@@Peter-xx6tz to
Or 3 hospitals.
Wait, how many tubs of water or showers would that be? lol
Co-locate greenhouses with datacentres.
Commercial Laundries. Anything that can use Warm to Hot air or water. AWS (Amazon) could heat their Distribution centers as needed. All we need to do is PLAN an interlocking infrastructure. Hub - Spoke. Heat production at the Hub. Heat consumption out on the spokes.
Except the time of year you have cooling cost, summer, you don’t need it in a greenhouse. In fact, you need cooling, fans work.
Community heating grids, industrial processing, etc
@@c1ph3rpunk smart thought
@@c1ph3rpunk It's oretty cold here in the north just now at 12:28 hrs on 15th July in so-called summer. People have their heating on for their homes and greenhouses with heating have the heating on. The municipal composting site are running heat exchangers for nearby heating. He did mention in the video locating datacentres in more polar locations. British Sugar once co-located greenhouses with their factories to produce tomatoes but they switched to a more 'tropical crop', aaiii, Some heat could be dumped in the sea and could support further industry such as alginate farming to mitigate environmental negatives. Lots of industrial processes need heat all year round eg lumber drying.
There’s a startup here in the UK called Deep Green that has a more unique means of recycling that heat: heating swimming pools.
Given how much more expensive it’s become in recent years for local councils to heat public swimming pools, it cuts their heating costs while also allowing for a great way to cool data centres installed beneath them!
They also have CRAC units that run on refrigerant instead of chilled water and cooling towers.
Yeap. And they're quite a lot less energy efficient.
@@jfbeam Oh no, so Google's "Bottom-Line" will shrink by a couple Million?! Anywho, fu*k em'
@@ligmasack9038 energy usage = greenhouse gas emissions. It's a choice between GHG or water consumption.
I think this comment kinda misses the forest for the trees. You don't need to run DX CRAC units. Almost every midsize data center in my area (South Texas) just runs rows and rows of air cooled screw chillers. Screw compressors are actually more efficient on their own than the centrifugal compressors which are more common on water cooled chillers, but of course that's lost when there are so many motors running for a bank of air cooled units as opposed to only a few water cooled ones, and because DX compression is more energy intensive than simply pumping water around for evaporative cooling. Other savings on makeup water costs, a big advantage with large numbers of air cooled units is the sheer redundancy of having dozens of machines that can easily bare the extra load should something fail.
@@jfbeam You'd be surprised how close in efficiency they are. On an annualized basis they're often nearly as efficient because in the cooler months they run an augmented thermosyphon loop, but miss by just a few points. We spent months arguing with California about this and how the slight increase trade in energy for water usage was well worth it, but they were hell bent on forcing evaporative cooling despite being a desert.
Would the person in charge of the computer room air conditioning system be referred to as "The CRAC Head"?
Get ready for big data to control your local water supply
You could easily use the waste heat from data centers to run absorption chillers . Reducing the electricity and water usage needed to cool the buildings.
This reminds me of Rain World, minor spoilers but not too bad:
In rain world one of the main plot premises is that these superstructure city-sized, absolutely gargantuan biomechanical supercomputers require a small sea's worth of water to cool themselves while running their computations and processes. The reason the game is called rain world is because the exhausted steam and evaporated water accumulates in such mass that it completely altered the world's global climate. What we're doing is a much smaller scale but still a massive amount of water being displaced and evaporated in places that water typically isn't found. I wouldn't be surprised if we notice effects on the local climates these massive data centers are located.
rain worl
Thats crazy, every data center I've ever seen don't use evaporative cooling. Every one I've even been to uses air conditioners.
Depends on the operator. Some operators even use DIRECT evaporative cooling (where it can literally rain inside a data center) to minimize the energy costs.
You’ve probably never been to an actual data center then, just a lab. Some of these DC campuses literally pull GW of power …..GIGAwatts.
@@miamisasquatch It does not rain inside those facilities. They have outside air economizers to manage humidity.
the most shocking part of this analysis to me is that MORE THAN 50% OF THE WATER USED FOR COOLING IS POTABLE WATER....WTFF
We've tried using reclaim water, it's an absolute mess. The chemicals used eat at all the cooling equipment so you need a huge amount of post treatment. Furthermore when you evaporate that water it's carrying a lot of those chemicals into populated areas (i.e. Ashburn).
@@miamisasquatch then don't build the bloody datacentre. we have enough. really. people's LIVES should come first
@ThylineTheGay lol but the ceo won't get hundred million dollars and shareholders won't get their $0.14 dividends who cares if the world is burning. Think of the stocks!
@@etherealrose2139 why won't anyone think of the billionaires???
what kind of water would you prefer in your coolant pipes?
My company uses air handlers the size of 4 shipping containers (100 of them in the newest building designs) that put out 100k cfm. We run free cooling mode up to around 83F dry bulb outside air temp.
what's "cfm"?
@@krumuvecis cubic feet per minute
Your mention of Arizona brought to mind TSMC's construction of two fabs in that state. TSMC will use water extracted from the already depleted Aquafier as part of its production process. The seven stages of washing the silicon wafers use considerable groundwater. Some of these stages need to use toxic chemicals. As part of their agreement with the state government, this water will be processed through a desalination plant and returned to the aquifer. How well the process is monitored for toxic chemicals returned to the aquifer depends on the vigilance of the state government. American environmental protection is not at the top of the list for excellence worldwide. It will be interesting to see how well they are policed when the new TSMC fads become operational.
Thanks for another excellent presentation.
WTF is "Aquafier"?
Do you mean aquifer?
Bad look.
@@ronlipsius I'm not edykated like wot you is.
This only happens because we have, historically and currently, dramatically underpriced water. When your water is nearly free, it makes sense to reduce your power bill (which costs you actual money) by consuming some water for evaporative cooling (which is close to free). Same thing with power plants; they use evaporative cooling because the water costs them a tiny fraction of what it's worth.
It's entirely possible to have an electric grid that consumes no water whatsoever; renewables do this naturally, and even thermal power plants can be cooled in other ways (at the cost of consuming some of the produced electricity to drive the cooling). Data centers could entirely operate from air conditioning (to air or ground) or, even more efficiently as you note, with free-cooling (potentially coupled to watercooling inside the facility). As you pointed out, hard drives are one of the more temperature-sensitive components, and they'll still tolerate 45C or so just fine; well above ambient temperatures in most locations. There's no reason you even need to use air conditioning if you can transfer the heat from your components (which only need to be kept somewhere between 40c and 90c) directly to the surrounding air and get rid of the heat passively.
Once externalities are priced in, so much of modern capitalism ceases to function
@@manysnakes Or would function a lot better.
That is very interesting, thanks for sharing.
I think a case could be made for establishing acres of greenhouses around such data-centers, where the heat could be used to grow various vegetables and fruits for nearby markets. This would obviate the need to air-freight at least some of such produce from distant countries, thereby helping to cut CO2 emissions, create jobs and maybe reducing prices of those products.
I paid patreon to see this video. Thermodynamics! When it comes to harddrives, the benefit isn't about the ultimate temperatures, but temperature fluctuations. Keeping HDDs at a steady temperature increases life. After all, they are precision machines.
It's also worth noting that water consumption is modified by the cost savings, i.e. they don't want to build cooling towers to recycle water.
Back in the day, our hospital moved to a new data center, and when we were 100% operational, we lost a chiller followed by the backup (a rag stuck somewhere?? and evidently failover wasn’t tested? Anyway…). It got really hot, and really humid, really quick. It was at the end of the day, so fortunately a lot of us sysadminsmwere still there, shutting down hundreds of vms, the hardware, disk storage…etc, all in an effort to keep our AIX Epic EMR servers/storage up. The AIX was a giant p690 and the storage was a full rack of spinning disk - with two of them for redundancy (in the same freakin’ room!! We lost that battle, but got them geographically dispersed later.). When the air temp hit 120deg, I was cleared to shut them down but then the chiller came back online.
The point: we were losing disk all over the place, often at the start but gradually back to normal after over a year.
Oh, yeah, this happened in Wisconsin in the late fall. Not like we had windows to prop up some box fans, but still ironic a bit.
Temperature fluctuations if severe and frequent enough can cause sockets, slots, and chips to break off the motherboards
i'll never get bored of your self commercial", not when its always that short, and straight to the point. Every other good creator on youtube has included some form of sponsoring into their videos. Well at least i can easily right-arrow it!
Even in the wetter Eastern US, these water demands are getting too high. Indiana is planning to pipe a ton of water from the Wabash to sustain industry in the middle of the state.
Turns out there's nothing preventing them from pumping the river dry.
Nothing indeed. Water is the most precious resource yet the governments continue to act like it's some abstract infinite water fall.
I am happy to see you directly attribute power equals water usage. Almost no one seems to address that fact when they evaluate system types.
Watching the video though, lots of small details are a bit off. They dont change the overall message, but worth mentioning regardless.
It seems you're well versed in energy therory, but not the HVAC industry.
- CRAC is normally pronounced "crack"
- In the evaporative loop, there is normally a water cooled chiller in between the two loops.
- the system you described with two loops primarily cooled by water is a water side economizer. They are some of the most efficient systems in many climate and weather conditions.
- direct cooling with the outside air should never come at the expense of dehumidification, the AC part of the HVAC cools and dehumidifies. Often air-side economizers are programmed with enthalpy controls to prevent this issue.
- all of the systems described here are a part of the HVAC systems
Well, when you take out the heat, and use that to heat neighboring apartments and their hot water, you get much better power efficiency. Problems with legacy operators, like Google, is that they boast loudly "how efficiently we can loose the heat", when the real question is "where can we dump this heat, for re-use?". Google uses cold sea water, and boasts about their "efficiency of loosing heat energy!", while real and smart engineers, pump the heating to the area to be used elsewhere.
Again, not too applicable in Singapore, but there are data centers, also Google data centers, in the areas where heating is needed, and a lot of it in certain months of the year.
Where are you pumping the heat to be used in the spring, summer and fall?
4:18... no way all those deer just happened to be standing there haha. Fantastic video as always Asianometry. Didn't think I'd be graced with a crash course on Datacenter cooling tonight! Well done sir.
Data Center HVAC Engineer here. Your description of the recooling stages is mostly outdated today. No datacenter that i know of uses a open secondary chiller. They have allmost all been phased out. We use 2 closed loops, one for the whitespace, one as a recooling loop for the chillers. They can be run in "free cooling" wereas the system just chills the water via plate heat exchangers without running compression based cooling. Here in europe we run free cooling around 60% of the year round. If we need emergency cooling, we can switch to a adiabatic solution witch uses watermist to increase the efficency of the recoolers.
Watercooling is nice, but except from some high density solutions we build, the cost to use and maintenace isnt worth it.
Everybody seems to hate on the energy used by datacenters, but what is the alternative? We are operating about 13 DCs with PoEs from 1.37 to 1.19. There is not mutch more to gain here.
'Water Credits' - smh.
Just as good as carbon credits!
Can you say "Shell game?"
Just found your channel. Thanks for the high quality explanations.
Also good job YT algo for finding a channel I'm not already subscribed too.
Cornell University has operated a deep lake cooling system since the year 2000. Cornell is close to Cayuga Lake, one of the Finger Lakes. These lakes are pretty deep - down near the bottom, the water temperature is consistently low. The system draws from the bottom, heat exchanges it with a closed loop which then sends chilled water up the hill to Cornell. The entire of Cornell's campus is cooled this way, as is part of the local school district. Cornell has a website that has a lot of details including annual greenhouse gas savings. Prior to the deep lake cooling system, Cornell chilled its water using a plant that was powered by... coal.
There's another system using Lake Ontario water in Toronto. Anywhere you're close to deep water you'll find it nice and cool at the bottom because cold water sinks. I believe there was a demo system in Hawaii using ocean water.
The potential of the oceans for this kind of thing is massive, but salt water is an issue and if you're not in a place that's already cold, you need to be near deep water. The potential of the Great Lakes for this kind of thing is enormous too.
You'd think Iceland would be a natural place to site data centers. Inherently cold, near a lot of cold ocean water, tons of hydropower energy, etc.
Tons of earthquakes and volcanoes in Iceland, too. But yeah, you make excellent points, especially with the Great Lakes scenario. I wonder if data center consultants prefer a more steady state climate throughout the year, though, like Arizona, as opposed to a place like Wisconsin, with wild swings of temp and humidity during the same time frame.
Ive thought that building massive data centers close to the Bruce Power site would make sense. They can produce a lot of clean power and use the lake to cool the data center. They already use the lake to cool the nuclear power plant. They would need to run large fiber cables there but that would be pretty cheap. They can use the same right of way the power lines are using.
This is why the Pacific North West is such a good location for Data Centers. More specifically along the Columbia River. There are a number of damns including Grand Cully along with large Wind Farms to provided cheap locally sourced power as well as the Columbia River that provides the water when needed. The Free Cooling factor is also a well known and used factor in these data centers as unlike Arizona, Oregon and Washington are temperate with no lack of rain fall. That last bit should be taken with a pinch of salt as east of the Cascades, aka Eastern Oregon and Washington are High Desert that get plenty hot and see far less rain than the Arboreal Rainforest that makes up the West of the states including the Cascades.
Why not have most of the data centers in cold regions?
Many reasons. You want them near people for latency reasons, far away from each other for redundancy (single power outage doesn't affect all of them) and there is not as much power available in cold places. You'd need to burn fossil fuels to power them up there.
Thanks!
@1:15 lol... 5000 servers is by no means anywhere near 'hyper scale. you can fit 5000 servers in roughly 6 or 7 rows of 15 racks. most people have more sq ft in their homes than that... maybe add another 2 0's and your getting close to 'hyper scale'
My data centre fits entirely in my cranium.
Another advantage to the desert is the nighttime temperatures can drop dramatically from the daytime temperatures
Not here in AZ anymore. There are lots of data centers.They need to put them in Minneapolis.
Not with all the pavement, it hits 97 f low if you're lucky
So datacenters are massively wasting water and heating the atmosphere and they don't really know at what temperature their components need to be??? WTF!!!
But we stupidly build datacenters where its hot, because its cheap to do so. So there isnt much use for all the waste heat in Arizona or the southeast.
Or it has to do with network infestructure in latency.
In Spain, depending on the water availability, approx. 70% of golf courses must use by law reclicled water from the wastewater plants .
1:53 and that's a stupid calculation, because up to 30% of "IT load" is used for server fans.
And I have data to prove it, because our company builds water cooling high density servers where ALL cooling systems (water pumps, radiator fans, etc) are using only up to 6% (usually ~4) of server power
It appears that power efficiency is unfortunately not really a design goal for many of these servers. They seem to lack many sensors, which leads them to run the fans at a much higher speed than they need to. The amount of energy saved could be very significant with better firmware design. I'm also very surprised that nobody has come up with the idea of centralized fans (i.e. one large fan to cool multiple servers). This would allow them to run silently and in a much more power efficient way I think.
Fantastic video. I recently saw a liquid cooling DC lab completed where liquid was either used for immersion or was piped through heatsinks on the processors. I set it up with a dual loop as per your examples in the video- but ran the hot process cooling loop through a dry cooler with fan assist. No need for evaporative cooling as the fluid was coming out towards 40'C and the chips themselves - as you say - could run even hotter (60'C was a reasonable target). Look up a company called 'Iceotope' who are UK leaders in direct liquid cooling.
Really a best in class channel
Worked in a datacenter with economizer chillers and all racks were set up with pod style cold aisle containment. In the winter PUE of 1.05, it was wild.
I worked at a small experimental site that did the same. Like a few MW. Never seen it when you get to what I’d consider a real production data center
@PinkFZeppelin the reason it worked was that while it was collocation, that site was only cage customers (15+ cabinets). We worked with the customers to make sure all unused space in the racks was blanked off and had automatic louvered vent tiles that were tied into a pretty nice temperature monitoring system. Site had 5 200 ton chillers when I was there and was installing the last two when I left.
Was great working there since everything was pretty much brand new so we rarely had anything that would break.
I think you addressed another issue. Why do 2 golf courses use as much water as a datacenter. These datacenters serve millions of people and handle banking, communications etc. While these golf courses serve like 1000 members for leisure. 😊
Let's go golfing dude.
Because money.
Why not, easier to spend millions of galloons of water on a shitty golf course than put some money into R&D figure out how to make something eco friendly that is nice to walk on at all times of the year, the government certainly doesn't care. Water everywhere, even in the West is a commodity, and treated as nothing more than dirt.
Can’t wait till they get so big it rains when they vent the steam 💀
You can't see steam.
That is just water vapor.
@@asbestosfibers1325 Yea, steam and water-vapor are essentially interchangeable, since they’re the same thing (but at different temps)
I’m just using ‘steam’ as a way to describe water that got so hot it turned into gas, even though the correct thing to say would be water vapor
They need to stop evaporating water for cooling. Convert everything over to refrigeration and just use refrigerant to air heat exchangers or refrigerant to water (glycol and water) heat exchangers, just using the water to move the heat around. They’ll use a lot more electricity, maybe that’s why they don’t do this, water is cheap for now. I work at an industrial plant in the desert, they used to have evaporative chillers, we’ve recently had to switch over to refrigeration. The only water that’s used is make up for evaporation off of our holding pit and spillage.
There are also power plants that do not use evaporation chillers for their steam condensers. There are some in the desert not far from me that use air cooled condensers.
power generation also consumes water, he said so in the video, so it isn't as straighforward as jsut adding HVAC everywhere
@@Ubya_ I addressed that in my reply to my first comment.
Joy Cone moved their production from Phoenix to Flagstaff ielevation 7,000 feet), because they were able to cool the cones down after baking quicker with the cooler air at that altitude. Free cooling indeed, and when the wind blew the right direction, it smelled great.
Joy cone?
Are those the little flat bottomed soft-serve cones that taste like styrofoam?
15:00 1.21 Jigawatts? Are we still talking about data centers, or is there a heavily modified DeLorean hiding around here somewhere?
This is why Finland has had lots of data centre investment. We go down to -15 to -30 C even in south - like in case of Hamina. On top of this Finland has lots of renewable and nuclear power keeping our electricity cheap and green on average. Also even the most remote bit of Finland is relatively close to urban development and good infrastructure.
Also a relatively new 144Tbps cable going to Rostock that cost 100m€.
There is also planning phase now for the next cable C-Lion2 and also a cable going via Norway to the Atlantic, Ireland and the US.
There is a misconception I keep hearing and I want to know where this notion comes from, or it's a misconception as far as my understanding and that's this issue of power consumption inside electronics being turned into heat, as if this means if 100KW is being used at any given moment, it's THAT amount of power generating heat but for the LIFE of me I don't understand the physics of THAT one, so someone PLEASE enlighten me.
My understanding of electric power is that it's a form of energy. Energy can be converted to do different things. For instance, in a simple home PC with a modern CPU and say that CPU has 10 billion transistors in it. What you are saying is that all power is converted to heat. What the hell was switching those transistors on and off? You can run electronics to where a unit is putting off almost no heat simply because you run it at low frequencies. The unit is still doing WORK, switching transistors on and off along with making other components work.
Slow down a CPU to 1.5 GHz max and tell me if you need that big honking cooler on it.
As far as I know, switching transistors on and off is REAL WORK. Computing is REAL WORK. Heat is generated, yes, but I can run a brand new Intel or AMD CPU and have it generate almost no heat and the efficiency of the circuit will be CRAZY GOOD, because I slowed it down.
Heat is generated as a result of reactance in electronic circuits that have clock signals driving them, where a clock signal is very much an AC signal (I've put an O'scope to plenty clock signals). You also get SOME heat loss as a result of switching transistors on and off but for the LIFE of me once again I don't understand how that's 100% of the input power or you NEVER would have switched the transistors on or off, which is REAL WORK.
So, there is SOME heat loss from switching transistors on and off. 500W being consumed is NOT 500W getting turned into heat. Get a small electric heater and run it around 500W and let me know if that's what your PC feels like and if it is I REALLY feel sorry for you.
There is MORE heat loss from reactance in electronic circuits driven by high speed clock signals, WHEN you push that clock signal up to the point where the die starts on that exponential curve of inefficiency. Every process node has different characteristics, one is power efficiency at different clock speeds and that's an exponential curve, so the faster you clock the die, the faster the rate of change in inefficiency of the die. It's not linear, it's exponential. There is a point where pushing the clock signal faster isn't worth it because the power consumption is not worth the small gains in REAL WORK, the actual computing being accomplished.
Power = Voltage * Current. There's no energy stored in or outputted from the machine, so all electric power consumed is converted to heat.
@@simontist Dude it took POWER to turn on and off those billions of transistors. Duh?
Once again people say statements like that and totally fail to realize a computer is doing REAL work. So, what's happening is power being used inside the computer is doing work (work uses energy, look it up) and has heat as a BY-PRODUCT. Power is energy.
I worked on electronic systems for 20 years, having a good chunk of a computer science and electrical engineering degree which was required to work on 60s - early 80s computer systems. I COMPLETELY understand the equation P = IE. That doesn't change the fact that electronics CONSUME power when doing work. P does not = heat. P is the total power, no matter what that power did, in the very simple equation of P = IE.
@@johndoh5182 I guess I'm coming from a thermodynamics perspective where work has a specific meaning, that is force*distance. EDIT: Though I guess all those electrons getting moved around probably involves force and distance, it's all something like frictional losses.. the system as a whole doesn't do physical work.
Also in Arizona, we are a disaster free zone. We have a ton of data centers here and lots of companies headquarter here. Data centers mainly use water chillers here and we use treated waste water for data center cooling. And alot of data centers are putting on massive amounts of solar panels to offset its energy. Plus, we have multi-grid with nuclear and hydro electric and so all data cemters are on both grids (separate owners, APS/SRP) as full redundancy here in the Phoenix area, where each server and metwork equipment js fed from UPS, but they have two outlets, where one plugs into one ups/grid and the kther plugs jnto the other grid/ups and almost every other home has solar backfeeding the grid and the natural gas demand plant is located in the four corners area.
So the game rain world is real?
This is how it starts. Next is bioengineered robot animals.
@@Arock-pu9zv bio engineering is a bop
I did a tmobile facility with the raised floors, they also have roughly 10 millions conduits running everywhere its such a pain to paint between them.
I think we have a real issue with what we consider a necessity for data center usage: e-commerce: yes. Facebook, Instagram, Snapchat:not so much. The same people who wring their hands and clutch their pearls over global warming and water usage will spend hours doom scrolling on social media. Show you care by stopping with the memes, selfies and other nonsense that give you a dopamine rush when somebody gives you a "like". If I'm supposed to switch to an electric car to soothe your conscience, then put the phone down to soothe mine.
yeah e-commerce should be purely html with some images----cut down energy consumption a ton
I like the straightforward reporting style of your videos
Ditch the golf courses. Total waste of water.
Golf is a wonderful sport.
Golf is a nice walk in the park ruined by a little white ball.
@@tonycosta3302 Is it the ball... or the cup?
Technically they dont use water more as its not wasted you dummies.
If it helps, they do use recycled water thats not to human consumption standards. But…there are a LOT of courses in AZ.
I once built a watercooled PC and overclocked it hard, was a nice bucket list item but honestly after doing maintenance on that thing for years, I developed a newfound love for a simple and low maintenance rig with mid range air cooled hardware. Currently rocking an i5-12400, RTX 4070, 32GB of DDR4 3600 C18, a SeaSonic PSU, a variety of storage, and a well ventilated case. Runs like a dream.
Why do we need to do this again?
Well recently, I may or may not be involved in a data center project that not only is the largest in the world, but may or may not be filled entirely with graphics cards, not storage. Allegedly, the customer is trying to achieve AGI, basically a super intelligence, utilizing unfathomable compute power all running in parallel, and there is a race among tech giants to gain that kind of power. Or so I’ve heard. Allegedly.
@@NameSpaceVoid I also would like to marry Sydney Sweeney, although the chances of that are better than AGI being achieved in the next 10-20 years.
I wasn't paying any attention, and you mentioned REITs investing into Data centres.... THAT caught my attention.
I feel like they could use that heat energy to boil salt water and extract fresh water
This idea needs to be shared far and wide
BnMGFoundation might fund this
That's a ridiculous idea and not how industrial desalination works. That would be one of the most inefficient uses of energy.
@@czerskipExplain?
@@czerskip
*Ponders thermo*
Darn.
the fact that two golf courses use as much water as a major data center is crazy to me
Training AIs does not not need to be co-located with customers, so those data-centres can be anywhere
They do need good internet connection though. AI training involves moving a lot of data.
I don't think you understand how tier one ISPs charge for traffic. There is tremendous economic incentive to locate data centers near data consumers. "You can get anything you want, but you ain't gonna get it for free"
I work at a data center for META here in Tennessee. . This is wild
I can't help but think that all this stuff we are doing in reducing "global warming" caused by burning fossil fuels and the like, is just being undone by all the stuff we now use that generates mass amounts of heat.
As for moving data centrers to colder regions (anywhere closer to the arctic) is that just going to cause the ice to melt quicker?
What we should be looking at is making all this tech run cooler and more efficient.... and cheap enough to replace old tech.
(My home computers are old... but i can't afford to replace them with anything newer. Even when considering the costs, new kit would still cost more than what i could save in energy bills)
Global Warming isn't caused by us releasing heat, its caused by us releasing CO2 which prevents heat from escaping the world into the space.
Heat dissipates from earth through the atmosphere into space, CO2 prevents that. I'd guess that producing more heat wouldn't matter if we could remove all that excess CO2 from the atmosphere.
8:52 they're trying to restore their water sources, but the question is how are they doing it? They can take it from a water source that creates a new water problem in that area.
Nerds in cubicles excessively sweating causing the heat problem.
I liked the comment about 100% of data center energy is converted to heat. So it struck me that like the old saying about 'The statue was already there, I just removed the extra marble.' i think you could say 'The data was already there, I just removed the extra heat (in a very precise way).'
8:00 That's just nonsense. Would you have shipped trillions of liters of water from US to Africa if US didn't have datacenters? Classic environmentalist sentimental argument.
Recently here in CHENNAI , INDIA lot of DATA CENTRES being deployed .
This video is useful to learn about water consumption in data centre's .
And for what? P0rn, crypto, facebook, and surveillance. The internet had so much promise
The youngsters gotta leep track of the cardashipusses
No, your HR data, sales data, inventory data, shipping data, credit card info, home address, purchase preferences, all user agreements, legal documents, compliance audits. Shall I go on?
All your email, your daily calendar, the television shows you watch, the interface to the camera in tour tv, the microphone in your phone.
I could keep going.
@c1ph3rpunk hr? More like nkvd
@@longiusaescius2537 I’ve put in and used more than one crappy HRIS that’s accessed via the Internet. Dayforce comes to mind most recently.
xAi is building a new facility in Memphis, TN. Memphis sits on one of the largest, cleanest supplies of fresh water in the world. Memphis doesn’t even filter the water coming out of the ground before it goes into our distribution system, but it has the cleanest water in the country according to the EPA. Teh xAi facility is expected to draw 1 million gallons per day,
Crypto is estimated to be 3% of global power use already. Who knows what AI is adding to that total.
I wonder at what point/how the issue will be addressed? We'd have made significant progress already towards using all green energy if these new tech industry technologies didn't keep driving up global energy usage.
Proof of work crypto takes up probably 99% of the share which is bitcoin in this case as it is most dominant. Proof of stake is very minimal in terms of power usage.
How long til we see submerged data centers?
Right, I think it’s time we switch off all the computers and return to a level of technological advancement that still forces us to think for ourselves.
lol literally warming up the ocean by taking the water out heating it and putting it back lol. Obviously it’s a drop in the pond probably has a near zero impact but still funny to think about.
It's important to remember that water is not consumed in the same way that fossil fuels are. Water is simply relocated from its natural source and released back into the environment.
tell that to all the people dying of thirst
apologies but this is at best an oversimplification - water sources rely on _replenishment_ and the actual replenishment processes for some sources can be measured in decades or centuries, or in some cases not at _all_. this is particularly true of groundwater sources which involve very slow processes and usually ENORMOUS (subcontinental, I do mean enormous) reservoir sizes meaning that the supply is for practical purposes, finite.
loss to these sources in america is well documented. usage has increased SEVENTEEN times since 1940. 40% of tracked wells have posted record depletion levels in the last decade, and that in turn is reflecting in the exceptionally poor performance of cropland that largely overlaps almost perfectly with the most depleted and water-scarce areas.
hell, I'm not even touching on rivers, lakes, reservoirs. freshwater usage, depletion and general management is a serious problem. 3%. Three percent.
Water pulled from aquifers can in most cases be considered a non-renewable resource.
Idiot of the year award goes to camperlab
@@humanbeing9079 your contribution is worth exactly an amount subjective but categorically less what op posted.
do better than the people you criticise and maybe your vague, uncontrolled efforts to self-validate will become praxis instead of poison.
It would be really interesting to build a massive facility to co-locate a powerplant, water desalination, and a data center all in one huge facility. Might be worth a study to see what theoretical efficiency savings there could be.
Using water is not wasting water it doesnt go into space after its used who cares
This sounds like something a villain would say after destroying the water supply of an entire region
One thing to realize: data centers are likely the most efficient compute we can get. If they didn't exist, companies would build their own on-premise data centers, which are always more inefficient.
So you either take this, inefficient decentralized data centers, or no computing at all.
Might be the worst recommended video ever. Big data, water. We in this channel are obssesed with water. Datacenters, water. Water. Hard disks need to be at the right temperature.. lol, what? 😆
Yeah, but when companies like Google or Meta preach about "muh climate change" and wasting water, it is a teeny-tiny bit hypocritical and ridiculous, don't you think?
We have a growing data center empire in Quincy, WA. They buy dirt cheap electricity from the Columbia River dams. I believe all the cooling water they use is relatively brackish ground water which is treated on site. You can see mineral ponds in the Google earth images. Every data center I'm familiar with operates with economizers (outside air) before using mechanical cooling whenever possible. Cold aisle temps typically kept around 80 degrees F with hot aisles at 100 degrees F. Fanwalls are the biggest innovation I've seen lately.
So... You're saying that air comes out of the crac?
Glad to see I'm not the only one.
Why not design them to use sea/ocean water? The steam could be condensed to make fresh water.
Do note that evaporating water is not required for power generation or cooling. It's just more efficient in drier locations and physically a much smaller footprint. Locally power generation has shifted to dry fluid coolers instead. (like a radiator in a car)
So what you're saying is, for every Data center we build, we need to get rid of two golf courses? Well, I'm sold.
Data centres are much more important than golf courses, that’s for sure.
"Open a damn window" - it solves so many problems!
Many electrical power plants in downtown Detroit that used to burn coal and have since been converted to natural gas make extra income by selling their waste low pressure steam from their generator turbines to the local downtown business district office towers for heating in the Wintertime. If you drive around downtown Detroit you see what the locals call Hell's Steam vents releasing steam (I assume from leaky pipes) up through manhole covers to the local avenues. The City had its own power generating lighting plants that have since been sold-off to DTE (the local power utility). These same generating plants supply electrical power year-round to the downtown business district. The expended steam is then returned to the same generating plants as condensate to be reheated back into super heated steam for powering the turbines. In the Summer they supply chilled water from evaporator water air cooled units for downtown air conditioning. This shared system has been in place in Detroit since the 1920's. I used to service DTE's data center and they used to brag about how cheap their steam heating bills were in the Wintertime. Steam heat is also used to thaw out side walks, stair ways to office buildings and some streets in Michigan.
If data centers located further North in Michigan they might find more customers for their waste heat and save water by recycling it back from heating customers. Many large Michigan cities already have the under street steam tunnel infrastructures in place. We also have a lot of fresh water around the Great Lakes area for cooling. The only condition is if they draw water out they need to filter it before they return it as run-off water to maintain the lake and river levels. Water drawn from the Great Lakes runs cool around 50 degrees all year long and especially when the lakes freeze over. The only issue we have is a lack of new electrical generating sources. Smaller scale nuclear plants that can be licensed more quickly might help to solve that issue. Many power plants in Michigan are called co-generation plants as they supply both electricity and local steam heat to their industrial customers. There is also a lot of unlit fiber installed in Michigan for transmitting data. Moving North makes more sense to me than locating in hot water scarce states like Arizona.