*These videos take a long time to make* if you would like to buy Paul a coffee to say thanks, link below: ☕ PayPal: www.paypal.me/TheEngineerinMindset Channel membership: ruclips.net/channel/UCk0fGHsCEzGig-rSzkfCjMwjoin Patreon: www.patreon.com/theengineeringmindset
Very cool. I work for a large Data Center that builds on slab, hot isle contained data halls. We employ multi-mode airhandlers that are not in the data hall but outside. Cold air is ducted to above the server cabinets. Our air handlers have direct expansion, indirect expansion via an evaporative cooling tower system, direct evaporative cooling in unit AND access to economizer/ free cooling as able to. We even have areas where using the outside evap coolers we have liquid cooling piped into the server cabinets. We went away from CRAC due to potential issues they presented to the Servers when failures happened in the data halls; Too risky. Having the set up we do allows plenty of redundancy both for individual unit component failure and total capacity as well as efficiency for cooling and power. Thanks for the video!
A data center video on how the critical load is maintained during power outage by generators, ATS’s, UPS’s, PDU’s, and static switch PDU’s would be cool. There’s so many configurations though.
I was told by a DC Manager that Liquid Immersion Cooling will replace all of the above, on new DCs over the next 10 years. It's the next gen server cooling system apparantly. No CRAC's, CRAH'S, AHU's, Chillers, Raised Floors, Hot/Cold Aisle Containment etc. Would be nice to see a vid on that.
As a server tech I was sent to do a SAN upgrade to a customers in house datacenter. Expecting to be in there for hours I brought a nice warm jacket.When I walked into the DC it was like stepping into a sauna. The air con system had failed, there were buckets of water catching leaking AC and they had house fans plugged in trying to cool all the equipment. There were hundreds of red flashing leds on all the server and storage equipment in the racks. I have also encountered Datacenter AC failure with water leaking from the roof soaking the racks below. With staff frantically calling their server storage hardware vendors to log warranty calls and of course not mentioning the flood they were exposed too. Cooling failure in a DC is catastrophic, u better have a redundant solution in place.
I’m surprised this video talks about the raised floor design so much. Any new data center I have worked on in the last 7 years doesn’t use any type of raised floor for cooling.
@@maheshmurali2697 My only experience is North America but I know that that here that not all tier 3 are using raised floor. I’m currently standing in a tier 3 with slab floors.
Both solid floors with overhead cooling, and raised floors are still very common. Raised floors are possibly more likely used in high performance supercomputing centers though. Raised floors are more common for liquid cooled systems too.
I've worked in both mid-sized data centers and computer labs back in the 1970s and 80s. Back then, single mini-computer installations were similar to today's data centers in they were installed on raised floors with significant quantities of cooled air, as those O.G. computers produced a large amount of heat that needed to be constantly removed. Sharing the space near a cooled computer mean wearing a heavy jacket or parka(!), unless you love arctic conditions LOL! I believe the air was being pumped from the floor at about 40 degF, not much warmer than the interior of a refrigerator, and 100s or 1000s of cubic feet / minute. Nowadays, people complain if their laptop gets a bit warm or they can hear those whisper-quiet cooling fans. In that data center, you'd be lucky if you could hear your own thoughts. It's like a constant speed hurricane! I'm sure there are many more clever solutions to cooling nowadays.
This is an amazing video. Im.a journeyman Electrician and has built a few of these data centers in wyoming. I'm currently trying to become a critical facility engineer for one. It's like a whole other apprenticeship, this video is amazing and is a great refresher for me. Thank you for your hard work.
I've been to a couple about 10 years ago working with the chillers. It amazed me that the emergency generator can start up, go to full speed and powering the building in less than 60 seconds. The chilled water system usually has about a 10 min. reserve of chilled water so if one chiller goes down, the spare can come up to speed before that is all used up.
Very interesting video! A good tip for efficiency is to explain to the customers/rack owners, that blanking panels and correct installed equipment are mandatory.... No matter, how smart you build your mechanical cooling system and cold aisles.... when the equipment you want to cool is not installed properly, you will always have an issue.
i used to work in the 9/11 memorial as an engineer. Data centers were top alert at all times, and we had more than a few emergencies where the temp climbed from 60 to near 82 in minutes. The port authority server room had two constantly running dataaire units, and you literally had to wear a jacket if you were working inside for any length of time.
I loved being a data centre engineer, best job I ever had, spoilt only by clueless managers without data centre experience, blanking out your improvement suggestions only to mention them months later in front of the client so that it made them look clever. DC's run mainly on bullshit nowadays.
This is a very well explained video. I think the performance of the cooler is important, but in the end, the most important thing is to effectively convect and dissipate the generated heat. It seems that the actual cooling energy consumption can be reduced through this. I think it is good to optimize the air flow to effectively dissipate heat.
In the 50-120MW DCs I'm running, we use indirect air cooling (+ spray water). The chiller just kicks in to add cooled water and mix it into the flow if temps are higher. CAC is normal, HAC is newer and not comment yet. Efficiency also could be made, when the customer aggree to run their intake not on 22-23°c +/-2°c (or even some old folks want 19°c), but more like on 25°c +/- 3°c. From our calcolation that's 10-15% less cooling power needed.
Good explanation. The data center that I worked in evolved from overcooling the room to keep the servers happy to adding containment with an automation system that installed 3 temp sensors on the face of the cabinet doors to control that CRAC units fan speed and supply temperature. The automation system would learn the cooling requirements of the room and worked quite well. Only problem was when additional cabinets were added it would require additional sensors and automation system programming.
Thankyou soo much because of this video my college presentation went very very very good and my teacher also liked the information thankyou so much!! 🙏❤️
UPS and battery monitoring technician here. Worked in data centers for the past 8 years. I’ve been at data centers that use that evaporative (they called it adiabatic) cooling where they should not have. I was in a battery room that was 88 degrees F (31 C) probably 100 percent humidity. Not a good environment for batteries. This was a major company but I can’t say who because of an NDA.
@2:13. Just an FYI, The correct term is "raised floor". The phrases "Suspended floor" implies the floor is hanging from a tension system much like the deck of a suspension bridge. 😉
One I’ve built is a hot aisle/cold aisle, air/mist evaporation cooled on from the second floor and forced down through the roof of the data hall and then hot air is removed and either mixed or expelled.
Hi Paul! I’ve been following your channel for quite some time now! When I was writing my thesis for my Bachelor I was giving an overview of absorption HVAC systems (such as heat pumps and chillers) and I remember that there were more than some articles around that talked about the use of absorption chillers in data centers. One article was analyzing a solution implemented in a data center in Arizona (pretty hot climate) in which some small finned tubes were made passing around the physical servers casings, taking away much heat, then this fluid would have accumulated in a hot tank, kept at the desired temperature with the help of some solar panels. The hot fluid here stored would have been fed to an absorption chiller LiBr-water in order to cool the server room. Hence the server room would have been cooled by the same heat the servers were producing! The absorption chiller was cooled with an external water source, water that if I remember correctly was used and then cooled in a cooling tower. Do you guys think this could be a viable solution? What problems would it encounter?
Yes, it does work. It can't produce enough coolth to completely cool it and it isn't very efficient but, it is a way to offset other mechanical cooling. We have covered how the absorption chiller works in an old video, check it out.
Great video, but the animation at 3:50 wrongly shows the coolant flowing into both side of the evaporator. But overall great overview with enough specifics.
Even with optional airflow equipment, data center operations folk seem to still have a knack for installing intake on the hot aisle and exhaust on the cold aisle...
Servers are being designed every day to withstand higher Temperatures so that Cooling Load reduces drastically. DCs have been designed till 38 deg C Hot Aisle Temperature so save sufficient load on Chilled Water System. . Diifference between CHW Temperatures have to be as large as possible to decrease the Pumping GPM and thus the Load. . All inverter (Partial Load) Motors provide higher efficiency at lower speeds. So designing Tier 4 DCs with N+N Redundancy and running both systems (2N Chillers with Inverter Compressors & 2N CRACs with EC Fans) at partial loads provide higher Efficiencies. Of course, very little of the stated above applies to American Colder Climate with Free Cooling possibilities but Servers with higher Temperature Resistances always help.
While I'll agree with you that servers are being built to withstand higher temperatures, as I have walked into rooms that were 95 to 100 degrees and everything was still running. The problem is the optical servers which are delicate and often start to suffer physical damage above 90 degrees
There's discussion in the media about fresh water usage by DC cooling systems. Given the closed cooling circuits and fluid to air heat exchangers involved, where is this water lost? Some thermal power stations lose water to atmosphere when condensing LP steam in their cooling towers but I can't see why they must use potable quality water for this if they don't have access to river or seawater.
I swear to God, this channel is absolutely underated and you deserve all the likes from the engineering community! Thanks for sharing this brilliant knowledge!
Perfect , practical… thanks Some manufacturers also keeping heaters inside the CRAC unit , it will operate after deep cooling during dehumidification process. I didn’t see that in the video, how the dehumidification going in the video?
I assume that would work but I would think that there are way more data centers than LNG terminals, although both LNG terminals and data centers are located close to cities. But there are other ways to reduce cooling costs. For example, the server room in the office where I work recycles the server heat throughout the building 8 months of the year and heats the building for free except on the very coldest days.
To increase efficiency we could have reduced oxegen atoms to reduce the formation of rust this will require infrastructure but would it be worth the significant cost?
@@joecool4656 Its not very dangerous, i work in a few centres with lower oxygen levels. You can easily work in there after doing a check up, or just turn the oxygen level higher that does it also. We use it to prevent fire.
Can someone explain to me why you would want to do this? Is rust actually a problem in a climate controlled environment? I've owned many dozens of servers and have never considered rust and have never seen rust in a server. Maybe rust would be a problem at edge deployments like cell phone towers, but that's not what this video is about. And why would lower oxygen be helpful except for the reduced fire risk?
I installed many cooling units in data centers. They also has Halon fire suppression systems in them. I always worried about setting off the Halon system while working in them. Halon gas eats oxygen in the room quickly.
I would like to ask you if there's any difference when I placed the crac linear with the cold aisle or linear with the hot aisle? which is more efficient? Can I Calculate it?
Question: Is it possible to harness the heat from the hot air flowing in the ceiling into energy? Because I have a dumb tought of placing a stirling engine (which I discovered by yt recommend) on the top of the ceiling where the hot air flows
The refrigerant flow at 3:30 is incorrect as it indicates coolant flowing from both ends of the piping towards the evaporator, thus having no coolant exiting the evaporator.
What is the priority for cooling a data center, processors or storage? If it's processors then the best way would be in oil that is then cooled and recirculated. Usually a swamp cooler outside.
Well in-order to #floor cooling why we're always intended to flow the cold #AirCirculation part only from the bottom of the #DataCenter, because sometimes I just think why can't we put the entire Cooling System under the false floor of #DC, including the #WaterTank itself. I mean to say the #DC Room itself will be able to conceive the part of its #ChillingUnit under the false floor. P.S. While maintaining all type of #Precautions and #Safety factor.
It depends which industry guide you choose to follow. Some suggest supply air around 23*C, but you need to consider your data center and equipment to understand if that is suitable
It also depends on the server technology, new equipment can handle higher temperatures. Google is running warmer temperatures than a lot of others. But 68 F still seems to be the sweet spot.
Humidification is a lot more important than you’re letting on. Also I don’t know how other places do it but our towers are vented from directly beneath so there’s no chance of recirculating.
they could liquid cool the servers have waterblocks similar to what you would use on your pc cpu and gpu. they also make north and south bridge as well as hard drive and ssd water blocks.
The best tip for saving energy in a data centre is don’t by an iPhone, look for alternatives that don’t track and share all your data,and use cash, say no to cbdc, help these guys save a fortune on cooling 😉
I don't get how the position of the evaporator coil is completely horizontal, from my understanding it should be installed at least with 60 degrees of inclination.
Some cooling centers use cold isle and hot isle. And some use cooling towers, while others use a different form. They are crazy. They have massive generators, and they are normally powered directly from the power source, IE hydroelectric dams.. one building can generate 1 trillion a year, and one section of the building can generate 500 million to 500 billion. I currently work at one such site. I work on ones that do not use refrigerant due to size, they are designed to be replaced after 10 years...
How about a video on oscillators? How an inductor and capacitor in parallel circuit can make an oscillator. How they are used to make frequencies for radio applications. And finally, talk about quartz crystal oscillators.
not sure about the evaporator cooling method. doesn't all the moister cause eventual corrosion of components and inter connects ,shorts in the equipment, and possible data corruption and or loss? 🤔
Should be installed near Buildings in the city To heat the water for the Shower and the heat in the winter in parallel with the Central when the server does not need to be cooled
Something about the animation at 3:47 was confusing me... Then I realized the pipes going into the evaporator are both flowing in and neither are flowing out.
I have been working in data centers for my entire career. I have seen every one of these layouts and cooling systems. One huge challenge is getting the computers and networking gear to have the proper air flow direction, not to mention the, mostly older, systems that move the air in on one side and out on the other. The hot air containment is the easiest to work in from my experience, provided the hot area has air movement to keep the temperature down to a reasonable level. One item that always seems to be left out is the noise. Granted that is outside the scope of your video, but it is something your don't hear much about. You wouldn't believe how loud it gets inside a data center, especially inside of a hot isle containment area. All those 20,000 RPM fans pumping air into a small enclose space is deafening. Ear projection is a must. You showed some B-roll of people in a datacenter wearing hard hats. That is bullshit. Nobody ever does that. You need one where everyone has ear protection. Also, all the sock video ever shows neat and clean rooms all organized and all with super clean wiring. That does happen, but only in a room that is managed properly. I have been in many colo's (colocation data centers where you can rent 1 or more racks) where many of the racks are just a spider web of tangled cables. It is a major problem that good datacenter managers spend a lot of time policing.
Talk about humidity control, the testing which I am involved delay for weeks because some consultant assume they can control the humidity to below 70 without a dehumidifier in a humid country.
I have not seen in data centers or semiconductor fans. They are moving a lot of Air relatively fast, so it may not add value. Also they install a lot of utilities under the floor including power. Those systems come up through the floor.
accurate, correct until the last part. 8:20 racks don't exhaust air to the rear but to the top. data center cooling though still lacks efficiency/innovation. for example, humans working in data centers don't need to be cooled or the building/room containing the racks. cool air can simply sucked underneath the racks but no.
*These videos take a long time to make* if you would like to buy Paul a coffee to say thanks, link below: ☕
PayPal: www.paypal.me/TheEngineerinMindset
Channel membership: ruclips.net/channel/UCk0fGHsCEzGig-rSzkfCjMwjoin
Patreon: www.patreon.com/theengineeringmindset
All I see is wasted energy. Where there is heat exchange there is the potential to generate electricity.
Please make a video on how HVAC system is designed and installed at hospitals.
2 DAYS AGO?
THIS WAS UPLOADED TODAY, HOW COULD THIS COMMENT BE 2 DAYS AGO?
They should try to use Gallium-nitride technology for the power supplies and things to reduce heat.
Very cool. I work for a large Data Center that builds on slab, hot isle contained data halls. We employ multi-mode airhandlers that are not in the data hall but outside. Cold air is ducted to above the server cabinets. Our air handlers have direct expansion, indirect expansion via an evaporative cooling tower system, direct evaporative cooling in unit AND access to economizer/ free cooling as able to. We even have areas where using the outside evap coolers we have liquid cooling piped into the server cabinets. We went away from CRAC due to potential issues they presented to the Servers when failures happened in the data halls; Too risky. Having the set up we do allows plenty of redundancy both for individual unit component failure and total capacity as well as efficiency for cooling and power. Thanks for the video!
very cool hahahaha
A data center video on how the critical load is maintained during power outage by generators, ATS’s, UPS’s, PDU’s, and static switch PDU’s would be cool. There’s so many configurations though.
I was told by a DC Manager that Liquid Immersion Cooling will replace all of the above, on new DCs over the next 10 years. It's the next gen server cooling system apparantly. No CRAC's, CRAH'S, AHU's, Chillers, Raised Floors, Hot/Cold Aisle Containment etc. Would be nice to see a vid on that.
As a server tech I was sent to do a SAN upgrade to a customers in house datacenter. Expecting to be in there for hours I brought a nice warm jacket.When I walked into the DC it was like stepping into a sauna. The air con system had failed, there were buckets of water catching leaking AC and they had house fans plugged in trying to cool all the equipment. There were hundreds of red flashing leds on all the server and storage equipment in the racks. I have also encountered Datacenter AC failure with water leaking from the roof soaking the racks below. With staff frantically calling their server storage hardware vendors to log warranty calls and of course not mentioning the flood they were exposed too. Cooling failure in a DC is catastrophic, u better have a redundant solution in place.
I’m surprised this video talks about the raised floor design so much. Any new data center I have worked on in the last 7 years doesn’t use any type of raised floor for cooling.
There's a couple of instances in the video where newer non-raised floor designs are shown.
All tier III uses raised floor design
@@maheshmurali2697 My only experience is North America but I know that that here that not all tier 3 are using raised floor. I’m currently standing in a tier 3 with slab floors.
Both solid floors with overhead cooling, and raised floors are still very common. Raised floors are possibly more likely used in high performance supercomputing centers though. Raised floors are more common for liquid cooled systems too.
The company I work for is a large tech company with multiple modern data centres, they are building more as we speak and they are all raised floor
I've worked in both mid-sized data centers and computer labs back in the 1970s and 80s. Back then, single mini-computer installations were similar to today's data centers in they were installed on raised floors with significant quantities of cooled air, as those O.G. computers produced a large amount of heat that needed to be constantly removed. Sharing the space near a cooled computer mean wearing a heavy jacket or parka(!), unless you love arctic conditions LOL! I believe the air was being pumped from the floor at about 40 degF, not much warmer than the interior of a refrigerator, and 100s or 1000s of cubic feet / minute. Nowadays, people complain if their laptop gets a bit warm or they can hear those whisper-quiet cooling fans. In that data center, you'd be lucky if you could hear your own thoughts. It's like a constant speed hurricane! I'm sure there are many more clever solutions to cooling nowadays.
This is an amazing video. Im.a journeyman Electrician and has built a few of these data centers in wyoming. I'm currently trying to become a critical facility engineer for one. It's like a whole other apprenticeship, this video is amazing and is a great refresher for me. Thank you for your hard work.
I've been to a couple about 10 years ago working with the chillers. It amazed me that the emergency generator can start up, go to full speed and powering the building in less than 60 seconds. The chilled water system usually has about a 10 min. reserve of chilled water so if one chiller goes down, the spare can come up to speed before that is all used up.
@@snax_4820 That's right, and it would cost them millions.
Very interesting video!
A good tip for efficiency is to explain to the customers/rack owners, that blanking panels and correct installed equipment are mandatory....
No matter, how smart you build your mechanical cooling system and cold aisles.... when the equipment you want to cool is not installed properly, you will always have an issue.
i used to work in the 9/11 memorial as an engineer. Data centers were top alert at all times, and we had more than a few emergencies where the temp climbed from 60 to near 82 in minutes. The port authority server room had two constantly running dataaire units, and you literally had to wear a jacket if you were working inside for any length of time.
I loved being a data centre engineer, best job I ever had, spoilt only by clueless managers without data centre experience, blanking out your improvement suggestions only to mention them months later in front of the client so that it made them look clever.
DC's run mainly on bullshit nowadays.
This is a very well explained video. I think the performance of the cooler is important, but in the end, the most important thing is to effectively convect and dissipate the generated heat. It seems that the actual cooling energy consumption can be reduced through this.
I think it is good to optimize the air flow to effectively dissipate heat.
In the 50-120MW DCs I'm running, we use indirect air cooling (+ spray water). The chiller just kicks in to add cooled water and mix it into the flow if temps are higher. CAC is normal, HAC is newer and not comment yet. Efficiency also could be made, when the customer aggree to run their intake not on 22-23°c +/-2°c (or even some old folks want 19°c), but more like on 25°c +/- 3°c. From our calcolation that's 10-15% less cooling power needed.
Another engineering mindset banger!!!! Youve taken the previous video to the next level!
I appreciate that!
Great that you made vido about Datacenter, I was waiting for one from you. Good work 👏
Great video. As a DC engineer I enjoyed it.
Good explanation. The data center that I worked in evolved from overcooling the room to keep the servers happy to adding containment with an automation system that installed 3 temp sensors on the face of the cabinet doors to control that CRAC units fan speed and supply temperature. The automation system would learn the cooling requirements of the room and worked quite well. Only problem was when additional cabinets were added it would require additional sensors and automation system programming.
did that reduce energy consumption? by how much %?
Thankyou soo much because of this video my college presentation went very very very good and my teacher also liked the information thankyou so much!! 🙏❤️
I work at a data center. I'll say this is a good video.
UPS and battery monitoring technician here. Worked in data centers for the past 8 years. I’ve been at data centers that use that evaporative (they called it adiabatic) cooling where they should not have. I was in a battery room that was 88 degrees F (31 C) probably 100 percent humidity. Not a good environment for batteries. This was a major company but I can’t say who because of an NDA.
Its so, so incredibly loud inside of a DC. Lots of fun too.
BEAUTIFULLY EXPLAINED
I had a job building systems to cool data centers. That was my favorite job.
@2:13. Just an FYI, The correct term is "raised floor". The phrases "Suspended floor" implies the floor is hanging from a tension system much like the deck of a suspension bridge. 😉
As a design engineer for a company focused on data center cooling - can confirm
Though we technically call chilled water units CRAHs for computer room air handier
One I’ve built is a hot aisle/cold aisle, air/mist evaporation cooled on from the second floor and forced down through the roof of the data hall and then hot air is removed and either mixed or expelled.
Hi Paul! I’ve been following your channel for quite some time now! When I was writing my thesis for my Bachelor I was giving an overview of absorption HVAC systems (such as heat pumps and chillers) and I remember that there were more than some articles around that talked about the use of absorption chillers in data centers. One article was analyzing a solution implemented in a data center in Arizona (pretty hot climate) in which some small finned tubes were made passing around the physical servers casings, taking away much heat, then this fluid would have accumulated in a hot tank, kept at the desired temperature with the help of some solar panels. The hot fluid here stored would have been fed to an absorption chiller LiBr-water in order to cool the server room. Hence the server room would have been cooled by the same heat the servers were producing! The absorption chiller was cooled with an external water source, water that if I remember correctly was used and then cooled in a cooling tower.
Do you guys think this could be a viable solution? What problems would it encounter?
Yes, it does work. It can't produce enough coolth to completely cool it and it isn't very efficient but, it is a way to offset other mechanical cooling. We have covered how the absorption chiller works in an old video, check it out.
Great video, but the animation at 3:50 wrongly shows the coolant flowing into both side of the evaporator. But overall great overview with enough specifics.
Thanks for another cool video!
HA !
Even with optional airflow equipment, data center operations folk seem to still have a knack for installing intake on the hot aisle and exhaust on the cold aisle...
Servers are being designed every day to withstand higher Temperatures so that Cooling Load reduces drastically. DCs have been designed till 38 deg C Hot Aisle Temperature so save sufficient load on Chilled Water System.
.
Diifference between CHW Temperatures have to be as large as possible to decrease the Pumping GPM and thus the Load.
.
All inverter (Partial Load) Motors provide higher efficiency at lower speeds. So designing Tier 4 DCs with N+N Redundancy and running both systems (2N Chillers with Inverter Compressors & 2N CRACs with EC Fans) at partial loads provide higher Efficiencies.
Of course, very little of the stated above applies to American Colder Climate with Free Cooling possibilities but Servers with higher Temperature Resistances always help.
While I'll agree with you that servers are being built to withstand higher temperatures, as I have walked into rooms that were 95 to 100 degrees and everything was still running. The problem is the optical servers which are delicate and often start to suffer physical damage above 90 degrees
Do you know what are the average outlet water temperatures from the chillers for these applications? Would water need go below 0°C?
A datacenter I work at from time to time has hot aisle design. It's great, until you have to do maintenance inside that area...
😂
I've just been given a project recently to do inside the hot aisles 🙃😂
@@Thispersonsaysso
I feel your pain.
I work in one of those! No one likes Hot Isle work lol
There's discussion in the media about fresh water usage by DC cooling systems. Given the closed cooling circuits and fluid to air heat exchangers involved, where is this water lost? Some thermal power stations lose water to atmosphere when condensing LP steam in their cooling towers but I can't see why they must use potable quality water for this if they don't have access to river or seawater.
I swear to God, this channel is absolutely underated and you deserve all the likes from the engineering community! Thanks for sharing this brilliant knowledge!
i took way too long to find this again lol. i should probably save this to watch later
Thanks for sharing. a quick question if I may. For DX CRAC units why the compressor is always installed in the indoor units?
Perfect , practical… thanks
Some manufacturers also keeping heaters inside the CRAC unit , it will operate after deep cooling during dehumidification process.
I didn’t see that in the video, how the dehumidification going in the video?
A bit behind the times here. Built cold aisle data centers 20 years ago. Technologies have moved on.
Could we use the waste cold energy released from the Liquefied Natural Gas (LNG) regasification process to help with cooling costs?
I assume that would work but I would think that there are way more data centers than LNG terminals, although both LNG terminals and data centers are located close to cities.
But there are other ways to reduce cooling costs. For example, the server room in the office where I work recycles the server heat throughout the building 8 months of the year and heats the building for free except on the very coldest days.
yes siir... im a technician of a PACU units... especially vertiv😁😁👍👍
Cool you used the Google DC in Fredericia, Denmark as your zoom-in DC in the beginning of the video 👍
Isn't water vooling inside the computer more effective than air cooling? In underwate data center it seems like the most doable option too.
To increase efficiency we could have reduced oxegen atoms to reduce the formation of rust this will require infrastructure but would it be worth the significant cost?
It would reduce the risk of fire. But it's very energy intensive to maintain a low oxygen environment.
@@EngineeringMindset maybe the underwater server does it well without any extra energy
It could also be dangerous for humans
@@joecool4656 Its not very dangerous, i work in a few centres with lower oxygen levels. You can easily work in there after doing a check up, or just turn the oxygen level higher that does it also. We use it to prevent fire.
Can someone explain to me why you would want to do this? Is rust actually a problem in a climate controlled environment? I've owned many dozens of servers and have never considered rust and have never seen rust in a server. Maybe rust would be a problem at edge deployments like cell phone towers, but that's not what this video is about. And why would lower oxygen be helpful except for the reduced fire risk?
I'm enjoying your channel.
Please any updates on manually uploading video subtitles?
I installed many cooling units in data centers. They also has Halon fire suppression systems in them. I always worried about setting off the Halon system while working in them. Halon gas eats oxygen in the room quickly.
I would like to ask you if there's any difference when I placed the crac linear with the cold aisle or linear with the hot aisle? which is more efficient? Can I Calculate it?
Question: Is it possible to harness the heat from the hot air flowing in the ceiling into energy?
Because I have a dumb tought of placing a stirling engine (which I discovered by yt recommend) on the top of the ceiling where the hot air flows
The refrigerant flow at 3:30 is incorrect as it indicates coolant flowing from both ends of the piping towards the evaporator, thus having no coolant exiting the evaporator.
Well spotted
never been to a data center, however my future may involve going to one someday, and not for a tour
What is the priority for cooling a data center, processors or storage? If it's processors then the best way would be in oil that is then cooled and recirculated. Usually a swamp cooler outside.
That's an interesting idea. Can you provide a link to an information site?
Using that method the oil will never get below the outdoor temperature.
Can use for personal cpu knowledge too. Pretty good :)
I live in Denmark close to an Apple data center and it is the plan the part of my central heating will come from the data center from 2024.
Well in-order to #floor cooling why we're always intended to flow the cold #AirCirculation part only from the bottom of the #DataCenter, because sometimes I just think why can't we put the entire Cooling System under the false floor of #DC, including the #WaterTank itself.
I mean to say the #DC Room itself will be able to conceive the part of its #ChillingUnit under the false floor.
P.S. While maintaining all type of #Precautions and #Safety factor.
How much degrees tempareture maintain in data center
It depends which industry guide you choose to follow. Some suggest supply air around 23*C, but you need to consider your data center and equipment to understand if that is suitable
It also depends on the server technology, new equipment can handle higher temperatures. Google is running warmer temperatures than a lot of others. But 68 F still seems to be the sweet spot.
Humidification is a lot more important than you’re letting on. Also I don’t know how other places do it but our towers are vented from directly beneath so there’s no chance of recirculating.
And here I thought it was the neat clothes and hairstyles that made them cool.
they could liquid cool the servers have waterblocks similar to what you would use on your pc cpu and gpu.
they also make north and south bridge as well as hard drive and ssd water blocks.
The best tip for saving energy in a data centre is don’t by an iPhone, look for alternatives that don’t track and share all your data,and use cash, say no to cbdc, help these guys save a fortune on cooling 😉
I don't get how the position of the evaporator coil is completely horizontal, from my understanding it should be installed at least with 60 degrees of inclination.
In reality it is, but this is a simplified 3d model. It's missing 90% of the components inside. It's just shown as a illustrative representation
Some cooling centers use cold isle and hot isle. And some use cooling towers, while others use a different form. They are crazy. They have massive generators, and they are normally powered directly from the power source, IE hydroelectric dams.. one building can generate 1 trillion a year, and one section of the building can generate 500 million to 500 billion. I currently work at one such site. I work on ones that do not use refrigerant due to size, they are designed to be replaced after 10 years...
How about a video on oscillators? How an inductor and capacitor in parallel circuit can make an oscillator. How they are used to make frequencies for radio applications. And finally, talk about quartz crystal oscillators.
not sure about the evaporator cooling method. doesn't all the moister cause eventual corrosion of components and inter connects ,shorts in the equipment, and possible data corruption and or loss? 🤔
Should be installed near Buildings in the city To heat the water for the Shower and the heat in the winter in parallel with the Central when the server does not need to be cooled
Sir. Which. One, is. More, efficient, is, very, essay, either, water, cooling, or. Air. Cooling. System.
Very good video.I love it.
Please do a video on HCPV
Wow so cool bro 👍🙏🇬🇧
I work on data cooling units both dx and chw. I am cool.
Good job👍
Something about the animation at 3:47 was confusing me... Then I realized the pipes going into the evaporator are both flowing in and neither are flowing out.
good work! thank you!)
Who usually work in this data center? As in job titles?? I worked before on the but too scared to talk to the guys inside the data center
Can you do a video about hospital and operation theatre air-conditioning systems?
Hospital systems are normal vav systems with hot water reheat boxes, and the majority of theaters now all are all giant package units.
could the removed hot air then be used to turn a turbine, and convert some of the waste heat back into electricity?
I have been working in data centers for my entire career. I have seen every one of these layouts and cooling systems. One huge challenge is getting the computers and networking gear to have the proper air flow direction, not to mention the, mostly older, systems that move the air in on one side and out on the other. The hot air containment is the easiest to work in from my experience, provided the hot area has air movement to keep the temperature down to a reasonable level.
One item that always seems to be left out is the noise. Granted that is outside the scope of your video, but it is something your don't hear much about. You wouldn't believe how loud it gets inside a data center, especially inside of a hot isle containment area. All those 20,000 RPM fans pumping air into a small enclose space is deafening. Ear projection is a must. You showed some B-roll of people in a datacenter wearing hard hats. That is bullshit. Nobody ever does that. You need one where everyone has ear protection.
Also, all the sock video ever shows neat and clean rooms all organized and all with super clean wiring. That does happen, but only in a room that is managed properly. I have been in many colo's (colocation data centers where you can rent 1 or more racks) where many of the racks are just a spider web of tangled cables. It is a major problem that good datacenter managers spend a lot of time policing.
10/10 nice!
You didn't talk much about humidity control which is very important in data center and the main difference between AHU and CRAC units.
Talk about humidity control, the testing which I am involved delay for weeks because some consultant assume they can control the humidity to below 70 without a dehumidifier in a humid country.
I'm in Florida. We have external humidifiers in the rooms in addition to the humidifier in the air handler.
Do you know if the raised floors are insulated to slow heat transfer? Thanks
They should/could be, but many aren't
@@EngineeringMindset Thank you!
I have not seen in data centers or semiconductor fans. They are moving a lot of Air relatively fast, so it may not add value.
Also they install a lot of utilities under the floor including power. Those systems come up through the floor.
Is there any cooling system in smartphones? 🤔 If there's it's too small!
I have designed and built over 30 data centers worldwide.
We are using DAHU Fans for Cooling.
accurate, correct until the last part. 8:20 racks don't exhaust air to the rear but to the top. data center cooling though still lacks efficiency/innovation. for example, humans working in data centers don't need to be cooled or the building/room containing the racks. cool air can simply sucked underneath the racks but no.
US DOE sets a specific performance requirement in a addition to local mechanical codes.
Btw thanks for the video was looking for it since heard of raised floor cooling
That was useful
Hey man I'm "just chilling out"
Good video
I am subscribed!
1:22 The image no server owner want to see, and you show it to everyone ;)
I hope Google's server manager doesn't want to kill you ;-)
Funny thing i got into servers a few months ago and might start hosting a local VPN
Just chilling out
Yes evean those. For my R4H18 R4X2 working probbly up runnning verey safe good basstromechs sisco to for now goood server rooms good to.
So it's just a name safe privit to good 👍.
Can anyone , how hot aisle containment system is more efficient than cold aisle containment system, and by how much(appx)??
Hey Paul, I work in the immersion cooling datacenter industry. Drop me a line to see how we can work together.
Good video, well done, but mathematically is probably the easiest calculation.(For an A grade student). :))
just chilling out
Nicely spotted
Why not just open data centers in the Arctic?
Cool man