Data Center Cooling - how are data centre cooled cold aisle containment hvacr

Поделиться
HTML-код
  • Опубликовано: 26 дек 2024

Комментарии • 210

  • @EngineeringMindset
    @EngineeringMindset  3 года назад +28

    *These videos take a long time to make* if you would like to buy Paul a coffee to say thanks, link below: ☕
    PayPal: www.paypal.me/TheEngineerinMindset
    Channel membership: ruclips.net/channel/UCk0fGHsCEzGig-rSzkfCjMwjoin
    Patreon: www.patreon.com/theengineeringmindset

    • @frankh.3849
      @frankh.3849 3 года назад

      All I see is wasted energy. Where there is heat exchange there is the potential to generate electricity.

    • @buntyshukla2625
      @buntyshukla2625 3 года назад +2

      Please make a video on how HVAC system is designed and installed at hospitals.

    • @DanielBerzinskas
      @DanielBerzinskas 3 года назад

      2 DAYS AGO?

    • @DanielBerzinskas
      @DanielBerzinskas 3 года назад

      THIS WAS UPLOADED TODAY, HOW COULD THIS COMMENT BE 2 DAYS AGO?

    • @fbi-federalblyatofinvestig3853
      @fbi-federalblyatofinvestig3853 3 года назад +1

      They should try to use Gallium-nitride technology for the power supplies and things to reduce heat.

  • @ELuciferC
    @ELuciferC 3 года назад +11

    Very cool. I work for a large Data Center that builds on slab, hot isle contained data halls. We employ multi-mode airhandlers that are not in the data hall but outside. Cold air is ducted to above the server cabinets. Our air handlers have direct expansion, indirect expansion via an evaporative cooling tower system, direct evaporative cooling in unit AND access to economizer/ free cooling as able to. We even have areas where using the outside evap coolers we have liquid cooling piped into the server cabinets. We went away from CRAC due to potential issues they presented to the Servers when failures happened in the data halls; Too risky. Having the set up we do allows plenty of redundancy both for individual unit component failure and total capacity as well as efficiency for cooling and power. Thanks for the video!

  • @CjMooseChuckle_1
    @CjMooseChuckle_1 3 года назад +25

    A data center video on how the critical load is maintained during power outage by generators, ATS’s, UPS’s, PDU’s, and static switch PDU’s would be cool. There’s so many configurations though.

  • @hvacdesignsolutions
    @hvacdesignsolutions 3 года назад +6

    I was told by a DC Manager that Liquid Immersion Cooling will replace all of the above, on new DCs over the next 10 years. It's the next gen server cooling system apparantly. No CRAC's, CRAH'S, AHU's, Chillers, Raised Floors, Hot/Cold Aisle Containment etc. Would be nice to see a vid on that.

  • @garyburke301
    @garyburke301 3 года назад +12

    As a server tech I was sent to do a SAN upgrade to a customers in house datacenter. Expecting to be in there for hours I brought a nice warm jacket.When I walked into the DC it was like stepping into a sauna. The air con system had failed, there were buckets of water catching leaking AC and they had house fans plugged in trying to cool all the equipment. There were hundreds of red flashing leds on all the server and storage equipment in the racks. I have also encountered Datacenter AC failure with water leaking from the roof soaking the racks below. With staff frantically calling their server storage hardware vendors to log warranty calls and of course not mentioning the flood they were exposed too. Cooling failure in a DC is catastrophic, u better have a redundant solution in place.

  • @benlappin
    @benlappin 3 года назад +94

    I’m surprised this video talks about the raised floor design so much. Any new data center I have worked on in the last 7 years doesn’t use any type of raised floor for cooling.

    • @EngineeringMindset
      @EngineeringMindset  3 года назад +21

      There's a couple of instances in the video where newer non-raised floor designs are shown.

    • @maheshmurali2697
      @maheshmurali2697 3 года назад +12

      All tier III uses raised floor design

    • @benlappin
      @benlappin 3 года назад +10

      @@maheshmurali2697 My only experience is North America but I know that that here that not all tier 3 are using raised floor. I’m currently standing in a tier 3 with slab floors.

    • @SoloRenegade
      @SoloRenegade 3 года назад +9

      Both solid floors with overhead cooling, and raised floors are still very common. Raised floors are possibly more likely used in high performance supercomputing centers though. Raised floors are more common for liquid cooled systems too.

    • @Thispersonsaysso
      @Thispersonsaysso 3 года назад +7

      The company I work for is a large tech company with multiple modern data centres, they are building more as we speak and they are all raised floor

  • @sfperalta
    @sfperalta 3 года назад +9

    I've worked in both mid-sized data centers and computer labs back in the 1970s and 80s. Back then, single mini-computer installations were similar to today's data centers in they were installed on raised floors with significant quantities of cooled air, as those O.G. computers produced a large amount of heat that needed to be constantly removed. Sharing the space near a cooled computer mean wearing a heavy jacket or parka(!), unless you love arctic conditions LOL! I believe the air was being pumped from the floor at about 40 degF, not much warmer than the interior of a refrigerator, and 100s or 1000s of cubic feet / minute. Nowadays, people complain if their laptop gets a bit warm or they can hear those whisper-quiet cooling fans. In that data center, you'd be lucky if you could hear your own thoughts. It's like a constant speed hurricane! I'm sure there are many more clever solutions to cooling nowadays.

  • @justinhour3879
    @justinhour3879 4 месяца назад

    This is an amazing video. Im.a journeyman Electrician and has built a few of these data centers in wyoming. I'm currently trying to become a critical facility engineer for one. It's like a whole other apprenticeship, this video is amazing and is a great refresher for me. Thank you for your hard work.

  • @glennnickey3160
    @glennnickey3160 3 года назад +9

    I've been to a couple about 10 years ago working with the chillers. It amazed me that the emergency generator can start up, go to full speed and powering the building in less than 60 seconds. The chilled water system usually has about a 10 min. reserve of chilled water so if one chiller goes down, the spare can come up to speed before that is all used up.

    • @glennnickey3160
      @glennnickey3160 3 года назад +1

      @@snax_4820 That's right, and it would cost them millions.

  • @rolands.7870
    @rolands.7870 3 года назад +6

    Very interesting video!
    A good tip for efficiency is to explain to the customers/rack owners, that blanking panels and correct installed equipment are mandatory....
    No matter, how smart you build your mechanical cooling system and cold aisles.... when the equipment you want to cool is not installed properly, you will always have an issue.

  • @ptsmknbatgirl
    @ptsmknbatgirl 3 года назад +1

    i used to work in the 9/11 memorial as an engineer. Data centers were top alert at all times, and we had more than a few emergencies where the temp climbed from 60 to near 82 in minutes. The port authority server room had two constantly running dataaire units, and you literally had to wear a jacket if you were working inside for any length of time.

  • @tinytonymaloney7832
    @tinytonymaloney7832 3 года назад +3

    I loved being a data centre engineer, best job I ever had, spoilt only by clueless managers without data centre experience, blanking out your improvement suggestions only to mention them months later in front of the client so that it made them look clever.
    DC's run mainly on bullshit nowadays.

  • @topotw2
    @topotw2 2 года назад +2

    This is a very well explained video. I think the performance of the cooler is important, but in the end, the most important thing is to effectively convect and dissipate the generated heat. It seems that the actual cooling energy consumption can be reduced through this.
    I think it is good to optimize the air flow to effectively dissipate heat.

  • @zenja42
    @zenja42 3 года назад +2

    In the 50-120MW DCs I'm running, we use indirect air cooling (+ spray water). The chiller just kicks in to add cooled water and mix it into the flow if temps are higher. CAC is normal, HAC is newer and not comment yet. Efficiency also could be made, when the customer aggree to run their intake not on 22-23°c +/-2°c (or even some old folks want 19°c), but more like on 25°c +/- 3°c. From our calcolation that's 10-15% less cooling power needed.

  • @Z901Z
    @Z901Z 3 года назад +1

    Another engineering mindset banger!!!! Youve taken the previous video to the next level!

  • @apm8396
    @apm8396 3 года назад +1

    Great that you made vido about Datacenter, I was waiting for one from you. Good work 👏

  • @maheshmurali2697
    @maheshmurali2697 3 года назад +5

    Great video. As a DC engineer I enjoyed it.

  • @tomg721
    @tomg721 3 года назад +11

    Good explanation. The data center that I worked in evolved from overcooling the room to keep the servers happy to adding containment with an automation system that installed 3 temp sensors on the face of the cabinet doors to control that CRAC units fan speed and supply temperature. The automation system would learn the cooling requirements of the room and worked quite well. Only problem was when additional cabinets were added it would require additional sensors and automation system programming.

  • @brawlerbible2620
    @brawlerbible2620 2 года назад +1

    Thankyou soo much because of this video my college presentation went very very very good and my teacher also liked the information thankyou so much!! 🙏❤️

  • @Rando_Suave
    @Rando_Suave 3 года назад +1

    I work at a data center. I'll say this is a good video.

  • @CjMooseChuckle_1
    @CjMooseChuckle_1 3 года назад

    UPS and battery monitoring technician here. Worked in data centers for the past 8 years. I’ve been at data centers that use that evaporative (they called it adiabatic) cooling where they should not have. I was in a battery room that was 88 degrees F (31 C) probably 100 percent humidity. Not a good environment for batteries. This was a major company but I can’t say who because of an NDA.

  • @randykitchleburger2780
    @randykitchleburger2780 Год назад +1

    Its so, so incredibly loud inside of a DC. Lots of fun too.

  • @rockysubu8384
    @rockysubu8384 2 года назад +1

    BEAUTIFULLY EXPLAINED

  • @Jokreher
    @Jokreher 3 года назад +1

    I had a job building systems to cool data centers. That was my favorite job.

  • @michaelgarito4176
    @michaelgarito4176 3 года назад +1

    @2:13. Just an FYI, The correct term is "raised floor". The phrases "Suspended floor" implies the floor is hanging from a tension system much like the deck of a suspension bridge. 😉

  • @miamisasquatch
    @miamisasquatch 3 года назад

    As a design engineer for a company focused on data center cooling - can confirm

    • @miamisasquatch
      @miamisasquatch 3 года назад +1

      Though we technically call chilled water units CRAHs for computer room air handier

  • @the.bearded.gunner5618
    @the.bearded.gunner5618 2 года назад

    One I’ve built is a hot aisle/cold aisle, air/mist evaporation cooled on from the second floor and forced down through the roof of the data hall and then hot air is removed and either mixed or expelled.

  • @vittoriopiaser9233
    @vittoriopiaser9233 3 года назад +1

    Hi Paul! I’ve been following your channel for quite some time now! When I was writing my thesis for my Bachelor I was giving an overview of absorption HVAC systems (such as heat pumps and chillers) and I remember that there were more than some articles around that talked about the use of absorption chillers in data centers. One article was analyzing a solution implemented in a data center in Arizona (pretty hot climate) in which some small finned tubes were made passing around the physical servers casings, taking away much heat, then this fluid would have accumulated in a hot tank, kept at the desired temperature with the help of some solar panels. The hot fluid here stored would have been fed to an absorption chiller LiBr-water in order to cool the server room. Hence the server room would have been cooled by the same heat the servers were producing! The absorption chiller was cooled with an external water source, water that if I remember correctly was used and then cooled in a cooling tower.
    Do you guys think this could be a viable solution? What problems would it encounter?

    • @EngineeringMindset
      @EngineeringMindset  3 года назад +2

      Yes, it does work. It can't produce enough coolth to completely cool it and it isn't very efficient but, it is a way to offset other mechanical cooling. We have covered how the absorption chiller works in an old video, check it out.

  • @tristanwegner
    @tristanwegner 2 года назад +1

    Great video, but the animation at 3:50 wrongly shows the coolant flowing into both side of the evaporator. But overall great overview with enough specifics.

  • @Pood369
    @Pood369 3 года назад +3

    Thanks for another cool video!

  • @crazyredneck7244
    @crazyredneck7244 3 года назад

    Even with optional airflow equipment, data center operations folk seem to still have a knack for installing intake on the hot aisle and exhaust on the cold aisle...

  • @roshanramesh627
    @roshanramesh627 3 года назад +3

    Servers are being designed every day to withstand higher Temperatures so that Cooling Load reduces drastically. DCs have been designed till 38 deg C Hot Aisle Temperature so save sufficient load on Chilled Water System.
    .
    Diifference between CHW Temperatures have to be as large as possible to decrease the Pumping GPM and thus the Load.
    .
    All inverter (Partial Load) Motors provide higher efficiency at lower speeds. So designing Tier 4 DCs with N+N Redundancy and running both systems (2N Chillers with Inverter Compressors & 2N CRACs with EC Fans) at partial loads provide higher Efficiencies.
    Of course, very little of the stated above applies to American Colder Climate with Free Cooling possibilities but Servers with higher Temperature Resistances always help.

    • @MrXjoeharperx
      @MrXjoeharperx 2 года назад

      While I'll agree with you that servers are being built to withstand higher temperatures, as I have walked into rooms that were 95 to 100 degrees and everything was still running. The problem is the optical servers which are delicate and often start to suffer physical damage above 90 degrees

  • @octaviovinoly
    @octaviovinoly 6 месяцев назад

    Do you know what are the average outlet water temperatures from the chillers for these applications? Would water need go below 0°C?

  • @RedmilesShark
    @RedmilesShark 3 года назад +10

    A datacenter I work at from time to time has hot aisle design. It's great, until you have to do maintenance inside that area...

    • @EngineeringMindset
      @EngineeringMindset  3 года назад +5

      😂

    • @Thispersonsaysso
      @Thispersonsaysso 3 года назад +1

      I've just been given a project recently to do inside the hot aisles 🙃😂

    • @RedmilesShark
      @RedmilesShark 3 года назад +1

      @@Thispersonsaysso
      I feel your pain.

    • @ELuciferC
      @ELuciferC 3 года назад

      I work in one of those! No one likes Hot Isle work lol

  • @alans9806
    @alans9806 2 месяца назад

    There's discussion in the media about fresh water usage by DC cooling systems. Given the closed cooling circuits and fluid to air heat exchangers involved, where is this water lost? Some thermal power stations lose water to atmosphere when condensing LP steam in their cooling towers but I can't see why they must use potable quality water for this if they don't have access to river or seawater.

  • @Rodrigo540
    @Rodrigo540 3 года назад

    I swear to God, this channel is absolutely underated and you deserve all the likes from the engineering community! Thanks for sharing this brilliant knowledge!

  • @Chitose_
    @Chitose_ 3 месяца назад

    i took way too long to find this again lol. i should probably save this to watch later

  • @mohamedfergany5611
    @mohamedfergany5611 2 года назад

    Thanks for sharing. a quick question if I may. For DX CRAC units why the compressor is always installed in the indoor units?

  • @mohammadalshaikhhasan5091
    @mohammadalshaikhhasan5091 3 года назад

    Perfect , practical… thanks
    Some manufacturers also keeping heaters inside the CRAC unit , it will operate after deep cooling during dehumidification process.
    I didn’t see that in the video, how the dehumidification going in the video?

  • @tsanger121
    @tsanger121 3 года назад +1

    A bit behind the times here. Built cold aisle data centers 20 years ago. Technologies have moved on.

  • @justlisten82
    @justlisten82 3 года назад +2

    Could we use the waste cold energy released from the Liquefied Natural Gas (LNG) regasification process to help with cooling costs?

    • @TheVonMatrices
      @TheVonMatrices 3 года назад +3

      I assume that would work but I would think that there are way more data centers than LNG terminals, although both LNG terminals and data centers are located close to cities.
      But there are other ways to reduce cooling costs. For example, the server room in the office where I work recycles the server heat throughout the building 8 months of the year and heats the building for free except on the very coldest days.

  • @benjangotong9265
    @benjangotong9265 3 года назад

    yes siir... im a technician of a PACU units... especially vertiv😁😁👍👍

  • @DeStoreholmskeBaner
    @DeStoreholmskeBaner 3 года назад

    Cool you used the Google DC in Fredericia, Denmark as your zoom-in DC in the beginning of the video 👍

  • @jucom756
    @jucom756 2 года назад +1

    Isn't water vooling inside the computer more effective than air cooling? In underwate data center it seems like the most doable option too.

  • @Theaverageyoutuber-c8v
    @Theaverageyoutuber-c8v 3 года назад +3

    To increase efficiency we could have reduced oxegen atoms to reduce the formation of rust this will require infrastructure but would it be worth the significant cost?

    • @EngineeringMindset
      @EngineeringMindset  3 года назад +3

      It would reduce the risk of fire. But it's very energy intensive to maintain a low oxygen environment.

    • @RahulKumar-ve4jm
      @RahulKumar-ve4jm 3 года назад +2

      @@EngineeringMindset maybe the underwater server does it well without any extra energy

    • @joecool4656
      @joecool4656 3 года назад

      It could also be dangerous for humans

    • @XDnikiDX
      @XDnikiDX 3 года назад

      @@joecool4656 Its not very dangerous, i work in a few centres with lower oxygen levels. You can easily work in there after doing a check up, or just turn the oxygen level higher that does it also. We use it to prevent fire.

    • @TheVonMatrices
      @TheVonMatrices 3 года назад +1

      Can someone explain to me why you would want to do this? Is rust actually a problem in a climate controlled environment? I've owned many dozens of servers and have never considered rust and have never seen rust in a server. Maybe rust would be a problem at edge deployments like cell phone towers, but that's not what this video is about. And why would lower oxygen be helpful except for the reduced fire risk?

  • @bitebonumbere1426
    @bitebonumbere1426 3 года назад +1

    I'm enjoying your channel.
    Please any updates on manually uploading video subtitles?

  • @johnheggie8064
    @johnheggie8064 3 года назад +1

    I installed many cooling units in data centers. They also has Halon fire suppression systems in them. I always worried about setting off the Halon system while working in them. Halon gas eats oxygen in the room quickly.

  • @mettcbsd4790
    @mettcbsd4790 3 года назад

    I would like to ask you if there's any difference when I placed the crac linear with the cold aisle or linear with the hot aisle? which is more efficient? Can I Calculate it?

  • @---------______
    @---------______ 3 года назад

    Question: Is it possible to harness the heat from the hot air flowing in the ceiling into energy?
    Because I have a dumb tought of placing a stirling engine (which I discovered by yt recommend) on the top of the ceiling where the hot air flows

  • @prototypo8359
    @prototypo8359 3 года назад +2

    The refrigerant flow at 3:30 is incorrect as it indicates coolant flowing from both ends of the piping towards the evaporator, thus having no coolant exiting the evaporator.

  • @captainkeyes9913
    @captainkeyes9913 Год назад +1

    never been to a data center, however my future may involve going to one someday, and not for a tour

  • @philselkin4776
    @philselkin4776 3 года назад

    What is the priority for cooling a data center, processors or storage? If it's processors then the best way would be in oil that is then cooled and recirculated. Usually a swamp cooler outside.

    • @paulmcclung9383
      @paulmcclung9383 3 года назад

      That's an interesting idea. Can you provide a link to an information site?

    • @MrXjoeharperx
      @MrXjoeharperx 2 года назад

      Using that method the oil will never get below the outdoor temperature.

  • @andrewzhu8753
    @andrewzhu8753 Год назад

    Can use for personal cpu knowledge too. Pretty good :)

  • @Thorsted67
    @Thorsted67 2 года назад

    I live in Denmark close to an Apple data center and it is the plan the part of my central heating will come from the data center from 2024.

  • @shankarnathmajumder
    @shankarnathmajumder 3 года назад

    Well in-order to #floor cooling why we're always intended to flow the cold #AirCirculation part only from the bottom of the #DataCenter, because sometimes I just think why can't we put the entire Cooling System under the false floor of #DC, including the #WaterTank itself.
    I mean to say the #DC Room itself will be able to conceive the part of its #ChillingUnit under the false floor.
    P.S. While maintaining all type of #Precautions and #Safety factor.

  • @mp2669
    @mp2669 3 года назад +3

    How much degrees tempareture maintain in data center

    • @EngineeringMindset
      @EngineeringMindset  3 года назад

      It depends which industry guide you choose to follow. Some suggest supply air around 23*C, but you need to consider your data center and equipment to understand if that is suitable

    • @paulmcclung9383
      @paulmcclung9383 3 года назад

      It also depends on the server technology, new equipment can handle higher temperatures. Google is running warmer temperatures than a lot of others. But 68 F still seems to be the sweet spot.

  • @timothycampbell8053
    @timothycampbell8053 3 года назад +1

    Humidification is a lot more important than you’re letting on. Also I don’t know how other places do it but our towers are vented from directly beneath so there’s no chance of recirculating.

  • @michaellinner7772
    @michaellinner7772 3 года назад

    And here I thought it was the neat clothes and hairstyles that made them cool.

  • @ejonesss
    @ejonesss 2 года назад

    they could liquid cool the servers have waterblocks similar to what you would use on your pc cpu and gpu.
    they also make north and south bridge as well as hard drive and ssd water blocks.

  • @knottyinks1
    @knottyinks1 4 месяца назад

    The best tip for saving energy in a data centre is don’t by an iPhone, look for alternatives that don’t track and share all your data,and use cash, say no to cbdc, help these guys save a fortune on cooling 😉

  • @carlosbetancourt9228
    @carlosbetancourt9228 3 года назад +1

    I don't get how the position of the evaporator coil is completely horizontal, from my understanding it should be installed at least with 60 degrees of inclination.

    • @EngineeringMindset
      @EngineeringMindset  3 года назад +2

      In reality it is, but this is a simplified 3d model. It's missing 90% of the components inside. It's just shown as a illustrative representation

  • @chrisl6263
    @chrisl6263 2 года назад

    Some cooling centers use cold isle and hot isle. And some use cooling towers, while others use a different form. They are crazy. They have massive generators, and they are normally powered directly from the power source, IE hydroelectric dams.. one building can generate 1 trillion a year, and one section of the building can generate 500 million to 500 billion. I currently work at one such site. I work on ones that do not use refrigerant due to size, they are designed to be replaced after 10 years...

  • @jimvalim1567
    @jimvalim1567 3 года назад

    How about a video on oscillators? How an inductor and capacitor in parallel circuit can make an oscillator. How they are used to make frequencies for radio applications. And finally, talk about quartz crystal oscillators.

  • @GothGuy885
    @GothGuy885 Год назад

    not sure about the evaporator cooling method. doesn't all the moister cause eventual corrosion of components and inter connects ,shorts in the equipment, and possible data corruption and or loss? 🤔

  • @LascuLars
    @LascuLars 8 месяцев назад

    Should be installed near Buildings in the city To heat the water for the Shower and the heat in the winter in parallel with the Central when the server does not need to be cooled

  • @siddeshwarapm5613
    @siddeshwarapm5613 9 месяцев назад

    Sir. Which. One, is. More, efficient, is, very, essay, either, water, cooling, or. Air. Cooling. System.

  • @sakinhossain9226
    @sakinhossain9226 3 года назад

    Very good video.I love it.

  • @Rick-d6t
    @Rick-d6t Месяц назад

    Please do a video on HCPV

  • @psvyme48paulh45
    @psvyme48paulh45 3 года назад

    Wow so cool bro 👍🙏🇬🇧

  • @maness2112
    @maness2112 3 года назад +1

    I work on data cooling units both dx and chw. I am cool.

  • @miteshrembo4594
    @miteshrembo4594 3 года назад

    Good job👍

  • @Igneusflama
    @Igneusflama 3 года назад

    Something about the animation at 3:47 was confusing me... Then I realized the pipes going into the evaporator are both flowing in and neither are flowing out.

  • @SorokinAU
    @SorokinAU 3 года назад

    good work! thank you!)

  • @highwood18
    @highwood18 Год назад

    Who usually work in this data center? As in job titles?? I worked before on the but too scared to talk to the guys inside the data center

  • @jishnumohanpillai6820
    @jishnumohanpillai6820 3 года назад +1

    Can you do a video about hospital and operation theatre air-conditioning systems?

    • @MrXjoeharperx
      @MrXjoeharperx 2 года назад

      Hospital systems are normal vav systems with hot water reheat boxes, and the majority of theaters now all are all giant package units.

  • @Andrew90046zero
    @Andrew90046zero 3 года назад

    could the removed hot air then be used to turn a turbine, and convert some of the waste heat back into electricity?

  • @zadrik1337
    @zadrik1337 3 года назад +4

    I have been working in data centers for my entire career. I have seen every one of these layouts and cooling systems. One huge challenge is getting the computers and networking gear to have the proper air flow direction, not to mention the, mostly older, systems that move the air in on one side and out on the other. The hot air containment is the easiest to work in from my experience, provided the hot area has air movement to keep the temperature down to a reasonable level.
    One item that always seems to be left out is the noise. Granted that is outside the scope of your video, but it is something your don't hear much about. You wouldn't believe how loud it gets inside a data center, especially inside of a hot isle containment area. All those 20,000 RPM fans pumping air into a small enclose space is deafening. Ear projection is a must. You showed some B-roll of people in a datacenter wearing hard hats. That is bullshit. Nobody ever does that. You need one where everyone has ear protection.
    Also, all the sock video ever shows neat and clean rooms all organized and all with super clean wiring. That does happen, but only in a room that is managed properly. I have been in many colo's (colocation data centers where you can rent 1 or more racks) where many of the racks are just a spider web of tangled cables. It is a major problem that good datacenter managers spend a lot of time policing.

  • @DanielBerzinskas
    @DanielBerzinskas 3 года назад +1

    10/10 nice!

  • @kasimshaikh3750
    @kasimshaikh3750 3 года назад +1

    You didn't talk much about humidity control which is very important in data center and the main difference between AHU and CRAC units.

    • @j.l1848
      @j.l1848 3 года назад +1

      Talk about humidity control, the testing which I am involved delay for weeks because some consultant assume they can control the humidity to below 70 without a dehumidifier in a humid country.

    • @MrXjoeharperx
      @MrXjoeharperx 2 года назад

      I'm in Florida. We have external humidifiers in the rooms in addition to the humidifier in the air handler.

  • @joecool4656
    @joecool4656 3 года назад

    Do you know if the raised floors are insulated to slow heat transfer? Thanks

    • @EngineeringMindset
      @EngineeringMindset  3 года назад +1

      They should/could be, but many aren't

    • @joecool4656
      @joecool4656 3 года назад

      @@EngineeringMindset Thank you!

    • @paulmcclung9383
      @paulmcclung9383 3 года назад

      I have not seen in data centers or semiconductor fans. They are moving a lot of Air relatively fast, so it may not add value.
      Also they install a lot of utilities under the floor including power. Those systems come up through the floor.

  • @aley4644
    @aley4644 3 года назад

    Is there any cooling system in smartphones? 🤔 If there's it's too small!

  • @thomastexwilson7323
    @thomastexwilson7323 3 года назад

    I have designed and built over 30 data centers worldwide.

  • @mazharali9900
    @mazharali9900 3 года назад

    We are using DAHU Fans for Cooling.

  • @zodiacfml
    @zodiacfml 3 года назад

    accurate, correct until the last part. 8:20 racks don't exhaust air to the rear but to the top. data center cooling though still lacks efficiency/innovation. for example, humans working in data centers don't need to be cooled or the building/room containing the racks. cool air can simply sucked underneath the racks but no.

  • @paulmcclung9383
    @paulmcclung9383 3 года назад

    US DOE sets a specific performance requirement in a addition to local mechanical codes.

  • @buntyshukla2625
    @buntyshukla2625 3 года назад +1

    Btw thanks for the video was looking for it since heard of raised floor cooling

  • @farnoodshabafroozan4968
    @farnoodshabafroozan4968 3 года назад

    That was useful

  • @scorpio4041
    @scorpio4041 3 года назад +1

    Hey man I'm "just chilling out"

  • @bayou__
    @bayou__ Год назад

    Good video

  • @DanielBerzinskas
    @DanielBerzinskas 3 года назад

    I am subscribed!

  • @hquanngd
    @hquanngd 3 года назад +2

    1:22 The image no server owner want to see, and you show it to everyone ;)
    I hope Google's server manager doesn't want to kill you ;-)

  • @Spozinbro
    @Spozinbro Год назад

    Funny thing i got into servers a few months ago and might start hosting a local VPN

  • @brlinf06398
    @brlinf06398 10 месяцев назад

    Just chilling out

  • @AsstromechR4-M1AsstromechR4-M1
    @AsstromechR4-M1AsstromechR4-M1 10 месяцев назад +1

    Yes evean those. For my R4H18 R4X2 working probbly up runnning verey safe good basstromechs sisco to for now goood server rooms good to.

  • @XchemicleX
    @XchemicleX 3 года назад

    Can anyone , how hot aisle containment system is more efficient than cold aisle containment system, and by how much(appx)??

  • @raulalvareztenor8928
    @raulalvareztenor8928 3 года назад

    Hey Paul, I work in the immersion cooling datacenter industry. Drop me a line to see how we can work together.

  • @iliapopovich
    @iliapopovich 3 года назад

    Good video, well done, but mathematically is probably the easiest calculation.(For an A grade student). :))

  • @vigneshwaranms
    @vigneshwaranms 3 года назад

    just chilling out

  • @qjtvaddict
    @qjtvaddict 3 года назад

    Why not just open data centers in the Arctic?

  • @MrSabram07
    @MrSabram07 3 года назад

    Cool man