A Fun Data Center Tour at PhoenixNAP

Поделиться
HTML-код
  • Опубликовано: 1 дек 2024

Комментарии • 313

  • @beloved_lover
    @beloved_lover 3 года назад +148

    More of these tours, always interesting to see different Data Centers.

  • @Jessassin
    @Jessassin 3 года назад +67

    I have a half cage colo there! Super cool to see a video about this!!!

  • @RakeshSharma_PCTeKReviews
    @RakeshSharma_PCTeKReviews 3 года назад +10

    Thanks PhoenixNAP for wonderful inside of your Data-center.

  • @danielchester5131
    @danielchester5131 3 года назад +61

    I project managed a millimeter wave radio on the roof and full rack colo project in that facility. Really cool to actually see it!

    • @rjy8960
      @rjy8960 3 года назад +4

      Wow!
      I've recently set up system for the QO-100 / Es'Hail geostationary satellite for narrow band amateur radio - uplink on 2.4GHz and downlink 10GHz. I'm really hoping to do something in the mm spectrum in the future. That must have been an interesting project :)

    • @alexthelion335
      @alexthelion335 2 года назад

      Nice!

  • @OTechnology
    @OTechnology 3 года назад +48

    Showing the behind the scenes on the cooling system is awesome!

  • @JeffGeerling
    @JeffGeerling 3 года назад +123

    7:40 - /me sees humidifier in a DC, freaks out...
    Then I realized this is in the arid land of Phoenix, and not a humid swamp like St. Louis!

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  3 года назад +12

      Yea! Pretty darn cool.

    • @lasbrujazz
      @lasbrujazz 3 года назад +9

      Now, go make Pi-powered humidifier.

    • @chiragsukhala
      @chiragsukhala 3 года назад +5

      wow, even Jeff is here. just don't trip those breakers if you are red shirt Jeff

    • @chiragsukhala
      @chiragsukhala 3 года назад +5

      from Pi Cluster to Pi Data Center, we all are in this together.

    • @jfbeam
      @jfbeam 3 года назад +1

      They're usually inside the air handlers.

  • @davidcarroll2908
    @davidcarroll2908 2 года назад +14

    As a contractor that has built computer rooms before, this was impressive, they have spared no expense to make this attractive to all customers. they even put glass storefront in their mechanical rooms, double and triple redundancy all I can say is wow, it would take an act of war just about to take this off line.

  • @AWPneeson
    @AWPneeson 3 года назад +32

    NOW this is some cool behind the scenes action. awesome stuff

  • @JonMasters
    @JonMasters 3 года назад +7

    This is useful for folks who don’t get a chance to visit datacenters. Thanks for doing it!

  • @Chopancho93
    @Chopancho93 3 года назад +9

    Please make more of these tours. It's always amazing to see Data Centers.

  • @garyseaman6105
    @garyseaman6105 3 года назад +1

    Very interesting indeed. Thank you PhoenixNAP and STH.

  • @acruzp
    @acruzp 3 года назад +1

    Frank is incredibly well spoken and clear.

  • @bahmanhatami2573
    @bahmanhatami2573 3 года назад +7

    Those guys look really kind and aren't of selfish one's. Good job you and them both.

  • @alexgravenor
    @alexgravenor 3 года назад +18

    Amazing video :)
    Well shot and edited. Great content

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  3 года назад +10

      Joe did a great job on this. It was a big help not to have to shoot the video myself.

  • @marktackman2886
    @marktackman2886 3 года назад +9

    Thank you for the transparency disclosure.

  • @setharnold9764
    @setharnold9764 3 года назад +6

    Awesome stuff, best infomercial I've watched in a long time :D I hope other data center folks want in on this. Thanks

  • @The_Personal_Picks_SnM
    @The_Personal_Picks_SnM Год назад

    Thanks a lot for making this video and making it possible for people to see what a actual Datacenter and their components look like.

  • @stucorbishley
    @stucorbishley 3 года назад +8

    This was fantastic! I've been in the bowels of few DCs, but my god, that's one heck of a facility. Thanks for shooting this! 😃

  • @silentbyte33
    @silentbyte33 3 года назад +2

    I absolutely enjoyed this video! This video is by far the most in-depth video compared to other videos I seen about Data Center!

  • @harrisongilbert
    @harrisongilbert 3 года назад +5

    Great tour! I’d love to see more of these videos!

  • @lennygemar1021
    @lennygemar1021 3 года назад +1

    Great video. Thanks for posting this. I used to work in IT data centers in the late 90s/early 2000s but haven't been in a big one since probably 2006. They've come a long way.

  • @AZwupatki
    @AZwupatki 3 года назад

    I worked in a IDC in Phoenix area, loved it in the summer time as I would have to work under the floor and freeze my arse off. Watching this brings back memories, good times and it was an awesome job.

  • @johntrussell7228
    @johntrussell7228 3 года назад +11

    I'm always fascinated by how many physical security measures there are for data centers.

    • @mrmotofy
      @mrmotofy 3 года назад +3

      Data security is critical these days. With a few clicks you can be wealthy, broke or have 8 warrants from multiple states. People NEED to start taking it more seriously, it's only gonna get worse. Ask anyone who has had identity theft. Your accounts are locked down, can't put gas in your car, buy anything, make your mortgage payments, receive paychecks etc. It can be devastating for a year or more and destroy credit etc.

  • @davelamont
    @davelamont 2 года назад +1

    This was a great video. I've always wanted to tour a data center, you do a tour of a very large data center. Great content!

  • @VidarStorm
    @VidarStorm 3 года назад +1

    I was a systems engineer for 15 years at a small-medium sized data center. It was a great experience. Now I am a cloud and virtualization architect for a large infrastructure where we maintain servers in two data centers as well as disaster recovery with with a large public cloud provider. Phoenix NAP looked great! But I must say that my IT Disneyland was Switch (formerly SuperNAP) in Vegas. That place sets the standard for all carrier connectivity as well as unique heat/air management. Hard to get a tour though. I was lucky to get my tour there.
    Thanks for the tour at Phoenix NAP!

    • @chumpmu1
      @chumpmu1 2 года назад

      Agreed. This is a great DC, but Switch is on an entirely different level. Glad to have the opportunity to have a full rack co-lo there. If only I could work out of it everyday!!

  • @BadAssAdministrator
    @BadAssAdministrator 2 года назад

    Just ordered a dedicated server from these folks as a DR site. Thanks STH!

  • @TomWhi
    @TomWhi 3 года назад +1

    Great video, they’re obviously very passionate about what they do!

  • @apefu
    @apefu 3 года назад +23

    This was cool. Major nerd creds for making this!
    I've worked with quite a few data centers over the years but they looked like small hobby projects compared to this :)

  • @jpshanuson7192
    @jpshanuson7192 3 года назад +1

    Nice. I racked two full cabs there about 3 years ago. Great facility

  • @Ogorodovd
    @Ogorodovd 3 года назад +1

    Really cool! Love these kind of educational/doc-style videos!

  • @youtubecommenter4069
    @youtubecommenter4069 3 года назад +1

    Hey Patrick, this is so cool. Me flashing back to class trips for practicals then back to write about the facilities. Please, do this more.

  • @SOF006
    @SOF006 3 года назад +1

    Very cool, data centres facinate me. Its impressive seeing all that equipment in one location, makes me wonder what its all doing.

  • @MyAeroMove
    @MyAeroMove 3 года назад +2

    Awesome intro into macro scale!

  • @noahneutral7557
    @noahneutral7557 3 года назад +1

    I live there and I work at Sky Harbor! I hope you had a great time here! I enjoyed the video.

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  3 года назад +1

      Sweet! Thanks for the kind words and have a great weekend.

    • @noahneutral7557
      @noahneutral7557 3 года назад

      @@ServeTheHomeVideo you too! Thanks!

  • @rjy8960
    @rjy8960 3 года назад +1

    Brilliant! Thanks Patrick!
    I live in the UK and HQ is in Phoenix so due to Covid haven't had the chance to go for over a year. It was so nice to see a few images of Sky Harbour and the sun!
    Great video - I'm always fascinated to learn about real infrastructure. Me at home I have a 1Gbps link and a backup 4G connection. Main connection went down on Thursday and having to wait till Tuesday am to get it back. Not very happy.
    I really enjoy the STH channel. Stay safe.

    • @noahneutral7557
      @noahneutral7557 3 года назад +1

      KPHX is a great airport! I work there as a ramper!

    • @rjy8960
      @rjy8960 3 года назад

      @@noahneutral7557 I miss it! Happy days, hopefully soon to come back! :)
      Stay safe!

  • @EggHead2103
    @EggHead2103 3 года назад +10

    Very cool video, and awesome what kind of access they gave. Being in the Phoenix area, I can definitely corroborate that Datacenters are on the rise here, in spite of the environmental factors (high heat, potential for low water supply).
    Might need to apply there 🤔.

  • @MAG320
    @MAG320 2 года назад

    Magnificent all the way.
    I understand the whole security part, but they are also getting free marketing. You got my like.

  • @Nobe_Oddy
    @Nobe_Oddy 3 года назад +1

    THAT PLACE IS AMAZING!!! WOW!!!!!!!

  • @Slackw4x
    @Slackw4x 3 года назад +2

    im still proud managing data on my NAS in my livingroom

  • @riccardik
    @riccardik 3 года назад +4

    pleaaseee do more of those videos :D crazy interesting

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  3 года назад

      Glad you think so. Anything in particular you found interesting?

  • @nicholascherry5962
    @nicholascherry5962 3 года назад +1

    This man is super technical, knows his stuff, and doesn't mind sharing his knowledge. Great show!!

  • @deanrhodenizer938
    @deanrhodenizer938 3 года назад +3

    Greetings from Canada and thank you for a great tour given the security considerations involved. I was surprised that the facility only had 60 second capability for running on battery. You must have a lot of confidence in those generators moving from the OFF state too full load very quickly. I was also surprised about the amount of cooling capability you can accommodate - 43 KW in a single cabinet. That is enough heat generation in such a cabinet that is is like a controlled fire - impressive. I was surprised about the location selected given how difficult heat dissipation would likely be in Phoenix. I guess having Tier 1 ISP availability from multiple providers counts for a lot. It is too bad that there is no good way (presently at least) to put all the waste heat that gets generated to some useful purpose. Thanks again.

    • @poitiers2853
      @poitiers2853 3 года назад

      Historically, Phoenix always averaged about 76ºF in the summertime until they installed all of those data centers.

  • @strongium9900
    @strongium9900 2 года назад +1

    What a cool learning experience.

  • @thimslugga
    @thimslugga 3 года назад +10

    Ha I’ve been to this facility and installed / turned up equipment for my previous employer who was one of their larger colo customers. I’m surprised they didn’t talk about the no cardboard rule. They are super strict on bringing anything remotely close to cardboard like Cisco license envelopes and if you tailgate through the man traps they will come over the speaker and give ya a hard time.

    • @SkynetCyb
      @SkynetCyb 3 года назад

      Why no cardboard?

    • @thimslugga
      @thimslugga 3 года назад +1

      @@SkynetCyb shedding of dust and contamination. When you tear cardboard you see little particles fly in the air

    • @SkynetCyb
      @SkynetCyb 3 года назад

      @@thimslugga That's pretty smart, I never would've thought about it, I thought they had filters in place for this use case though?

    • @jfbeam
      @jfbeam 3 года назад +2

      @@SkynetCyb They don't want that crap being sucked into the air handlers. Dust is murder in a DC. I had one "lab" (~400sqft) that was 100% isolated from the building HVAC. The filters on the CRAC stayed like new for 5 years... until the a**holes cut a 1sqft hole in the wall; the filters were clogged in less than a week. (and two servers were killed by drywall dust -- fucks with power supplies frying the cpus.)

    • @BullCheatFR
      @BullCheatFR 3 года назад

      They seem ok with it if it's something like a reusable box or anything you're not going to tear open basically.

  • @Agakir
    @Agakir 3 года назад +1

    I was in few Data Centers, backup Data Centers..... and few of them due to company specification was in the same location with Communication Junction.

  • @stevejoseph1664
    @stevejoseph1664 3 года назад +1

    Thanks Frank.

  • @redtails
    @redtails 3 года назад +4

    Crazy to see all that physical security with steel cages around the racks. Makes sense if you consider there's 10s of millions of $$ in each rack nowadays, even more with GPU or HDD clusters.

    • @lennygemar1021
      @lennygemar1021 3 года назад +4

      While the hardware and software may have a high value, the real value they're protecting is the service each client provides. Imagine you're a client like AWS or Google where your revenue is directly tied to your up-time and data throughput. All that security and redundancy directly contribute to a company's bottom line.

    • @BullCheatFR
      @BullCheatFR 3 года назад +1

      Also makes sense when you consider any other customer could go to your rack and pick their way in

  • @latemhh5577
    @latemhh5577 3 года назад +3

    Tours are always interesting

  • @cheddarcheese
    @cheddarcheese 3 года назад +1

    Welcome to Phoenix!

  • @GGBeyond
    @GGBeyond 3 года назад +4

    I'd love to see more of this kind of content. If possible, I'd like to see hardware in the racks and what they're being used for. I have my own full-rack in a colo and I'd love to get some ideas.

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  3 года назад +3

      Usually showing hardware in racks is not allowed. Trying to address the hardware side with our hardware reviews

    • @AchwaqKhalid
      @AchwaqKhalid 3 года назад +3

      +++

    • @jfbeam
      @jfbeam 3 года назад

      It's customer gear, so "none of your business."

  • @suntzu1409
    @suntzu1409 3 года назад +3

    Would love detailed tours of storage racks, compute racks, networking racks, etc. in data centers

  • @huplim
    @huplim 3 года назад +1

    Awesome stuff!
    More of this please

  • @fooey88
    @fooey88 3 года назад +1

    Great video! Thanks for sharing.

  • @rileyhayes1493
    @rileyhayes1493 3 года назад +2

    seeing telstra on the glass really threw me, i knew they had gear in other countries but didnt realise they did SD-WAN/VPN/other datacentre related hosting. very cool!

  • @lucheanywhite
    @lucheanywhite 7 месяцев назад +1

    Very informative ‼️

  • @gustavb6062
    @gustavb6062 3 года назад +2

    More content like this, awesome

  • @seccentral
    @seccentral 3 года назад +1

    thanks for sharing, would love some tours on other multi tenant dcs as well as some private ones if ever possible like apple tesla etc

  • @ArcticSilverFox1
    @ArcticSilverFox1 3 года назад +1

    Data Center appliances (servers, switches, storage etc) operate on 110-240V. PSUs yield high power output on higher voltages, which is why almost all Data Center equipment is run on 208V or higher. If you check any server PSU, you will see three power outputs listed on it for each voltage range. Blade Chassis almost always require 240V although you could purchase 120V PSUs for Blade Chassis if in the rare occasion someone wants to run it at their office or in a quarter cabinet. Large high power (8kW and higher) PDUs (0u) are 208v or higher (like 208v 3-phase 30amps). There are very few 0u PDUs that are 120v, which is why the guy said 120v is usually requested by small customers wanting a quarter cabinet.

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  3 года назад

      A long time ago (2015) we did a piece on 120V v. 208V using HPE power supplies. www.servethehome.com/120v-208v-power-consumption/
      Good points in your comment. Hopefully can incorporate them on the next tour.

    • @mrmotofy
      @mrmotofy 3 года назад

      Yep more (power)...with less (copper/heat) is just simpler. The NORMS aren't generally cuz it's better...it's kinda random, popularity, political influence or cost that drive it. Think Beta vs VHS who won??? Or another good one I researched was the Phillips screw vs the square drive or torx. We got the Phillips which actually strip like they were designed to do for their original application. But we use it thanks to Henry Ford who did a cost analysis on the Square drive (Robertson and Canadian) or the Phillips. He calculated he could get the same job done and it would cost less with Phillips...then that became so common, it's still the annoyance we have to suffer with today...which is slowly changing.

  • @robertwolfiii8711
    @robertwolfiii8711 2 года назад

    Thank you for showing US

  • @Miphen0707
    @Miphen0707 3 года назад +15

    Would like to see how they go about servicing the hardware on servers, disk drives, switches etc.

    • @ws2940
      @ws2940 3 года назад

      It usually depends on if the hardware is from a vendor that offers a warrenty. Also if it is monitored offsite by the vendor. Alot of HW vendors have contracted out their field tech work to Insyte Global and Infosys. If the hardware has a warranty. Usually they send a tech out with a part to repair it. Or the part is shipped to the site for the tech to use when they arrive. The faulty parts are shipped back to the vendor. The sole exception is anything that retains media. Such as hard drives. Depending on the client. They can be shredded onsite by companies such as ProShred. Or the client has onsite degaussing/shredding equipment to take care of hard drives. If the hardware is out of warranty. Then whoever runs it will (hopefully) have parts or replacements on site or readily on hand to repair it. Depending on if the client has chosen to employ onsite personnel to repair their hardware. Or if they have other arrangements. (contracts with IT firms to provide techs that reboot/repair/replace hardware within SLA)

    • @Miphen0707
      @Miphen0707 3 года назад +1

      @@ws2940 Thank you for the very much appreciated and detained information. Greatly enjoyed the story and explanation. Regards Michael from Australia.

    • @watcher206
      @watcher206 Год назад

      ​@Michael Enright Do you happen to remember what was said? It appears that the Comment you Replied to Got taken down

  • @aaronchamberlain4698
    @aaronchamberlain4698 3 года назад +2

    Good video. Had to laugh at 10:09. Had to learn Lab View in school and it always immediately jumps out to me when a control panel is built with it.

  • @topendtrucker
    @topendtrucker 3 года назад +8

    Seeing Telstra etched into the glass was interesting .. an Australian telecommunications company

    • @Healed
      @Healed 3 года назад

      Telstra are all over the world, just as equal to the ones etched on the glass - www.telstra.co.uk/en/products/cloud/colocation

  • @somerandomguy1533
    @somerandomguy1533 3 года назад +1

    That was a really cool video!

  • @carlchristenisnes6763
    @carlchristenisnes6763 3 года назад +2

    Great video, love datacenters

  • @Nick-zu9sn
    @Nick-zu9sn 3 года назад +1

    Great, just great. Thanks!

  • @mauisam1
    @mauisam1 3 года назад +1

    Super cool !!! Thanks.

  • @johnmijo
    @johnmijo 3 года назад +48

    Hmm, how can you NOT like a Data Center Tour, unless you are one of PhoenixNap's competitors :p

  • @Dmitriy.0
    @Dmitriy.0 3 года назад +4

    And here I am with my 40TB NAS on a decade old hardware, in a dinky 12U open-frame rack home lab.

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  3 года назад +2

      That is the gateway!

    • @mrmotofy
      @mrmotofy 3 года назад

      It all starts somewhere. I started way back with his and hers computers and wanting to share internet access through a 56k dialup modem. Some research to find out what I needed. Then off to the store and looking at prices of 25ft patch cords...wow I can buy the bulk wire and couple connectors for so much less...just need the tools and learn to do it. I knew that was only the beginning.

  • @Jonathan-iq4hl
    @Jonathan-iq4hl 3 года назад +1

    I also work in the data center, but the ups and battery system work outdoor is my first see the design

  • @angryjoshi
    @angryjoshi Год назад +2

    PhoenixNAP in AZ is a nice one, we have a cage in there too, i think i saw it on the video even lol

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  Год назад

      We may be getting a cage there next year.

    • @angryjoshi
      @angryjoshi Год назад

      @@ServeTheHomeVideo maybe we'll become cage neighbors, were on the lower floor, although some have freed up there isn't much space tho haha

  • @visvamba
    @visvamba 3 года назад +2

    More of this kind of content!

  • @capability-snob
    @capability-snob 3 года назад +1

    5:23 my goodness, is that an altix?? Lovely retro vibes!

  • @АбракадабраКобра259
    @АбракадабраКобра259 3 года назад +7

    Holy cow... Never been to a DC. This is crazy stuff. So redundant.

    • @mrmotofy
      @mrmotofy 3 года назад

      2 is 1 and 1 is none

  • @LearnEnglishWithMatta
    @LearnEnglishWithMatta 2 года назад +1

    Great video. 💪🏽

  • @Miphen0707
    @Miphen0707 3 года назад +7

    I wonder what the stats and plans are, on running out of space even though they are adequately supplied with power.

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  3 года назад +11

      They just announced that they are building a 500,000 sq ft facility next to this one which is 200,000 sq ft. I think that is how they are solving for space.

    • @Miphen0707
      @Miphen0707 3 года назад

      @@ServeTheHomeVideo thank you

  • @andreavergani7414
    @andreavergani7414 3 года назад +1

    Wow! Soo cool. You Are been really lucky to film insede that datacenter. I am jelous ahhah

  • @LoneRiderz
    @LoneRiderz 3 года назад +1

    Awesome video!

  • @michaelocampo9986
    @michaelocampo9986 3 года назад +4

    Very cool. Great to hear about their bare-metal services running on high density servers from Supermicro! :) Nice work, Patrick! Keep it up.

  • @pkt1213
    @pkt1213 3 года назад +2

    Working in a place like this seems like it would be pretty cool.

    • @ptmnc1
      @ptmnc1 3 года назад +1

      Well it is nice on a hot day. But not for too long: no clocks, no windows, loud white noise, can all get almost disorienting after a number of hours.

  • @MajesticNerd
    @MajesticNerd 3 года назад +1

    Data centers always make such a show of physical security. It's important and needed of course. More than once I've been to them having passed through multiple levels of security and tech to get in, only to see a roll up door to the parking lot open with guys standing around smoking or shooting the shit. Or a side door where employees and friends go in and out without passing through the gauntlet. While important, a lot of that is for show from what I've seen. That wasn't at PhoenixNAP, but several top tier DCs around the country. Some of the biggest DCs that are industrial primarily for telcos and that sort of thing have nobody onsite and a keycard and maybe finger print to get in and you're in the DC. None of the fluff, just the stuff you need.

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  3 года назад +1

      Actually did tour the dock. Multiple levels of security there but there was equipment we did now want to show on pallets there.

    • @MajesticNerd
      @MajesticNerd 3 года назад

      @@ServeTheHomeVideo That is cool and glad the are consistent. Mine was more a general comment on how much emphasis data centers make on physical security or the impression of physical security at the front when it's super rare that someone crashes through a door or tries to Mission Impossible into server cages and racks. There is far more risk in most of these places over the network and internet than in the building or the doors. Since I mentioned Mission Impossible, pet peeve of mine that you touched on, in movies, data centers are almost always dead silent. If you spend any time in one, you learn quick to bring ear protection as it gets to you over time.

  • @amateurambience
    @amateurambience 2 года назад +1

    learn a lot! thanks

  • @wudchk
    @wudchk 3 года назад +1

    You should get a tour of Switch NAP in Vegas. It's crazy impressive.

  • @sevenredundent7256
    @sevenredundent7256 2 года назад +1

    Ah, I am proud to see my employer cools this facily, I am even prouder to work for them now.

  • @scbtripwire
    @scbtripwire 3 года назад +1

    Man I would love to see Pen testers work their magic here!

    • @jfbeam
      @jfbeam 3 года назад +3

      Having worked in a few DC's that like to show this sort of thing on tours, I'm 99% certain the facility would fail within seconds if anyone looks beyond the tour route. 90% of the security of any such facility is in the "first layer"... the general difficulty to get on the floor in the first place. Unfortunately, for a commercial DC, that's not much of a barrier. Once on the floor, it's pretty easy to walk into areas you aren't supposed to, and those cages are mostly just for show; they don't offer a great deal of resistance. Customers bank on security watching all of the (thousands) of cameras.
      (I would hope PhoenixNAP hasn't done what _so_ many other places do... have unsecured doors bypassing the theater shown to customers. I've seen BANK data centers doing that.)

  • @ArcticSilverFox1
    @ArcticSilverFox1 3 года назад +3

    Another question worth asking, how do they compete with larger hosting companies like OVH? PheonixNAP's baremetal server rental cost is a lot higher than OVH.

  • @Iamdebug
    @Iamdebug 3 года назад +2

    43kw rack seems fairly out there but then I look at my half full cabinet that can use 15k if I let it and suddenly that seems a lot closer.

  • @cdoublejj
    @cdoublejj 3 года назад +1

    HA! They still have the waterfall wall!

  • @lukewalker3905
    @lukewalker3905 3 года назад +5

    Hey Patrick, would be amazing if you could include Celsius temps at 7:31 just spare a thought for the rest of the world that uses metric. You already went to the effort of making graphic, just take the extra 10secs to throw metric on it. Thanks!

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  3 года назад +1

      Ha! Giving me too much credit here. Joe edited/ made the graphic and overall did a great job. We will add metric in the future. I usually try to on other reviews just did not get it in this one.

    • @lukewalker3905
      @lukewalker3905 3 года назад

      @@ServeTheHomeVideo Thank you sir, other than that great video!

    • @mrmotofy
      @mrmotofy 3 года назад

      Maybe you clowns in the metric world should switch to a real Imperial system LOL J/K If it makes you feel any better Technically the US is on a metric system. The govt officially switched years ago...they just have no power to force private business to switch over so we're sort of stuck in a middle ground of both

    • @lunascomments3024
      @lunascomments3024 3 года назад +1

      22.2 °C I think.

    • @KaesOner
      @KaesOner 3 года назад +1

      you're interested in videos about data centers yet you cant even use google? It would have taken you a quarter of the amount of time to find the answer online then it did to actually write your question.

  • @ewenchan1239
    @ewenchan1239 3 года назад +1

    I wonder how much power the cooling facilities take...
    Pardon the pun though, this is REALLY cool!!!
    I've visited a Telus datacenter in Toronto before as well and my dad used to work at a bank so I got to see a number of data centers, albeit not at this hyperscaler scale.

  • @Mndezthecreator
    @Mndezthecreator 3 года назад +1

    Wooow this is insane!

  • @powell.christopher
    @powell.christopher 3 года назад +1

    Very cool

  • @ihameed
    @ihameed 3 года назад +1

    loved it. only other detailed video about data centers is one from UK, I keep wondering how longs techs have to be on floor in that noisy envoirnment

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  3 года назад

      Usually you have heavy hearing protection on. We just did not have it here for filming and so we were going off the floor every few min.

  • @samybenzekry
    @samybenzekry 2 года назад +1

    Mr. McClarthy said that they can cool 44Kw rack with no special containment units (12 minutes in the video). That's really impressive, I may be wrong, but, i'm thinking it's probably not a standard 42U rack? I would love to see it. :)

  • @GanDtech
    @GanDtech 3 года назад +1

    Nice boxes

  • @bc5891
    @bc5891 3 года назад +2

    4:52 invest in a datacenter with NO hot room "not hot aisle" so the heat in someway is constantly mixing with cool air - not a good move and considering the amount of money that DC is making they are actually being very inefficient. My DC has the APC heat room with 4 rows of 20 48U racks, 3100 servers, 4 EMC San units & 4 Cisco Cores in 4 corners of the space. The entire room is cooled with 30 tons using 29% of its cooling abilities and the heat it generates is recycled and used in the winter to heat the building office space.
    Datacenter are supposed to be greenish & efficient not designed to be power hungry cities. I would hate to see the price tag for all that power and cooling equipment.

    • @mrmotofy
      @mrmotofy 3 года назад

      Yep some basic understanding of HVAC understands that. Even a minimal blocking off between aisles with glass or plexiglass or something would dramatically increase efficiency

    • @binba9
      @binba9 3 года назад +1

      Yeah that part surprised me. I'm planning a server room now and hot aisle containment seems like a no brainer.

  • @ahuachapan2
    @ahuachapan2 3 года назад +2

    this IS a datacenter!

  • @twg2984
    @twg2984 2 года назад +1

    Since when is 72 degrees "very cold" lol. Great video!

  • @gdrriley420
    @gdrriley420 3 года назад +1

    Being in a HPC data center. I forget that 30KW+ is rare in racks
    Also man is that a cold DC now. Im use to them being high70 low 80s

  • @trissylegs
    @trissylegs 2 года назад +1

    3:17. Was pretty surprised to see Telstra (Australia's largest ISP) on there.
    I suppose it's realted to them part Owning the AutraliaUS fibre connections.

  • @benwu7980
    @benwu7980 3 года назад +2

    Great video ! , worked at a telecoms dc for a few years on the 'physical' side, was always fun.
    One thing I didn't quite get, was saying their generators can run '
    indefinitely'?

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  3 года назад +5

      I think the idea is that the diesel supply contracts and tanks ensure they can survive imaginable outages

    • @benwu7980
      @benwu7980 3 года назад

      @@ServeTheHomeVideo ty for reply, a first engagement on channel :)
      I still do not view that as 'indefinite' however, nor non-prone to some imaginable scenarios

    • @jfbeam
      @jfbeam 3 года назад +4

      Depends on their supply of fuel. Many claim "indefinite" because they're connected to a pipeline, vs. on-site tanks that have to be refilled from tankers. Ask some of the guys in NYC how "indefinite" that feed turned out to be. (not very. pipelines need power to function, too.)

    • @chumpmu1
      @chumpmu1 2 года назад

      Indefinite is parsing language for sure. But generally, since they probably host gov’t clients, they probably have a Tier II endorsement for fuel delivery during a disaster.