Wow! I've recently set up system for the QO-100 / Es'Hail geostationary satellite for narrow band amateur radio - uplink on 2.4GHz and downlink 10GHz. I'm really hoping to do something in the mm spectrum in the future. That must have been an interesting project :)
As a contractor that has built computer rooms before, this was impressive, they have spared no expense to make this attractive to all customers. they even put glass storefront in their mechanical rooms, double and triple redundancy all I can say is wow, it would take an act of war just about to take this off line.
Great video. Thanks for posting this. I used to work in IT data centers in the late 90s/early 2000s but haven't been in a big one since probably 2006. They've come a long way.
I worked in a IDC in Phoenix area, loved it in the summer time as I would have to work under the floor and freeze my arse off. Watching this brings back memories, good times and it was an awesome job.
Data security is critical these days. With a few clicks you can be wealthy, broke or have 8 warrants from multiple states. People NEED to start taking it more seriously, it's only gonna get worse. Ask anyone who has had identity theft. Your accounts are locked down, can't put gas in your car, buy anything, make your mortgage payments, receive paychecks etc. It can be devastating for a year or more and destroy credit etc.
I was a systems engineer for 15 years at a small-medium sized data center. It was a great experience. Now I am a cloud and virtualization architect for a large infrastructure where we maintain servers in two data centers as well as disaster recovery with with a large public cloud provider. Phoenix NAP looked great! But I must say that my IT Disneyland was Switch (formerly SuperNAP) in Vegas. That place sets the standard for all carrier connectivity as well as unique heat/air management. Hard to get a tour though. I was lucky to get my tour there. Thanks for the tour at Phoenix NAP!
Agreed. This is a great DC, but Switch is on an entirely different level. Glad to have the opportunity to have a full rack co-lo there. If only I could work out of it everyday!!
This was cool. Major nerd creds for making this! I've worked with quite a few data centers over the years but they looked like small hobby projects compared to this :)
Brilliant! Thanks Patrick! I live in the UK and HQ is in Phoenix so due to Covid haven't had the chance to go for over a year. It was so nice to see a few images of Sky Harbour and the sun! Great video - I'm always fascinated to learn about real infrastructure. Me at home I have a 1Gbps link and a backup 4G connection. Main connection went down on Thursday and having to wait till Tuesday am to get it back. Not very happy. I really enjoy the STH channel. Stay safe.
Very cool video, and awesome what kind of access they gave. Being in the Phoenix area, I can definitely corroborate that Datacenters are on the rise here, in spite of the environmental factors (high heat, potential for low water supply). Might need to apply there 🤔.
Greetings from Canada and thank you for a great tour given the security considerations involved. I was surprised that the facility only had 60 second capability for running on battery. You must have a lot of confidence in those generators moving from the OFF state too full load very quickly. I was also surprised about the amount of cooling capability you can accommodate - 43 KW in a single cabinet. That is enough heat generation in such a cabinet that is is like a controlled fire - impressive. I was surprised about the location selected given how difficult heat dissipation would likely be in Phoenix. I guess having Tier 1 ISP availability from multiple providers counts for a lot. It is too bad that there is no good way (presently at least) to put all the waste heat that gets generated to some useful purpose. Thanks again.
Ha I’ve been to this facility and installed / turned up equipment for my previous employer who was one of their larger colo customers. I’m surprised they didn’t talk about the no cardboard rule. They are super strict on bringing anything remotely close to cardboard like Cisco license envelopes and if you tailgate through the man traps they will come over the speaker and give ya a hard time.
@@SkynetCyb They don't want that crap being sucked into the air handlers. Dust is murder in a DC. I had one "lab" (~400sqft) that was 100% isolated from the building HVAC. The filters on the CRAC stayed like new for 5 years... until the a**holes cut a 1sqft hole in the wall; the filters were clogged in less than a week. (and two servers were killed by drywall dust -- fucks with power supplies frying the cpus.)
Crazy to see all that physical security with steel cages around the racks. Makes sense if you consider there's 10s of millions of $$ in each rack nowadays, even more with GPU or HDD clusters.
While the hardware and software may have a high value, the real value they're protecting is the service each client provides. Imagine you're a client like AWS or Google where your revenue is directly tied to your up-time and data throughput. All that security and redundancy directly contribute to a company's bottom line.
I'd love to see more of this kind of content. If possible, I'd like to see hardware in the racks and what they're being used for. I have my own full-rack in a colo and I'd love to get some ideas.
seeing telstra on the glass really threw me, i knew they had gear in other countries but didnt realise they did SD-WAN/VPN/other datacentre related hosting. very cool!
Data Center appliances (servers, switches, storage etc) operate on 110-240V. PSUs yield high power output on higher voltages, which is why almost all Data Center equipment is run on 208V or higher. If you check any server PSU, you will see three power outputs listed on it for each voltage range. Blade Chassis almost always require 240V although you could purchase 120V PSUs for Blade Chassis if in the rare occasion someone wants to run it at their office or in a quarter cabinet. Large high power (8kW and higher) PDUs (0u) are 208v or higher (like 208v 3-phase 30amps). There are very few 0u PDUs that are 120v, which is why the guy said 120v is usually requested by small customers wanting a quarter cabinet.
A long time ago (2015) we did a piece on 120V v. 208V using HPE power supplies. www.servethehome.com/120v-208v-power-consumption/ Good points in your comment. Hopefully can incorporate them on the next tour.
Yep more (power)...with less (copper/heat) is just simpler. The NORMS aren't generally cuz it's better...it's kinda random, popularity, political influence or cost that drive it. Think Beta vs VHS who won??? Or another good one I researched was the Phillips screw vs the square drive or torx. We got the Phillips which actually strip like they were designed to do for their original application. But we use it thanks to Henry Ford who did a cost analysis on the Square drive (Robertson and Canadian) or the Phillips. He calculated he could get the same job done and it would cost less with Phillips...then that became so common, it's still the annoyance we have to suffer with today...which is slowly changing.
It usually depends on if the hardware is from a vendor that offers a warrenty. Also if it is monitored offsite by the vendor. Alot of HW vendors have contracted out their field tech work to Insyte Global and Infosys. If the hardware has a warranty. Usually they send a tech out with a part to repair it. Or the part is shipped to the site for the tech to use when they arrive. The faulty parts are shipped back to the vendor. The sole exception is anything that retains media. Such as hard drives. Depending on the client. They can be shredded onsite by companies such as ProShred. Or the client has onsite degaussing/shredding equipment to take care of hard drives. If the hardware is out of warranty. Then whoever runs it will (hopefully) have parts or replacements on site or readily on hand to repair it. Depending on if the client has chosen to employ onsite personnel to repair their hardware. Or if they have other arrangements. (contracts with IT firms to provide techs that reboot/repair/replace hardware within SLA)
It all starts somewhere. I started way back with his and hers computers and wanting to share internet access through a 56k dialup modem. Some research to find out what I needed. Then off to the store and looking at prices of 25ft patch cords...wow I can buy the bulk wire and couple connectors for so much less...just need the tools and learn to do it. I knew that was only the beginning.
They just announced that they are building a 500,000 sq ft facility next to this one which is 200,000 sq ft. I think that is how they are solving for space.
Data centers always make such a show of physical security. It's important and needed of course. More than once I've been to them having passed through multiple levels of security and tech to get in, only to see a roll up door to the parking lot open with guys standing around smoking or shooting the shit. Or a side door where employees and friends go in and out without passing through the gauntlet. While important, a lot of that is for show from what I've seen. That wasn't at PhoenixNAP, but several top tier DCs around the country. Some of the biggest DCs that are industrial primarily for telcos and that sort of thing have nobody onsite and a keycard and maybe finger print to get in and you're in the DC. None of the fluff, just the stuff you need.
@@ServeTheHomeVideo That is cool and glad the are consistent. Mine was more a general comment on how much emphasis data centers make on physical security or the impression of physical security at the front when it's super rare that someone crashes through a door or tries to Mission Impossible into server cages and racks. There is far more risk in most of these places over the network and internet than in the building or the doors. Since I mentioned Mission Impossible, pet peeve of mine that you touched on, in movies, data centers are almost always dead silent. If you spend any time in one, you learn quick to bring ear protection as it gets to you over time.
Having worked in a few DC's that like to show this sort of thing on tours, I'm 99% certain the facility would fail within seconds if anyone looks beyond the tour route. 90% of the security of any such facility is in the "first layer"... the general difficulty to get on the floor in the first place. Unfortunately, for a commercial DC, that's not much of a barrier. Once on the floor, it's pretty easy to walk into areas you aren't supposed to, and those cages are mostly just for show; they don't offer a great deal of resistance. Customers bank on security watching all of the (thousands) of cameras. (I would hope PhoenixNAP hasn't done what _so_ many other places do... have unsecured doors bypassing the theater shown to customers. I've seen BANK data centers doing that.)
Another question worth asking, how do they compete with larger hosting companies like OVH? PheonixNAP's baremetal server rental cost is a lot higher than OVH.
Hey Patrick, would be amazing if you could include Celsius temps at 7:31 just spare a thought for the rest of the world that uses metric. You already went to the effort of making graphic, just take the extra 10secs to throw metric on it. Thanks!
Ha! Giving me too much credit here. Joe edited/ made the graphic and overall did a great job. We will add metric in the future. I usually try to on other reviews just did not get it in this one.
Maybe you clowns in the metric world should switch to a real Imperial system LOL J/K If it makes you feel any better Technically the US is on a metric system. The govt officially switched years ago...they just have no power to force private business to switch over so we're sort of stuck in a middle ground of both
you're interested in videos about data centers yet you cant even use google? It would have taken you a quarter of the amount of time to find the answer online then it did to actually write your question.
I wonder how much power the cooling facilities take... Pardon the pun though, this is REALLY cool!!! I've visited a Telus datacenter in Toronto before as well and my dad used to work at a bank so I got to see a number of data centers, albeit not at this hyperscaler scale.
Mr. McClarthy said that they can cool 44Kw rack with no special containment units (12 minutes in the video). That's really impressive, I may be wrong, but, i'm thinking it's probably not a standard 42U rack? I would love to see it. :)
4:52 invest in a datacenter with NO hot room "not hot aisle" so the heat in someway is constantly mixing with cool air - not a good move and considering the amount of money that DC is making they are actually being very inefficient. My DC has the APC heat room with 4 rows of 20 48U racks, 3100 servers, 4 EMC San units & 4 Cisco Cores in 4 corners of the space. The entire room is cooled with 30 tons using 29% of its cooling abilities and the heat it generates is recycled and used in the winter to heat the building office space. Datacenter are supposed to be greenish & efficient not designed to be power hungry cities. I would hate to see the price tag for all that power and cooling equipment.
Yep some basic understanding of HVAC understands that. Even a minimal blocking off between aisles with glass or plexiglass or something would dramatically increase efficiency
3:17. Was pretty surprised to see Telstra (Australia's largest ISP) on there. I suppose it's realted to them part Owning the AutraliaUS fibre connections.
Great video ! , worked at a telecoms dc for a few years on the 'physical' side, was always fun. One thing I didn't quite get, was saying their generators can run ' indefinitely'?
@@ServeTheHomeVideo ty for reply, a first engagement on channel :) I still do not view that as 'indefinite' however, nor non-prone to some imaginable scenarios
Depends on their supply of fuel. Many claim "indefinite" because they're connected to a pipeline, vs. on-site tanks that have to be refilled from tankers. Ask some of the guys in NYC how "indefinite" that feed turned out to be. (not very. pipelines need power to function, too.)
Indefinite is parsing language for sure. But generally, since they probably host gov’t clients, they probably have a Tier II endorsement for fuel delivery during a disaster.
More of these tours, always interesting to see different Data Centers.
I have a half cage colo there! Super cool to see a video about this!!!
Awesome!
What's the pricing there, power, space, transit
@@ServeTheHomeVideo I would also love to know
Thanks PhoenixNAP for wonderful inside of your Data-center.
I project managed a millimeter wave radio on the roof and full rack colo project in that facility. Really cool to actually see it!
Wow!
I've recently set up system for the QO-100 / Es'Hail geostationary satellite for narrow band amateur radio - uplink on 2.4GHz and downlink 10GHz. I'm really hoping to do something in the mm spectrum in the future. That must have been an interesting project :)
Nice!
Showing the behind the scenes on the cooling system is awesome!
Thanks! It was super cool to see.
7:40 - /me sees humidifier in a DC, freaks out...
Then I realized this is in the arid land of Phoenix, and not a humid swamp like St. Louis!
Yea! Pretty darn cool.
Now, go make Pi-powered humidifier.
wow, even Jeff is here. just don't trip those breakers if you are red shirt Jeff
from Pi Cluster to Pi Data Center, we all are in this together.
They're usually inside the air handlers.
As a contractor that has built computer rooms before, this was impressive, they have spared no expense to make this attractive to all customers. they even put glass storefront in their mechanical rooms, double and triple redundancy all I can say is wow, it would take an act of war just about to take this off line.
NOW this is some cool behind the scenes action. awesome stuff
This is useful for folks who don’t get a chance to visit datacenters. Thanks for doing it!
Please make more of these tours. It's always amazing to see Data Centers.
Very interesting indeed. Thank you PhoenixNAP and STH.
Frank is incredibly well spoken and clear.
Those guys look really kind and aren't of selfish one's. Good job you and them both.
Amazing video :)
Well shot and edited. Great content
Joe did a great job on this. It was a big help not to have to shoot the video myself.
Thank you for the transparency disclosure.
Awesome stuff, best infomercial I've watched in a long time :D I hope other data center folks want in on this. Thanks
Glad you enjoyed it!
Thanks a lot for making this video and making it possible for people to see what a actual Datacenter and their components look like.
This was fantastic! I've been in the bowels of few DCs, but my god, that's one heck of a facility. Thanks for shooting this! 😃
I absolutely enjoyed this video! This video is by far the most in-depth video compared to other videos I seen about Data Center!
Thanks Michael
Great tour! I’d love to see more of these videos!
Great video. Thanks for posting this. I used to work in IT data centers in the late 90s/early 2000s but haven't been in a big one since probably 2006. They've come a long way.
I worked in a IDC in Phoenix area, loved it in the summer time as I would have to work under the floor and freeze my arse off. Watching this brings back memories, good times and it was an awesome job.
I'm always fascinated by how many physical security measures there are for data centers.
Data security is critical these days. With a few clicks you can be wealthy, broke or have 8 warrants from multiple states. People NEED to start taking it more seriously, it's only gonna get worse. Ask anyone who has had identity theft. Your accounts are locked down, can't put gas in your car, buy anything, make your mortgage payments, receive paychecks etc. It can be devastating for a year or more and destroy credit etc.
This was a great video. I've always wanted to tour a data center, you do a tour of a very large data center. Great content!
I was a systems engineer for 15 years at a small-medium sized data center. It was a great experience. Now I am a cloud and virtualization architect for a large infrastructure where we maintain servers in two data centers as well as disaster recovery with with a large public cloud provider. Phoenix NAP looked great! But I must say that my IT Disneyland was Switch (formerly SuperNAP) in Vegas. That place sets the standard for all carrier connectivity as well as unique heat/air management. Hard to get a tour though. I was lucky to get my tour there.
Thanks for the tour at Phoenix NAP!
Agreed. This is a great DC, but Switch is on an entirely different level. Glad to have the opportunity to have a full rack co-lo there. If only I could work out of it everyday!!
Just ordered a dedicated server from these folks as a DR site. Thanks STH!
Great video, they’re obviously very passionate about what they do!
This was cool. Major nerd creds for making this!
I've worked with quite a few data centers over the years but they looked like small hobby projects compared to this :)
Nice. I racked two full cabs there about 3 years ago. Great facility
Really cool! Love these kind of educational/doc-style videos!
Hey Patrick, this is so cool. Me flashing back to class trips for practicals then back to write about the facilities. Please, do this more.
Very cool, data centres facinate me. Its impressive seeing all that equipment in one location, makes me wonder what its all doing.
Awesome intro into macro scale!
I live there and I work at Sky Harbor! I hope you had a great time here! I enjoyed the video.
Sweet! Thanks for the kind words and have a great weekend.
@@ServeTheHomeVideo you too! Thanks!
Brilliant! Thanks Patrick!
I live in the UK and HQ is in Phoenix so due to Covid haven't had the chance to go for over a year. It was so nice to see a few images of Sky Harbour and the sun!
Great video - I'm always fascinated to learn about real infrastructure. Me at home I have a 1Gbps link and a backup 4G connection. Main connection went down on Thursday and having to wait till Tuesday am to get it back. Not very happy.
I really enjoy the STH channel. Stay safe.
KPHX is a great airport! I work there as a ramper!
@@noahneutral7557 I miss it! Happy days, hopefully soon to come back! :)
Stay safe!
Very cool video, and awesome what kind of access they gave. Being in the Phoenix area, I can definitely corroborate that Datacenters are on the rise here, in spite of the environmental factors (high heat, potential for low water supply).
Might need to apply there 🤔.
Magnificent all the way.
I understand the whole security part, but they are also getting free marketing. You got my like.
THAT PLACE IS AMAZING!!! WOW!!!!!!!
im still proud managing data on my NAS in my livingroom
pleaaseee do more of those videos :D crazy interesting
Glad you think so. Anything in particular you found interesting?
This man is super technical, knows his stuff, and doesn't mind sharing his knowledge. Great show!!
Greetings from Canada and thank you for a great tour given the security considerations involved. I was surprised that the facility only had 60 second capability for running on battery. You must have a lot of confidence in those generators moving from the OFF state too full load very quickly. I was also surprised about the amount of cooling capability you can accommodate - 43 KW in a single cabinet. That is enough heat generation in such a cabinet that is is like a controlled fire - impressive. I was surprised about the location selected given how difficult heat dissipation would likely be in Phoenix. I guess having Tier 1 ISP availability from multiple providers counts for a lot. It is too bad that there is no good way (presently at least) to put all the waste heat that gets generated to some useful purpose. Thanks again.
Historically, Phoenix always averaged about 76ºF in the summertime until they installed all of those data centers.
What a cool learning experience.
Glad you enjoyed
Ha I’ve been to this facility and installed / turned up equipment for my previous employer who was one of their larger colo customers. I’m surprised they didn’t talk about the no cardboard rule. They are super strict on bringing anything remotely close to cardboard like Cisco license envelopes and if you tailgate through the man traps they will come over the speaker and give ya a hard time.
Why no cardboard?
@@SkynetCyb shedding of dust and contamination. When you tear cardboard you see little particles fly in the air
@@thimslugga That's pretty smart, I never would've thought about it, I thought they had filters in place for this use case though?
@@SkynetCyb They don't want that crap being sucked into the air handlers. Dust is murder in a DC. I had one "lab" (~400sqft) that was 100% isolated from the building HVAC. The filters on the CRAC stayed like new for 5 years... until the a**holes cut a 1sqft hole in the wall; the filters were clogged in less than a week. (and two servers were killed by drywall dust -- fucks with power supplies frying the cpus.)
They seem ok with it if it's something like a reusable box or anything you're not going to tear open basically.
I was in few Data Centers, backup Data Centers..... and few of them due to company specification was in the same location with Communication Junction.
Thanks Frank.
Crazy to see all that physical security with steel cages around the racks. Makes sense if you consider there's 10s of millions of $$ in each rack nowadays, even more with GPU or HDD clusters.
While the hardware and software may have a high value, the real value they're protecting is the service each client provides. Imagine you're a client like AWS or Google where your revenue is directly tied to your up-time and data throughput. All that security and redundancy directly contribute to a company's bottom line.
Also makes sense when you consider any other customer could go to your rack and pick their way in
Tours are always interesting
Welcome to Phoenix!
I'd love to see more of this kind of content. If possible, I'd like to see hardware in the racks and what they're being used for. I have my own full-rack in a colo and I'd love to get some ideas.
Usually showing hardware in racks is not allowed. Trying to address the hardware side with our hardware reviews
+++
It's customer gear, so "none of your business."
Would love detailed tours of storage racks, compute racks, networking racks, etc. in data centers
Awesome stuff!
More of this please
Great video! Thanks for sharing.
Thanks for the kind words
seeing telstra on the glass really threw me, i knew they had gear in other countries but didnt realise they did SD-WAN/VPN/other datacentre related hosting. very cool!
Very informative ‼️
More content like this, awesome
thanks for sharing, would love some tours on other multi tenant dcs as well as some private ones if ever possible like apple tesla etc
Data Center appliances (servers, switches, storage etc) operate on 110-240V. PSUs yield high power output on higher voltages, which is why almost all Data Center equipment is run on 208V or higher. If you check any server PSU, you will see three power outputs listed on it for each voltage range. Blade Chassis almost always require 240V although you could purchase 120V PSUs for Blade Chassis if in the rare occasion someone wants to run it at their office or in a quarter cabinet. Large high power (8kW and higher) PDUs (0u) are 208v or higher (like 208v 3-phase 30amps). There are very few 0u PDUs that are 120v, which is why the guy said 120v is usually requested by small customers wanting a quarter cabinet.
A long time ago (2015) we did a piece on 120V v. 208V using HPE power supplies. www.servethehome.com/120v-208v-power-consumption/
Good points in your comment. Hopefully can incorporate them on the next tour.
Yep more (power)...with less (copper/heat) is just simpler. The NORMS aren't generally cuz it's better...it's kinda random, popularity, political influence or cost that drive it. Think Beta vs VHS who won??? Or another good one I researched was the Phillips screw vs the square drive or torx. We got the Phillips which actually strip like they were designed to do for their original application. But we use it thanks to Henry Ford who did a cost analysis on the Square drive (Robertson and Canadian) or the Phillips. He calculated he could get the same job done and it would cost less with Phillips...then that became so common, it's still the annoyance we have to suffer with today...which is slowly changing.
Thank you for showing US
Would like to see how they go about servicing the hardware on servers, disk drives, switches etc.
It usually depends on if the hardware is from a vendor that offers a warrenty. Also if it is monitored offsite by the vendor. Alot of HW vendors have contracted out their field tech work to Insyte Global and Infosys. If the hardware has a warranty. Usually they send a tech out with a part to repair it. Or the part is shipped to the site for the tech to use when they arrive. The faulty parts are shipped back to the vendor. The sole exception is anything that retains media. Such as hard drives. Depending on the client. They can be shredded onsite by companies such as ProShred. Or the client has onsite degaussing/shredding equipment to take care of hard drives. If the hardware is out of warranty. Then whoever runs it will (hopefully) have parts or replacements on site or readily on hand to repair it. Depending on if the client has chosen to employ onsite personnel to repair their hardware. Or if they have other arrangements. (contracts with IT firms to provide techs that reboot/repair/replace hardware within SLA)
@@ws2940 Thank you for the very much appreciated and detained information. Greatly enjoyed the story and explanation. Regards Michael from Australia.
@Michael Enright Do you happen to remember what was said? It appears that the Comment you Replied to Got taken down
Good video. Had to laugh at 10:09. Had to learn Lab View in school and it always immediately jumps out to me when a control panel is built with it.
Seeing Telstra etched into the glass was interesting .. an Australian telecommunications company
Telstra are all over the world, just as equal to the ones etched on the glass - www.telstra.co.uk/en/products/cloud/colocation
That was a really cool video!
Great video, love datacenters
Thanks for watching!
It's quite amazing that "old tech" DG's can sync at the speed required. Love it
Great, just great. Thanks!
Super cool !!! Thanks.
Hmm, how can you NOT like a Data Center Tour, unless you are one of PhoenixNap's competitors :p
And here I am with my 40TB NAS on a decade old hardware, in a dinky 12U open-frame rack home lab.
That is the gateway!
It all starts somewhere. I started way back with his and hers computers and wanting to share internet access through a 56k dialup modem. Some research to find out what I needed. Then off to the store and looking at prices of 25ft patch cords...wow I can buy the bulk wire and couple connectors for so much less...just need the tools and learn to do it. I knew that was only the beginning.
I also work in the data center, but the ups and battery system work outdoor is my first see the design
PhoenixNAP in AZ is a nice one, we have a cage in there too, i think i saw it on the video even lol
We may be getting a cage there next year.
@@ServeTheHomeVideo maybe we'll become cage neighbors, were on the lower floor, although some have freed up there isn't much space tho haha
More of this kind of content!
5:23 my goodness, is that an altix?? Lovely retro vibes!
Holy cow... Never been to a DC. This is crazy stuff. So redundant.
2 is 1 and 1 is none
Great video. 💪🏽
I wonder what the stats and plans are, on running out of space even though they are adequately supplied with power.
They just announced that they are building a 500,000 sq ft facility next to this one which is 200,000 sq ft. I think that is how they are solving for space.
@@ServeTheHomeVideo thank you
Wow! Soo cool. You Are been really lucky to film insede that datacenter. I am jelous ahhah
Awesome video!
Very cool. Great to hear about their bare-metal services running on high density servers from Supermicro! :) Nice work, Patrick! Keep it up.
Working in a place like this seems like it would be pretty cool.
Well it is nice on a hot day. But not for too long: no clocks, no windows, loud white noise, can all get almost disorienting after a number of hours.
Data centers always make such a show of physical security. It's important and needed of course. More than once I've been to them having passed through multiple levels of security and tech to get in, only to see a roll up door to the parking lot open with guys standing around smoking or shooting the shit. Or a side door where employees and friends go in and out without passing through the gauntlet. While important, a lot of that is for show from what I've seen. That wasn't at PhoenixNAP, but several top tier DCs around the country. Some of the biggest DCs that are industrial primarily for telcos and that sort of thing have nobody onsite and a keycard and maybe finger print to get in and you're in the DC. None of the fluff, just the stuff you need.
Actually did tour the dock. Multiple levels of security there but there was equipment we did now want to show on pallets there.
@@ServeTheHomeVideo That is cool and glad the are consistent. Mine was more a general comment on how much emphasis data centers make on physical security or the impression of physical security at the front when it's super rare that someone crashes through a door or tries to Mission Impossible into server cages and racks. There is far more risk in most of these places over the network and internet than in the building or the doors. Since I mentioned Mission Impossible, pet peeve of mine that you touched on, in movies, data centers are almost always dead silent. If you spend any time in one, you learn quick to bring ear protection as it gets to you over time.
learn a lot! thanks
You should get a tour of Switch NAP in Vegas. It's crazy impressive.
They tend not to allow filming
Ah, I am proud to see my employer cools this facily, I am even prouder to work for them now.
Man I would love to see Pen testers work their magic here!
Having worked in a few DC's that like to show this sort of thing on tours, I'm 99% certain the facility would fail within seconds if anyone looks beyond the tour route. 90% of the security of any such facility is in the "first layer"... the general difficulty to get on the floor in the first place. Unfortunately, for a commercial DC, that's not much of a barrier. Once on the floor, it's pretty easy to walk into areas you aren't supposed to, and those cages are mostly just for show; they don't offer a great deal of resistance. Customers bank on security watching all of the (thousands) of cameras.
(I would hope PhoenixNAP hasn't done what _so_ many other places do... have unsecured doors bypassing the theater shown to customers. I've seen BANK data centers doing that.)
Another question worth asking, how do they compete with larger hosting companies like OVH? PheonixNAP's baremetal server rental cost is a lot higher than OVH.
43kw rack seems fairly out there but then I look at my half full cabinet that can use 15k if I let it and suddenly that seems a lot closer.
HA! They still have the waterfall wall!
Hey Patrick, would be amazing if you could include Celsius temps at 7:31 just spare a thought for the rest of the world that uses metric. You already went to the effort of making graphic, just take the extra 10secs to throw metric on it. Thanks!
Ha! Giving me too much credit here. Joe edited/ made the graphic and overall did a great job. We will add metric in the future. I usually try to on other reviews just did not get it in this one.
@@ServeTheHomeVideo Thank you sir, other than that great video!
Maybe you clowns in the metric world should switch to a real Imperial system LOL J/K If it makes you feel any better Technically the US is on a metric system. The govt officially switched years ago...they just have no power to force private business to switch over so we're sort of stuck in a middle ground of both
22.2 °C I think.
you're interested in videos about data centers yet you cant even use google? It would have taken you a quarter of the amount of time to find the answer online then it did to actually write your question.
I wonder how much power the cooling facilities take...
Pardon the pun though, this is REALLY cool!!!
I've visited a Telus datacenter in Toronto before as well and my dad used to work at a bank so I got to see a number of data centers, albeit not at this hyperscaler scale.
Wooow this is insane!
Very cool
loved it. only other detailed video about data centers is one from UK, I keep wondering how longs techs have to be on floor in that noisy envoirnment
Usually you have heavy hearing protection on. We just did not have it here for filming and so we were going off the floor every few min.
Mr. McClarthy said that they can cool 44Kw rack with no special containment units (12 minutes in the video). That's really impressive, I may be wrong, but, i'm thinking it's probably not a standard 42U rack? I would love to see it. :)
Nice boxes
4:52 invest in a datacenter with NO hot room "not hot aisle" so the heat in someway is constantly mixing with cool air - not a good move and considering the amount of money that DC is making they are actually being very inefficient. My DC has the APC heat room with 4 rows of 20 48U racks, 3100 servers, 4 EMC San units & 4 Cisco Cores in 4 corners of the space. The entire room is cooled with 30 tons using 29% of its cooling abilities and the heat it generates is recycled and used in the winter to heat the building office space.
Datacenter are supposed to be greenish & efficient not designed to be power hungry cities. I would hate to see the price tag for all that power and cooling equipment.
Yep some basic understanding of HVAC understands that. Even a minimal blocking off between aisles with glass or plexiglass or something would dramatically increase efficiency
Yeah that part surprised me. I'm planning a server room now and hot aisle containment seems like a no brainer.
this IS a datacenter!
Since when is 72 degrees "very cold" lol. Great video!
Compared to summer in AZ!
Being in a HPC data center. I forget that 30KW+ is rare in racks
Also man is that a cold DC now. Im use to them being high70 low 80s
3:17. Was pretty surprised to see Telstra (Australia's largest ISP) on there.
I suppose it's realted to them part Owning the AutraliaUS fibre connections.
Great video ! , worked at a telecoms dc for a few years on the 'physical' side, was always fun.
One thing I didn't quite get, was saying their generators can run '
indefinitely'?
I think the idea is that the diesel supply contracts and tanks ensure they can survive imaginable outages
@@ServeTheHomeVideo ty for reply, a first engagement on channel :)
I still do not view that as 'indefinite' however, nor non-prone to some imaginable scenarios
Depends on their supply of fuel. Many claim "indefinite" because they're connected to a pipeline, vs. on-site tanks that have to be refilled from tankers. Ask some of the guys in NYC how "indefinite" that feed turned out to be. (not very. pipelines need power to function, too.)
Indefinite is parsing language for sure. But generally, since they probably host gov’t clients, they probably have a Tier II endorsement for fuel delivery during a disaster.