Excellent. It's always great to see the engine room of the data centre. Hot and cold aisles are often abstract terms with little meaning until now we can see the mechanism by which it operates. Great explanation. Resilience is the key term here.
I've learned a lot about evaporative cooling in your facilities and how efficient it is - but there weren't any numbers. Could you provide an percentage for total efficiency/compared to other systems? Also, thanks for the illustrations of the airflow, those helped me to further understand the matter!
Love your videos. This was particulary interesting as I have wondered about server farm cooling. A follow up video on this one would be really interesting where you go through the next part of the cooling, how the air flows and is distributed in the cells themselves and around the servers.
Very interesting. In addition to @Scania_V8_Rat comment I'd like to know how you achieve the target humidity for the air going into the racks? Because I can only imagine that the incoming (cooled) air is pretty humid due to the water vapor that is being used. How is the humidity extracted before going to the electrical equipment which often has humidity requirements of
Most equipment just requires non-condensing humidity levels, so my guess is that they're running with the dew point very near the ambient temperature. On paper, it may be allowable, since the running computers prevent temperature drops that would cause condensation, and since ASHRAE is not a UK entity. However, in real-life, it's risky. High humidity is a larger cause of data center failures than low humidity. Humidity has to be extremely low to be the cause of triboelectric (static) problems, but by 65% RH, corrosion issues start to cause early failures. It will not affect conformally coated electronics or gold plated connectors, so it might be possible to avoid higher failure rates if you stick to SSD and helium sealed drives for storage and use hardened equipment, but if your customer is running more conventional drives for storage, plan on the MTBF to be much lower than the manufacturer's stated values.
yep used to service their equipment, not many back ups. When they fail, it is fix it now. Doesn't matter the time of day or cost. Did a 12,000 dollar repair at 2am with 4 other guys
Sooo there is no cameras in that air industry room? Now i see how an experienced hacker can breach into your facility - trough roof, cut the hatch, trough the ducts...BAM! He is inside air control room. From there he can sabotage air ventilation system by making a smaalll explosion, or he can follow the air stream to get into actual server room! SCHWOOSH! Now he is in and he is ready to steal hard drive disk with my naked photos i used to send to my girlfriend!!! And leave the way he came in. All i wanted to say - please don't mark this hard drive disk i dare you, i double dare you ;(
You first have to get near the building. I've been to a few datacenters and they usually have large fences surrounding the property and a single accesgate that requires 2 days notification if you want to visit and you have to phone in the registration of your car otherwise security won't open the gate for you
@@therickman1990 You are wrong. I am getting to the roof of data-center by parachuting in. Its easy and unnoticable. Remember i am best hacker advisor!
It's quite amusing. For people who have no idea how all this stuff works, these videos seem amazing. For people who do know what they're talking about and watching, these guys look like a bunch of cowboys. Sure, it's a 'plant room', and they're using fresh-air cooling - that certainly does not make this one of the 'greenest' data centers in the country! At least they have a decent air re-cycling system - it actually looks like someone thought that through. I can pretty much guarantee he was just wearing the hard hat and ear protectors for the video - under normal circumstances, they wouldn't even think of bothering.
Excellent. It's always great to see the engine room of the data centre. Hot and cold aisles are often abstract terms with little meaning until now we can see the mechanism by which it operates. Great explanation. Resilience is the key term here.
Would be nice if you guys can make a video on how to build a server rack from scratch.
Thanks for all these great videos !!!
Thanks so much for this. Great Stuff.
Very good information thanks for sharing hope to see more content.
Thanks for the Tour Mate! Nice and Impressive.
I've learned a lot about evaporative cooling in your facilities and how efficient it is - but there weren't any numbers. Could you provide an percentage for total efficiency/compared to other systems?
Also, thanks for the illustrations of the airflow, those helped me to further understand the matter!
Love your videos. This was particulary interesting as I have wondered about server farm cooling. A follow up video on this one would be really interesting where you go through the next part of the cooling, how the air flows and is distributed in the cells themselves and around the servers.
2:16 Thought it was some chocolates when you're hungry in the data center.
Thanks
Thanks 👍
Great to see this stuff!
Interesting update more videos on data centres
thanks for sharing video
As ,I'm a network engineer student...I'm very Interested to see it
Very interesting. In addition to @Scania_V8_Rat comment I'd like to know how you achieve the target humidity for the air going into the racks? Because I can only imagine that the incoming (cooled) air is pretty humid due to the water vapor that is being used. How is the humidity extracted before going to the electrical equipment which often has humidity requirements of
Guess that question got left unanswered
Most equipment just requires non-condensing humidity levels, so my guess is that they're running with the dew point very near the ambient temperature. On paper, it may be allowable, since the running computers prevent temperature drops that would cause condensation, and since ASHRAE is not a UK entity.
However, in real-life, it's risky. High humidity is a larger cause of data center failures than low humidity. Humidity has to be extremely low to be the cause of triboelectric (static) problems, but by 65% RH, corrosion issues start to cause early failures. It will not affect conformally coated electronics or gold plated connectors, so it might be possible to avoid higher failure rates if you stick to SSD and helium sealed drives for storage and use hardened equipment, but if your customer is running more conventional drives for storage, plan on the MTBF to be much lower than the manufacturer's stated values.
THX4UP1
Very cool :)
it was a intresting video for me
Very cool.
TIL data centers use glorified swamp coolers.
Cool. What manufacturer do you have for the BMS?
Heat transfered in = energy stored
It's quite shocking just how fast a data centre can heat up if the air conditioning fails...
yep used to service their equipment, not many back ups. When they fail, it is fix it now. Doesn't matter the time of day or cost. Did a 12,000 dollar repair at 2am with 4 other guys
You could be a bit greener if You used your own ear protection instead of using disposable ear plugs.
You could be a bit greener if you didn't waste those bits for your frivolous comment.
So many arrows, yet I have so little understanding of how this actually works
Sooo there is no cameras in that air industry room? Now i see how an experienced hacker can breach into your facility - trough roof, cut the hatch, trough the ducts...BAM! He is inside air control room. From there he can sabotage air ventilation system by making a smaalll explosion, or he can follow the air stream to get into actual server room! SCHWOOSH! Now he is in and he is ready to steal hard drive disk with my naked photos i used to send to my girlfriend!!! And leave the way he came in.
All i wanted to say - please don't mark this hard drive disk i dare you, i double dare you ;(
You first have to get near the building. I've been to a few datacenters and they usually have large fences surrounding the property and a single accesgate that requires 2 days notification if you want to visit and you have to phone in the registration of your car otherwise security won't open the gate for you
@@therickman1990 You are wrong. I am getting to the roof of data-center by parachuting in. Its easy and unnoticable. Remember i am best hacker advisor!
It's quite amusing. For people who have no idea how all this stuff works, these videos seem amazing. For people who do know what they're talking about and watching, these guys look like a bunch of cowboys. Sure, it's a 'plant room', and they're using fresh-air cooling - that certainly does not make this one of the 'greenest' data centers in the country! At least they have a decent air re-cycling system - it actually looks like someone thought that through. I can pretty much guarantee he was just wearing the hard hat and ear protectors for the video - under normal circumstances, they wouldn't even think of bothering.
"Plant room" hahaha
Filmed on a potato