I worked in that DC for years, many years ago. The fire suppression system is a pre-action dry-pipe type where the pipes in the ceiling to the sprinkler heads are full of air. Digital smoke / heat detectors would need to trigger to charge the system with water, then the sprinkler head element would also need to burst to allow water to exit the heads. This is why the system in the front closet was so complex. It prevents someone from flooding the DC by accidentally breaking a head with a ladder. The building also has some interesting history. In the late 1990s, the 'Godfather of Spam', Alan Ralsky hosted his servers in that DC. There is a big long story of it I'm sure somewhere still around in the internet archives. In the rafters of the back MDF room with the cut fiber, there should still be some binders full of paperwork up in the ceiling for the original fiber orders with his name on them.
Was in a server room/datacenter back in 2007. Was installing several beefy new vmware hosts. The boss was in there watching, with his hand over the emergency cutoff for the UPS, playing with the little cover that spins to give access to the cutoff button. Well, the server room went completely quiet in a split second; you should have seen the looks on his face.. What happened; well, the UPS had a single point of failure, with a control board that just happened to go at that moment as we were powering up like the 4th or 5th server. The massive UPS had been purchased a few years earlier, and the manufacturer issued a recall to replace the board, but well, there was a disconnect between the manufacturer, the vendor that performed the install, and the customer that had never received the notice that the board needed to be replaced.. Single points of failure were the typical Modus Operandi for the company ( the company with the servers.). A single HVAC system. A single massive UPS. Single entry/exit. No generator. Internet circuits that were nowhere near redundant, etc, etc.
Back when i worked with cooling datacenters, my policy was to never screw with the firmware in more than one unit at a time, because of previous mishaps. I had a workmate who was a bit on the wild side, he flashed them as quick as he could, done via ethernet, and i ran around frantically configuring all the parameters. Once done, every single unit suddenly shut down without explanation. Fortunately they all rebooted and were up again within minutes, but that time in between was scary.
Hey Tom, I'm a DC manager + administrator. We use flywheel UPS' and love them. I hated paying for the battery replacements every 3-5 years. Such a waste of money. I also vastly prefer aisle containment rather than just balls-to-the-wall CRAC cooling like this. Big waste in efficiency and PUE there too. Thankfully our DC is in a dry state so we utilize evaporative cooling in tandem with CRAC and CRAH's. Lastly, I think that COMs under the raised floor is the worst. Huge pain to install and maintain, and usually blocks air more than if you put power-only down there. COM's from top-down, power from bottom-up is the preferred method we have. Also, who the heck cuts fiber?! Sounds like a great way to accidentally blind someone if they forgot to disconnect the transmitting side.
The last few new DCs I've been into, the raised floor, if they even went with one, was just a plenum. Zone 4 EQ though so they are kinda at a disadvantage with the extra bracing underneath for the bays. Plus if you have to move a beast like a CRS1 or other pre configured bay, you have to temporarily crib the travel path under the floor. These were primarily -48v DC powered telecom facilities. So they used tank cells that with at least annual attention last decades. They use a lot of space though.
With the generator and transfer switch smaller, less expensive, batteries can be used...I thought the maintenance on the flywheels "vacuum bearings: made flywheels less desirable but admit, battery maintenance was a real pain.
@@johngermain5146 yeah, if your facility is primarily ac powered, the flywheel setup makes a lot of sense. Most everything is 48v DC with telco. You have E911 requirements as well so it’s kind of belt and suspenders where they’ll have 8 hours of battery plus enough fuel that the generators can often run for days depending on projected worst case resupply. Went to one that had batteries in every conceivable space plus 2 10,000 gallon diesel tanks. N+1 generators on site. Multi Megawatt sized facility though. Well MVA probably officially but I’m a DC guy and watts makes more sense to me.
Water cooled rear doors are way better than isle containment. More expensive upfront for sure but they last a long time. Ours are alread well over a decade old and look like new. Basically a large car radiator, and they last for lifetime of the car these days out in the open not in a nice clean dry data centre. What I noticed was lack of fire suppression
I’ve seen those fly wheel systems and they are really cool. Spinning super fast in a vacuum and sure enough they provided enough power until the generators came online.
As a person still working in a datacenter, and have for 35 years, I find this sad. There were once people working there, who's jobs are most likely gone, or sent overseas.
Nah, just moved across town. A newer/bigger datacenter was built to replace this one less than 10 miles away. This one was kept alive for a number of years with just a few colo clients in there and a skeleton crew to manage it.
This is hilarious .. Tom has NEVER looked so gleefully happy in any of his videos. Totally nerding out about a decommissioned data center .. like a kid in his favorite candy store. that’s probably the best testament what a great IT guy you must be, LOL. Cheers
2:45 the floor isn’t primarily designed so the cables can traverse underneath - it’s for the airflow…CRAC units distribute the airflow from underneath. This is why you see the grated/perforated floor tiles and large gaps under the cabinets.
Also, air conditioner is pushing air under floor and the holes in front of cabinets have cold air go up Infront of the servers. Hot air fills room and pull hot out of the ceiling. Also the raised floor has colors. When a floor has grey lanes of tiles, that means it's a heavier duty weight limit And you can use a pallet/pallet jack on it. If floor times are white, you can NOT use a pallet/pallet jack as it's not specd to hold that weight. You have to unpack and carry items into the raised floor.
Not going to Lie, seeing him tapping the EPO on a shut down data made my heart jump, I am an electrician / rack / fiber duct installer for data centers.
I just decommissioned one of our oldest DC one a few months ago. No raised floors, It was all RTU A/C units and ladder racks, dual generators, APC symmetra 80kva. Wasn't fully redundant which is why when the lease end was (is) coming up it was a new brainer to start over with new equipment in a newer location.
The flywheel at the other data center that you referred to is the UPS. The flywheel is not instead of a UPS, it's just a different type of energy storage vs. lead acid batteries. The Liebert unit you showed is an always inverting type where it's delivering DC to AC loads all the time.
When I was in field service back in the mid and late 90s, I worked in many data centers. I remember cutting out bus & tag cables underneath the the floor as well as running fiber and plugging in cables to devices and mainframes.
Great tour and definitely eerily quiet! That pretty much looks like the first DC I worked in (Built in the '90s) with all the cabs out in the open, everything's gone all environmentally friendly and energy saving since then with enclosed hot/cold aisles. I'ts amazing the amount of kit that telcos just abandon when sites get decommissioned, I've got a ton of OLT kit sitting in a shed at home from old circuits where when asked if they want then back, the telcos have just said to scrap them, no idea what use they'd be, but hey! 😅
That’s wicked cool! I love seeing decommissioned buildings like that, I got the privilege of seeing several buildings on my university’s campus shortly before they were torn down.
I work in a datacenter, and every time I see a video with an empty one it feels so creepy, because I'm so used to the noise and it's almost haunting to see that kind of environment and hear nothing.
I remember in one data center the UPS had failed and we were running solely on commercial power. (I don't know the specifics of why the backup generators weren't connected) The power went out and everything went dark and silent save for the emergency lights. Its amazing how accustomed to noise you get when it suddenly stops and you become VERY aware of the absence. Its so creepy.
@@kwith I don't want to even imagine that. We have policies, plans, and procedures in place, but that kind of thing happening is straight up a nightmare to me. Ideal setting for a horror game to bug me now than normal XD
@@joshuawaterhousify Well fortunately the data center in question was for Dev and QA environments, nothing in production was affected. So a few projects were impacted and some dev workers were annoyed and inconvenienced, but nothing major or critical was affected. But the creep factor was in full effect lol.
@kwith that's something at least; if it's going to happen, that's where you want it to be XD but yeah, I can imagine it would still be creepy as all hell, and would not want to be there :P
@joshuawaterhousify yea unfortunately I didn't have much of a choice. Someone had to do some troubleshooting and verifying that everything came back up. I pulled thr short straw on that one haha
A question about the genny, I worked for a company and their corporate data center and IT department was in part of our building. They had two independent generators, one diesel and the second natural gas. The primary was the gas unit. They had 7 days worth of diesel on hand. I left in 96, do people spec single or double generators now? Is it just how paranoid management is?
I have a mid 90s comms room/ server room at work, it’s only about 30ftx30ft. When our systems were upgraded to fibre and internet around 2010 a new room was built and services switched over. The old room was abandoned, everything powered down and left as it was, only the lights work now 😂. What was once state of the art equipment is now just electronic waste. As we don’t need the space just now, it is cheaper to leave it alone. Cool video. All the best.
I wouldnt say the floor is lifted "for the cables to go under", that is just a nice feature that comes with raising them. But they are raised for floading and air flow reasons.
I understand the security concerns for not filming and/or noting the address for an active data centre, but I am curious what the legal reason wouild be.
Legal department looks for the least amount of issues and telling us NO is the easiest way. There are sometimes customer names on cabinets and even where there is not the customers may get nervous knowing people are there filming.
@@LAWRENCESYSTEMS >and even where there is not the customers may get nervous knowing people are there filming. I don't understand this. Why would the customer be nervous? Is this a case of "security through obscurity"?
There's often lots of information that can be gleamed from filming which alone might not be important but I'm combination with other public or private info might be hazardous to corporate espionage targets or similar.
A lot of facilities are colocation (so it's a lot of companies in one data center) and I've been in some data centers with high profile clients - like hospitals, payment processors, even a lot of local telcos will use colo over running their own data centers - and that just paints a big target on them. And in a lot of cases, these larger clients will make the data center sign their own NDAs and legal contracts. So to make it easy, Legal just puts a blanket "NO" across the board. Especially because I've seen racks and cages where those high profile clients will leave out laptops, printed diagrams, etc. since they're in a secure facility, so if you walk by with a camera and capture confidential information, then the data center could be in hot water.
I worked in a data centers for 10 years and so much of this is familiar. The Liebert CPCs, the single-mode fiber hanging everywhere, the ladder racks racing across the ceiling, yellow fiber trough, good memories. I still work in one periodically but not to the extent I used though. That raised floor, I worked in one that had a 12-inch raised floor, one that was 18 inches and one that was 36 inches. (Yes, three feet!) From the looks of it, they didn't have cold air containment in the cold aisles did they? Were those Writeline racks? I remember the older DUC70 cabinets, that ugly beige color. THOSE were a pain in the ass to install in.
@@deepspacecow2644 Not that I know of but we are pretty strict on safety. So when a tile is taken up and we were underneath, we had to put up cones and signs. Our safety guy was VERY militant about that and rightfully so.
I loved testing the generators and transfer switches both from the UPS and by killing power. It was a frequent testing requirement from the insurance co. I also liked it when the halon fire system went off.
That UPS battery stack'll kill you right dead if you touch the wrong thing, powered off or not. Please be careful and warn people not to go poking around inside room UPSes.
Back in 2004 I was working in phoenix sky harbor airport in their little datacenter on-site. Probably 1/3 the size of this one. It ran all the flight displays and the parking garage space counters. Was plugging in a new rack and apparently they wired 208v to a 120v plug under the tile. Big flash and pop out of the rack, followed by an uneasy silence as the entire room got quiet. Within minutes managers/supervisors were busting down the doors. Fun times.
Frankly it's a little surprising that that would even do much - virtually everything I've powered in a DC is 240v tolerant (with some requiring 200+V), but I can see something in 2004 being just old enough to be problematic. Or if it was a plug-in UPS, which I could see being the case in a little server room like you're talking about.
Well, it they were stupid enough to set it up such that a single shorted (overloaded) outlet could take down the entire room, they totally deserve to be shamed.
I haven't visited many datacentres in my time, but working for an ISP I do go in the same few fairly often. What surprises me in this video is the lack of physical security - the building seems so exposed from the outside. The datacentres I typically visit are surrounded by double barbed wire fences, have security patrols and all internal rooms protected by very restrictive access control. Not to mention all power systems, including generators being built into the buildings. They're not so strict on phone usage though, while taking pictures is strictly forbidden, no one takes your phone on the way in for example it's just more of a trust basis. What type of facility was your one, owned by a single company or was it colo space?
All that modern "security" is just theater. It's there to impress customers. A place like this one is more secure by the fact few know it's there. And even fewer know what ("who's stuff") is in there.
I worked building maintenance at the one in Atlanta (40 Perimeter Center 1978) and ours was 84k sq ft. 2 story, the bottom being the actual "Data" portion and the upper is where all the programmers were. We did all the electrical, plumbing, HVAC and work stations.
Relic of a "long past" age. These days, one could have more power in a single rack than that entire building. (I've used "small" [4U] blade centers that used 4kW... the whole rack only had 5.7k fed to it.)
I got a glimpse of where the ILLIAC IV computer used to be on the University of Illinois campus in Urbana, years ago. I think there was a Cray there at the time. For all I know there aren’t any important computers in that building any more, over thirty years later. Raised floor, big cooling system, not all that different, accounting for the changes in technology.
Oh and we always snipped fiber when doing rebuilds/moves. So much quicker, and fiber is actually very cheap. I miss configuring Cisco and Fujitsu ons platforms, very unique vs "normal"switching platforms!
Looks a lot like the development labs that I've either been in or operated. Loved the raised floor, makes life really easy. I've also been in telephone central offices where everything's overhead. Great stuff!
We had a UPS explode in our data center in about 2001. Big fire, and lots of water damage, because the fire department couldn't get in. The guy who had the keys got into a car accident rushing to the site. Now the center has no UPS in the server rooms. A fire wall between it.
WHY! would you chop the fibre and leave the racks? (I did notice that there were no PDU's. Gotta love 60A 208 3phase PDU per rack and the redundent) Security procedures vary all over the place. Some just had badges (and maybe a code). Some have you put your hand on a scanner And badge. One I was in had people locks. Door1 had to be closed before door2 would even accept the hand+badge.
Very cool! That building is a very short drive from my house! Driven by it countless times without even thinking about it. It's right by that really weird intersection with Carlysle and Pelham! Maybe they can bring those UPS batteries to power my house and help deal with my constant DTE power outages... Would be interesting to see inside the AT&T building nearby on Michigan Ave with the door labeled "collocation entrance."
Currently work in a former data center like that one, with the same raised floors and Liebert HVAC. They kept the layout even without actual server racks. The Lieberts break down a lot. The room has two. With just one, we exceed 110 degrees just from ambient. Really need two to be survivable and again, we don't even have servers. The best part about the Lieberts is how they smell like a cross between old socks and an old fridge.
7:54 Cisco 15454. Previously Cerent until Cisco does what Cisco does and bought them. It dates the operation a fair bit. I could probably still provision one from the ground up despite not logging into one for 15 years.
The datacenter in Southfield is much more interesting. Though, they probably wont let you film there. I hadn't seen this since it was decommissioned, pretty cool.
Ah, a good example of a data center where you least expect it. These are actually very common throughout the world. The main reasons for keeping a data center's location a secret is purely hardware security. Having direct access to a server is far more dangerous than almost any software attack. Data centers that just offers VMs tends to be more relaxed about this type of security, since the hardware itself is more or less just a standardized box without any labels hinting at who rents it. Ie, it is far far harder to do a targeted attack at a specific organization. But even a lot of these data centers hides. In Stockholm where I live I know of 2 data centers that are in actual cold war era bunkers. Their locations are fairly well known, but that doesn't make it easier to access, since it is a litteral bunker. Though even highly secure facilities have been targeted through more indirect means before, just dropping a few "thumb drives" outside the premise to try to pull of a HID attack isn't a new thing. This becomes a lot harder if you don't even know where the secure facility is located.
I thought "finally" at the end when I saw you with the mini hammer and the emergency glass. That would have been the most tempting. You controlled yourself well.
"normally can't open these" man that's so wild compared to places where I've been in computer lab support server rooms and its like "yeah just go find the breaker turn it on" with various 120V/208V/480V different 3 phase delta/wye, and 60/400Hz power systems. The plumbing stuff and UPSs are neat though, that part I can't see places I've been.
How long ago did Nexcess decommission it? It looks like they still have it listed on their datacenters page. I wonder how much noise the neighbors, such as the dance studio next door, heard.
i think we only finally decommissioned it a year or so ago. re: noise: the generator only ran during anticipated/actual power outages. other than that, the walls and thick doors keeps most of the noise inside.
cabling is often cut like this due to not wanting the gear to be reusable, if the tenant were to sell it to the building owner/landlord, and if there were ANY issues down the line, there could be a giant legal mess that the previous tenant might get pulled into. that and the obvious "not my job"/expediency thing, and sometimes to spite the building owner/landlord. also sometimes shady contractor things in the agreements "tenant will use our preferred vendor for network cabling install work"
Looks like it was a small business local DC, especially with only 1 generator. Any DC I've been to (and have eqpt at 4 locations) has at least 5 gennies, usually N+1. Two major DCs I'm at also have contracts with fuel (diesel) suppliers that when a major storm is inbound, they not only have their multiple thousand gallon underground tanks, but to have a couple tankers parked in the parking lot in case of an extended outage. The inners and operations of DCs are fascinating. Only a few min in on your video, but so far very interesting.
It wasn't a data center, but I worked in a place that had the raised machine room flooring in our main workroom. Funny, just seeing that ramp into the room and and those floor tiles triggered old memories of clocking in.
I was once a short term generator tech. ATAT was our good customer. One of the bigger sites, the "switch's" where all the local calls are routed through. Then the calls go microwave to microwave tower all around the country. For instance. the whole east coast has microwave towers all down the coast it doesn't need any wire/fiber to go all the way. Then to fiber maybe depending on how far across the planet. Detroit The other cool generator was when the main standby 1 megawatt one needs a part in a switch building. It then needs a back up of the backup generator. Brought in a trailer generator. 8 cylinder twin turbo diesel cat. OH MY GAWD. the startup on that is violent and the airflow for the radiator will suck a golf bawl through a garden hose. Very fun but i had to go make a bit more money at a desk job. Love you channel I'm also here to say my old black box heater poe edgerouter x died. Replaced it with a lite 8. Firmware it had or suggested, was broke. They eventually fixed it as my google foo wasn't good enough to figure out how to put an older firmware on it. Or just my unique basic PC hosted controller (up-to-date). It wouldn't take an adoption and hold. It would fault out. Over and over, reset after reset. Firmware update came along a few weeks later. voila. Great. I do like the switch setup and its setup in the controller instead of standalone like the old edge router. Time for some cameras i think.
They are called Central Offices (CO's). Nothing is on microwave anymore. At least nothing significant. Everything is on fiber. Both coasts, everything in between, as well as oceanic. Power consumption is even less than back in the 90's. You go into a CO today and I guarantee there are empty floors that were jam packed with telephony gear. Hell one floor would be dedicated just to terminate all the outside plant cable pairs.
Most data centers I've been in are more chill about cell phones. Even the super high security ones (security armed with automatic weapons on full display!)
Depends on a lot of things. I've been to some with fully enclosed racks that didn't care *too* much but then again if I rocked in with a video camera I'm sure they'd've asked me to hand it over...
I almost purchased this data center when I was CEO of Ann Arbor based DigitalRealm (in fact, I bought a house across from because I was confident and expecting to close on it but Nexcess won the bidding war at the time). So I instead built out a DC in downtown Ann Arbor's Key Bank Building (where Cisco had vacated then site used to build the BFR which required them to make serious augments to the building)
It was built by Carter, which designed facilities for Level3, UUNET/Worldcom, and Verizon. It has close proximity to several fiber conduits and it had ample power in the area (which had reduced the cost to augment). I believe, it was originally designed to supplement backbone provider AEGIS other location on Outer Drive. I did a lot of BGP/Internet peering consulting for ISPs and CLECs my youth in the mid 90s and its rarely about cheap land and more about the areas infrastructure. Otherwise that facility would be larger.
I'm definitely going to borrow the doors in the floor idea when I renovate my basement. I don't have enough room for a raised floor but I can put the doors between the joists in the ceiling.
So sad to see the generator and those UPS which are no longer in use... with all the spare parts. I want to keep them from being taken apart... but I'm too far away unfortunately.
Hmm... NetApp has a data center in the Research Triangle that MIGHT be filmable. They promote it to sell copies of it... reason? It is FAR more efficient than most data centers... the front of racks is a reasonable human temp.. the back? Very warm... They have huge diesel generators to keep things alive during the normal Duke Power problems... One of the nice side effects of having the racks isolated is that a large percentage of noise goes away when you have the backs of the racks sort of isolated from the fronts... (Lord knows a NetApp Filer is NOT a quiet animal...)
Whooo, thats crazy. I drive by this place on a daily and the only reason I knew this was a datacenter was because I applied to work there. Fun to see the inside of it finally!
I swear - after building and commissioning a bunch of TV stations (which are not too unlike data ceters) concrete floor tiles are the shittiest things to cut holes in for racks.... and then when you cable the rack and pull the cables the steel tops and bottoms are like razor blades if you dont put capping on them.
Also the misters in the corners on the cieling. That spray water into the air if humidity drops. Also the cut fiber, any new client will NEVER re use any fiber or cables and so the vacating co wil jsut cut and pull servers out. Faster abd cheaper. Reusing any cables in a 24x7 datacenter operation is a bug no no. Murphy will show up!!!
>any new client will NEVER re use any fiber Never underestimate corporate cheapness. I could easily see a small operation letting it ride. And FWIW It's singlemode fiber, pretty easy to test and run it, so I wouldn't blame em too much.
That is actually not true. You can have phones, but can not take pictures. Now you can take pictures within your own cage if you are in a colo of your equipment. This is for many data centers I've been in in the DFW, Austin and San Antonio area.
I worked in the Data Center for the Atlanta Federal Reserve back around the year 2000. I was the LMS administrator for the first online training server they installed. It was about the same size as the one you were visiting. Nowadays that Data Center has been shrunk down to a small room with one cabinet of servers. :)
Why did they cut the fiber??? Like if you're not gonna take it, why make it harder for the next person coming??? I don't see how that assisted the previous people in any way.
some building leases require that to be done, and its easier to cut it as well, and of course its a security risk, as you dont know what the last person may have done to it
another factor is likely speed of removal of the equipment, unplugging them might take a few seconds per line, times 50 lines, or just 5 seconds to slice them all.
I remember once I had setup my desk on a cardboard box in a place like that while I was installing some servers, and they probably had well over a million dollars worth of equipment, thinking of the irony of the value of my desk...
i had always heard you couldn't cut fiber like that but I didn't know if that went by the wayside when they switched to plastic core over the older glass style. I never get to terminate those so I've always been curious. Cool video btw I've only heard a quiet datacenter room when we're starting a NEW warehouse.
@@mrmotofy the issue is that the conduit was too small and the LC end separated leaving just the rip cord attached. It's om5 outdoor jacketing and LC connector replacements won't slide over. So if it were damaged it would be right at the end and hopefully not travel back too far (and I have service loop slack to play with). I ran a light tester on each pair and it gets there but I don't know how to test intensity. Guess I need to find a local fiber tech to come show me how it's done. I have a cleaver and all the other stuff I should need except proper fitting end connectors. Thanks for replying bc I assumed the whole cable was done for.
7:50 The good old ONS 15454! These suckers are built as solid rocks! Depending on the age and use case, that could be a ROADM with transponders or a SONET MUX! Cisco still uses these chassis to this day as NCS 2015's and NCS 2006's!
One thing I have wondered is while a flywheel UPS makes sense because you just need enough time for the generator to fire up, is the real future in better battery tech like LFP batteries? The idea being with all of this 'green' power generation tech being unreliable and variable throughout the day, the UPS doubles as a power buffer to soak up extra power when renewables are plentiful, say in the middle of a sunny day and then discharge when they are not, say in the evening when the Sun goes down and everyone comes home and turns on stuff? Another thing I have wondered is would it make more sense to do a dual fuel natural gas and propane generator instead of of diesel? The idea being by default you would run on natural gas, which should be limitless and not as bad on the environment to burn than diesel and fall back to stored propane in the event the natural gas supply fails, say an earthquake ruptures lines? Just thinking it would be less to manage than shipping in diesel on a regular basis and having to do something with all of that fuel before it goes bad. Propane never goes bad, so if you go years or decades without disruption to your natural gas supply, the propane can just sit in the tank. Natural gas is typically piped all over the place and usually still works in a power outage. You could also have an LNG tanker truck hookup for emergency use in case you can't get a propane tanker in a disaster situation or just find it is cheaper to source the LNG and know you will likely use it up before it evaporates. Maybe even get into some co-generation with the power utility to help give those generators a good workout, especially as natural gas fired power plants are generally used as peaker plants these days. If you had good battery tech and good generator tech, maybe a data center could run off of batteries during peak grid demand when power is most expensive and dump power onto the grid from its generators at the same time. Especially some of these newer data centers are so big, they could use grid scale power generation equipment anyway just for the site. Anyways, I find this backup power stuff usually works best if it is setup in a way where it is in daily use. It is when the backup power is really only a backup that rarely gets touched is when you find it doesn't work when you need it. Even just test firing on a regular basis may not tell you everything you need to know and wastes a lot of fuel for nothing other than to know that the thing still turns on. There have been data centers put together where backup power does not cover cooling and so the whole place shuts down anyway when something happens because you cannot run a data center without cooling. These kinds of solutions talked about here will be making money for the operator, so easier to justify to the bean counters that it has to be sized up to cover everything. Data centers also need to adopt a 400V DC system. When you think about it, an old data center like this wastes tons of power on doing all of these power conversions. With a 400V DC system, you do one conversion to the 400V DC standard, the UPSes are charged to the voltage of the other in-line UPSes, and then just directly connect into the 400V power bus. There is no real conversion in the UPS beyond a two way charger to get the batteries to the right voltage before connecting to the main power distribution bus. The power then goes direct to the servers. It is straight main converter to the 400V DC power bus, the UPS batteries are inline with the power bus, and the power bus goes straight to the servers. The server power supplies don't have any AC/DC conversion hardware, which is generally about half of the power supply, instead just skipping to the DC-DC step down after. Much more efficient and cheap to mass implement. Electric motors running the cooling system are also much more efficient when running variable speed inverter driven motors, which could get power from a 400V DC power bus. For grid balancing applications, the main power converters to the 400 V DC buses could go from say delivering twice the power and thus charging all of the UPS batteries at the same time to completely shut off and the data center running completely off of the batteries and constantly varying in-between while balancing the grid. This will provide good stats on battery health on a daily basis. With the ability to have multiple UPS battery banks in parallel, you can take a problem battery pack offline and swap it out. With LFP batteries, they should be able to take this daily use case for decades. With a setup like this losses in the system are very minimal as LFP batteries are pretty efficient and you cut out conversion steps that reduce the efficiency and raise the installation cost of existing grid scale battery deployments. I mean the conversion hardware is already there to power your servers, you just piggyback off of it to also charge up batteries and then the batteries can discharge directly into the servers without conversion hardware in-between and the servers themselves drop some of the internal conversion hardware, making it all even more efficient to do.
Its worth pointing out that this particular datacenter has not been actively used aside from 1 or 2 colo clients for the better part of the last 7-8 years. Maybe even longer. The primary datacenter is much larger, and has all the fancy bells and whistles you speak of :)
that's not an ASR, it's an optical chassis (looks like maybe an ONS 15454). It would have connected to AT&T's SONET network ring and be used to break out whatever circuits were coming into the datacenter (OC-3, OC-12, etc.)
I worked in that DC for years, many years ago. The fire suppression system is a pre-action dry-pipe type where the pipes in the ceiling to the sprinkler heads are full of air. Digital smoke / heat detectors would need to trigger to charge the system with water, then the sprinkler head element would also need to burst to allow water to exit the heads. This is why the system in the front closet was so complex. It prevents someone from flooding the DC by accidentally breaking a head with a ladder.
The building also has some interesting history. In the late 1990s, the 'Godfather of Spam', Alan Ralsky hosted his servers in that DC. There is a big long story of it I'm sure somewhere still around in the internet archives. In the rafters of the back MDF room with the cut fiber, there should still be some binders full of paperwork up in the ceiling for the original fiber orders with his name on them.
That's interesting, I can probably get back into that building again to look for them.
Hi Brad! hope you're doin' well
It looks like it was an old mainframe data center back in the day too!
Same here! Had a office there and you know what , I need to go back.
Let me see how much they got it up for and perhaps I will lease it .
@@kieranwilliams3052 How many megawatts is it?
The only thing creepier than a quiet datacenter is a datacenter that all of a sudden gets quiet when you're working in the power room.
For sure
Got the t-shirt, the sounds of all the UPS's beeping still haunt me 🤣
Was in a server room/datacenter back in 2007. Was installing several beefy new vmware hosts. The boss was in there watching, with his hand over the emergency cutoff for the UPS, playing with the little cover that spins to give access to the cutoff button. Well, the server room went completely quiet in a split second; you should have seen the looks on his face.. What happened; well, the UPS had a single point of failure, with a control board that just happened to go at that moment as we were powering up like the 4th or 5th server. The massive UPS had been purchased a few years earlier, and the manufacturer issued a recall to replace the board, but well, there was a disconnect between the manufacturer, the vendor that performed the install, and the customer that had never received the notice that the board needed to be replaced.. Single points of failure were the typical Modus Operandi for the company ( the company with the servers.). A single HVAC system. A single massive UPS. Single entry/exit. No generator. Internet circuits that were nowhere near redundant, etc, etc.
As an electrician who often works in datacenters that’s my biggest fear.
Back when i worked with cooling datacenters, my policy was to never screw with the firmware in more than one unit at a time, because of previous mishaps. I had a workmate who was a bit on the wild side, he flashed them as quick as he could, done via ethernet, and i ran around frantically configuring all the parameters. Once done, every single unit suddenly shut down without explanation. Fortunately they all rebooted and were up again within minutes, but that time in between was scary.
Hey Tom, I'm a DC manager + administrator. We use flywheel UPS' and love them. I hated paying for the battery replacements every 3-5 years. Such a waste of money. I also vastly prefer aisle containment rather than just balls-to-the-wall CRAC cooling like this. Big waste in efficiency and PUE there too. Thankfully our DC is in a dry state so we utilize evaporative cooling in tandem with CRAC and CRAH's. Lastly, I think that COMs under the raised floor is the worst. Huge pain to install and maintain, and usually blocks air more than if you put power-only down there. COM's from top-down, power from bottom-up is the preferred method we have. Also, who the heck cuts fiber?! Sounds like a great way to accidentally blind someone if they forgot to disconnect the transmitting side.
The last few new DCs I've been into, the raised floor, if they even went with one, was just a plenum. Zone 4 EQ though so they are kinda at a disadvantage with the extra bracing underneath for the bays. Plus if you have to move a beast like a CRS1 or other pre configured bay, you have to temporarily crib the travel path under the floor. These were primarily -48v DC powered telecom facilities. So they used tank cells that with at least annual attention last decades. They use a lot of space though.
With the generator and transfer switch smaller, less expensive, batteries can be used...I thought the maintenance on the flywheels "vacuum bearings: made flywheels less desirable but admit, battery maintenance was a real pain.
@@johngermain5146 yeah, if your facility is primarily ac powered, the flywheel setup makes a lot of sense. Most everything is 48v DC with telco. You have E911 requirements as well so it’s kind of belt and suspenders where they’ll have 8 hours of battery plus enough fuel that the generators can often run for days depending on projected worst case resupply. Went to one that had batteries in every conceivable space plus 2 10,000 gallon diesel tanks. N+1 generators on site. Multi Megawatt sized facility though. Well MVA probably officially but I’m a DC guy and watts makes more sense to me.
Water cooled rear doors are way better than isle containment. More expensive upfront for sure but they last a long time. Ours are alread well over a decade old and look like new. Basically a large car radiator, and they last for lifetime of the car these days out in the open not in a nice clean dry data centre.
What I noticed was lack of fire suppression
I’ve seen those fly wheel systems and they are really cool. Spinning super fast in a vacuum and sure enough they provided enough power until the generators came online.
I wish I had a sub floor for wires and possible flood!
Yes, my studio is in the basement and I have two water alarms and soon I will be installing backup power and a SECOND sump pump, just in case..
@@LAWRENCESYSTEMS check out the triple safe pump. 2 AC pumps (primary/backup) and a 12VDC pump for those bad days.
@@LAWRENCESYSTEMSI have a hot spare to my sump now, after my last one failed and I had a nice pool of water under the rack.
Can anyone say GENERATOR INLET and inverter for your car? 🙏 😉
I use pallets : p
As a person still working in a datacenter, and have for 35 years, I find this sad. There were once people working there, who's jobs are most likely gone, or sent overseas.
Nah, just moved across town. A newer/bigger datacenter was built to replace this one less than 10 miles away. This one was kept alive for a number of years with just a few colo clients in there and a skeleton crew to manage it.
This is hilarious .. Tom has NEVER looked so gleefully happy in any of his videos. Totally nerding out about a decommissioned data center .. like a kid in his favorite candy store. that’s probably the best testament what a great IT guy you must be, LOL.
Cheers
I'm always excited when I'm in a data center or server room!
@@LAWRENCESYSTEMS ha ha I can tell … Nice!
*Starts digging a hole in the basement to create a false floor*
Wife: what now?
Me: This is how the big boys do it!
2:45 the floor isn’t primarily designed so the cables can traverse underneath - it’s for the airflow…CRAC units distribute the airflow from underneath. This is why you see the grated/perforated floor tiles and large gaps under the cabinets.
Also, air conditioner is pushing air under floor and the holes in front of cabinets have cold air go up Infront of the servers. Hot air fills room and pull hot out of the ceiling.
Also the raised floor has colors. When a floor has grey lanes of tiles, that means it's a heavier duty weight limit And you can use a pallet/pallet jack on it. If floor times are white, you can NOT use a pallet/pallet jack as it's not specd to hold that weight. You have to unpack and carry items into the raised floor.
Not going to Lie, seeing him tapping the EPO on a shut down data made my heart jump, I am an electrician / rack / fiber duct installer for data centers.
I got the chance to hit the EPO once when we needed to test it in our data center.
I just decommissioned one of our oldest DC one a few months ago. No raised floors, It was all RTU A/C units and ladder racks, dual generators, APC symmetra 80kva. Wasn't fully redundant which is why when the lease end was (is) coming up it was a new brainer to start over with new equipment in a newer location.
I used to work in this datacenter when it was owned by a web hosting company before that company was acquired by a bigger web hosting company.
Yes, they did get bought out
The flywheel at the other data center that you referred to is the UPS. The flywheel is not instead of a UPS, it's just a different type of energy storage vs. lead acid batteries. The Liebert unit you showed is an always inverting type where it's delivering DC to AC loads all the time.
When I was in field service back in the mid and late 90s, I worked in many data centers. I remember cutting out bus & tag cables underneath the the floor as well as running fiber and plugging in cables to devices and mainframes.
Great tour and definitely eerily quiet! That pretty much looks like the first DC I worked in (Built in the '90s) with all the cabs out in the open, everything's gone all environmentally friendly and energy saving since then with enclosed hot/cold aisles. I'ts amazing the amount of kit that telcos just abandon when sites get decommissioned, I've got a ton of OLT kit sitting in a shed at home from old circuits where when asked if they want then back, the telcos have just said to scrap them, no idea what use they'd be, but hey! 😅
That’s wicked cool! I love seeing decommissioned buildings like that, I got the privilege of seeing several buildings on my university’s campus shortly before they were torn down.
I work in a datacenter, and every time I see a video with an empty one it feels so creepy, because I'm so used to the noise and it's almost haunting to see that kind of environment and hear nothing.
I remember in one data center the UPS had failed and we were running solely on commercial power. (I don't know the specifics of why the backup generators weren't connected) The power went out and everything went dark and silent save for the emergency lights. Its amazing how accustomed to noise you get when it suddenly stops and you become VERY aware of the absence. Its so creepy.
@@kwith I don't want to even imagine that. We have policies, plans, and procedures in place, but that kind of thing happening is straight up a nightmare to me. Ideal setting for a horror game to bug me now than normal XD
@@joshuawaterhousify Well fortunately the data center in question was for Dev and QA environments, nothing in production was affected. So a few projects were impacted and some dev workers were annoyed and inconvenienced, but nothing major or critical was affected.
But the creep factor was in full effect lol.
@kwith that's something at least; if it's going to happen, that's where you want it to be XD but yeah, I can imagine it would still be creepy as all hell, and would not want to be there :P
@joshuawaterhousify yea unfortunately I didn't have much of a choice. Someone had to do some troubleshooting and verifying that everything came back up. I pulled thr short straw on that one haha
Data Centers aren't supposed to have water sprinklers for fire suppression. Water ruins the data. Halon would be a better choice.
A question about the genny, I worked for a company and their corporate data center and IT department was in part of our building. They had two independent generators, one diesel and the second natural gas. The primary was the gas unit. They had 7 days worth of diesel on hand. I left in 96, do people spec single or double generators now? Is it just how paranoid management is?
I have a mid 90s comms room/ server room at work, it’s only about 30ftx30ft. When our systems were upgraded to fibre and internet around 2010 a new room was built and services switched over. The old room was abandoned, everything powered down and left as it was, only the lights work now 😂. What was once state of the art equipment is now just electronic waste. As we don’t need the space just now, it is cheaper to leave it alone. Cool video. All the best.
I wouldnt say the floor is lifted "for the cables to go under", that is just a nice feature that comes with raising them. But they are raised for floading and air flow reasons.
I understand the security concerns for not filming and/or noting the address for an active data centre, but I am curious what the legal reason wouild be.
Legal department looks for the least amount of issues and telling us NO is the easiest way. There are sometimes customer names on cabinets and even where there is not the customers may get nervous knowing people are there filming.
@@LAWRENCESYSTEMS >and even where there is not the customers may get nervous knowing people are there filming.
I don't understand this. Why would the customer be nervous? Is this a case of "security through obscurity"?
There's often lots of information that can be gleamed from filming which alone might not be important but I'm combination with other public or private info might be hazardous to corporate espionage targets or similar.
Many times contracts have clauses in them about revealing who is where and what they have kind of thing
A lot of facilities are colocation (so it's a lot of companies in one data center) and I've been in some data centers with high profile clients - like hospitals, payment processors, even a lot of local telcos will use colo over running their own data centers - and that just paints a big target on them. And in a lot of cases, these larger clients will make the data center sign their own NDAs and legal contracts. So to make it easy, Legal just puts a blanket "NO" across the board. Especially because I've seen racks and cages where those high profile clients will leave out laptops, printed diagrams, etc. since they're in a secure facility, so if you walk by with a camera and capture confidential information, then the data center could be in hot water.
I worked in a data centers for 10 years and so much of this is familiar. The Liebert CPCs, the single-mode fiber hanging everywhere, the ladder racks racing across the ceiling, yellow fiber trough, good memories. I still work in one periodically but not to the extent I used though. That raised floor, I worked in one that had a 12-inch raised floor, one that was 18 inches and one that was 36 inches. (Yes, three feet!)
From the looks of it, they didn't have cold air containment in the cold aisles did they? Were those Writeline racks? I remember the older DUC70 cabinets, that ugly beige color. THOSE were a pain in the ass to install in.
Did anyone ever fall in the 36in floor?
@@deepspacecow2644 Not that I know of but we are pretty strict on safety. So when a tile is taken up and we were underneath, we had to put up cones and signs. Our safety guy was VERY militant about that and rightfully so.
I loved testing the generators and transfer switches both from the UPS and by killing power. It was a frequent testing requirement from the insurance co. I also liked it when the halon fire system went off.
>I also liked it when the halon fire system went off
That does NOT sound like fun.
@@Jamesaepp expensive too, causes shunt trip of power also
That UPS battery stack'll kill you right dead if you touch the wrong thing, powered off or not. Please be careful and warn people not to go poking around inside room UPSes.
Back in 2004 I was working in phoenix sky harbor airport in their little datacenter on-site. Probably 1/3 the size of this one. It ran all the flight displays and the parking garage space counters. Was plugging in a new rack and apparently they wired 208v to a 120v plug under the tile. Big flash and pop out of the rack, followed by an uneasy silence as the entire room got quiet. Within minutes managers/supervisors were busting down the doors. Fun times.
Frankly it's a little surprising that that would even do much - virtually everything I've powered in a DC is 240v tolerant (with some requiring 200+V), but I can see something in 2004 being just old enough to be problematic.
Or if it was a plug-in UPS, which I could see being the case in a little server room like you're talking about.
Well, it they were stupid enough to set it up such that a single shorted (overloaded) outlet could take down the entire room, they totally deserve to be shamed.
I haven't visited many datacentres in my time, but working for an ISP I do go in the same few fairly often. What surprises me in this video is the lack of physical security - the building seems so exposed from the outside. The datacentres I typically visit are surrounded by double barbed wire fences, have security patrols and all internal rooms protected by very restrictive access control. Not to mention all power systems, including generators being built into the buildings. They're not so strict on phone usage though, while taking pictures is strictly forbidden, no one takes your phone on the way in for example it's just more of a trust basis.
What type of facility was your one, owned by a single company or was it colo space?
All that modern "security" is just theater. It's there to impress customers. A place like this one is more secure by the fact few know it's there. And even fewer know what ("who's stuff") is in there.
That's one Tiny datacenter.
yup
I worked building maintenance at the one in Atlanta (40 Perimeter Center 1978) and ours was 84k sq ft. 2 story, the bottom being the actual "Data" portion and the upper is where all the programmers were. We did all the electrical, plumbing, HVAC and work stations.
@@michaelmartin8036 Tiny as well. I use to walk around Iron Mountain Phoenix.
Relic of a "long past" age. These days, one could have more power in a single rack than that entire building. (I've used "small" [4U] blade centers that used 4kW... the whole rack only had 5.7k fed to it.)
I got a glimpse of where the ILLIAC IV computer used to be on the University of Illinois campus in Urbana, years ago. I think there was a Cray there at the time. For all I know there aren’t any important computers in that building any more, over thirty years later.
Raised floor, big cooling system, not all that different, accounting for the changes in technology.
Oh and we always snipped fiber when doing rebuilds/moves. So much quicker, and fiber is actually very cheap. I miss configuring Cisco and Fujitsu ons platforms, very unique vs "normal"switching platforms!
I've actually driven past there a few times, and I always assumed it was a closed bar or something like that, lol. Great video!
Looks a lot like the development labs that I've either been in or operated. Loved the raised floor, makes life really easy. I've also been in telephone central offices where everything's overhead. Great stuff!
1:02 even if it was powered up, you could still touch all that
We had a UPS explode in our data center in about 2001. Big fire, and lots of water damage, because the fire department couldn't get in. The guy who had the keys got into a car accident rushing to the site. Now the center has no UPS in the server rooms. A fire wall between it.
@@wd5vd page 2: They used concrete saws and cut a hole in the wall after what seemed a long wait.
WHY! would you chop the fibre and leave the racks? (I did notice that there were no PDU's. Gotta love 60A 208 3phase PDU per rack and the redundent)
Security procedures vary all over the place. Some just had badges (and maybe a code). Some have you put your hand on a scanner And badge. One I was in had people locks. Door1 had to be closed before door2 would even accept the hand+badge.
I think they call those "mantraps". (edit: it's one word, not two)
Very cool! That building is a very short drive from my house! Driven by it countless times without even thinking about it. It's right by that really weird intersection with Carlysle and Pelham!
Maybe they can bring those UPS batteries to power my house and help deal with my constant DTE power outages...
Would be interesting to see inside the AT&T building nearby on Michigan Ave with the door labeled "collocation entrance."
Currently work in a former data center like that one, with the same raised floors and Liebert HVAC. They kept the layout even without actual server racks. The Lieberts break down a lot. The room has two. With just one, we exceed 110 degrees just from ambient. Really need two to be survivable and again, we don't even have servers. The best part about the Lieberts is how they smell like a cross between old socks and an old fridge.
Tom was def in his happy place roaming around there.
I love server rooms and data centers
7:54 Cisco 15454. Previously Cerent until Cisco does what Cisco does and bought them. It dates the operation a fair bit. I could probably still provision one from the ground up despite not logging into one for 15 years.
The datacenter in Southfield is much more interesting. Though, they probably wont let you film there. I hadn't seen this since it was decommissioned, pretty cool.
These UPS caps are expensive, up to 100 euros new.
They coock off from heat easy, had one pair killed by lighrning, rest of the UPS survived
Such a cute little data center! Most people don't know these exist, let alone the scale of modern, huge, ones.
Ah, a good example of a data center where you least expect it. These are actually very common throughout the world.
The main reasons for keeping a data center's location a secret is purely hardware security. Having direct access to a server is far more dangerous than almost any software attack.
Data centers that just offers VMs tends to be more relaxed about this type of security, since the hardware itself is more or less just a standardized box without any labels hinting at who rents it. Ie, it is far far harder to do a targeted attack at a specific organization. But even a lot of these data centers hides.
In Stockholm where I live I know of 2 data centers that are in actual cold war era bunkers. Their locations are fairly well known, but that doesn't make it easier to access, since it is a litteral bunker. Though even highly secure facilities have been targeted through more indirect means before, just dropping a few "thumb drives" outside the premise to try to pull of a HID attack isn't a new thing. This becomes a lot harder if you don't even know where the secure facility is located.
I thought "finally" at the end when I saw you with the mini hammer and the emergency glass. That would have been the most tempting. You controlled yourself well.
"normally can't open these" man that's so wild compared to places where I've been in computer lab support server rooms and its like "yeah just go find the breaker turn it on" with various 120V/208V/480V different 3 phase delta/wye, and 60/400Hz power systems. The plumbing stuff and UPSs are neat though, that part I can't see places I've been.
How long ago did Nexcess decommission it? It looks like they still have it listed on their datacenters page. I wonder how much noise the neighbors, such as the dance studio next door, heard.
Not sure but the generator was last serviced in 2022 per the tags on it.
i think we only finally decommissioned it a year or so ago. re: noise: the generator only ran during anticipated/actual power outages. other than that, the walls and thick doors keeps most of the noise inside.
i work in hvac controls. those holes and the ac units are normally called CRAC units. (computer room air conditioning)
Even though it's quiet, I can still hear it!
cabling is often cut like this due to not wanting the gear to be reusable, if the tenant were to sell it to the building owner/landlord, and if there were ANY issues down the line, there could be a giant legal mess that the previous tenant might get pulled into. that and the obvious "not my job"/expediency thing, and sometimes to spite the building owner/landlord. also sometimes shady contractor things in the agreements "tenant will use our preferred vendor for network cabling install work"
Looks like it was a small business local DC, especially with only 1 generator. Any DC I've been to (and have eqpt at 4 locations) has at least 5 gennies, usually N+1. Two major DCs I'm at also have contracts with fuel (diesel) suppliers that when a major storm is inbound, they not only have their multiple thousand gallon underground tanks, but to have a couple tankers parked in the parking lot in case of an extended outage. The inners and operations of DCs are fascinating. Only a few min in on your video, but so far very interesting.
That's a small data center, when I worked for Nestle, our server room was bigger. 5 IBM mainframes and 200 compaq servers.
It wasn't a data center, but I worked in a place that had the raised machine room flooring in our main workroom. Funny, just seeing that ramp into the room and and those floor tiles triggered old memories of clocking in.
Who tracks all the racks and what is connected? Is there some kind of master tracker?
I was once a short term generator tech. ATAT was our good customer. One of the bigger sites, the "switch's" where all the local calls are routed through. Then the calls go microwave to microwave tower all around the country. For instance. the whole east coast has microwave towers all down the coast it doesn't need any wire/fiber to go all the way. Then to fiber maybe depending on how far across the planet. Detroit
The other cool generator was when the main standby 1 megawatt one needs a part in a switch building. It then needs a back up of the backup generator. Brought in a trailer generator. 8 cylinder twin turbo diesel cat. OH MY GAWD. the startup on that is violent and the airflow for the radiator will suck a golf bawl through a garden hose. Very fun but i had to go make a bit more money at a desk job.
Love you channel
I'm also here to say my old black box heater poe edgerouter x died. Replaced it with a lite 8. Firmware it had or suggested, was broke. They eventually fixed it as my google foo wasn't good enough to figure out how to put an older firmware on it. Or just my unique basic PC hosted controller (up-to-date). It wouldn't take an adoption and hold. It would fault out. Over and over, reset after reset. Firmware update came along a few weeks later. voila. Great.
I do like the switch setup and its setup in the controller instead of standalone like the old edge router.
Time for some cameras i think.
They are called Central Offices (CO's). Nothing is on microwave anymore. At least nothing significant. Everything is on fiber. Both coasts, everything in between, as well as oceanic. Power consumption is even less than back in the 90's. You go into a CO today and I guarantee there are empty floors that were jam packed with telephony gear. Hell one floor would be dedicated just to terminate all the outside plant cable pairs.
Most data centers I've been in are more chill about cell phones. Even the super high security ones (security armed with automatic weapons on full display!)
Depends on a lot of things. I've been to some with fully enclosed racks that didn't care *too* much but then again if I rocked in with a video camera I'm sure they'd've asked me to hand it over...
Be careful with those raised tiles. I have one pinch my hand before putting it back down.
How do you decide which cables are under floor and which are over the top?
Why was there a data center in Dearborn of all places?
yeah, weird placement being adjacent to a few other business like that and backing right up to residential. It's hiding in plane site.
Low land prices, I would imagine
en.wikipedia.org/wiki/Apex_Global_Internet_Services@@LAWRENCESYSTEMSThe place has a long and interesting history.
I almost purchased this data center when I was CEO of Ann Arbor based DigitalRealm (in fact, I bought a house across from because I was confident and expecting to close on it but Nexcess won the bidding war at the time). So I instead built out a DC in downtown Ann Arbor's Key Bank Building (where Cisco had vacated then site used to build the BFR which required them to make serious augments to the building)
It was built by Carter, which designed facilities for Level3, UUNET/Worldcom, and Verizon. It has close proximity to several fiber conduits and it had ample power in the area (which had reduced the cost to augment). I believe, it was originally designed to supplement backbone provider AEGIS other location on Outer Drive. I did a lot of BGP/Internet peering consulting for ISPs and CLECs my youth in the mid 90s and its rarely about cheap land and more about the areas infrastructure. Otherwise that facility would be larger.
Relatively small data center compared to what I used to work in. The components are always similar and usually just differ in scope and footprint.
I'm definitely going to borrow the doors in the floor idea when I renovate my basement. I don't have enough room for a raised floor but I can put the doors between the joists in the ceiling.
I'll take "What is a fire stop" for 1 million Alex.
As long as the doors are made of something fire resistant they are as much a fire stop as can lights.
the amount of power in that little room holy smokes!
love your joyful excitement..
Fascinating! Great video. Thanks.
Glad you enjoyed it!
So sad to see the generator and those UPS which are no longer in use... with all the spare parts. I want to keep them from being taken apart... but I'm too far away unfortunately.
Hmm... NetApp has a data center in the Research Triangle that MIGHT be filmable. They promote it to sell copies of it... reason? It is FAR more efficient than most data centers... the front of racks is a reasonable human temp.. the back? Very warm... They have huge diesel generators to keep things alive during the normal Duke Power problems... One of the nice side effects of having the racks isolated is that a large percentage of noise goes away when you have the backs of the racks sort of isolated from the fronts... (Lord knows a NetApp Filer is NOT a quiet animal...)
Whooo, thats crazy. I drive by this place on a daily and the only reason I knew this was a datacenter was because I applied to work there. Fun to see the inside of it finally!
I swear - after building and commissioning a bunch of TV stations (which are not too unlike data ceters) concrete floor tiles are the shittiest things to cut holes in for racks.... and then when you cable the rack and pull the cables the steel tops and bottoms are like razor blades if you dont put capping on them.
Also the misters in the corners on the cieling. That spray water into the air if humidity drops.
Also the cut fiber, any new client will NEVER re use any fiber or cables and so the vacating co wil jsut cut and pull servers out. Faster abd cheaper. Reusing any cables in a 24x7 datacenter operation is a bug no no. Murphy will show up!!!
>any new client will NEVER re use any fiber
Never underestimate corporate cheapness. I could easily see a small operation letting it ride. And FWIW It's singlemode fiber, pretty easy to test and run it, so I wouldn't blame em too much.
That is actually not true. You can have phones, but can not take pictures. Now you can take pictures within your own cage if you are in a colo of your equipment. This is for many data centers I've been in in the DFW, Austin and San Antonio area.
after you get accustom to a 500+ (and growing) acre DC that looks smaller than a comms closet...... how quickly we forget....
I have had to snip a lot of fibre panels out before , it just feels wrong cutting a fibre, even more so a bundle of like 20!
I worked in the Data Center for the Atlanta Federal Reserve back around the year 2000. I was the LMS administrator for the first online training server they installed. It was about the same size as the one you were visiting. Nowadays that Data Center has been shrunk down to a small room with one cabinet of servers. :)
Eyyyyyy the company I used to work for had a brand that had a lot of servers in that datacenter! It was a pain in the ass.
Where I worked I had to crawl under the raised floor to trace cables. Too many tape drives to lift the tiles.
Wait, shouldn't the sprinkler system put out argon instead of water in a data center?!
In a modern one, yes
Why did they cut the fiber??? Like if you're not gonna take it, why make it harder for the next person coming??? I don't see how that assisted the previous people in any way.
some building leases require that to be done, and its easier to cut it as well, and of course its a security risk, as you dont know what the last person may have done to it
it saved them from having to put blanking caps on all the switch ports. Maybe they forgot to bring caps or were just lazy.
another factor is likely speed of removal of the equipment, unplugging them might take a few seconds per line, times 50 lines, or just 5 seconds to slice them all.
Really neat to see. Thank you for sharing!
Hopefully you hit up Iversen's while you were there, awesome bakery.
They are good
I remember once I had setup my desk on a cardboard box in a place like that while I was installing some servers, and they probably had well over a million dollars worth of equipment, thinking of the irony of the value of my desk...
i had always heard you couldn't cut fiber like that but I didn't know if that went by the wayside when they switched to plastic core over the older glass style. I never get to terminate those so I've always been curious. Cool video btw I've only heard a quiet datacenter room when we're starting a NEW warehouse.
Well not if you plan to use it but for demo...who cares
@@mrmotofy I asked bc I have that scenario, but the cable is also underground. Trying to fix without a full replacement.
@@TannerWood2k1 With proper cutting and equipment it can be done. But in the middle of a run may not be possible. Run conduit just for that reason.
@@mrmotofy the issue is that the conduit was too small and the LC end separated leaving just the rip cord attached. It's om5 outdoor jacketing and LC connector replacements won't slide over. So if it were damaged it would be right at the end and hopefully not travel back too far (and I have service loop slack to play with). I ran a light tester on each pair and it gets there but I don't know how to test intensity. Guess I need to find a local fiber tech to come show me how it's done. I have a cleaver and all the other stuff I should need except proper fitting end connectors. Thanks for replying bc I assumed the whole cable was done for.
A flywheel would technically still be a type of UPS, just not an electrochemical (battery) kind.
7:50 The good old ONS 15454! These suckers are built as solid rocks! Depending on the age and use case, that could be a ROADM with transponders or a SONET MUX!
Cisco still uses these chassis to this day as NCS 2015's and NCS 2006's!
One thing I have wondered is while a flywheel UPS makes sense because you just need enough time for the generator to fire up, is the real future in better battery tech like LFP batteries? The idea being with all of this 'green' power generation tech being unreliable and variable throughout the day, the UPS doubles as a power buffer to soak up extra power when renewables are plentiful, say in the middle of a sunny day and then discharge when they are not, say in the evening when the Sun goes down and everyone comes home and turns on stuff?
Another thing I have wondered is would it make more sense to do a dual fuel natural gas and propane generator instead of of diesel? The idea being by default you would run on natural gas, which should be limitless and not as bad on the environment to burn than diesel and fall back to stored propane in the event the natural gas supply fails, say an earthquake ruptures lines? Just thinking it would be less to manage than shipping in diesel on a regular basis and having to do something with all of that fuel before it goes bad. Propane never goes bad, so if you go years or decades without disruption to your natural gas supply, the propane can just sit in the tank. Natural gas is typically piped all over the place and usually still works in a power outage. You could also have an LNG tanker truck hookup for emergency use in case you can't get a propane tanker in a disaster situation or just find it is cheaper to source the LNG and know you will likely use it up before it evaporates. Maybe even get into some co-generation with the power utility to help give those generators a good workout, especially as natural gas fired power plants are generally used as peaker plants these days.
If you had good battery tech and good generator tech, maybe a data center could run off of batteries during peak grid demand when power is most expensive and dump power onto the grid from its generators at the same time. Especially some of these newer data centers are so big, they could use grid scale power generation equipment anyway just for the site. Anyways, I find this backup power stuff usually works best if it is setup in a way where it is in daily use. It is when the backup power is really only a backup that rarely gets touched is when you find it doesn't work when you need it. Even just test firing on a regular basis may not tell you everything you need to know and wastes a lot of fuel for nothing other than to know that the thing still turns on. There have been data centers put together where backup power does not cover cooling and so the whole place shuts down anyway when something happens because you cannot run a data center without cooling. These kinds of solutions talked about here will be making money for the operator, so easier to justify to the bean counters that it has to be sized up to cover everything.
Data centers also need to adopt a 400V DC system. When you think about it, an old data center like this wastes tons of power on doing all of these power conversions. With a 400V DC system, you do one conversion to the 400V DC standard, the UPSes are charged to the voltage of the other in-line UPSes, and then just directly connect into the 400V power bus. There is no real conversion in the UPS beyond a two way charger to get the batteries to the right voltage before connecting to the main power distribution bus. The power then goes direct to the servers. It is straight main converter to the 400V DC power bus, the UPS batteries are inline with the power bus, and the power bus goes straight to the servers. The server power supplies don't have any AC/DC conversion hardware, which is generally about half of the power supply, instead just skipping to the DC-DC step down after. Much more efficient and cheap to mass implement. Electric motors running the cooling system are also much more efficient when running variable speed inverter driven motors, which could get power from a 400V DC power bus. For grid balancing applications, the main power converters to the 400 V DC buses could go from say delivering twice the power and thus charging all of the UPS batteries at the same time to completely shut off and the data center running completely off of the batteries and constantly varying in-between while balancing the grid. This will provide good stats on battery health on a daily basis. With the ability to have multiple UPS battery banks in parallel, you can take a problem battery pack offline and swap it out. With LFP batteries, they should be able to take this daily use case for decades. With a setup like this losses in the system are very minimal as LFP batteries are pretty efficient and you cut out conversion steps that reduce the efficiency and raise the installation cost of existing grid scale battery deployments. I mean the conversion hardware is already there to power your servers, you just piggyback off of it to also charge up batteries and then the batteries can discharge directly into the servers without conversion hardware in-between and the servers themselves drop some of the internal conversion hardware, making it all even more efficient to do.
I used to colo a server there. Bummer to find out it’s shut down.
I wonder why the companies abandon theses places, too much maintenance costs or something...
So many racks that can be used in home labs :P
you mean will be?
I’ve never seen a datacenter so small before! That’s quite I credible!
Is this the DC1 by Nexs ?
I think this used to be a Nexus datacenter.
Too bad you didn't have that LTT pull, that flywheel system sounds interesting, now I know what I'm gonna google/rabbit hole tonight
Yes, hope to review one some day
Wow, that’s really old equipment 80’s / early 90’s I reckon
With a lot of the power equipment, that stuff lasts forever with maintenance. The edgeiron switch under the desk points to mid-late 2000s for me.
No VESDA? Water instead of gas? Single Genset? Plenum cooling? No perimiter fence? No STS for A/B feed to racks? meh..
Its worth pointing out that this particular datacenter has not been actively used aside from 1 or 2 colo clients for the better part of the last 7-8 years. Maybe even longer. The primary datacenter is much larger, and has all the fancy bells and whistles you speak of :)
Have you worked in any data centers in Northern VA?
I have not
@@LAWRENCESYSTEMSthe buildings are insane. Would love to see what’s inside
I've found that the oil heater on those Cummins Generators can be "very hot" even when not running
Yes, it can be!
For me this quote sums up the video "I want to touch everything" mwhahahaha
Yeah, SO MANY SWITCHES & BUTTONS!
Interesting video and walkthrough
Glad you enjoyed it
This video holds simultaneous records for the most and least switches ever to appear in a (non-talking heads) Lawrence Systems video.
It’s a Cisco ASR, data center probably was running a multi-homed ASN
that's not an ASR, it's an optical chassis (looks like maybe an ONS 15454). It would have connected to AT&T's SONET network ring and be used to break out whatever circuits were coming into the datacenter (OC-3, OC-12, etc.)
Hi Tom! only one generator? where is the redundancy ? BTW, you sound very excited :D
where is it located?
Dearborn Michigan
this is so cool thanks for showing