What's Inside This Decommissioned Data Center?

Поделиться
HTML-код
  • Опубликовано: 26 дек 2024

Комментарии • 289

  • @bradthx
    @bradthx Год назад +191

    I worked in that DC for years, many years ago. The fire suppression system is a pre-action dry-pipe type where the pipes in the ceiling to the sprinkler heads are full of air. Digital smoke / heat detectors would need to trigger to charge the system with water, then the sprinkler head element would also need to burst to allow water to exit the heads. This is why the system in the front closet was so complex. It prevents someone from flooding the DC by accidentally breaking a head with a ladder.
    The building also has some interesting history. In the late 1990s, the 'Godfather of Spam', Alan Ralsky hosted his servers in that DC. There is a big long story of it I'm sure somewhere still around in the internet archives. In the rafters of the back MDF room with the cut fiber, there should still be some binders full of paperwork up in the ceiling for the original fiber orders with his name on them.

    • @LAWRENCESYSTEMS
      @LAWRENCESYSTEMS  Год назад +34

      That's interesting, I can probably get back into that building again to look for them.

    • @Adrian-jj4xk
      @Adrian-jj4xk Год назад +2

      Hi Brad! hope you're doin' well

    • @davidew98
      @davidew98 Год назад +2

      It looks like it was an old mainframe data center back in the day too!

    • @kieranwilliams3052
      @kieranwilliams3052 Год назад +6

      Same here! Had a office there and you know what , I need to go back.
      Let me see how much they got it up for and perhaps I will lease it .

    • @jhippl
      @jhippl Год назад

      @@kieranwilliams3052 How many megawatts is it?

  • @sandy1653
    @sandy1653 Год назад +200

    The only thing creepier than a quiet datacenter is a datacenter that all of a sudden gets quiet when you're working in the power room.

    • @LAWRENCESYSTEMS
      @LAWRENCESYSTEMS  Год назад +9

      For sure

    • @electroshed
      @electroshed Год назад +7

      Got the t-shirt, the sounds of all the UPS's beeping still haunt me 🤣

    • @MrPir84free
      @MrPir84free Год назад +9

      Was in a server room/datacenter back in 2007. Was installing several beefy new vmware hosts. The boss was in there watching, with his hand over the emergency cutoff for the UPS, playing with the little cover that spins to give access to the cutoff button. Well, the server room went completely quiet in a split second; you should have seen the looks on his face.. What happened; well, the UPS had a single point of failure, with a control board that just happened to go at that moment as we were powering up like the 4th or 5th server. The massive UPS had been purchased a few years earlier, and the manufacturer issued a recall to replace the board, but well, there was a disconnect between the manufacturer, the vendor that performed the install, and the customer that had never received the notice that the board needed to be replaced.. Single points of failure were the typical Modus Operandi for the company ( the company with the servers.). A single HVAC system. A single massive UPS. Single entry/exit. No generator. Internet circuits that were nowhere near redundant, etc, etc.

    • @DMSparky
      @DMSparky Год назад +3

      As an electrician who often works in datacenters that’s my biggest fear.

    • @hansmuller1625
      @hansmuller1625 Год назад +4

      Back when i worked with cooling datacenters, my policy was to never screw with the firmware in more than one unit at a time, because of previous mishaps. I had a workmate who was a bit on the wild side, he flashed them as quick as he could, done via ethernet, and i ran around frantically configuring all the parameters. Once done, every single unit suddenly shut down without explanation. Fortunately they all rebooted and were up again within minutes, but that time in between was scary.

  • @TheMongolPrime
    @TheMongolPrime Год назад +86

    Hey Tom, I'm a DC manager + administrator. We use flywheel UPS' and love them. I hated paying for the battery replacements every 3-5 years. Such a waste of money. I also vastly prefer aisle containment rather than just balls-to-the-wall CRAC cooling like this. Big waste in efficiency and PUE there too. Thankfully our DC is in a dry state so we utilize evaporative cooling in tandem with CRAC and CRAH's. Lastly, I think that COMs under the raised floor is the worst. Huge pain to install and maintain, and usually blocks air more than if you put power-only down there. COM's from top-down, power from bottom-up is the preferred method we have. Also, who the heck cuts fiber?! Sounds like a great way to accidentally blind someone if they forgot to disconnect the transmitting side.

    • @davewood406
      @davewood406 Год назад +1

      The last few new DCs I've been into, the raised floor, if they even went with one, was just a plenum. Zone 4 EQ though so they are kinda at a disadvantage with the extra bracing underneath for the bays. Plus if you have to move a beast like a CRS1 or other pre configured bay, you have to temporarily crib the travel path under the floor. These were primarily -48v DC powered telecom facilities. So they used tank cells that with at least annual attention last decades. They use a lot of space though.

    • @johngermain5146
      @johngermain5146 Год назад +2

      With the generator and transfer switch smaller, less expensive, batteries can be used...I thought the maintenance on the flywheels "vacuum bearings: made flywheels less desirable but admit, battery maintenance was a real pain.

    • @davewood406
      @davewood406 Год назад

      @@johngermain5146 yeah, if your facility is primarily ac powered, the flywheel setup makes a lot of sense. Most everything is 48v DC with telco. You have E911 requirements as well so it’s kind of belt and suspenders where they’ll have 8 hours of battery plus enough fuel that the generators can often run for days depending on projected worst case resupply. Went to one that had batteries in every conceivable space plus 2 10,000 gallon diesel tanks. N+1 generators on site. Multi Megawatt sized facility though. Well MVA probably officially but I’m a DC guy and watts makes more sense to me.

    • @jonathanbuzzard1376
      @jonathanbuzzard1376 Год назад +1

      Water cooled rear doors are way better than isle containment. More expensive upfront for sure but they last a long time. Ours are alread well over a decade old and look like new. Basically a large car radiator, and they last for lifetime of the car these days out in the open not in a nice clean dry data centre.
      What I noticed was lack of fire suppression

    • @fpskywalker
      @fpskywalker Год назад

      I’ve seen those fly wheel systems and they are really cool. Spinning super fast in a vacuum and sure enough they provided enough power until the generators came online.

  • @TechnoTim
    @TechnoTim Год назад +55

    I wish I had a sub floor for wires and possible flood!

    • @LAWRENCESYSTEMS
      @LAWRENCESYSTEMS  Год назад +21

      Yes, my studio is in the basement and I have two water alarms and soon I will be installing backup power and a SECOND sump pump, just in case..

    • @TheDillio187
      @TheDillio187 Год назад +7

      @@LAWRENCESYSTEMS check out the triple safe pump. 2 AC pumps (primary/backup) and a 12VDC pump for those bad days.

    • @JeffGeerling
      @JeffGeerling Год назад +2

      @@LAWRENCESYSTEMSI have a hot spare to my sump now, after my last one failed and I had a nice pool of water under the rack.

    • @jamess1787
      @jamess1787 Год назад

      Can anyone say GENERATOR INLET and inverter for your car? 🙏 😉

    • @AdrianTesta-Avila
      @AdrianTesta-Avila Год назад +1

      I use pallets : p

  • @repro7780
    @repro7780 Год назад +2

    As a person still working in a datacenter, and have for 35 years, I find this sad. There were once people working there, who's jobs are most likely gone, or sent overseas.

    • @franklaszlo8166
      @franklaszlo8166 Год назад +1

      Nah, just moved across town. A newer/bigger datacenter was built to replace this one less than 10 miles away. This one was kept alive for a number of years with just a few colo clients in there and a skeleton crew to manage it.

  • @carldonohue4806
    @carldonohue4806 Год назад +39

    This is hilarious .. Tom has NEVER looked so gleefully happy in any of his videos. Totally nerding out about a decommissioned data center .. like a kid in his favorite candy store. that’s probably the best testament what a great IT guy you must be, LOL.
    Cheers

    • @LAWRENCESYSTEMS
      @LAWRENCESYSTEMS  Год назад +10

      I'm always excited when I'm in a data center or server room!

    • @carldonohue4806
      @carldonohue4806 Год назад

      @@LAWRENCESYSTEMS ha ha I can tell … Nice!

  • @thenoisyelectron
    @thenoisyelectron Год назад +17

    *Starts digging a hole in the basement to create a false floor*
    Wife: what now?
    Me: This is how the big boys do it!

  • @Av-vd3wk
    @Av-vd3wk Год назад +5

    2:45 the floor isn’t primarily designed so the cables can traverse underneath - it’s for the airflow…CRAC units distribute the airflow from underneath. This is why you see the grated/perforated floor tiles and large gaps under the cabinets.

  • @MikeHarris1984
    @MikeHarris1984 Год назад +6

    Also, air conditioner is pushing air under floor and the holes in front of cabinets have cold air go up Infront of the servers. Hot air fills room and pull hot out of the ceiling.
    Also the raised floor has colors. When a floor has grey lanes of tiles, that means it's a heavier duty weight limit And you can use a pallet/pallet jack on it. If floor times are white, you can NOT use a pallet/pallet jack as it's not specd to hold that weight. You have to unpack and carry items into the raised floor.

  • @ryanm9318
    @ryanm9318 Год назад +6

    Not going to Lie, seeing him tapping the EPO on a shut down data made my heart jump, I am an electrician / rack / fiber duct installer for data centers.

    • @mjackstewart
      @mjackstewart Год назад

      I got the chance to hit the EPO once when we needed to test it in our data center.

  • @LOCOLAPTOP
    @LOCOLAPTOP Год назад +1

    I just decommissioned one of our oldest DC one a few months ago. No raised floors, It was all RTU A/C units and ladder racks, dual generators, APC symmetra 80kva. Wasn't fully redundant which is why when the lease end was (is) coming up it was a new brainer to start over with new equipment in a newer location.

  • @itsmeohmeandme
    @itsmeohmeandme Год назад +4

    I used to work in this datacenter when it was owned by a web hosting company before that company was acquired by a bigger web hosting company.

  • @warsurplus
    @warsurplus Год назад +3

    The flywheel at the other data center that you referred to is the UPS. The flywheel is not instead of a UPS, it's just a different type of energy storage vs. lead acid batteries. The Liebert unit you showed is an always inverting type where it's delivering DC to AC loads all the time.

  • @deo-max9229
    @deo-max9229 Год назад

    When I was in field service back in the mid and late 90s, I worked in many data centers. I remember cutting out bus & tag cables underneath the the floor as well as running fiber and plugging in cables to devices and mainframes.

  • @therealchayd
    @therealchayd Год назад +1

    Great tour and definitely eerily quiet! That pretty much looks like the first DC I worked in (Built in the '90s) with all the cabs out in the open, everything's gone all environmentally friendly and energy saving since then with enclosed hot/cold aisles. I'ts amazing the amount of kit that telcos just abandon when sites get decommissioned, I've got a ton of OLT kit sitting in a shed at home from old circuits where when asked if they want then back, the telcos have just said to scrap them, no idea what use they'd be, but hey! 😅

  • @the48thronin97
    @the48thronin97 Год назад +3

    That’s wicked cool! I love seeing decommissioned buildings like that, I got the privilege of seeing several buildings on my university’s campus shortly before they were torn down.

  • @joshuawaterhousify
    @joshuawaterhousify Год назад +23

    I work in a datacenter, and every time I see a video with an empty one it feels so creepy, because I'm so used to the noise and it's almost haunting to see that kind of environment and hear nothing.

    • @kwith
      @kwith Год назад +4

      I remember in one data center the UPS had failed and we were running solely on commercial power. (I don't know the specifics of why the backup generators weren't connected) The power went out and everything went dark and silent save for the emergency lights. Its amazing how accustomed to noise you get when it suddenly stops and you become VERY aware of the absence. Its so creepy.

    • @joshuawaterhousify
      @joshuawaterhousify Год назад

      @@kwith I don't want to even imagine that. We have policies, plans, and procedures in place, but that kind of thing happening is straight up a nightmare to me. Ideal setting for a horror game to bug me now than normal XD

    • @kwith
      @kwith Год назад +2

      @@joshuawaterhousify Well fortunately the data center in question was for Dev and QA environments, nothing in production was affected. So a few projects were impacted and some dev workers were annoyed and inconvenienced, but nothing major or critical was affected.
      But the creep factor was in full effect lol.

    • @joshuawaterhousify
      @joshuawaterhousify Год назад

      @kwith that's something at least; if it's going to happen, that's where you want it to be XD but yeah, I can imagine it would still be creepy as all hell, and would not want to be there :P

    • @kwith
      @kwith Год назад +1

      @joshuawaterhousify yea unfortunately I didn't have much of a choice. Someone had to do some troubleshooting and verifying that everything came back up. I pulled thr short straw on that one haha

  • @Natespage
    @Natespage Год назад +1

    Data Centers aren't supposed to have water sprinklers for fire suppression. Water ruins the data. Halon would be a better choice.

  • @mikecumbo7531
    @mikecumbo7531 Год назад +1

    A question about the genny, I worked for a company and their corporate data center and IT department was in part of our building. They had two independent generators, one diesel and the second natural gas. The primary was the gas unit. They had 7 days worth of diesel on hand. I left in 96, do people spec single or double generators now? Is it just how paranoid management is?

  • @TechOne7671
    @TechOne7671 Год назад +1

    I have a mid 90s comms room/ server room at work, it’s only about 30ftx30ft. When our systems were upgraded to fibre and internet around 2010 a new room was built and services switched over. The old room was abandoned, everything powered down and left as it was, only the lights work now 😂. What was once state of the art equipment is now just electronic waste. As we don’t need the space just now, it is cheaper to leave it alone. Cool video. All the best.

  • @derkaderkamohamadallaackbarnut
    @derkaderkamohamadallaackbarnut Год назад +1

    I wouldnt say the floor is lifted "for the cables to go under", that is just a nice feature that comes with raising them. But they are raised for floading and air flow reasons.

  • @jasonpeters9532
    @jasonpeters9532 Год назад +7

    I understand the security concerns for not filming and/or noting the address for an active data centre, but I am curious what the legal reason wouild be.

    • @LAWRENCESYSTEMS
      @LAWRENCESYSTEMS  Год назад +15

      Legal department looks for the least amount of issues and telling us NO is the easiest way. There are sometimes customer names on cabinets and even where there is not the customers may get nervous knowing people are there filming.

    • @a.m.653
      @a.m.653 Год назад +4

      @@LAWRENCESYSTEMS >and even where there is not the customers may get nervous knowing people are there filming.
      I don't understand this. Why would the customer be nervous? Is this a case of "security through obscurity"?

    • @boneappletee6416
      @boneappletee6416 Год назад +7

      There's often lots of information that can be gleamed from filming which alone might not be important but I'm combination with other public or private info might be hazardous to corporate espionage targets or similar.

    • @mrmotofy
      @mrmotofy Год назад +1

      Many times contracts have clauses in them about revealing who is where and what they have kind of thing

    • @TravisNewton1
      @TravisNewton1 Год назад +5

      A lot of facilities are colocation (so it's a lot of companies in one data center) and I've been in some data centers with high profile clients - like hospitals, payment processors, even a lot of local telcos will use colo over running their own data centers - and that just paints a big target on them. And in a lot of cases, these larger clients will make the data center sign their own NDAs and legal contracts. So to make it easy, Legal just puts a blanket "NO" across the board. Especially because I've seen racks and cages where those high profile clients will leave out laptops, printed diagrams, etc. since they're in a secure facility, so if you walk by with a camera and capture confidential information, then the data center could be in hot water.

  • @kwith
    @kwith Год назад +2

    I worked in a data centers for 10 years and so much of this is familiar. The Liebert CPCs, the single-mode fiber hanging everywhere, the ladder racks racing across the ceiling, yellow fiber trough, good memories. I still work in one periodically but not to the extent I used though. That raised floor, I worked in one that had a 12-inch raised floor, one that was 18 inches and one that was 36 inches. (Yes, three feet!)
    From the looks of it, they didn't have cold air containment in the cold aisles did they? Were those Writeline racks? I remember the older DUC70 cabinets, that ugly beige color. THOSE were a pain in the ass to install in.

    • @deepspacecow2644
      @deepspacecow2644 Год назад

      Did anyone ever fall in the 36in floor?

    • @kwith
      @kwith Год назад

      @@deepspacecow2644 Not that I know of but we are pretty strict on safety. So when a tile is taken up and we were underneath, we had to put up cones and signs. Our safety guy was VERY militant about that and rightfully so.

  • @johngermain5146
    @johngermain5146 Год назад +2

    I loved testing the generators and transfer switches both from the UPS and by killing power. It was a frequent testing requirement from the insurance co. I also liked it when the halon fire system went off.

    • @Jamesaepp
      @Jamesaepp Год назад +1

      >I also liked it when the halon fire system went off
      That does NOT sound like fun.

    • @johngermain5146
      @johngermain5146 Год назад +1

      @@Jamesaepp expensive too, causes shunt trip of power also

  • @TheAntibozo
    @TheAntibozo Год назад +6

    That UPS battery stack'll kill you right dead if you touch the wrong thing, powered off or not. Please be careful and warn people not to go poking around inside room UPSes.

  • @v12alpine
    @v12alpine Год назад +2

    Back in 2004 I was working in phoenix sky harbor airport in their little datacenter on-site. Probably 1/3 the size of this one. It ran all the flight displays and the parking garage space counters. Was plugging in a new rack and apparently they wired 208v to a 120v plug under the tile. Big flash and pop out of the rack, followed by an uneasy silence as the entire room got quiet. Within minutes managers/supervisors were busting down the doors. Fun times.

    • @UnreasonableSteve
      @UnreasonableSteve Год назад

      Frankly it's a little surprising that that would even do much - virtually everything I've powered in a DC is 240v tolerant (with some requiring 200+V), but I can see something in 2004 being just old enough to be problematic.
      Or if it was a plug-in UPS, which I could see being the case in a little server room like you're talking about.

    • @jfbeam
      @jfbeam 10 месяцев назад +1

      Well, it they were stupid enough to set it up such that a single shorted (overloaded) outlet could take down the entire room, they totally deserve to be shamed.

  • @rfitzgerald2004
    @rfitzgerald2004 Год назад +2

    I haven't visited many datacentres in my time, but working for an ISP I do go in the same few fairly often. What surprises me in this video is the lack of physical security - the building seems so exposed from the outside. The datacentres I typically visit are surrounded by double barbed wire fences, have security patrols and all internal rooms protected by very restrictive access control. Not to mention all power systems, including generators being built into the buildings. They're not so strict on phone usage though, while taking pictures is strictly forbidden, no one takes your phone on the way in for example it's just more of a trust basis.
    What type of facility was your one, owned by a single company or was it colo space?

    • @jfbeam
      @jfbeam 10 месяцев назад

      All that modern "security" is just theater. It's there to impress customers. A place like this one is more secure by the fact few know it's there. And even fewer know what ("who's stuff") is in there.

  • @jeffm2787
    @jeffm2787 Год назад +15

    That's one Tiny datacenter.

    • @LAWRENCESYSTEMS
      @LAWRENCESYSTEMS  Год назад +4

      yup

    • @michaelmartin8036
      @michaelmartin8036 Год назад

      I worked building maintenance at the one in Atlanta (40 Perimeter Center 1978) and ours was 84k sq ft. 2 story, the bottom being the actual "Data" portion and the upper is where all the programmers were. We did all the electrical, plumbing, HVAC and work stations.

    • @jeffm2787
      @jeffm2787 Год назад

      @@michaelmartin8036 Tiny as well. I use to walk around Iron Mountain Phoenix.

    • @jfbeam
      @jfbeam 10 месяцев назад

      Relic of a "long past" age. These days, one could have more power in a single rack than that entire building. (I've used "small" [4U] blade centers that used 4kW... the whole rack only had 5.7k fed to it.)

  • @markiangooley
    @markiangooley Год назад +1

    I got a glimpse of where the ILLIAC IV computer used to be on the University of Illinois campus in Urbana, years ago. I think there was a Cray there at the time. For all I know there aren’t any important computers in that building any more, over thirty years later.
    Raised floor, big cooling system, not all that different, accounting for the changes in technology.

  • @michaelhess4825
    @michaelhess4825 Год назад +1

    Oh and we always snipped fiber when doing rebuilds/moves. So much quicker, and fiber is actually very cheap. I miss configuring Cisco and Fujitsu ons platforms, very unique vs "normal"switching platforms!

  • @WM-ln4dz
    @WM-ln4dz Год назад +1

    I've actually driven past there a few times, and I always assumed it was a closed bar or something like that, lol. Great video!

  • @diamonddave45
    @diamonddave45 Год назад +4

    Looks a lot like the development labs that I've either been in or operated. Loved the raised floor, makes life really easy. I've also been in telephone central offices where everything's overhead. Great stuff!

  • @wesley1983
    @wesley1983 Год назад +1

    1:02 even if it was powered up, you could still touch all that

  • @Parkhill57
    @Parkhill57 Год назад +2

    We had a UPS explode in our data center in about 2001. Big fire, and lots of water damage, because the fire department couldn't get in. The guy who had the keys got into a car accident rushing to the site. Now the center has no UPS in the server rooms. A fire wall between it.

    • @Parkhill57
      @Parkhill57 11 месяцев назад

      @@wd5vd page 2: They used concrete saws and cut a hole in the wall after what seemed a long wait.

  • @steveurbach3093
    @steveurbach3093 Год назад +3

    WHY! would you chop the fibre and leave the racks? (I did notice that there were no PDU's. Gotta love 60A 208 3phase PDU per rack and the redundent)
    Security procedures vary all over the place. Some just had badges (and maybe a code). Some have you put your hand on a scanner And badge. One I was in had people locks. Door1 had to be closed before door2 would even accept the hand+badge.

    • @bmbiz
      @bmbiz Год назад

      I think they call those "mantraps". (edit: it's one word, not two)

  • @rocktheworld2k6
    @rocktheworld2k6 Год назад +1

    Very cool! That building is a very short drive from my house! Driven by it countless times without even thinking about it. It's right by that really weird intersection with Carlysle and Pelham!
    Maybe they can bring those UPS batteries to power my house and help deal with my constant DTE power outages...
    Would be interesting to see inside the AT&T building nearby on Michigan Ave with the door labeled "collocation entrance."

  • @LatitudeSky
    @LatitudeSky Год назад

    Currently work in a former data center like that one, with the same raised floors and Liebert HVAC. They kept the layout even without actual server racks. The Lieberts break down a lot. The room has two. With just one, we exceed 110 degrees just from ambient. Really need two to be survivable and again, we don't even have servers. The best part about the Lieberts is how they smell like a cross between old socks and an old fridge.

  • @xrayjoe
    @xrayjoe Год назад +2

    Tom was def in his happy place roaming around there.

  • @ShainAndrews
    @ShainAndrews Год назад +1

    7:54 Cisco 15454. Previously Cerent until Cisco does what Cisco does and bought them. It dates the operation a fair bit. I could probably still provision one from the ground up despite not logging into one for 15 years.

  • @franklaszlo8166
    @franklaszlo8166 Год назад +4

    The datacenter in Southfield is much more interesting. Though, they probably wont let you film there. I hadn't seen this since it was decommissioned, pretty cool.

  • @ljubomirculibrk4097
    @ljubomirculibrk4097 Год назад +3

    These UPS caps are expensive, up to 100 euros new.
    They coock off from heat easy, had one pair killed by lighrning, rest of the UPS survived

  • @michaelhess4825
    @michaelhess4825 Год назад +1

    Such a cute little data center! Most people don't know these exist, let alone the scale of modern, huge, ones.

  • @todayonthebench
    @todayonthebench Год назад +4

    Ah, a good example of a data center where you least expect it. These are actually very common throughout the world.
    The main reasons for keeping a data center's location a secret is purely hardware security. Having direct access to a server is far more dangerous than almost any software attack.
    Data centers that just offers VMs tends to be more relaxed about this type of security, since the hardware itself is more or less just a standardized box without any labels hinting at who rents it. Ie, it is far far harder to do a targeted attack at a specific organization. But even a lot of these data centers hides.
    In Stockholm where I live I know of 2 data centers that are in actual cold war era bunkers. Their locations are fairly well known, but that doesn't make it easier to access, since it is a litteral bunker. Though even highly secure facilities have been targeted through more indirect means before, just dropping a few "thumb drives" outside the premise to try to pull of a HID attack isn't a new thing. This becomes a lot harder if you don't even know where the secure facility is located.

  • @PaKa-kj3rj
    @PaKa-kj3rj Год назад +1

    I thought "finally" at the end when I saw you with the mini hammer and the emergency glass. That would have been the most tempting. You controlled yourself well.

  • @matthewmiller6068
    @matthewmiller6068 Год назад +1

    "normally can't open these" man that's so wild compared to places where I've been in computer lab support server rooms and its like "yeah just go find the breaker turn it on" with various 120V/208V/480V different 3 phase delta/wye, and 60/400Hz power systems. The plumbing stuff and UPSs are neat though, that part I can't see places I've been.

  • @Coderjo.
    @Coderjo. Год назад +1

    How long ago did Nexcess decommission it? It looks like they still have it listed on their datacenters page. I wonder how much noise the neighbors, such as the dance studio next door, heard.

    • @LAWRENCESYSTEMS
      @LAWRENCESYSTEMS  Год назад

      Not sure but the generator was last serviced in 2022 per the tags on it.

    • @Adrian-jj4xk
      @Adrian-jj4xk Год назад +2

      i think we only finally decommissioned it a year or so ago. re: noise: the generator only ran during anticipated/actual power outages. other than that, the walls and thick doors keeps most of the noise inside.

  • @djjones2407
    @djjones2407 Год назад

    i work in hvac controls. those holes and the ac units are normally called CRAC units. (computer room air conditioning)

  • @jprice1485
    @jprice1485 Год назад +2

    Even though it's quiet, I can still hear it!

  • @loofers
    @loofers Год назад

    cabling is often cut like this due to not wanting the gear to be reusable, if the tenant were to sell it to the building owner/landlord, and if there were ANY issues down the line, there could be a giant legal mess that the previous tenant might get pulled into. that and the obvious "not my job"/expediency thing, and sometimes to spite the building owner/landlord. also sometimes shady contractor things in the agreements "tenant will use our preferred vendor for network cabling install work"

  • @TheMrDrMs
    @TheMrDrMs Год назад

    Looks like it was a small business local DC, especially with only 1 generator. Any DC I've been to (and have eqpt at 4 locations) has at least 5 gennies, usually N+1. Two major DCs I'm at also have contracts with fuel (diesel) suppliers that when a major storm is inbound, they not only have their multiple thousand gallon underground tanks, but to have a couple tankers parked in the parking lot in case of an extended outage. The inners and operations of DCs are fascinating. Only a few min in on your video, but so far very interesting.

  • @couttsw
    @couttsw Год назад

    That's a small data center, when I worked for Nestle, our server room was bigger. 5 IBM mainframes and 200 compaq servers.

  • @david.mcmahan
    @david.mcmahan Год назад

    It wasn't a data center, but I worked in a place that had the raised machine room flooring in our main workroom. Funny, just seeing that ramp into the room and and those floor tiles triggered old memories of clocking in.

  • @TonyPadgett
    @TonyPadgett Год назад

    Who tracks all the racks and what is connected? Is there some kind of master tracker?

  • @scoobtoober2975
    @scoobtoober2975 Год назад

    I was once a short term generator tech. ATAT was our good customer. One of the bigger sites, the "switch's" where all the local calls are routed through. Then the calls go microwave to microwave tower all around the country. For instance. the whole east coast has microwave towers all down the coast it doesn't need any wire/fiber to go all the way. Then to fiber maybe depending on how far across the planet. Detroit
    The other cool generator was when the main standby 1 megawatt one needs a part in a switch building. It then needs a back up of the backup generator. Brought in a trailer generator. 8 cylinder twin turbo diesel cat. OH MY GAWD. the startup on that is violent and the airflow for the radiator will suck a golf bawl through a garden hose. Very fun but i had to go make a bit more money at a desk job.
    Love you channel
    I'm also here to say my old black box heater poe edgerouter x died. Replaced it with a lite 8. Firmware it had or suggested, was broke. They eventually fixed it as my google foo wasn't good enough to figure out how to put an older firmware on it. Or just my unique basic PC hosted controller (up-to-date). It wouldn't take an adoption and hold. It would fault out. Over and over, reset after reset. Firmware update came along a few weeks later. voila. Great.
    I do like the switch setup and its setup in the controller instead of standalone like the old edge router.
    Time for some cameras i think.

    • @ShainAndrews
      @ShainAndrews Год назад

      They are called Central Offices (CO's). Nothing is on microwave anymore. At least nothing significant. Everything is on fiber. Both coasts, everything in between, as well as oceanic. Power consumption is even less than back in the 90's. You go into a CO today and I guarantee there are empty floors that were jam packed with telephony gear. Hell one floor would be dedicated just to terminate all the outside plant cable pairs.

  • @Heizenberg32
    @Heizenberg32 Год назад +2

    Most data centers I've been in are more chill about cell phones. Even the super high security ones (security armed with automatic weapons on full display!)

    • @UnreasonableSteve
      @UnreasonableSteve Год назад +1

      Depends on a lot of things. I've been to some with fully enclosed racks that didn't care *too* much but then again if I rocked in with a video camera I'm sure they'd've asked me to hand it over...

  • @novellahub
    @novellahub Год назад +1

    Be careful with those raised tiles. I have one pinch my hand before putting it back down.

  • @TonyPadgett
    @TonyPadgett Год назад

    How do you decide which cables are under floor and which are over the top?

  • @RandomTechWZ
    @RandomTechWZ Год назад +3

    Why was there a data center in Dearborn of all places?

    • @LAWRENCESYSTEMS
      @LAWRENCESYSTEMS  Год назад +4

      yeah, weird placement being adjacent to a few other business like that and backing right up to residential. It's hiding in plane site.

    • @Heizenberg32
      @Heizenberg32 Год назад +1

      Low land prices, I would imagine

    • @cjon256netstudy4
      @cjon256netstudy4 Год назад

      en.wikipedia.org/wiki/Apex_Global_Internet_Services@@LAWRENCESYSTEMSThe place has a long and interesting history.

    • @allthingsdigital2434
      @allthingsdigital2434 Год назад +1

      I almost purchased this data center when I was CEO of Ann Arbor based DigitalRealm (in fact, I bought a house across from because I was confident and expecting to close on it but Nexcess won the bidding war at the time). So I instead built out a DC in downtown Ann Arbor's Key Bank Building (where Cisco had vacated then site used to build the BFR which required them to make serious augments to the building)

    • @allthingsdigital2434
      @allthingsdigital2434 Год назад +2

      It was built by Carter, which designed facilities for Level3, UUNET/Worldcom, and Verizon. It has close proximity to several fiber conduits and it had ample power in the area (which had reduced the cost to augment). I believe, it was originally designed to supplement backbone provider AEGIS other location on Outer Drive. I did a lot of BGP/Internet peering consulting for ISPs and CLECs my youth in the mid 90s and its rarely about cheap land and more about the areas infrastructure. Otherwise that facility would be larger.

  • @privateaddress4025
    @privateaddress4025 Год назад

    Relatively small data center compared to what I used to work in. The components are always similar and usually just differ in scope and footprint.

  • @haxwithaxe
    @haxwithaxe Год назад

    I'm definitely going to borrow the doors in the floor idea when I renovate my basement. I don't have enough room for a raised floor but I can put the doors between the joists in the ceiling.

    • @ShainAndrews
      @ShainAndrews Год назад

      I'll take "What is a fire stop" for 1 million Alex.

    • @haxwithaxe
      @haxwithaxe Год назад

      As long as the doors are made of something fire resistant they are as much a fire stop as can lights.

  • @TurnRacing
    @TurnRacing Год назад

    the amount of power in that little room holy smokes!

  • @alumseal
    @alumseal Год назад

    love your joyful excitement..

  • @jix177
    @jix177 Год назад +1

    Fascinating! Great video. Thanks.

  • @ben_doom1958
    @ben_doom1958 Год назад

    So sad to see the generator and those UPS which are no longer in use... with all the spare parts. I want to keep them from being taken apart... but I'm too far away unfortunately.

  • @Billblom
    @Billblom Год назад +1

    Hmm... NetApp has a data center in the Research Triangle that MIGHT be filmable. They promote it to sell copies of it... reason? It is FAR more efficient than most data centers... the front of racks is a reasonable human temp.. the back? Very warm... They have huge diesel generators to keep things alive during the normal Duke Power problems... One of the nice side effects of having the racks isolated is that a large percentage of noise goes away when you have the backs of the racks sort of isolated from the fronts... (Lord knows a NetApp Filer is NOT a quiet animal...)

  • @Bonehead321123
    @Bonehead321123 Год назад +3

    Whooo, thats crazy. I drive by this place on a daily and the only reason I knew this was a datacenter was because I applied to work there. Fun to see the inside of it finally!

  • @OPB682
    @OPB682 Год назад +1

    I swear - after building and commissioning a bunch of TV stations (which are not too unlike data ceters) concrete floor tiles are the shittiest things to cut holes in for racks.... and then when you cable the rack and pull the cables the steel tops and bottoms are like razor blades if you dont put capping on them.

  • @MikeHarris1984
    @MikeHarris1984 Год назад +3

    Also the misters in the corners on the cieling. That spray water into the air if humidity drops.
    Also the cut fiber, any new client will NEVER re use any fiber or cables and so the vacating co wil jsut cut and pull servers out. Faster abd cheaper. Reusing any cables in a 24x7 datacenter operation is a bug no no. Murphy will show up!!!

    • @UnreasonableSteve
      @UnreasonableSteve Год назад +1

      >any new client will NEVER re use any fiber
      Never underestimate corporate cheapness. I could easily see a small operation letting it ride. And FWIW It's singlemode fiber, pretty easy to test and run it, so I wouldn't blame em too much.

  • @killabandit
    @killabandit Год назад +1

    That is actually not true. You can have phones, but can not take pictures. Now you can take pictures within your own cage if you are in a colo of your equipment. This is for many data centers I've been in in the DFW, Austin and San Antonio area.

  • @MM-vl8ic
    @MM-vl8ic Год назад

    after you get accustom to a 500+ (and growing) acre DC that looks smaller than a comms closet...... how quickly we forget....

  • @andljoy
    @andljoy Год назад

    I have had to snip a lot of fibre panels out before , it just feels wrong cutting a fibre, even more so a bundle of like 20!

  • @brianh9358
    @brianh9358 Год назад +2

    I worked in the Data Center for the Atlanta Federal Reserve back around the year 2000. I was the LMS administrator for the first online training server they installed. It was about the same size as the one you were visiting. Nowadays that Data Center has been shrunk down to a small room with one cabinet of servers. :)

  • @concernedcitizen2031
    @concernedcitizen2031 Год назад

    Eyyyyyy the company I used to work for had a brand that had a lot of servers in that datacenter! It was a pain in the ass.

  • @davidwhelchel8774
    @davidwhelchel8774 Год назад

    Where I worked I had to crawl under the raised floor to trace cables. Too many tape drives to lift the tiles.

  • @fluffyflextail
    @fluffyflextail Год назад

    Wait, shouldn't the sprinkler system put out argon instead of water in a data center?!

  • @freshslaya
    @freshslaya Год назад

    Why did they cut the fiber??? Like if you're not gonna take it, why make it harder for the next person coming??? I don't see how that assisted the previous people in any way.

    • @ElmokillaXDK
      @ElmokillaXDK Год назад +1

      some building leases require that to be done, and its easier to cut it as well, and of course its a security risk, as you dont know what the last person may have done to it

    • @mrfrenzy.
      @mrfrenzy. Год назад

      it saved them from having to put blanking caps on all the switch ports. Maybe they forgot to bring caps or were just lazy.

    • @PortersMob
      @PortersMob Год назад +2

      another factor is likely speed of removal of the equipment, unplugging them might take a few seconds per line, times 50 lines, or just 5 seconds to slice them all.

  • @richardhyman6981
    @richardhyman6981 Год назад

    Really neat to see. Thank you for sharing!

  • @UnderLoK
    @UnderLoK Год назад +1

    Hopefully you hit up Iversen's while you were there, awesome bakery.

  • @Mikesco3
    @Mikesco3 Год назад

    I remember once I had setup my desk on a cardboard box in a place like that while I was installing some servers, and they probably had well over a million dollars worth of equipment, thinking of the irony of the value of my desk...

  • @TannerWood2k1
    @TannerWood2k1 Год назад +1

    i had always heard you couldn't cut fiber like that but I didn't know if that went by the wayside when they switched to plastic core over the older glass style. I never get to terminate those so I've always been curious. Cool video btw I've only heard a quiet datacenter room when we're starting a NEW warehouse.

    • @mrmotofy
      @mrmotofy Год назад

      Well not if you plan to use it but for demo...who cares

    • @TannerWood2k1
      @TannerWood2k1 Год назад

      @@mrmotofy I asked bc I have that scenario, but the cable is also underground. Trying to fix without a full replacement.

    • @mrmotofy
      @mrmotofy Год назад

      @@TannerWood2k1 With proper cutting and equipment it can be done. But in the middle of a run may not be possible. Run conduit just for that reason.

    • @TannerWood2k1
      @TannerWood2k1 Год назад

      @@mrmotofy the issue is that the conduit was too small and the LC end separated leaving just the rip cord attached. It's om5 outdoor jacketing and LC connector replacements won't slide over. So if it were damaged it would be right at the end and hopefully not travel back too far (and I have service loop slack to play with). I ran a light tester on each pair and it gets there but I don't know how to test intensity. Guess I need to find a local fiber tech to come show me how it's done. I have a cleaver and all the other stuff I should need except proper fitting end connectors. Thanks for replying bc I assumed the whole cable was done for.

  • @peter_smyth
    @peter_smyth Год назад

    A flywheel would technically still be a type of UPS, just not an electrochemical (battery) kind.

  • @brekmister
    @brekmister Год назад +2

    7:50 The good old ONS 15454! These suckers are built as solid rocks! Depending on the age and use case, that could be a ROADM with transponders or a SONET MUX!
    Cisco still uses these chassis to this day as NCS 2015's and NCS 2006's!

  • @ChaJ67
    @ChaJ67 11 месяцев назад

    One thing I have wondered is while a flywheel UPS makes sense because you just need enough time for the generator to fire up, is the real future in better battery tech like LFP batteries? The idea being with all of this 'green' power generation tech being unreliable and variable throughout the day, the UPS doubles as a power buffer to soak up extra power when renewables are plentiful, say in the middle of a sunny day and then discharge when they are not, say in the evening when the Sun goes down and everyone comes home and turns on stuff?
    Another thing I have wondered is would it make more sense to do a dual fuel natural gas and propane generator instead of of diesel? The idea being by default you would run on natural gas, which should be limitless and not as bad on the environment to burn than diesel and fall back to stored propane in the event the natural gas supply fails, say an earthquake ruptures lines? Just thinking it would be less to manage than shipping in diesel on a regular basis and having to do something with all of that fuel before it goes bad. Propane never goes bad, so if you go years or decades without disruption to your natural gas supply, the propane can just sit in the tank. Natural gas is typically piped all over the place and usually still works in a power outage. You could also have an LNG tanker truck hookup for emergency use in case you can't get a propane tanker in a disaster situation or just find it is cheaper to source the LNG and know you will likely use it up before it evaporates. Maybe even get into some co-generation with the power utility to help give those generators a good workout, especially as natural gas fired power plants are generally used as peaker plants these days.
    If you had good battery tech and good generator tech, maybe a data center could run off of batteries during peak grid demand when power is most expensive and dump power onto the grid from its generators at the same time. Especially some of these newer data centers are so big, they could use grid scale power generation equipment anyway just for the site. Anyways, I find this backup power stuff usually works best if it is setup in a way where it is in daily use. It is when the backup power is really only a backup that rarely gets touched is when you find it doesn't work when you need it. Even just test firing on a regular basis may not tell you everything you need to know and wastes a lot of fuel for nothing other than to know that the thing still turns on. There have been data centers put together where backup power does not cover cooling and so the whole place shuts down anyway when something happens because you cannot run a data center without cooling. These kinds of solutions talked about here will be making money for the operator, so easier to justify to the bean counters that it has to be sized up to cover everything.
    Data centers also need to adopt a 400V DC system. When you think about it, an old data center like this wastes tons of power on doing all of these power conversions. With a 400V DC system, you do one conversion to the 400V DC standard, the UPSes are charged to the voltage of the other in-line UPSes, and then just directly connect into the 400V power bus. There is no real conversion in the UPS beyond a two way charger to get the batteries to the right voltage before connecting to the main power distribution bus. The power then goes direct to the servers. It is straight main converter to the 400V DC power bus, the UPS batteries are inline with the power bus, and the power bus goes straight to the servers. The server power supplies don't have any AC/DC conversion hardware, which is generally about half of the power supply, instead just skipping to the DC-DC step down after. Much more efficient and cheap to mass implement. Electric motors running the cooling system are also much more efficient when running variable speed inverter driven motors, which could get power from a 400V DC power bus. For grid balancing applications, the main power converters to the 400 V DC buses could go from say delivering twice the power and thus charging all of the UPS batteries at the same time to completely shut off and the data center running completely off of the batteries and constantly varying in-between while balancing the grid. This will provide good stats on battery health on a daily basis. With the ability to have multiple UPS battery banks in parallel, you can take a problem battery pack offline and swap it out. With LFP batteries, they should be able to take this daily use case for decades. With a setup like this losses in the system are very minimal as LFP batteries are pretty efficient and you cut out conversion steps that reduce the efficiency and raise the installation cost of existing grid scale battery deployments. I mean the conversion hardware is already there to power your servers, you just piggyback off of it to also charge up batteries and then the batteries can discharge directly into the servers without conversion hardware in-between and the servers themselves drop some of the internal conversion hardware, making it all even more efficient to do.

  • @sph33r
    @sph33r Год назад

    I used to colo a server there. Bummer to find out it’s shut down.

  • @Dodge34
    @Dodge34 Год назад

    I wonder why the companies abandon theses places, too much maintenance costs or something...

  • @PupShepardRubberized
    @PupShepardRubberized Год назад +3

    So many racks that can be used in home labs :P

  • @thelol1759
    @thelol1759 Год назад +1

    I’ve never seen a datacenter so small before! That’s quite I credible!

  • @galitako7991
    @galitako7991 Год назад

    Is this the DC1 by Nexs ?

  • @Mack_Dingo
    @Mack_Dingo Год назад +1

    Too bad you didn't have that LTT pull, that flywheel system sounds interesting, now I know what I'm gonna google/rabbit hole tonight

  • @a9503128
    @a9503128 Год назад +1

    Wow, that’s really old equipment 80’s / early 90’s I reckon

    • @UnreasonableSteve
      @UnreasonableSteve Год назад

      With a lot of the power equipment, that stuff lasts forever with maintenance. The edgeiron switch under the desk points to mid-late 2000s for me.

  • @Manx0Mann
    @Manx0Mann Год назад +1

    No VESDA? Water instead of gas? Single Genset? Plenum cooling? No perimiter fence? No STS for A/B feed to racks? meh..

    • @franklaszlo8166
      @franklaszlo8166 Год назад

      Its worth pointing out that this particular datacenter has not been actively used aside from 1 or 2 colo clients for the better part of the last 7-8 years. Maybe even longer. The primary datacenter is much larger, and has all the fancy bells and whistles you speak of :)

  • @Mcohen20
    @Mcohen20 Год назад

    Have you worked in any data centers in Northern VA?

    • @LAWRENCESYSTEMS
      @LAWRENCESYSTEMS  Год назад +1

      I have not

    • @Mcohen20
      @Mcohen20 Год назад

      @@LAWRENCESYSTEMSthe buildings are insane. Would love to see what’s inside

  • @johngermain5146
    @johngermain5146 Год назад

    I've found that the oil heater on those Cummins Generators can be "very hot" even when not running

  • @paullproductions
    @paullproductions Год назад +3

    For me this quote sums up the video "I want to touch everything" mwhahahaha

  • @giovannimai8828
    @giovannimai8828 Год назад +1

    Interesting video and walkthrough

  • @MiniArts159
    @MiniArts159 Год назад

    This video holds simultaneous records for the most and least switches ever to appear in a (non-talking heads) Lawrence Systems video.

  • @bcm50
    @bcm50 Год назад

    It’s a Cisco ASR, data center probably was running a multi-homed ASN

    • @BrianLanders
      @BrianLanders Год назад +1

      that's not an ASR, it's an optical chassis (looks like maybe an ONS 15454). It would have connected to AT&T's SONET network ring and be used to break out whatever circuits were coming into the datacenter (OC-3, OC-12, etc.)

  • @sledgeHammerRulez
    @sledgeHammerRulez Год назад

    Hi Tom! only one generator? where is the redundancy ? BTW, you sound very excited :D

  • @osvaldocoelho3665
    @osvaldocoelho3665 3 месяца назад

    where is it located?

  • @mrlithium69
    @mrlithium69 Год назад

    this is so cool thanks for showing