This INCREDIBLE Short-depth Server has a Storage Secret

Поделиться
HTML-код
  • Опубликовано: 8 ноя 2024

Комментарии • 116

  • @justbendev2324
    @justbendev2324 Год назад +25

    This is patrick from STH presenting another insane server at an insane cost x)
    I love the products presented by your channel but damn it's too expensive for me. (crying)

    • @lurick
      @lurick Год назад +4

      Haha, yah
      They look so cool but $$$ makes me sad :(

    • @acuteaura
      @acuteaura Год назад +7

      just imagine the jet engine sound in your home lab and you'll feel slightly less bad ;)

    • @Jaabaa_Prime
      @Jaabaa_Prime Год назад +2

      Presenting new and awesome servers it a great thing. If you can (afford/justify) the price is not a reviewer problem. I think that Patrick/STH was, as always, honest in his review.

    • @marcogenovesi8570
      @marcogenovesi8570 Год назад +2

      It's ok to nerd out on expensive "big boy" hardware. Car people do it all the time

    • @justbendev2324
      @justbendev2324 Год назад +1

      @@Jaabaa_Prime are you okay Ken ?
      I think you are a bit on edge for a simple harmless joke.
      I hope you are doing good. 🙏🏼

  • @gowinfanless
    @gowinfanless Год назад +6

    Amazing!! Such a super strong 1U especially comes with Redundant power supply!

  • @christopherjackson2157
    @christopherjackson2157 Год назад +4

    I usually recommend ppl upgrade when a server is starting to hit around 80 percent load, and aim for 60 percent load with the replacement. So around 70 percent avg server load sounds just about right lol
    60 percent means ur workload can scale 1.5x in the 5 year lifespan of the box. And frankly if you're expecting more growth... you probably want to upgrade every 2 or 3 years instead buying for 5.

  • @jolness1
    @jolness1 Год назад +8

    Love the lstopo diagram, always cool to see the topology for me to help conceptualize it better.

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  Год назад

      I totally agree. We try to put the lstopo or system block diagram in every server review (and most motherboards at this point.)

  • @TehFreek
    @TehFreek Год назад +5

    Eh, I'd be happy to give up the internal M.2 drives and SATA drives if it meant I could get another two external NVMe drives.

  • @helmutweinberger4971
    @helmutweinberger4971 Год назад +5

    I really like it and am excited for the pricing. I hope that you can also mount it in the other way as it could be a problem in the datacenter.

  • @movax20h
    @movax20h Год назад +3

    So so close, yet not so close to being perfect. The PSU should be at the back, and the drives at the front. There is so few manufacturers that do it right (there are few from Supermicro, but I just gave up and built mine own from scratch instead). But really nice to see it has U.2, M.2, EDFF, and SATA. Everything doubled. Quite versatile indeed. 3 PCIe slots also great. I would prefer one of them to be OCP instead, but I guess that is not bad either. Idle 80W, oh, that is meh.

  • @radeksparowski7174
    @radeksparowski7174 Год назад +3

    gimme that, and a few industrial grade 30/60TB ssds for my new homelab....got it on my todo list right after an autark solar powered catamaran and before my own island, after I win a considerable jackpot..........

  • @alexrosenberg_tube
    @alexrosenberg_tube Год назад +3

    Seems like they missed a trick to be learned from the 2019 Mac Pro: most of those cables are unnecessary when you design a tightly-integrated system like this.

    • @fuzzyfuzzyfungus
      @fuzzyfuzzyfungus Год назад +1

      They probably could have cut more cables if they really wanted to; but going by the manual the system doesn't actually look that tightly integrated: there are two variants based on PSU length(the 650w short PSU one is shown here; in the 1300w unit the front 2.5in bay is gone because the PSUs occupy more space and that is where the power distribution board goes instead); which means that the PSUs can't just plug into the motherboard since the attachment point isn't in the same place.
      The E1.S cage is also optional, which means more cabling; and the I/O on the intake fan side is all cabled because no part of the motherboard is even close to it.
      If they'd been willing to do a spin of the motherboard exclusive to a 650w config with E1.S populated they could have had the PSUs connect directly to it, integrated the power distribution board, and connected the cabled I/O and E1.S directly to the motherboard; and if they'd spent more money on some sort of sliding carriers, like you see on some switches, they probably could have done away with the fan power cables; but that sounds like a lot of lost versatility and extra motherboard PCB area and mechanical complexity to get rid of a few wire bundles.

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  Год назад

      Yes

  • @JasonsLabVideos
    @JasonsLabVideos Год назад +3

    Sweet, the question for me is how loud is it ?

  • @gordslater
    @gordslater Год назад +1

    I've been all my running SFF and tower desktops back-to-front for years, only mod is to add a 2-way 2 pin dupont splitter on the front panel power button header and fit a push-button switch that I thread through a vent hole in the rear panel (hotglue it in place)
    I've never understood why desktop cases have all the connectivity at the back out of sight and all the styling bullcrap up front.

    • @nadtz
      @nadtz Год назад

      Silverstone tried something different with the IO at the top instead of the back but I guess it never took off. Loved my raven case that was set up that way though.

  • @nadtz
    @nadtz Год назад +4

    I've always loved the 'pizza box' form factor, this thing packs a lot of hardware into that tiny case.

    • @briceperdue7587
      @briceperdue7587 Год назад

      Allowing you to use a 2 post rack in legacy locations these things are awesome. Supermicro really dominates this space still today.

    • @sativagirl1885
      @sativagirl1885 Год назад +1

      $parc one up, Cheech 🙂

    • @nadtz
      @nadtz Год назад

      @@briceperdue7587 I probably built a few hundred 512L's once upon a time. Could knock one of those together in like 15 minutes but obviously it was nowhere near as dense as this server.

  • @thebesttechs
    @thebesttechs Год назад +1

    Awesome Review! keep it up..

  • @criticalhartzer
    @criticalhartzer Год назад +2

    Just a heads up, the link to the main site article is broken atm, its missing the servethehome part.
    the server is super interesting tho, really like it

  • @RetroBerner
    @RetroBerner Год назад +1

    It would be pretty silly if the rack ears weren't reversible. Are the fans in push or pull configuration?

    • @Eifer91
      @Eifer91 Год назад +2

      From the pictures on Asus website, it seems that the fan are pulling the air from the psu side in both the R (ears on the traditional side) and F (ears on the psu side like the version shown in the video) versions.

  • @jeremybarber2837
    @jeremybarber2837 Год назад +3

    Love the variety of storage options. So fun.

  • @ChristianSamsel
    @ChristianSamsel Год назад

    I'd actually considering buying one, but I'd have not found any seller in Germany. Paper launch?

  • @pycontiki
    @pycontiki Год назад +1

    Your link to the article is messed up

  • @karolromanski6192
    @karolromanski6192 Год назад +4

    Can you make a movie about xeon lines of procesors? With Core I3 I5 I7 everything is easy. With servers not.

  • @Jaabaa_Prime
    @Jaabaa_Prime Год назад +1

    7:06 Incomplete review, we must know if we can put the server into a rack the right way around! 🙂 Off topic, but is STH getting a "production" 45drives HL15 to review?

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  Год назад

      We might. I really want to do the 2.5" NVMe 45drives system though.

  • @markbooth3066
    @markbooth3066 Год назад

    The Asus web page for this product does show the rack ears on the front, rather than the back.

    • @Eifer91
      @Eifer91 Год назад +2

      The Asus website show both version, the F version has the rack ears on the side of the psu, the R on the other side.

    • @markbooth3066
      @markbooth3066 Год назад +1

      It's good that it's been confirmed that it has the mounting holes for the ears to be used either way.

  • @BloodyIron
    @BloodyIron Год назад +2

    3:55 I think you mean RAID 1, not RAID 0...

  • @knier
    @knier Год назад +1

    the site article link in the description is broken :(

  • @SimmanGodz
    @SimmanGodz 11 месяцев назад +1

    You sure the rack ears aren't just on backwards?

  • @kenzieduckmoo
    @kenzieduckmoo Год назад

    Is this video from before the move, or did you just set up your studio the exact same as it was in texas?

  • @bits2646
    @bits2646 Год назад

    Are those 2 front NVMVE bays U.3 ready?

  • @AI-xi4jk
    @AI-xi4jk Год назад +1

    How much would that be? Can I put 300watt dual slot gpu in it?

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  Год назад +1

      It might work, but you might also need bigger PSUs with a 300W GPU installed.

  • @oscarcharliezulu
    @oscarcharliezulu Год назад +1

    This is some sexy hardware.

  • @tushg4061
    @tushg4061 11 месяцев назад

    What is best server for $1800 or below?

  • @GameCyborgCh
    @GameCyborgCh Год назад

    jesus the heatsink onm the VRM

  • @allanwind295
    @allanwind295 Год назад +3

    Hey Patrick, do you plan on hosting your videos elsewhere (if RUclips succeeds in driving your audience away)? I subscribed to the feed on your website a while ago. What's the point of the kitchen sink storage options opposed to different versions of the server each support 1 one of the options (besides pcie card)? I wouldn't want to stock 5 different spare parts. On the other hand, I would want at least 3 of a given type for raid 5.

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  Год назад

      Probably not in the near term, but we might start doing vertical for elsewhere when we get the new studio in Nov.

  • @insu_na
    @insu_na Год назад

    I really don't like that most servers have the rack ears in the front.... Sometimes I want to do maintenance on the hardware, swap a bad memory module, a bad gpu, whatever, and with most servers I have to fully take the server out of the rack for that, because the lid of the chassis can't open, because the server can't be pulled out of the rack all the way. If I could pull those servers out the back tho that wouldn't be an issue at all...

    • @jonathanbuzzard1376
      @jonathanbuzzard1376 Год назад +2

      Duh, that's why you get sliding rails. If you are cheap skating and getting static rails then sucks to be you.

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  Год назад +1

      I prefer always having rails. That is a minority preference however.

    • @jonathanbuzzard1376
      @jonathanbuzzard1376 Год назад

      @@ServeTheHomeVideo I never understood that as it is a health and safety issue. If you don't have sliding rails then especially if the device is high up in a rack you are going to need two people for any maintenance rather than one. Even if you decide you don't care about dropping a server on yourself or doing your back in doing maintenance, dropping the server would turn simple maintenance such as replacing a faulty DIMM or fan etc. into a very bad day. Spoiling the ship for a hap'o'rth of tar frankly. At this point I take issue with vendors of Infiniband, Fibre Channel and Omnipath switches a pox on you all.

    • @TharangaSanjeewa-ee3ek
      @TharangaSanjeewa-ee3ek Год назад

      ​@@jonathanbuzzard1376ì7

  • @pdeakneam
    @pdeakneam Год назад +1

    Hello guy i am ery realy know new technology and thank you for Express clear it

  • @seanmchughnt
    @seanmchughnt Год назад

    can you get reverse airflow fans if you can indeed mount it around "the normal way"?

    • @nadtz
      @nadtz Год назад +2

      You get the RS2-R if you want normal orientation.

    • @Eifer91
      @Eifer91 Год назад

      @@nadtz When you look at the pictures on Asus website, the fans seems to be mounted in the same orientation for both R and F versions (you can see the flow direction arrows on the fans). If that is the case, the R versions will blow hot air through the front. Definitely something you want to take into consideration when mounting those versions in a rack.

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  Год назад

      Yes.

  • @psycl0ptic
    @psycl0ptic Год назад +29

    “Redundancy in raid 0”?

    • @Kurata_Nemui
      @Kurata_Nemui Год назад +11

      Redundancy in the chances of losing your data.

    • @poofygoof
      @poofygoof Год назад +9

      "redundancy or raid 0" is maybe what he intended to say?

    • @mph8759
      @mph8759 Год назад +3

      It was a silent “or”, emphasised by the “whatever”

    • @0joebloe
      @0joebloe Год назад

      Perfect for backup

    • @markarca6360
      @markarca6360 Год назад +1

      This is Scary RAID, you are scared of losing all of your data.

  • @sativagirl1885
    @sativagirl1885 Год назад +1

    Q: What can server vendors do with ARM v9 4nm with unlimited TDP?

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  Год назад +1

      4nm will help a lot. Remember though, the actual core IP on a modern CPU is actually very small compared to all the other IP for things like the fabric, memory controllers, PCIe controllers, and especially cache cells.

  • @eman0828
    @eman0828 Год назад

    The extra storage on the back of the server is nothing new. I seen server chassis with those when shopping around for a nee server chassis for Mt custom server build.

  • @benjaminsmith3625
    @benjaminsmith3625 Год назад

    Im not sure why you'd want *different* kinds of storage? Just sounds like it's a pain to spec initially and then a pain to upgrade, let alone hold any spares for.

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  Год назад

      On the other hand, they are there. That was the amazing part.

  • @kristopherleslie8343
    @kristopherleslie8343 Год назад +1

    dope

  • @alexjenkins7102
    @alexjenkins7102 Год назад

    How loud?

  • @DonaldMolter
    @DonaldMolter Год назад

    I need it

  • @jonathanbuzzard1376
    @jonathanbuzzard1376 Год назад +1

    Can you get an option on SFP+ because 10GBaseT is useless in a data centre. Oh and the X710 was first released in Q4 2014, so that is a nine year old chipset and note remotely recently. Clearly, you have been on the crack pipe. Had to double-check because I knew our compute nodes purchased in 2017 all have X710 NIC's.

    • @allanwind295
      @allanwind295 Год назад

      Why is 10GBaseT useless in a data center?

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  Год назад

      There are a lot of folks still rocking X540-t2's and older. Also, the updated X710 10Gbase-T (check the X710-T2L for an example) is 2019. Fortville required a re-spin when it was still a 40GbE/ 10GbE SFP+ and so it went through revisions.

    • @jonathanbuzzard1376
      @jonathanbuzzard1376 Год назад

      @@allanwind295 For lots of reasons. Firstly 99% of data centre switches are SFP+ because for starters they are cheaper and lower power so running costs are significantly lower. In a rack or even to an adjacent rack a DAC cable is lower power, and thus cheaper. If I need more than can be managed with a DAC cable well its SFP+ so I will just stick in some SR transceivers and go fibre optic. Break out DAC cables are awful to work with and super inflexible. Much better to go optical and breakout to a patch panel and get a bunch of flexibility. I have hundreds of 10Gbps ports at work and not a single 10GBaseT amount them. If someone presented me with a server that had them my first port of call would be to swap the NIC for something that was SFP+ I was also taking with the out head of networks (I work in HPC) and they don't do 10GBaseT either. There is the grand sum in the entire university of one port using a transceiver because some dum research group went out and broke procurement rules and purchased a DGX box and want to use it's 10GBaseT port. They are a problem group anyway because they didn't consult on anything before making the purchase, so we are having to do a major power upgrade to accommodate them.

    • @ChristianSamsel
      @ChristianSamsel Год назад

      Well this clearly position to be deployed in an edge environment. Copper isn't wrong, and it has room for expansion

  • @JohnCillian
    @JohnCillian Год назад

    Every new video Pat looks bigger. Watch your weight 😂

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  Год назад

      This was recorded near the end of 32 flights in August/ September. Very rough :-/

  • @mohamedfawzy1295
    @mohamedfawzy1295 Год назад

    ر

  • @fateichmann
    @fateichmann Год назад +1

    Second! :)

  • @charles2000wang
    @charles2000wang Год назад +1

    First!

  • @colinreece3452
    @colinreece3452 Год назад

    Geez take a breath, constant chat is off putting had to leave.

  • @be-kind00
    @be-kind00 Год назад

    I want the motherboard inside this thing. What is the part number? @servethehome

  • @احمدامام-ك3ص
    @احمدامام-ك3ص Год назад

    تكبير. اليتيوب