The Perfect Home Server 2024 – 56TB, ECC, IPMI, Quiet & (kind of) Compact

The Perfect Home Server 2024 – 56TB, ECC, IPMI, Quiet & (kind of) Compact

Show Video

In this video, we’re gonna build  a perfect home server slash NAS that you can use for virtualization,  backups, media, "Linux ISOs", Docker containers, and really,  anything else you can think of. It’s gonna be powerful, quiet, power efficient,  reasonably compact and easily upgradable, and it’s also gonna have all the features you  would expect from a big boy capital s server – including things like ECC  memory and remote management. And the best part? It’s gonna  cost us less than 500 bucks. I've put a lot of thought into this build, and I’m fairly confident that  if you’re looking to build a power efficient storage-focused  home server on a budget – this is pretty much the best build to go for. Oh, and this time around, You won’t even need a 3D printer to do it.

So let’s jump into it right after a word from  today’s sponsor, CodeRabbit. CodeRabbit is an AI-powered Code Review tool that helps you and your team merge code changes faster without  sacrificing the code quality. Instead of just pointing out the  potential problems with your code, Code Rabbitalso provides fixes and explains  the reasoning behind the suggestions. It also includes automatic PR  summaries, file-change walkthroughs, and highlights for code and  configuration security issues. And it’s not just AI either! Code Rabbit can run popular linters  like Biome, Ruff, PHPStan, etc.  and lets you write your own AST grep rules.

CodeRabbit has reviewed over 5 million PRs,  is installed on a million repositories,   has 15k+ daily developer interactions,  and is used by 1000+ organizations.  And the best part? it’s is completely  free for open-source projects. So go ahead and try Code Rabbit today! Check out the link down below  and use the code WOLFGANG1MFREE to get one month of Code Rabbit AI for free. So thank you CodeRabbit for  sponsoring today’s video, and now, let’s get back to our build Starting with our CPU and motherboard, I decided to build this system  around the AMD AM4 platform, and that’s because AM4 is pretty much  the best bang for the buck right now when it comes to building a home server. you get ECC support out of the  box, regardless of the chipset you get PCIe bifurcation which  is great for NVMe adapters and on the G-series CPUs you  also get integrated graphics   with support for hardware video transcoding. Even though it’s not as  good as the Intel graphics.

And since AM4 is technically a “legacy” platform, the parts are dirt cheap, and assuming you’ve picked the right CPU, you can actually build a pretty  power efficient machine with it. So let’s start with the right CPU. This is Ryzen 5 Pro 4650G. It’s a 6 core, 12 thread CPU  with integrated graphics. that’s based on the Zen 2 architecture.

I went with the 4650G due to a  pretty much unbeatable combination of integrated graphics with support  for hardware video transcoding, reasonable price on the reseller market, great power efficiency at idle  due to its monolithic design, and support for unregistered ECC memory. Now, being a PRO-series chip, this CPU is  not officially sold to private customers. However, you can usually find it pretty  easily on websites like eBay or Aliexpress, for anywhere between 90 and 140 euros.

If you can’t find this particular chip, the 4350G with its 4 cores and 8 threads, is also a great option. I would personally avoid anything  older than the 4th gen chips when it comes to the G-series CPUs because Zen 2, which the 4th  gen G-series chips are based on, is the first generation where AMD  fixed the “crashing at idle” bug, which plagued the Zen 1 and Zen+ chips. At the same time, if you don’t care about power  efficiency or video transcoding, you can pretty much use any  3rd gen and newer AM4 CPU that you can get your hands on. And no worries, you’ll still be able  to get the video output from it, even if it doesn’t have integrated graphics.

How that does work? Well, let me introduce you to our motherboard. This is Gigabyte MC12-LE0. It’s a microATX motherboard  based on the B550 chipset, which comes with 6 SATA ports, two PCIe 4 slots and an integrated BMC with support for  web-based KVM and remote management.

A BMC is basically a separate tiny  ARM-based chip on the motherboard that works independently of the main machine, and has its own Ethernet port. It can be used to turn your server  on and off independently of the OS, change the BIOS settings, fan speeds, and be able to control the machine remotely, all from a WebUI. I bought this motherboard on  eBay for 59€, and at first, I was like, what’s the catch? Sure, there are some cheap  AM4 motherboards out there, but you’d be pretty hard pressed  to find any B550 board for 59€, let alone a server-grade  motherboard with a whole ass BMC.

And yes, on some server and  workstation boards with a BMC, you actually can’t use  internal CPU graphics at all, even for things that don’t strictly need  a display output, like video transcoding. But on this board, Provided you have the BIOS version F14 or newer, there is actually an option in the BIOS that  lets you activate the internal graphics chip. And here’s the catch The BIOS update process on this  board is… slightly convoluted long story short, updating it  via the BMC interface won’t work. you have to do it through the EFI shell. I’ll leave a link to a guide that I  wrote outlining all the steps down below. Other than that, it’s a good board with solid I/O,   a nice WebUI for remote control,  and a really compelling price.

It does however seem to  only be available in Europe, but if you can’t get this  particular board — no problem. AM4 boards are cheap and  plentiful on the used market, and most of them support ECC  memory, regardless of the chipset. You will be losing the BMC functionality,  but that’s not the end of the day, especially with devices like  NanoKVM being available.

To cool our CPU, we’ll be using the Thermalright AXP90-X36. As you’ll see in a minute, we won’t have a lot of room for the CPU cooler, and at the height of 37mm, the AXP90  is a pretty decent little cooler, that should be able to handle  our 65W CPU with no issues. The cooler includes a little tool for tightening  the nuts on the back of the motherboard, and here, just don’t go crazy on them,  and try not to overtighten the cooler. As for the memory, We’ll be using this 16 GB kit of  unregistered ECC RAM from Kingston. This motherboard has been known to be a  bit picky about the RAM, so when in doubt, look at the QVL on Gigabyte’s website.

Keep in mind that unlike some other motherboards,   where memory channels go  first-second first-second, this board actually has memory  channels grouped together, so first-first second-second. Now the reason we’re going with 16 gigs  of RAM and not 32 this time around, is because this motherboard  has 4 RAM slots and not 2. Not everyone needs 32 gigs  of RAM on their home server, and if you do find yourself running out of memory, you can always upgrade by  buying two more 8 gig sticks. On the other hand, if you find that even  64 gigs of RAM is not enough for you, you can install 128 gigs in this bad boy, as opposed to “just” 64 on an ITX board.

Now regular viewers will know that I am  obsessed with small form factor server builds. they’re slick, they’re compact, they can easily  fit into your living room setup or on a shelf. That being said, let’s face it Small form factor builds come  with a lot of limitations. miniITX motherboards are usually  more rare, more expensive, And come with less ports and  features than standard ATX boards. Likewise, SFX power supplies  are usually more expensive and less power efficient at idle  than their ATX counterparts.

If you remember the idle power efficiency  spreadsheet that I’ve compiled earlier this year, most SFX power supplies in the  Top 10 for 20W power efficiency Are either impossible to find,  discontinued, or cost a lot of money. The only exception from that rule when  it comes to storage server builds, are… cases, for some reason. There are a ton of really good miniITX NAS cases Fractal Node 304 Jonsbo N2 and N3, Silverstone SST-DS380 and many of the no-name NAS  cases like this one from Eolized.

On the other hand, space optimized  microATX cases with 6+ hard drive slots Are pretty much non-existent  as a product category… Most microATX NAS cases on the market today are either so huge you might  as well buy a tower case, or are limited to SFX or Flex power supplies. The only sub-20 liter microATX NAS  case I’ve seen so far is Jonsbo N4. Buuuuut unfortunately, it’s  just not a really good case. The wooden front panel looks  nice, but it’s also pretty fragile and is very likely to get damaged in shipping.

The internal layout is crammed, so  when installing the power supply, the cables will press on the  drives and squish the SATA cables. At the same time, the cable runs are  longer than your typical SFF cases, and with some motherboard and PSU combinations, the 24-pin and EPS cables  will end up being too short. The cooling is lackluster, with only one 120mm fan to cool  potentially 6 hard drives and 2 SSDs, and there’s no option to install an  additional fan in the motherboard chamber.

So, I’ve looked far and wide for a  potential contender, and I found this. This is the Sagittarius 8-bay  NAS case from AliExpress It can fit a microATX motherboard, a full-size ATX power supply, and 8x 3.5 inch hard drives. It’s also got a dual chamber cooling design with mounts for four 120mm fans in total, four full-size PCIE slots, and all of that at a similar  footprint to the Jonsbo N4. It also fits perfectly into my TV shelf! And by the way, Huge thanks to Peter Brockie for his video  on both Jonsbo N4 and the Sagittarius case, You should definitely check it out if you’re choosing between those two cases.

All in all, I think that this  case is a great compromise between an ITX-based NAS case  with limited features and cooling and a huge tower-style ATX case, with its, well, hugeness. My only gripe with the case is that it needs 4 Molex connectors for the SATA backplane. I’ve seen other cases with 8 hard  drive slots, that only need two, and wish they went that way instead. Now as of making this video,  this case is only sold in China, and I personally bought it for 158€ on Aliexpress ...half of which was shipping costs! Is that a lot of money for a noname case? Yes.

And believe me when I say that  I did a lot of research on this, and at the time of filming this video, this is pretty much the best value  microATX storage-focused case on the market Speaking of storage, the case  has two 4-drive backplanes, with a fairly simple mounting system. there are no drive caddies involved, instead, you basically screw  these rubber nubs to your drive, as well as this metal pull tab. You can then use the rubber nubs to slide  your drive into the rails on the case, and use the metal pull tab to take a drive out.

Now on this particular motherboard, 4 of the SATA ports are facing to the right, which won’t give us a lot of room to plug stuff in once we put the motherboard into the case. Because of that, I’m actually gonna plug in the   SATA cables before putting  motherboard into the case. and before installing the fans. we can then lower the motherboard into the case, and then route the SATA  cable below the motherboard and into this window at the bottom, and here, some thin low profile SATA cables  would definitely make your life easier. Now some people might ask “why do you even need 8  hard drives in a home NAS?” “especially when SSDs are so cheap?” And the answer is, I don’t.

I have three hard drives in my  current home server slash NAS as well as four 2TB SSDs and one SATA boot drive. And if there was a good  space-optimized microATX case with three 3.5" and five 2.5" bays, I would use that instead. Some other people might ask “why not just get a case  with less hard drive bays, and buy higher capacity drives?” And the answer to that is – I don’t expect you guys to buy a new set of drives every time you build a new NAS or a home server.

A lot of people have a mishmash of  different capacity drives at home, and being able to put all of  your existing mismatched 4,   6 or 8 terabyte drives into one machine and build a home server with that is very important for a lot of people. For what it’s worth, this motherboard  doesn’t even have 8 SATA ports, but that’s something that could easily be fixed  with a cheap HBA or a PCIe SATA controller. To help us cool our drives, I’ll also be  installing four Arctic P12 PWM PST fans.

These are amazing value,  at 26€ for a pack of five, and provide a really good balance  of noise level and performance. The P12 PWM PST fans come with two  connectors – one male, and one female, which lets you daisy chains the fans, and removes the need for a  separate fan hub or a fan splitter. Now our motherboard actually has  six (yes, six) fan connectors, so we don’t actually need to daisy-chain the fans, but it could be pretty useful on some other  motherboards with just 1 or 2 fan connectors. I’ll be installing two fans  in the motherboard chamber, and two in the drive chamber.

As you can see, the mounts for  the fans are literally just the   same holes that the case uses for airflow, they’re perfectly spaced. Personally though, I would prefer if the case  manufacturer shipped white screws for the fans, since the case doesn’t look as  clean with the metallic screws. But that’s just a little nitpick. This is also the moment where you’d want to  plug in and cable manage the USB connector. as well as the front panel connectors, or in this case, connector, singular.

This case only has a connector for  the power button, and that’s it. No power LED, no reset button, nada. For the PSU, I decided to go with  the Cooler Master MWE 550 Bronze. It’s a fairly inexpensive 550W power  supply, and I chose it for several reasons. First off, it only cost me 44€. and second, it scored pretty well  in the Cybenetics efficiency tests.

79.3% efficiency at 20W load, Which is just 1.6% lower  compared to our last year’s PSU – Corsair RM550x, which costs more than double.

And sure, the MWE 550 is not modular  and doesn’t have a zero-RPM fan mode, but I think that for that price, we  can forgive some missing features. The only problem with this particular PSU  is that it has three Molex connectors, whereas we need four. Luckily, this can easily be  solved by using a Molex splitter. These are fairly cheap and have a way simpler  construction than Molex to SATA adapters, which can be a fire hazard. Before installing the PSU into the case, make sure to plug in the CPU power cable, since the PSU will cover a part  of the motherboard when installed. You’ll also want to feed  the Molex connector cable, as well as any connectors that  you won’t need, for example SATA through the hole in the top of the case.

This will make cable management a bit easier. I’ll also feed the PCIe cables into the window, since I’m not gonna be using any  PCIe devices that need PCIe power. Now as you can see, the power supply  takes quite a lot of space in the case. but thanks to how the CPU is placed  on this particular motherboard, it’s not covering the entire CPU cooler.

if you’re planning to install  a beefier CPU however, it might be a better idea to use an AIO. And with some zipties and elbow grease, the cable management doesn’t look half bad. at least for a non-modular power supply. Now last but not least, let’s talk storage. For the boot drive, I got this 256 gig  NVMe SSD off of Amazon for around 25€. 256 gigs should be plenty for a boot drive, and yeah, there’s nothing  more to say about it I guess.

For the hard drives, I’ll be using  three 16TB Seagate Exos x18 drives. I bought those drives refurbished off  of Amazon about a year and a half ago, and they’ve been working great ever since. They’re quiet, cool, reliable  and fairly power efficient Honestly, what else would you  even want from a hard drive? Now sure, not everyone is gonna  want to buy a refurbished drive, and not everyone’s gonna have a great offer  for refurbished drives in the first place. So make sure to check out diskprices.com and see what drives are currently the  best bang for the buck where you live.

There are a few kinds of drives that  I would personally avoid though: If you plan to use ZFS, avoid SMR drives. SMR is a technology that lets drive  manufacturers fit more terabytes on a platter, But it also slows down random  reads and writes dramatically. This doesn’t seem to be a  problem with regular file systems or even more traditional software RAID, but ZFS performance seems to  suffer A LOT from using SMR drives. These days even drives that are marketed as  “prosumer” or “business” drives can be SMR.

For instance, Western Digital’s Red lineup  has Red Pro and Red Plus, which are not SMR, but it also has Red drives, which are. So definitely do your research. If you appreciate peace and quiet, don’t  buy older enterprise 7200RPM drives, especially the ones filled  with air and not helium. Same goes for Seagate Exos models  older than x18. These are very loud. Air based drives also tend to run  hotter than helium based drives, so you will have to run your fans  faster, thus resulting in more noise. Other than that, do your research, and make sure you get the drive  set up that works for you.

Apart from my 3 Seagate drives, I’ll  also be using four 2TB SATA SSDs, which I’m gonna put in the hard drive slots  by using 3.5 inch to 2.5 inch adapters. Obviously, you don’t necessarily need  4 SSDs for your home server/NAS setup. I’m a content creator, and I do a lot of  video editing over the 10 gig connection, so for me, having a lot of relatively  fast storage is really important. For any other use case however, A couple of 1 or 2 terabyte drives is  probably gonna be more than enough. You can use them as ZFS  special devices in TrueNAS, or set them up as a cache pool in Unraid, and using SSD storage in your home server can  drastically improve your read/write performance.

This case also has mounting  holes for two 2.5" drives at the bottom of the motherboard chamber, so you don’t necessarily need to use 3.5  to 2.5 inch adapters, like I did here. Now another nitpick I have with this case is the fact that it uses regular separate  SATA connectors for the backplane, instead of a unified SFF-like connector. So as I already mentioned, I would highly recommend buying some  kind of thin low profile SATA cable, like these ones from Aliexpress.

These will drastically reduce the cable clutter. Now as I mentioned before, this  motherboard has two PCIe slots, so we can extend the functionality  of our home server even further. If you need fast networking, you can  add a 10 gigabit networking card, like this Intel X710-DA2. If you need better video transcoding  performance in Plex and Jellyfin, you can add something like this Intel Arc GPU. If you want to use all 8 hard drive bays, you can add a PCIe SATA controller, like this one.

Basically, the world is your oyster. Personally, I added a 10  gigabit SFP+ card from Intel, and a 6 port PCIe SATA controller, for 12 SATA ports in total. And with that, our home server build is complete. it’s powerful, it has lots of space and it looks gorgeous! ...once we put the side panels on,  and hide all the cable clutter. But the best part is the silence.

If you’re coming to this server after  years of running enterprise rack servers, this will be music to your ears. But what about the power consumption? Well, although this build doesn’t win  any contests in terms of idle power draw, It still doesn’t disappoint,  drawing just 12-15W at idle,   with no storage devices connected  apart from the boot drive. In this case, "idle" means that the computer  is running some services and Docker containers, but has no intensive tasks or workloads running. Of these 12-15W, 3.3w are consumed by the BMC, even when the machine is turned off,  but still connected to the wall. With my entire storage and networking connected, so 3 hard drives in spin down mode,  4 cache SSDs and one boot drive, 10 gig networking card, and  an external SATA controller This machine draws 30 from the wall at idle.

Finally, with all hard drives running, the entire system consumes 37 watt from the wall. On average, given the price of 30 cents per kWh, this machine will cost you 6 to 9 euros a month, provided you run it 24/7. To reach those numbers, I enabled all the  settings that had to do with C-States in the BIOS, ran `powertop —autotune` from the OS and also explicitly enabled ASPM  on all devices that support it.

I’m gonna leave a link to the script  I’m using down in the video description. Now comes the big part –  let’s talk about the price. At the time of making this video, this  build cost me just under 500€ after tax. Feel free to pause and look  at the individual components. To put this into perspective, at half the  price of Synology’s latest 6-bay NAS offering, you get four times the CPU performance four times the RAM twice the PCIe slots And even though our machine technically  has 6 hard drive slots out of the box, you can upgrade it to 8  3.5” slots and 2 2.5” slots For less than 50 bucks, without  buying any expansion modules.

And sure, our build is a bit bigger  than your typical off-the-shelf NAS, but is it 500€ bigger? Well, I’ll let you decide. Now if you don’t care about the form factor, you can actually make this build even cheaper by getting a random old ATX case  from your local electronics recycler, and simply 3D printing some five and a quarter  inch bay to three and half inch bay adapters. Still, I think this case really  brings this build together, It gives you a ton of space for storage  at a relatively small footprint. and also has great ventilation – something that’s  often missing on your typical office PC cases.

Now the last thing we’re gonna do Is log in to our remote management  panel and change the fan settings. The default fan curve is a tiny bit steep, and we also want to separate our  chassis fans from our CPU fan. once you plug a lan cable into the  management port on your motherboard, you should be able to see the IP address  of the BMC in your router’s WebUI. That is, if you’re a normal human being  who doesn’t run NixOS on their router. Me, I’m just gonna run arp and look for  the MAC address of the networking card. Now we need to log in.

The default username is admin, and the password is the part of the  motherboard’s serial number after the last slash. You can usually find it on a sticker  in the middle of the motherboard. And don’t worry, you can  change the password later.

after that, we’re gonna go  to settings, Fan profiles, and copy the default fan settings. And here, feel free to configure  the curve to your liking. Personally, here’s what my CPU curve looks like, and this is my chassis fan curve. Unfortunately, the KVM fan controller  is not supported by lm-sensors, So you can’t control the fan  speed from the OS itself. It could be useful for controlling the  fan speed depending on the HDD temperature using something like hddfancontrol However, the hddfancontrol author has mentioned  that they’re working on a way to control  fan speeds using an external script, So that could solve the problem.

Okay, so now that we’ve built the  server, and made it practically silent, what about the performance?  What can this machine do? Well, I’ve already put the Ryzen  4650G through its paces in this video, so I won’t dig too deep into it today. But long story short, the 4650G is very capable, and will pretty much handle any  home server task you can throw at it media playback, Docker containers,  game servers, virtual machines, you name it. it’s also way more powerful than  the server we’ve built last year which was based on an Intel N5105 CPU So what do you do after building  yourself a storage focused home server? Well, you guys have asked me for a software  version of this video for a long time, and the truth is, that’s actually  a way more difficult video to make! I’ve scrapped at least three scripts while trying.

But this time, I’m actually making a “how to build a home  server” video from a software perspective! If the video is already out,  you should see a link up here, but if it’s not, feel free to write a comment calling me a poopyhead and asking  when the video is coming out. Anyway, that’s gonna be it for this video, I hope you guys enjoyed it, and usual, I would like to thank my Patrons

2024-12-15 13:02

Show Video

Other news

Ancient Indian Civilizations - Advanced Alien Technology | Full Documentary 2025-01-13 11:25
CES 2025: Deep Dive on Intel Core Ultra 200H & 200HX Series Processor Performance | Intel Technology 2025-01-10 22:02
PrintTracker PRO LATAM 2025-01-11 15:29