Techno Tim HomeLab Server Room Tour! (Late 2022)
It's that time of year again. Time for my annual home lab, server tour where I walk through all of my network and server systems I have running here in my home lab. Things have changed quite a bit.
If you recall from last year's tour, I revamped my server room to be, well, more of a server room. It went from a simple storage room to a centralized room for all of my networking servers and technology. I thought for sure when I completed this that I wouldn't have to change anything. But you know how it goes on.
Labs are ever changing. This year I've added more networking, added more servers, added more cameras, better cable management, better UPS's power management, a new rack and even more GB, if that's even possible. This year has been all about taking my home lab to the next level and setting up some things for next year. So without further ado, let's go check out my home lab. So here's my new and improved home lab and my server room.
And over here is a bunch of stuff that really hasn't changed. Over here, I have my fiber modem. Then I have my patch panel that connects to all of my devices around the house and then connects in to my server rack. And we'll go into that here in a second. Raspberry Pi, they're still kind of doing the same thing.
I'll talk about that one later on in another video when I review my services to Hue hubs, they're getting close to the limit for devices you can have on one hub. So I put in another one. Then I have a mini unified flex poly powered switch right there, which just gives me VLANs basically to all these devices. My PI zero still doing the ping and the wake on LAN to all of these servers now and then my HD home run to record my TV that connects to Plex and connects to an antenna way in the attic. And that's how I watch all of my TV, especially PBS. And so this whole entire panel is connected to this surge protector right here.
This is all backed up by battery power from this. And then over here, it's just room to grow. I don't even know if I'm going to grow. I kind of hope I don't. But I have the surge protector there now on battery backup and ready to put things on the wall.
So and then if we chase this down, like I mentioned, I have this umbilical cord that goes up and it goes inside of my rack over there. And you can see my professional placement of this access point. This was kind of here for testing, but it's actually doing okay, pointing up from the basement and serving some of my devices. So I need to find a more permanent place for it.
But this is as good as it gets for now, for one in the basement. Let's come back here and look at the server rack. So this is my new server rack. I covered this in a previous video, but it's a CIS rack server. It's all the way in close. I don't know if you could tell, but there's glass right there and it has a door here on the front.
Having this enclosed does wonders for the sound. I don't know if you can hear, but it's a lot quieter now than it was before. And I have more devices in here now.
The only downside is that it traps heat. So you can see it's 82 degrees in there. I have this kick on when this thermostat in here reaches 85 and it'll blow that heat out and it stays on for literally 30 seconds because it just pulls that hot air right out and then the fan kicks back off. So and this isn't really touching anything. It's actually hovering like above my head, which is what it is, giving off a lot of heat. So this is kind of deceiving.
It's really not that hot in there everywhere. It's just that hot where it is placed right now. So anyways, let's let's open this up and let's see what's new and changed and improved. Okay, so for my patch panels, I don't know if you saw this in a previous video, but earlier this year I upgraded my pose switch, which is in the middle. I had a 24 port which is now down there and not being used will be soon. But I replaced it with a 48 port power switch and this one is pretty awesome.
I may or may not have done it just so I can have 48 ports so that these line up properly. No, I'm kidding. Actually, that one has 24 ports. They were all here. We so I was running out of ports and I decided to buy another switch that has 48 ports all power.
So I don't have to worry about it. And so I can connect all of my devices. And it also has 410 gig ports right here. I have a couple in here for testing and then uplink down to my UTM. So now I have two of these patch panels with pass through keystones on them and a wire on the back and then loop them down.
So I don't know, it's kind of nice having a switch in the middle and then your two patch panels above and below so you can wire some up envelope. This is the first of many unpopular opinions. Networking in the front.
If you don't like networking in the front, you're going to see a lot more you probably don't like in the front either. But let's move down. So next is this device, my UTM SC. This isn't my UTM Pro. I sold the UTM Pro to a friend and upgraded to a UTI MSI.
Honestly, I didn't need to actually upgrade. It does give me some additional power ports, which is nice, but really I wanted to upgrade the internet because the internet has 2.5 gigabit on the when not then I'm going 2.5 gigabit any time soon.
But you know, upgrades. But it drives a lot of cameras and drives my VLANs drives everything. It's really the hub of my whole entire network. And you can see over here, it's a ten gig uplink to my switch and then it has another uplink to Port 24. Which port 24 is my wind, which goes up and into my fiber modem. So that red one is my win.
It's the only one that's a different color. And that's just so I know, hey, this is the way. And if I'm in the back messing with plugs. So then we see this uplink going to this device right here.
So let's talk about this. Another unpopular opinion, probably a hugely unpopular opinion because it's rarely ever done or maybe you haven't formed an opinion yet, but you don't like it, which is power in the front. So the reason why I did this was a couple of reasons. One, I think it looks super nice.
I like having the power in front to be able to easily add devices or remove devices. It did add some complexity in the back for wiring, but I think it looks nice this up to you. Pro is it's good looking and it's a shame to put it in the back where you can't see it.
I don't know. Maybe I'm weird. Let me know if you think power in front like this looks okay. But I did find these plugs right here that made it a little bit more acceptable. All these are turned at a 90 degree angle and they point down. So then, because I have that, I had to find these brackets right here, these brush panels that were big enough to fit a power adapter through them, either this side or the other side.
It was a little more challenging than I thought, but I found them and they work perfect. So if you want some of those, I'll leave links in the description below. But anyways, this thing is awesome. I can plug in so many devices into here.
I think it's a total of 16 or 17, maybe more, maybe 24. I'll have to do the math again here a little bit or count, but you can plug in four USB C devices, you can plug in, then one, two, three, four, five, six, seven, eight, nine, ten, 11, 12, 13, 14, 15, 16. Okay, 16. Okay. My motion detection up there isn't detecting me, so I need to move more so that doesn't happen. But anyways, so you can plug in up to 16 devices into here.
You can remotely toggle these on and off, which is really nice. And on top of that you get metrics for how much power they're drawing. So when I was looking for a piece to you, I started looking at them. They were kind of expensive to begin with. And then I thought, Hey, I want some smart remote ones. Then I saw a unified I had them and you could see I have some unified devices.
So I said, Hey, what the heck? And the super nice thing, like I mentioned, you can toggle power on and off and so I can toggle the power on and off of all of these devices. So right now, my network devices are plugged into here because all of this is running off of battery backup all the way down there. But that won't always be the case. These soon will be running off at different ups where I can toggle those on and off separately because I want to separate my network devices with power. That power cameras access points and switches and everything else from my compute machines that if worst case scenario, I had to shut things down, I'm going to shut those down first. So that I can still power my home security and lights and everything else over here.
So anyway, it's a pretty cool device. If you want to see a video on it, I'll gladly show you. And then it has this, which I think is basically to be able to fail over a load balance to UTM pros so you can have not only fail over internet, but you could have fail over UTM that go into here, which then the back would be your internet connection going into there anyways, I haven't played with that because I only have one UTM, but if you want to see a video totally on this, let me know. I'm sure I'll put one together.
So next down here is my old 24 port switch. I'm not using that anymore. It's going to go somewhere else. So not much to say about that, but this switch right here, this one right here replaced this one. And I thought about selling it, but I actually think I'm going to repurpose it. I don't see the point of selling it for a steep discount
and then buying something similar later on. I might as well keep it. This even UPS is going to be the one that powers all of my networking devices. I don't have that fully hooked up yet.
Once they do, I'll be able to toggle on all of these devices via this and they'll all be battery backup via this as well. Okay. So now let's talk about compute. So on the top, I have my two one servers and these one new servers are my SUPERMICRO servers. I haven't changed anything about these at all.
They've been exactly the way that they have been since almost the day and installed them. I mean, I doubled the RAM shortly after, but I haven't touched them. They've just been running solid nonstop. But both of these one new servers have an Intel Xeon.
I think it's the E5 2684, which is 2.5 gigahertz and has 14 cores and 28 threads. It's a single socket. So each of these are a single socket. I did that purposely because I didn't want to draw a ton of power.
They do draw a lot of power, but not as much as dual socket. And it's a newer platform, relatively efficient, but it is a server, so it's going to draw a little more power. They both have 256 gigs of RAM and they're basically my hypervisors.
So these are two of my products. Mark's nodes, top one Dorigo that runs nine VMs, bottom one Hydra, that one's 17 VMs. So a lot of testing going on in that one and I'll deep dive into those later on in my services tour. Next is this store in eight.
This is a store in Inner. So this store natur is kind of in use right now. It's getting ready to take on some new roles, but I haven't done any upgrades to this besides lights and fans and power supplies. And actually it was a downgrade. But just so it'd be a little bit quieter, a little bit prettier and draw a little less power. And really, you wouldn't even have done that if you buy one for your organization.
But I'm at home and this is my server. And so that's what I did. But no changes to this. It's enhanced at 15 and has an intel Xeon Silver it's the 4216 has 16 cores, 32 threads and it is a 2.1 gigahertz and it's a super micro board inside but that's all OEM from 45 drives as an HP and it has dual next gigabit and then it has dual ten gigabit NICS as well and it has 128 gigabytes of RAM.
And then as far as storage goes, it has enterprise hard drives. It has 714 terabytes. Seagate Exos drives raw capacity of almost 100 terabytes. I think I'm going to move my NAS to here, but I haven't fully decided yet.
I might still use prox marks on it. I might still virtualized to Nas, but we'll see. And down here is my older PC conversion. This is my in two pieces, the Go PC that I used as a desktop, but this is just an older core I7 Intel.
It still has virtualization and SD capabilities and so I'm going to continue to use it as two video cards in there now that I was using to both convert video as well as do some streaming. But for the most part I've just been using it as another backup system. So as lots of hard drives in there and it backs up a lot of the things you see in here at night. So it wakes up, backs up, shuts down. But I'm going to find maybe a different role for it, maybe put the hard drives in there. Haven't decided yet. Okay.
As far as storage goes, one of the last things is this disk shelf. This dish shelf has been here. I haven't touched it. Besides moving it into this case, but it still has all of my drives. This whole entire ray is passed into true Nas, which is virtualized inside of this one.
And the controllers pass through down to here. And so I have my intranet as fast pool running on these drives, which is all of my data for all of the network. I might be getting rid of this.
I'm not sure yet. It does draw quite a bit of power. Now that I have this PD pro, I can actually see how much power each individual device is pulling. And I think with all of these drives, this is pulling about 120 watts and this machine pulls less with more drives and compute in it.
So I think I might get rid of this disk shelf. We'll see or keep it somewhere else. And then move the data into here. Haven't decided yet. Future video. And then way down on the bottom is my ups.
So this is a trip light ups, it's a rackmount one. It's super duper nice. I actually have a battery backup extension on the back. I was able to mount them facing back to back so that I didn't take up another to use space. It's probably not recommended because there's probably four or five inch gap in between the two, but you don't want to be like digging around in between those in the dark.
But I know it's in there and I've run all of the power cables I need to out of there. So I did it back to back. Not recommended for a corporate environment, but fantastic for a home lab because I wanted to conserve some space.
So now that we see that, let's let's take a look at the back. You can check out my cable management and I'll show you how everything is wired up. Okay. All of this connects down here. I call this the umbilical cord going up into the server rack from the top, and that comes down here and then runs down this channel to all the devices. So the wiring jumps.
Not the best. I mean, I'm kind of proud of it. It's a lot better than I had before. But if you could see up there, it's a little tangling going to sort that out a little bit. But for the most part, all the cables are bound together.
And then what I did was I got all of these clips and I created these channels going all the way down so that they stay pretty organized. And then in between, I use some Velcro. I mean, this isn't a Tom Lawrence, you know, super fancy Tom Lawrence job or a mac telecom network. It's those guys are experts, not me. This is a software engineers way of wiring.
But I think it turned out okay. Definitely better than it was before. But as I mentioned, I decided to do network on the right and then on the left I did power.
And so you can see kind of inherent to I use these all the way back here, these same channels to run power. And so there's the unpopular decision of they're running all the power from the front to the back. But I think it turned out okay with these. And so now I have all of my power here and it comes down, it's all labeled and then it goes down into each respective system.
And you can see I have quite a bit of labels, but I went crazy with the painter's tape just so I knew what each device was. And then I thought, Now I'll leave it anyways. So up at the top here, I just have this chip to you that came with the server rack.
It's good enough, but up here are devices that aren't critical. So you can see here on my Philips Hue two light strips, they're plugged into there. I have the fans. Arguably the fans are critical, but at the same time, if it goes to battery power long enough, I'm going to start shutting stuff down. So I didn't really want to spend the power on the fans as well.
I might change my mind, convince me otherwise in the comments. And another call out really quick as the lights in here. I know I use a ton of rugby, but now you can actually see the utility of it.
When I have this rugby on and just clear white, I can see really well and my server rack all the way around like every corner I have except for when you get down here is pretty well covered down here. It still looks okay. That might be a little bit dark on the video, but more than enough light to see what I'm doing. So even though a lot of times I change them to pink and blue and purple, most of the time I'm using them in a normal white because I'm working on my server and I mentioned this in the first tour, but I have a motion sensor up there that when it detects motion, it turns on the lights in this room as well as in here. And it's the default white light this.
So as soon as I walk into my server room, I can see right in there when I leave for 5 minutes and there's no motion, it turns everything off. And that's the same with the fan on the store neater up front that I didn't mention. But that light doesn't always stay on either. After motion isn't detected for 5 minutes, it'll turn off the fans within the store neater as well. That's because I hooked it up to a Zigbee controller, but that's in a different video.
So anyways, as you can see, I think I did an okay job wiring so you can see go up there network cables going down, hydro underneath, network cables going down store later there, network cables going down, so on and so forth. Same with my PC conversion. And then for my disk shelf, it's just power and I only have one power plugged in that's just really to save power.
But if one dies, I'll be in big trouble. But at least I have a backup, so I didn't want to run them both at the same time just to kind of conserve power and the same goes for the power. I noticed that most power supplies on cases or servers are on the left, so I decided to put all the power on the left.
So these are all marked going down and going up here into this channel channels kind of getting kind of fat right here. But I think it's good enough for now. And when I add devices here, it's going to add some more bulk to this.
But I think I can squeeze these open a little bit more and these are super nice as you get a close up. But if you don't have these, you should grab some of these. These can bolt on anywhere, anywhere. As you can get a knot, you can bolt these on and you can fit cables in there and it keeps it really nice and tight within there.
And you can even bend this if you want a more open or closed and then get your wires in there and they kind of stay out of the way. And then down here is the UPS battery extension. So this is an extender that's connected to the first set of batteries up front. And so this gives me more runtime. Last time I checked, I could get over an hour of runtime on these servers with this battery unit, this extension along with the front one. That's pretty awesome.
A lot better than the 910 minutes I had before, but I'm going to see if I can get some more run time out of that. I don't know. I'll be adding devices which will take away from runtime, but I'll also be removing or deprecated machines and taking those out, which should give me a little more capacity and a little more runtime.
And the same as in the front and the back. This is also a door you can probably see the noise went down considerably, but I can close this door and I'm not yelling anymore. I got pretty quiet and that's why I wanted an enclosed case. So. So this was my home lab tour. What did you think? Too big out of control. Crazy.
Let me know in the comments below. In 2022, I expanded quite a bit. I ran out of Rackspace this year and so it forced me to get a new rack, which then forced me to reconsider a lot of my options and start to shuffle things around.
But don't let this discourage you from building your own home lab. It doesn't need to be as big and as crazy and as power hungry as this. As I always say, it can be as small as a Raspberry Pi.
It can be an old used PC or it can be virtual machines on your desktop that you might be using right now. All of these are great places to learn about technology. And so in 2023, I have a lot of things I'm excited to share with you some that will be going right in here. And part of the reason why I did expand, But before we talk about 2023, I should talk about 2022. And really thank you all so much for making this channel. What it is.
This year, again, has been another incredible year for my channel. And I couldn't have done it without you. Seriously, I mean it. But thank you so much for sharing this with friends and colleagues and coworkers because it's really helped the channel tremendously.
And you might not know this, but I learned just as much, if not more, from you guys in the comments and in discord than you probably do for me. So I'm forever grateful for that, too. So thanks again for all your support. And remember, if you found anything in this video helpful, don't forget to like and subscribe.
Thanks for watching.