I diagrammed out my home lab/home server setup, mostly to keep a complete overview of how everything connects. I didn't want to get bogged down in aesthetics around colour scheme, or layout -- as you can no doubt tell. After a while diagramming it started to feel like a meme where I was trying to convey some crazy conspiracy theory on a wall of pinned paperwork and connecting threads. I think I am done documenting everything. But now I am wondering how obsessive I should be about detailing every little thing and VLANs and IP assignments. I don't really care if it looks like a dog's dinner, I really just care about "okay, where does this wire go to?" Is that the right approach?
This is how it works for me. I am using the homelab to learn new things. Part of that learning process is getting things into production and maintaining them. Because managing a production environment is one of the things I want to learn.
The meaning of "homelab" has changed over the years. Originally it was literally just having the hardware you'd find in the lab at home. e.g. you were taking classes for a CCNA and instead of going to the school's lab for hands-on with the hardware you'd just replicate the setup at 'home'. Nothing in the setup would be relied on beyond the specific thing you're testing in the moment. If you're going to stick to the original intent of the name, anything beyond "lab" use wouldn't be "homelab".
Now it skews more to meaning anything you're using to learn the technology even if you're using it as the equivalent of production and rely on it being up as a part of your daily life.
You, diagram? I just keep throwing crap into the mix and trying to remember which vlan and ip scheme its supposed to use and which device has access. Order is for work, Chaos is for personal enjoyment.
I am working on the build log for the jukebox project. It'll be on github eventually.
I have a $700 Tyan motherboard in my workstation. When I was moving the motherboard from one case to another, I scraped the underside of the motherboard against the metal case and broke off a number of small SMT caps and resistors. In the middle of the pandemic. In the middle of a project. So I had to jump on Amazon and have a new motherboard shipped to me next day whilst I RMA'd the damaged one. What do you say when you break your workstation motherboard in a moment of casual clumsiness? "Well... fuck!"
The jukebox is a "retro" jukebox. Wood grain, lots of tactile buttons. Two 14" 4160x1100 pixel touch screens with a vacuum fluroescent display graphic effect that shows the tracks. Click an arcade button, play that track. So it looks like those old-style jukebox devices you'd find in a diner. There are two 1920x1080 flexible touchscreens (though I have them encased so they are just permanently curved touchscreen displays), that let you navigate the full library, show album artwork, search box, etc. It's all driven by a single Raspberry Pi with a 4TB USB SSD for storage, and everything syncs to the music directory on my Synology NAS.
Am I in deep shit? Only if I don't clean the litter box in the "Cat Bathroom." So the only thing that can really go wrong is the power going out. Everything else is sort of redundant, and you can route around it by moving a few cables. I guess the UDM Pro SE taking a shit would cause me some issues. Or the cable modem. Everything else, though used daily and to its fullest extent, simply means those services, e.g. music server, become temporarily unavailable. No real disasters in over 20 years. The backup Synology NAS is effectivel a fail over for essential services, e.g. adguard, but even if both synology devices are down, there's backup DNS resolving on the UDM and also with quad9.
Is it 84 cores? 4 in the NUC, 28x2 in Storm, 28 in oh-fuck. Never really thought of it that way. I'd like some 8490H XEONs but I cannot justify it right now.
I don't see a single other person mentioning it, so I'll just say it: 52TB of flash storage alone is enough to make me jealous. 52TB of flash storage in an RV is just a few more layers on top.
Sad that the picture-wall project repository isn't open on github - I hoped to see it in action. Seems very neat.
When I started doing informal change control reviews with family and scheduling disruptive work outside of peak windows to avoid "user impact" - also having critical hot spares available, haha.
Yes I love the EQ references. Still my favorite MMO TO DATE pre Lucian.
Only game where a death would make me pretty late for work.
Having all the clients in your diagram is a bit much but it’s your diagram. We have over 1M client end points and just call it user or used to. Things change but love your setup. My home lab has been in use for years. My powered off servers and switches could make another lab for someone. To much power draw for me.
It stops becoming a "homelab" when it's impacting your family.
My entire "homelab" is contained in one box- if something fails on it, no one cares- it won't impact anyone. My home network consists of a simple mesh network.
This is something I don't think a lot of people consider. If you're the one that set this up, and you expire tomorrow, do you really want the burden of figuring this out on your spouse/partner/kids???
Diagrams are one thing, but traces are another. Have you thought about using something like Rack space or Netbox to create a source of detailed truth and document device inventory, their ports, and the connections between everything?
If you can't power it off without losing services or data then it's not a lab. Don't get confused by this sub where people call their home infrastructures "lab".
It’s a homelab when I’m just screwing around with some new software just to test it out. It becomes homeserver when I need to schedule maintenance windows to update things.
That’s part of the reason why I scaled back a lot of the stuff I had at home a few years ago. I maintain enough of this stuff at work. I don’t want to do it at home.
I feel like this is much more than many people on here are ready to undertake.
Also, homelab is a kinda vague designation, so it stops being one when you say so. I know people who call homelabs their NAS running a couple of containers, so go crazy.
A lab is a testing space, a playground, something that can be brought up and down and broken and fixed at will. It will be destroyed and rebuilt frequently.
As soon as it stops being possible to do that without someone (even if just yourself) getting annoyed that a service or functionality isn't working, then you've graduated to homeproduction/homeserver/homedatacentre (depending on its size!).
The growth has been purely organic. I cannot say any of it is really planned ahead of time. I use 16U vertical rails for each rack, and then build a cabinet around them that works for the space it is in, e.g. 32U in the cat bathroom rack, which is 16U side-by-side with another 16U. The arcade cabinet rack is 16U technically, but I only have 6U of rails in there, as the other space is pull out drawers to make it easier to work on the workstations without having to deal with cabling issues. 16U at the RV.
For permanent infra, I tend to buy new, because I want that extended warranty and am not interested in buying somebody elses problem. For projects, it is a mix of ebay finds and road-side or ewaste center salvage. I don't watch TV, but I probably own more 55" 4K TVs than any one person I know, because I salvage them (people in big cities throw out all sorts of stuff with minor electrical faults) and then turn them into personal projects, e.g. a touchscreen cat toy, a waterfall ring toss game in the door of an art gallery, a virtual window.
Some days it feels like everything is held together with string and chewing gum.
homelab and homeservices should be 2 different things and separated as much as possible but can share some like a network. Not sure why you to trace where connected where, use UniFi network diagram and some IPAM solution to track VLANS an IP addresses.
When I liked have a AD domain and DHCP running on it and the whole Media server part as well. I am working to seperate the two and have running my lab off of old Dell desktops from the servers I was using since my power bill went up. Also anything like my game server, Media Server, and Casa OS server and AD server I have moved to micro PC's when I feel like blowing stuff up it wont affect the entire house.
Bro, you are running a small/medium size office at this point. Not a home lab. This MF has a rack in 2 different data centers. What are you using those racks for? Off side backups? Redundancy?
God damn dude. You are running a medium size office at this point. Not a home lab. Also why did you decide to go with 2 different data center and what's the purpose of the data center in taxes(I can't see the text very well from mobile). Also what is your current IP schema for home and DC.
Ohio runs my personal server in a data center. Handles email, personal websites such as justin-lloyd.com (down and to the right if you're on desktop), offsite-backup. Texas data center is a web application server hosting a "funny pictures" website I am building in my spare time. MS SQL => Kestrel => NGINX = ATS => Android/iOS/Web/Terminal clients.