8x10Gb backplane for the basement has arrived - will install this after work not interrupting school internet and such.
Let’s do this.
Racked up - working on the aggregate bonds now.
Alrighty, all configured (I kept doing it backwards after moving the controller...that's on me). 20Gb to the 24x PoE, currently 2Gb to upstairs (ready to go 20Gb), and UXG-Pro routing 10Gb through to this spine. Overall: looking good.
Also wasn’t happy with front panel on the CloudKey - so routed back through the rear port and patch panel:
Alrighty, let's start upstairs. First up is rack, patch panel, keystone bits, and a USW-Pro-24-POE. I'm not sure if I'm going with a 8x 10Gb spine switch upstairs or the slightly larger 28x 10Gb + 4x 25Gb option (depends on availability). Still, step 1 is 2x 10Gb to basement.
Overall: I'm converting my existing layout of some network gear, UPSes, and drives in 2 IKEA units with desktop systems on top to...a rack! That'll give me more office space overall. Here's a recent-ish pic of the current layout. Picture the right as a 15U rack instead:
I'll have overall less gear in the end - that second black tower will be a 2U-3U server chassis. Main desktop (new build, more to come) will be a 3U chassis most likely. 2U for switches, 1 for patch, and 2-4U for UPS down bottom (likely 1500VA + expansion battery...we'll see!)
The drives and Synology in the upper left, time capsule in the upper right, etc. will all be collapsed into that second tower replacement. Maybe a custom build with TrueNAS, maybe Synology, not sure yet. Likely a ~6-8x 12TB option for photo backup, Plex, etc. Suggestions welcome!
People keep telling me a rack in the office is too noisy, but I don't buy that!
First: racks are silent, they make no noise. Things *in* them make noise, and we're going to be paying attention to that. Lower wattage everywhere possible means less to cool and longer UPS runtime.
The rack will have soundproofing and cooling - and this will likely be my first water cooling build for the 3U server chassis for my main machine. The file/Plex/etc. server I'd like to have a GPU in for transcode offline and a lower TDP processor, basically: quiet less playing.
Maybe an afternoon project…
Alrighty, we’re back from a short rest - let’s pick this back up. Time to build the 15U rack for the office:
First layer assembled with casters on - this is the inner rack and sound baffling:
Vertical rails in and sides on. Note for anyone doing this: I recommend different steps ordering: swap 2 and 4 since rails easily block top screw access.
Annnnnd we’ve hit a big snag. Evidently they didn’t drill holes for the hinges on one side. Or more precisely, the side positions matter and I have to disassemble it and invert the left side, blahhhhhh fuck, what a waste of time. Instructions are just the 1 sheet from earlier.
Luckily with the vertical rails supporting, we could do some surgery without a lot of hassle…just the 2 panels. Now we’re back on track:
Cable panel installed - cords exit through here (sound baffling has a slot in it) for minimum noise leak:
Alrighty - hinges, doors, and locks installed. Fully assembled! Really not a bad experience other than the panel direction oops. I’m exhausted though, because it’s quite heavy overall…kids and school tomorrow, after that we’ll move stuff into it :)
For reference, here’s the pair (2 deep) of IKEA shelves this is replacing:
Been lazy about this - finally settled on (ordered) a UPS setup. I'm going with a PR3000RTXL2U unit (3000VA, sinewave) with a BP48VP2U02 expansion unit (+3360VA). That should be a decent amount of backup runtime for the desktop and network gear in the bottom 4U of rack space.
Well, oops. SOMEBODY (...it was me...) read input specs on the wrong UPS model the other way. A 30 amp circuit install for the office has now been added to the project - needs a NEMA L5-30P for power.
As I'm listing parts out to run a new 30 amp circuit to my office for the rack...okay, yes, the UPS I'm going with miiiiiiight be overkill. Only a little though. If we sell this house, explaining the NEMA L5-30P in a bedroom will be fun at least.
If we ever sell this house, I've decided I'm going to advertise this as the ultimate guest bedroom. You could even hookup an RV.
"Why is there a dryer outlet in here?"
"That's not a dryer outlet, it's for an RV"
"..."
"It's a guest room"
I giggled
First up is the battery backup expansion unit. I’m installing it on the bottom so that the plugs on the main unit are more easily accessible (being a bit higher). Rails in first:
Unit racked - handles and faceplate to make things accessible and pretty:
If anyone's curious - the backup unit charges itself (it'll be on a separate 20 amp circuit already in the room). This pic isn't quite right because the main unit in my case uses a NEMA L5-30P, but same connection principal:
Still in a hardware shortage so I’ve been really lazy, but did rack the main UPS unit this afternoon. Here is is connected up. And yes, I’ll have to address that 30 amp outlet next:
Patch panel initial assembly. It’s not too beefy because local systems will be connecting via 10-25Gb SFP+/SFP28 cabling - this is purely for uplinks to other devices.
Time for a 20 amp & 30 amp circuit pair for the new UPS and expansion units…and since were crawling around in the attic, another Cat 6a run to the basement. Let’s do this.
Step 1: lay out all the tools & supplies you know you’ll use so there’s no pausing to go hunting.
In retrospect, it’s obvious I need to tear the office apart to get at where I want these outlets easily…so I guess we’re switching to the rack today too. Starting to tear tech out not required for the day:
Yummy dust! *cough* Enclosed rack should help with this greatly:
Disassembly complete - I now have a temporary/minimal network setup so APs upstairs still have PoE.
Please take pride in your work boys and girls.
Stopped to cut some boxes out - almost ready to drop the new power and cat6a runs down:
Basement leg is done for all 3 - getting the office side drop in next. Attics suck.
Well, shit.
Fishing line destroyed…shoulda started with the camera…
But, the lines are run - breaking for dinner and probably the day. I was laying in insulation for a while fighting that one, ugh. My brother-in-law was with me for the latter part of this…can’t imagine doing that one alone.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Okay so I see people thinking Twitter will go down as it loses engineers, fail whale everywhere. But that's not how it happens. That's the worst case as things go catastrophically wrong.
What actually happens is small fires, no longer addressed, burn into wildfires users see.
What a lot of people don't know is every major platform/service/whatever has people addressing small fires every day. Fires 99.9% of users don't realize happened. Maybe a failure, maybe a bad rollout, maybe...whatever, doesn't matter. Stuff goes sideways, minorly, all the time.
How often does this happen where you are? Okay, now multiply by massive scale. Bigger teams: more things happen. There are also internet issues outside your control, attackers performing DDOS attacks...lots of things.
Lots of replies about how building the tech isn't hard, building the userbase it.
Both are hard. And both go hand in hand, you don't take on the cost of building a high scale platform before needing it, you need users, revenue, etc. to grow as you go. They're very intertwined.
To be clear, most people are claiming they can *replicate* something in a weekend...which still isn't true, but bear with me.
The original had to figure out how to get to the current state. They didn't know what users wanted. They had some guesses and changed directions a lot.
Most companies start this way, they think they have a solution for some need.
Many guess wrong. They're not around anymore. Remember: most don't make it. Talking about a current company is heavy survivorship bias.
"I could do what they did 5000x faster" is exceedingly arrogant.
I think huge scale is hard to imagine for most people because human brains rely on relations and comparisons to what they've seen before. Something that's orders of magnitude past anything you've encountered thus far is hard to really comprehend quickly. It's so foreign.
And I'm not claiming to be good at this either, but I have seen enough to not make assumptions about those unknowns.
I ran a medium-scale website with 9 prod webservers. My current role deploys to millions. I still haven't quite wrapped my head about that over a year in.
And let's be clear: I don't want to discourage people from seeing big things and thinking "I could build that". That's an awesome attitude. I hope you do one day.
I want to discourage calling what you don't understand "easy". It almost always isn't. That's why so few exist.
I need to do a write up on the common symptom of stampeding in web apps at some point. It’s one of those problems that isn’t obvious until you see it for what it is, and then it’s super obvious and you always want to design for it.
The basics are you have say 1,000 requests per second, just some number over a few. When a thing they’re all using disappears, they’re all then waiting on you to get that thing (e.g. re-caching it). That’s unavoidable unless you want to 404.
But are they *all* re-caching it?
If they’re all causing a cache fetch, you suddenly go from 100% of requests using the cache to 0% using the cache and 100% slamming your backend. This can exhaust a lot of things quickly (bandwidth, ops, quotas, CPU, etc.). That’s not awesome!
If you live in Missouri, please be sure not to view source on this tweet you damn HTML source hackers.
Good question. No. I am not afraid of being sued for this tweet. It's very obvious that the people suing here have never used the internet, so they won't see this.