@Nick_Craver@infosec.exchange Profile picture
Dec 10, 2020 42 tweets 21 min read Read on X
8x10Gb backplane for the basement has arrived - will install this after work not interrupting school internet and such. Image
Let’s do this. ImageImage
Racked up - working on the aggregate bonds now. ImageImage
Alrighty, all configured (I kept doing it backwards after moving the controller...that's on me). 20Gb to the 24x PoE, currently 2Gb to upstairs (ready to go 20Gb), and UXG-Pro routing 10Gb through to this spine. Overall: looking good. ImageImageImage
Also wasn’t happy with front panel on the CloudKey - so routed back through the rear port and patch panel: Image
Alrighty, let's start upstairs. First up is rack, patch panel, keystone bits, and a USW-Pro-24-POE. I'm not sure if I'm going with a 8x 10Gb spine switch upstairs or the slightly larger 28x 10Gb + 4x 25Gb option (depends on availability). Still, step 1 is 2x 10Gb to basement. ImageImage
Overall: I'm converting my existing layout of some network gear, UPSes, and drives in 2 IKEA units with desktop systems on top to...a rack! That'll give me more office space overall. Here's a recent-ish pic of the current layout. Picture the right as a 15U rack instead: Image
I'll have overall less gear in the end - that second black tower will be a 2U-3U server chassis. Main desktop (new build, more to come) will be a 3U chassis most likely. 2U for switches, 1 for patch, and 2-4U for UPS down bottom (likely 1500VA + expansion battery...we'll see!)
The drives and Synology in the upper left, time capsule in the upper right, etc. will all be collapsed into that second tower replacement. Maybe a custom build with TrueNAS, maybe Synology, not sure yet. Likely a ~6-8x 12TB option for photo backup, Plex, etc. Suggestions welcome!
People keep telling me a rack in the office is too noisy, but I don't buy that!

First: racks are silent, they make no noise. Things *in* them make noise, and we're going to be paying attention to that. Lower wattage everywhere possible means less to cool and longer UPS runtime.
The rack will have soundproofing and cooling - and this will likely be my first water cooling build for the 3U server chassis for my main machine. The file/Plex/etc. server I'd like to have a GPU in for transcode offline and a lower TDP processor, basically: quiet less playing.
ImageImage
Maybe an afternoon project… Image
Alrighty, we’re back from a short rest - let’s pick this back up. Time to build the 15U rack for the office: ImageImageImageImage
First layer assembled with casters on - this is the inner rack and sound baffling: ImageImageImageImage
Vertical rails in and sides on. Note for anyone doing this: I recommend different steps ordering: swap 2 and 4 since rails easily block top screw access. ImageImageImage
Annnnnd we’ve hit a big snag. Evidently they didn’t drill holes for the hinges on one side. Or more precisely, the side positions matter and I have to disassemble it and invert the left side, blahhhhhh fuck, what a waste of time. Instructions are just the 1 sheet from earlier. ImageImage
Luckily with the vertical rails supporting, we could do some surgery without a lot of hassle…just the 2 panels. Now we’re back on track: ImageImage
Cable panel installed - cords exit through here (sound baffling has a slot in it) for minimum noise leak: ImageImage
Alrighty - hinges, doors, and locks installed. Fully assembled! Really not a bad experience other than the panel direction oops. I’m exhausted though, because it’s quite heavy overall…kids and school tomorrow, after that we’ll move stuff into it :) ImageImageImageImage
For reference, here’s the pair (2 deep) of IKEA shelves this is replacing: ImageImage
Been lazy about this - finally settled on (ordered) a UPS setup. I'm going with a PR3000RTXL2U unit (3000VA, sinewave) with a BP48VP2U02 expansion unit (+3360VA). That should be a decent amount of backup runtime for the desktop and network gear in the bottom 4U of rack space.
Well, oops. SOMEBODY (...it was me...) read input specs on the wrong UPS model the other way. A 30 amp circuit install for the office has now been added to the project - needs a NEMA L5-30P for power.
As I'm listing parts out to run a new 30 amp circuit to my office for the rack...okay, yes, the UPS I'm going with miiiiiiight be overkill. Only a little though. If we sell this house, explaining the NEMA L5-30P in a bedroom will be fun at least. ImageImage
If we ever sell this house, I've decided I'm going to advertise this as the ultimate guest bedroom. You could even hookup an RV.
"Why is there a dryer outlet in here?"
"That's not a dryer outlet, it's for an RV"
"..."
"It's a guest room"
I giggled ImageImage
First up is the battery backup expansion unit. I’m installing it on the bottom so that the plugs on the main unit are more easily accessible (being a bit higher). Rails in first: ImageImageImageImage
Unit racked - handles and faceplate to make things accessible and pretty: ImageImageImageImage
If anyone's curious - the backup unit charges itself (it'll be on a separate 20 amp circuit already in the room). This pic isn't quite right because the main unit in my case uses a NEMA L5-30P, but same connection principal: Image
Still in a hardware shortage so I’ve been really lazy, but did rack the main UPS unit this afternoon. Here is is connected up. And yes, I’ll have to address that 30 amp outlet next: ImageImageImageImage
Patch panel initial assembly. It’s not too beefy because local systems will be connecting via 10-25Gb SFP+/SFP28 cabling - this is purely for uplinks to other devices. ImageImage
Time for a 20 amp & 30 amp circuit pair for the new UPS and expansion units…and since were crawling around in the attic, another Cat 6a run to the basement. Let’s do this.

Step 1: lay out all the tools & supplies you know you’ll use so there’s no pausing to go hunting. ImageImage
In retrospect, it’s obvious I need to tear the office apart to get at where I want these outlets easily…so I guess we’re switching to the rack today too. Starting to tear tech out not required for the day: ImageImageImage
Yummy dust! *cough* Enclosed rack should help with this greatly: Image
Disassembly complete - I now have a temporary/minimal network setup so APs upstairs still have PoE. ImageImageImageImage
Please take pride in your work boys and girls. Image
Stopped to cut some boxes out - almost ready to drop the new power and cat6a runs down: ImageImageImageImage
Basement leg is done for all 3 - getting the office side drop in next. Attics suck. ImageImage
Well, shit. ImageImageImageImage
Fishing line destroyed…shoulda started with the camera… ImageImageImageImage
But, the lines are run - breaking for dinner and probably the day. I was laying in insulation for a while fighting that one, ugh. My brother-in-law was with me for the latter part of this…can’t imagine doing that one alone. Image

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with @Nick_Craver@infosec.exchange

@Nick_Craver@infosec.exchange Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @Nick_Craver

Nov 4, 2022
Okay so I see people thinking Twitter will go down as it loses engineers, fail whale everywhere. But that's not how it happens. That's the worst case as things go catastrophically wrong.

What actually happens is small fires, no longer addressed, burn into wildfires users see.
What a lot of people don't know is every major platform/service/whatever has people addressing small fires every day. Fires 99.9% of users don't realize happened. Maybe a failure, maybe a bad rollout, maybe...whatever, doesn't matter. Stuff goes sideways, minorly, all the time.
How often does this happen where you are? Okay, now multiply by massive scale. Bigger teams: more things happen. There are also internet issues outside your control, attackers performing DDOS attacks...lots of things.

Every. Day. It never ends.
Read 6 tweets
Nov 2, 2022
Lots of replies about how building the tech isn't hard, building the userbase it.

Both are hard. And both go hand in hand, you don't take on the cost of building a high scale platform before needing it, you need users, revenue, etc. to grow as you go. They're very intertwined.
To be clear, most people are claiming they can *replicate* something in a weekend...which still isn't true, but bear with me.

The original had to figure out how to get to the current state. They didn't know what users wanted. They had some guesses and changed directions a lot.
Most companies start this way, they think they have a solution for some need.

Many guess wrong. They're not around anymore. Remember: most don't make it. Talking about a current company is heavy survivorship bias.

"I could do what they did 5000x faster" is exceedingly arrogant.
Read 5 tweets
Nov 1, 2022
I think huge scale is hard to imagine for most people because human brains rely on relations and comparisons to what they've seen before. Something that's orders of magnitude past anything you've encountered thus far is hard to really comprehend quickly. It's so foreign.
And I'm not claiming to be good at this either, but I have seen enough to not make assumptions about those unknowns.

I ran a medium-scale website with 9 prod webservers. My current role deploys to millions. I still haven't quite wrapped my head about that over a year in.
And let's be clear: I don't want to discourage people from seeing big things and thinking "I could build that". That's an awesome attitude. I hope you do one day.

I want to discourage calling what you don't understand "easy". It almost always isn't. That's why so few exist.
Read 6 tweets
Oct 31, 2022
"Do they have WiFi?"

"Technically, yes." 63 Kbps speed test, weeeee!
I hope you appreciated that tweet, it took 6 minutes to send.
I'm never going to meet my commits per day quota under these conditions.
Read 5 tweets
Oct 14, 2021
I need to do a write up on the common symptom of stampeding in web apps at some point. It’s one of those problems that isn’t obvious until you see it for what it is, and then it’s super obvious and you always want to design for it.
The basics are you have say 1,000 requests per second, just some number over a few. When a thing they’re all using disappears, they’re all then waiting on you to get that thing (e.g. re-caching it). That’s unavoidable unless you want to 404.

But are they *all* re-caching it?
If they’re all causing a cache fetch, you suddenly go from 100% of requests using the cache to 0% using the cache and 100% slamming your backend. This can exhaust a lot of things quickly (bandwidth, ops, quotas, CPU, etc.). That’s not awesome!
Read 6 tweets
Oct 14, 2021
Wow. Just...wow.

This is why you need to elect people with any clue whatsoever as to what the hell is going on.
If you live in Missouri, please be sure not to view source on this tweet you damn HTML source hackers.
Good question. No. I am not afraid of being sued for this tweet. It's very obvious that the people suing here have never used the internet, so they won't see this.
Read 4 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us!

:(