Discover and read the best of Twitter Threads about #gopacket

Most recents (1)

A few months ago, as users started throwing more API traffic at the Akita client, we started seeing multi-GB memory spikes. 🙀

At first, we hoped for an easy solution. There was none.

This is the story of @markgritter's 25-day journey taming memory usage in a #golang client. 1/ Memory spikes in the Akita CLI
@markgritter The first thing @markgritter did, as you would do in most languages, was to profile the system for obvious bottlenecks.

There was one, in the #gopacket reassembly buffer. Memory was persisting even though it wasn't getting used.

Mark fixed this, but the problem persisted. 2/ Profile showing bottleheck in reassembly
@markgritter At this point, the team and I weighed possible next steps.

Since #golang is memory-managed (2x memory overhead no matter what 🙀) and does not expose much control over the garbage collector, we were worried that it was not possible to be able to get memory usage low enough. 3/
Read 10 tweets

Related hashtags

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!