LaurieWired Profile picture
May 27 4 tweets 2 min read Read on X
Want to recognize a song from just a few seconds of distorted audio?

Use Constellation Maps.

The math is brilliantly simple.

With just a handful of bytes; discarding 99% of the waveform, you can recognize a unique fingerprint across hundreds of millions of tracks. Image
Image
First, chop up the audio into few-second windows.

Take an FFT of the waveform, then extract the local peaks. Each maximum becomes a “star” on an xy plot of time vs frequency.

Pair nearby stars into clusters and hash the result. Boom, a noise-resistant fingerprint. Image
Constellation maps were the basis for Shazam.

Shazam today is a polished iOS/Android app, but the tech actually started back in 2003!

The early marketing was kind of hilarious; dial a phone number and get your Nokia to "listen". The ID'd track was sent back via SMS. Image
Today’s tech is more advanced, often using DNNs and vectors, but the early constellation algorithms are still fun to play with.

Panako is one of the more advanced open-source tools.

Some have used the tech for wildlife monitoring, or even generating fingerprints of mechanical pumps / motors to detect anomalies!

If you’d like to learn more, check out this paper that gives an overview of Panako’s tech:
archives.ismir.net/ismir2014/pape…Image

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with LaurieWired

LaurieWired Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @lauriewired

May 23
I miss the insanity of 80s processor design.

Intel’s iAPX 432 was a “micromainframe”.

It had no general purpose registers, supported object orientation *directly*, and performed garbage collection on-chip.

It was also 23x slower than an 8086. Here's why it failed. Image
Image
Intel targeted Ada so aggressively that C support was an afterthought.

Problem was, particularly at the time, the Ada compiler was extremely untuned and immature.

Scalar instructions were basically never used; *everything* was huge object-oriented calls. Image
Image
The “micromainframe” moniker wasn’t just marketing. One I/O chip could stitch together 63 CPUs on a single bus.

Essentially memory safe in-hardware; dangling pointers were impossible at the ISA level.

Partners like BiiN suggested using the CPU for nuclear-reactor control. Image
Image
Read 4 tweets
May 22
NTIRE is the coolest conference you’ve never heard of.

Deleting motion blur? Sure.
Night Vision? No problem.

Every year, labs compete on categories like hyperspectral restoration, satellite image enhancement, even raindrop removal (think car sensors)! Some highlights -> Image
Low-light enhancement is always popular.

Retinexformer, shown here got 2nd place in the 2024 contest.

A *TINY* transformer-based model, it runs in about 0.5 seconds for a 6K image on a single 3090. Only 1.6M parameters (<2MB weights at INT8)! Image
Maybe motion blur removal is more your thing.

UAVs are often used to examine wind turbine blades for early failure warning. Movement of drone + rotational velocity pose a challenge.

Here’s the 2021 winner DeblurGANv2, taking ~0.19s of processing per image. Image
Read 5 tweets
May 21
What if an OS fit entirely inside the CPU’s Cache?

Turns out we’ve been doing it for decades.

CNK, the OS for IBM’s Blue Gene Supercomputer, is just 5,000 lines of tight C++.

Designed to “eliminate OS noise”, it lives in the cache after just a few milliseconds of boot. IBM Blue Gene
Kernels that “live” in the cache are common for HPC.

Cray’s Catamount microkernel (~2005) used a similar method for jitter free timing.

Huge Pages, Statically Mapped Memory, and a lack of scheduling are all typical aspects of these systems.

What about the modern era? Cray XT2
Modern CPUs are *insane*.

L3 sizes exceed GIGABYTES per socket (see Genoa).

Many HPC labs run the hot path in light kernels (LWKs), outsourcing file I/O and syscalls to separate nodes; all with the intent of reducing µs-level jitter. Determinism is the name of the game. Image
Read 5 tweets
May 13
TDP (Thermal Design Power) of CPUs is a garbage metric that misleads consumers.

In the Pentium era, a 89W TDP meant just that; expect to dissipate 89W of heat in the worst case.

With Alder Lake, a 125W CPU can draw ~241W indefinitely!

Here's the goofy math: Image
Image
CPU’s didn’t really know how to idle until the early 2000s. They just kinda ran full bore all the time.

With the introduction of C-States, various parts of the processor could be shut down, saving power when the computer was doing nothing.

Of course, this was HUGE for laptops. Image
Image
If you temporarily downclock a CPU, why not upclock it as well?

Around ~2008, Intel came up with the concept of Turbo Boost and the “energy bucket”.

Short spikes *above* TDP are allowed, as long as the 28 second moving average stays <TDP.

Okay, nothing too crazy yet. Image
Image
Read 5 tweets
May 12
What if humanity forgot how to make CPUs?

Imagine Zero Tape-out Day (Z-Day), the moment where no further silicon designs ever get manufactured. Advanced core designs fare out very badly.

Assuming we keep our existing supply, here’s how it would play out: Image
Image
Z-Day + 1 Year:

Cloud providers freeze capacity. Compute Prices skyrocket.

Black’s Equation is brutal; the smaller the node, the faster electromigration kills the chip.

Savy consumers immediately undervolt and excessively cool their CPUs, buying precious extra years. Image
Image
Z-Day + 3yrs:

Black Market booms, Xeons worth more than gold. Governments prioritize power, comms, finance. Military supply remains stable; leaning on stockpiled spares.

Datacenters desperately strip hardware from donor boards, the first "shrink" of cloud compute. Image
Image
Read 6 tweets
Mar 27
HTTP code 418 began as an April Fool’s prank to signal “I’m a teapot”.

Later, it acted as a DDoS defense mechanism during the Ukraine war.

In 1998, the HTCPCP standard defined communication for…coffee pots.

Turns out, that wasn't the only thing brewing. Image
Image
Even the original RFC pokes fun at itself, stating:

"This has a serious purpose – it identifies many of the ways in which HTTP has been extended inappropriately."

Despite the joke, frameworks began treating the error code as valid; leading to unexpected consequences. Image
Code 418 became a method for developers to confuse attackers.

Instead of an ordinary 404, 418 acted as a marker for suspicious or invalid requests.

It’s essentially a wink at the attacker saying “I know you’re here, and I know what you’re trying to do. Image
Read 4 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us!

:(