Have you ever implemented a dynamic function dispatcher in Python, where you can register functions at runtime using the decorator syntax? (Like the routers for FastAPI.)
I did this recently, and I am going to teach you how to do it! I'll walk you through it in the thread below.
We will build an event handler to catch arbitrary events and dynamically execute a function to handle the event.
(A good example is catching events in a webhook listener.)
Time to use some decorator magic!
The usage is straightforward: 1) instantiate the EventHandler, 2) register handler functions for specific events, 3) pass the event to the EventHandler instance when caught.
The EventHandler has an internal dictionary, where it holds the (event, function) key-value pairs. This is where the single event handler functions are registered.
The magic happens in the `EventHandler.register_event()` method, which is not just a decorator. It is a 𝐝𝐞𝐜𝐨𝐫𝐚𝐭𝐨𝐫 𝐟𝐚𝐜𝐭𝐨𝐫𝐲.
It returns a parametrized decorator, whose only job is to store functions in the EventHandler's dictionary.
Since `self` is in its closure, it can be accessed within the parametrized decorator. So, it can register functions to the `self.handlers` dictionary.
Why not just use a dictionary instead of EventHandler? I got two reasons.
1) The decorator syntax is much more convenient than manually jamming each function into a dictionary.
2) You can easily implement additional functionality. What if you are building a webhook event handler, and you need to verify its signature before doing anything? Just create a validator method and slam it into `__call__`.
Decorators are a fantastic feature in Python. If you are wondering, this is very similar to how Flask and FastAPI implements routing.
What is your favorite application of function decorators?
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Neural networks are getting HUGE. In their @stateofaireport 2020, @NathanBenaich and @soundboy visualized how the number of parameters grew for breakthrough architectures. The result below is staggering.
What can you do to compress neural networks?
👇A thread.
1⃣ Neural network pruning: iteratively removing connections after training. Turns out that in some cases, 90%+ of the weights can be removed without noticeable performance loss.
A few selected milestone papers:
📰Optimal Brain Damage by @ylecun, John S. Denker, and @SaraASolla. As far as I know, this was the one where the idea was introduced.
1️⃣ If you struggle to understand determinants, stop what you are doing and check out this video by @3blue1brown, it will make your brain explode.
2⃣ Sometimes, it is hard to figure out what a concept represents by looking at how it is calculated. The determinant of a matrix is calculated by a sum, iterating through all permutations of a row.
3⃣ However, this definition doesn't reveal anything about what the determinant means. In fact, it is quite simple: it describes how the volume scales under the corresponding linear transformation.