If you read about O3 finding a SMB bug in the Linux Kernel, I did a few tests and I what I suspected looks true: Gemini 2.5 PRO can more easily identify the vulnerability. My success rate is so high that running the following prompt a few times is enough: gist.github.com/antirez/8b76cd…
• • •
Missing some Tweet in this thread? You can try to
force a refresh
1. No more than 5% of single stocks. 95% ETFs. I especially like MSCI World. But in general, wide spectrum ETFs for 70%. More specialized for the remaining 30% if you wish, like high dividend companies, semiconductor, ...
2. I always enter (and re-enter after an exit) the market incrementally. I buy 8% per month or alike. Markets are crazy in the short time: I like to get an average price.
3. When everybody is super happy about the market and the boy at the vegetables store tell me he and his cat are buying with 10x leverage this new eggs company, well, if I have decent gains (>= 10%) I exit the market selling from 80% to 100%.
Welcome to the LoRa thread. Here I'll talk about LoRa, a modulation used in RF communications that is a game changer for certain applications, and about my work on a LoRa based WAN protocol and implementation: my open source project called FreakWAN.
An obvious way to communicate wirelessy is to use WiFi or Bluetooth, but these technologies require an access point (WiFi case) and the range is limited (for both). So, in order to communicate point-to-point over ISM (open to all) frequencies other techniques are used.
You digital thermometer, car key or gate remote are likely to use either OOK or FSK modulations. The first will just transmit or stop transmitting in patterns in order to tell 0 from 1, while the other will move between two near frequencies F1 and F2 to also communicate bits.
Ok folks better to write a thread to make things more clear. A few days ago I wrote that the new LLM released by Meta is not open source. Many people replied that I'm dumb and of course it is open source, the license is GPL. Remember the following words: AI IS NOT CODE.
The code part of AI, is only a set of tools needed in order to implement the algorithms described in the papers. The final result of AI (talking of neural networks, at this point 99.99% of what matters), that is the weights, is not code in any way, of course.
So while the tools to work with AI are valuable and useful, the point about AI is: 1) Papers describing new ways to train neural networks. 2) Hardware + GPU time, in order to train models. The models that will change our society can't be trained by individuals on their PCs.
Flipper Zero: The Thread. I received a Flipper Zero a few days ago, and since I'm idling here at my parent's house, for the holidays, I spent a lot of time playing with it. This thread captures my impressions about the device.
First: the Flipper should put many hardware companies to shame. The user experience is *so* good. Everything works well at the first try. The Android app immediately connects with the device and updates the firmware. It can stream the screen in real time, access the file system.
The battery is already charged when you get the device. The animations are great, the applications well designed and it never crashes despite the fact it is still beta code. A few selected coders and designers shows how much big companies suck at designing hardware and software.
Sometimes I like to rediscover algorithms from scratch. Years ago I blogged (in Italian language) about the algorithm to enumerate the permutations of N elements recursively (oldblog.antirez.com/post/102). Yesterday I wanted to re-discover the equivalent lexicographic algorithm.
The algorithm is well known and very used in practice, so why tweeting about it? Because when you do the exercise of rediscovering algorithms, you develop an intuition about why it works. This intuition helps to remember the algorithm forever even if you are just told about it.
The lexicographic algoritm for permutations is cool because it is completely stateless. Given a sequence of N elements ABC, it is composed of just a NEXT() function that emits the next permutation of ABC so that lexicographically no permutation exists between ABC and the new one.
We often say: that were the cool times! The one of the Commodore64 and 6502 coding, where every instruction counted, every cycle, every byte of memory. Not now, where junior programmers burn gigabytes and seconds of CPU time to emit a single web page! [Boomer voice]. But...
We can make things like this again. Force your managers to understand carbon footprint of the software you are using and building. Don't put just scalability in the mix, but also performance per watt. This way we can help the environment and return to a more sane coding approach.
When I was in charge of the Redis code base, over the years I spent many days to optimize the CPU usage of idle Redis instances. You know? Redis is so massively popular that there are a huge number of idle instances everywhere. Imagine the cumulative environment cost of all that.