You gotta have options - that's the line that a jewelry salesman once used on my wife, and has become an inside joke in our family. However, that sales line is a very good consideration to have in all sorts of life situations.
I endured some of the biggest setbacks in my life when I found myself in situations where I had just a few bad options, or even worse, just one terrible one. Over the years I found myself unconsciously working to maximize the number of options that I had. 2/
There are many ways that you can increase your optionality, and most of them don't require you to have access to outsize resources. 3/
1. Invest in yourself and in skills that have a wide range of applications, especially professional ones. 4/
2. Alternatively, invest a small amount of time and energy in many skills and projects, each one of which can have a potentially large impact. You don't want to be spread too thin though. There is an optimal number of these that you would want to be a part of. 5/
3. Avoid getting stuck in commitments with no easy way out in the event of a failure. 6/
4. Always have an exit strategy. Make plans well in advance, and while things are still going well at whatever you are involved with at the moment. You don't need a detailed plan, just a set of thought out guidelines. 7/
5. Optimize for impact, not the appearances. Lots of unsexy options that work is better than a few sexy ones that will not get you far. 8/
• • •
Missing some Tweet in this thread? You can try to
force a refresh
A few weeks ago I came across a tweet by a prominent ML/AI developer and researchers that promoted a new post about the use of transformers based neural networks for tabular data classification.
The post was on Keras’ official site, and it seemed like a good opportunity to learn how to build transfomers with Keras, somethig that I’ve been meaning to do for a while. However, one part of the post and the tweet bothered me. 2/27
However, one part of the post and the tweet bothered me. It claimed that the model mateched “the performance of tree-based ensemble models.” As those who know me well know, I am pretty bullish on the tree-based ensemble models, 3/27
The current issue of @Nature has three articles that show how to make those error-correcting mechanisms achieve over 99% accuracy, which would make silicon-based qubits a viable option for the large-scale quantum computational devices.
I've worked for 4 different tech companies in various Data Science roles. For my day job I have never ever had to deal with text, audio, video, or image data. 1/4
Based on the informal conversations I've had with other data scientists, this seems to be the case for the vast majority of them. 2/4
Almost a year later this remains largely true: for the *core job* related DS/ML work, I have still not used any of the aforementioned data. However, for work-related/affiliated *research* I have worked with lots of text data. 3/4
2/ A year ago I was approached with a unique and exciting opportunity: I was asked to help out with setting a Kaggle Open Vaccine competition, where the goal would be to come up with a Machine Learning model for the stability of RNA molecules.
3/ This is of a pressing importance for the development of the mRNA vaccines. The task seemed a bit daunting, since I have had no prior experience with RNA or Biophysics, but wanted to help out any way I could.
One of the unfortunate consequences of Kaggle's inability to host tabular data competitions any more will be that the fine art of feature engineering will slowly fade away. Feature engineering is rarely, if ever, covered in ML courses and textbooks. 1/
There is very little formal research on it, especially on how to come up with domain-specific nontrivial features. These features are often far more important for all aspects of the modeling pipeline than improved algorithms. 2/
I certainly would have never realized any of this were it not for tabular Kaggle competitions. There, over many years, a community treasure trove of incredible tricks and insights had accumulated. Most of them unique. 3/