This past weekend there was a major announcement by the @anacondainc team at #PyCon2022 - the launch of PyScript, an in-HTML interface for Python. This is something that many of us in the Python community have been hoping for for a long time.
•Python in the browser: Enable drop-in content, external file hosting, and application hosting without the reliance on server-side configuration
2/6
•Python ecosystem: Run many popular packages of Python and the scientific stack (such as numpy, pandas, scikit-learn, and more)
•Python with JavaScript: Bi-directional communication between Python and Javascript objects and namespaces
3/6
•Environment management: Allow users to define what packages and files to include for the page code to run
•Visual application development: Use readily available curated UI components, such as buttons, containers, text boxes, and more
4/6
•Flexible framework: A flexible framework that can be leveraged to create and share new pluggable and extensible components directly in Python
5/6
"A fox knows many things, but a hedgehog knows one big thing." - Archilochus
This was a quote that Isaiah Berlin expanded into one of his essays, first published in 1953. The quote was used to classify writers and thinkers into two big categories - those who view the world
1/18
in terms of a single large idea (Plato, Pascal, Hegel, Ibsen), and those who draw from a variety of experience (Aristotle, Shakespeare, Balzac, Joyce) and are not in the process of constructing a singular worldview.
2/18
In Berlin’s subsequent telling, the essay was meant to be more of a whimsical exploration of that classification, rather than a rigorous treatment. Nonetheless, the idea behind it was taken up by many “thought leaders”, and incorporated in *their* worldviews.
Here is a snapshot of a (small) subset of all of my Coding, Data Science and Machine Learning books. This collection would get you close to 98%-99% of all the necessary core skills to be a good Data Scientists. 1/6
No, I have not read all of these books. I have picked up a lot of what’s covered in them from various, mostly online, resources. If you are very dedicated and motivated you could do the same, but having them all in easily accessible hard copy resources is a great advantage. 2/6
Full disclosure: I have gotten many of these books for free as review copies, and a few others have been written by friends and acquaintances. Nonetheless, I believe I am being fair when saying that these are all high-quality resources that would be with your time and money. 3/6
#GTC22 is just around the corner, and it is a fully online and free event. There are numerous incredible and valuable sessions. If you are an academic, researcher, or an early-career ML/AI aspirant, the following sessions might be of particular interest to you. 🧵 👇 1/7
5 Steps to Building a Career in AI: You’ll hear from AI experts as they give you insight into their journey and discuss the top five most practical steps you can take when beginning a career in artificial intelligence.
Fighting Diseases with High-resolution GPU-accelerated Molecular Dynamics Simulations. The rise of multi-GPU systems allows us to perform long timescale simulations either on supercomputers or in the cloud while enabling increased simulation accuracy. 3/7 nvda.ws/3ie4QYS
OK, here is my honest take on when to use which approach/technique with a given dataset. These are my rules of thumb, and caveats could fill out the entire internet. 1. Up to a few hundred datapoint, use stats 2. For few hundred to few thousand use linear/logistic regression 1/
3. Between few thousand to about 10,000 it's anyone's guess. Gradient boosters generally do well here, with other "classical" algorithms.(SVM for instance) sometimes shining. 2/
4. Many thousands to about a billion datapoint is where Gradient Boosted trees rule. If you need just one algorithm, go with this. You'll never go wrong. 3/
As anyone with an even cursory knowledge of AI history knows, there have been several AI Winters, periods of cooling of interest (and drop in funding) in AI research. All of these came about after the realization that at the time dominant AI paradigms were somehow limited. 1/7
For at least a decade now we have been enjoying an unprecedented AI Springtime. A perfect storm of major advances in algorithms (deep learning), computational architecture (GPUs) and availability of large high quality datasets has enabled the field to grow - exponentially! 2/7
However, in the real world there is no such thing as an endless exponential growth. What may seem like an exponential curve, inevitably turns out to be the fast rising part of a logistic curve. It is hard to speculate when the fastest part of the growth will end though. 3/7