Just 3 days ago, I had the pleasure of watching the #rstudioconf2022 kick off.
I've been attending since 2018 and watching even longer than that.
And, I was just a normal spectator in the audience until this happened.
@topepos and @juliasilge's keynote showed all of the open source work their team has been working on to build the best machine learning ecosystem in R called #tidymodels.
And then they brought this slide up.
Max and Julia then proceeded to talk about how the community members have been working on expanding the ecosystem.
- Text Recipes for Text
- Censored for Survival Modeling
- Stacks for Ensembles
And then they announced me and my work on Modeltime for Time Series!!!
I had no clue this was going to happen.
Just a spectator in the back.
My friends to both sides went nuts. Hugs, high-fives, and all.
My students in my slack channel went even more nuts.
Throughout the rest of the week, I was on cloud-9.
My students that were at the conf introduced themselves.
Much of our discussions centered around Max & Julia's keynote and the exposure that modeltime got.
And all of this wouldn't be possible without the support of this company. Rstudio / posit.
So, I'm honored to be part of something bigger than just a programming language.
And if you'd like to learn more about what I do, I'll share a few links.
The first is my modeltime package for #timeseries.
This has been a 2-year+ passion project for building the premier time series forecasting system.
It now has multiple extensions including ensembles, resampling, deep learning, and more.
A group of 50 AI researchers (ByteDance, Alibaba, Tencent + universities) just dropped a 303 page field guide on code models + coding agents.
And the takeaways are not what most people assume.
Here are the highlights I’m thinking about (as someone who lives in Python + agents):
1) Small models can punch way above their weight
If you do RL the right way (RLVR / verifiable rewards), a smaller open model can close the gap with the giants on reasoning-style coding tasks.
2) Python is weirdly hard for models
Mixing languages in pretraining helps… until it doesn’t. Python’s dynamic typing can create negative transfer vs. statically typed languages. Meanwhile pairs like Java↔C# or JS↔TS have strong “synergy.”