Lead dev of @sciml_org, VP of Modeling and Simulation @JuliaHubInc, Director of Scientific Research @pumas_ai, and Research Staff @mit_csail. #julialang #sciml
nature.com/articles/s4159…
In this we detail how #julialang's core compute model gives faster code, with a detailed calculation of the effects of the #python interpreter and kernel launching costs on simulation performance. It's pretty cool how one can pen and paper calculate the 100x expected difference.
Mar 31, 2023 • 9 tweets • 5 min read
#sciml#machinelearning in chemical engineering using prior scientific knowledge of chemical processes? New paper: we dive deep into using universal differential equation hybrid models and see how well gray boxes can recover the dynamics. arxiv.org/abs/2303.13555#julialang
For learning these cases, we used neural networks mixed with known physical dynamics, and mixed it with orthogonal collocation on finite elements (OCFEM) to receive a stable simulation simulation and estimation process.
Oct 18, 2022 • 19 tweets • 8 min read
Differentiable programming (dP) is great: train neural networks to match anything w/ gradients! ODEs? Neural ODEs. Physics? Yes. Agent-Based models? Nope, not differentiable... or are they? Check out our new paper at NeurIPS on Stochastic dP!🧵
arxiv.org/abs/2210.08572
Problem: if you flip a coin with probability p of being heads, how do you generate a code that takes the derivative with respect to that p? Of course that's not well-defined: the coin gives a 0 or 1, so it cannot have "small changes". Is there a better definition?