Machine learning with hard constraints: Neural Differential-Algebraic Equations (DAEs) as a general formalism


We recently released a new manuscript Semi-Explicit Neural DAEs: Learning Long-Horizon Dynamical Systems with Algebraic Constraints where we showed a way to develop neural networks where any arbitrary constraint function can be directly imposed throughout the evolution equation to near floating point accuracy. However, in true academic form it focuses directly on getting to the point about the architecture, but here I want to elaborate about the mathematical structures that surround the object, particularly the differential-algebraic equation (DAE), how its various formulations lead to the various architectures (such as stabilized neural ODEs), and elaborate on the other related architectures which haven�t had a paper yet but how you�d do it (and in what circumstances they would make sense).

READ MORE

How chaotic is chaos? How some AI for Science / SciML papers are overstating accuracy claims


Just how chaotic are chaotic systems? Many of you may have heard of �the butterfly effect� but don�t quite know the mathematics behind such systems. What I want to demonstrate is the �sensitive dependence to initial conditions� property of chaotic systems and just how sensitive these systems are. The reason this has come up is that I have seen some AI papers claiming to be able to predict the timeseries of a chaotic system (many more can be found online too, just highlighting a few random ones). What I want to bring to the forefront is an examination of what is really being claimed: just how hard is it to actually forecast a chaotic system? And if they aren�t doing that, what have they done instead?

Quick Understanding of Chaos: Sensitive Dependence and the Shadowing Lemma

First of � READ MORE

A Hands on Introduction to Applied Scientific Machine Learning / Physics-Informed Learning


Presented at JuliaEO25

This is a hands-on introduction to Scientific Machine Learning that does not assume a background in machine learning. We start scratch, showing the mathematical basis of �what is a neural network?� all the way up through adding physical intuition to the neural network and using it solve problem in epidemic outbreaks to improving sensor tracking of Formula 1 cars.

Open Source Component-Based Modeling with ModelingToolkit


Component-based modeling systems such as Simulink and Dymola allow for building scientific models in a way that can be composed. For example, Bob can build a model of an engine, and Alice can build a model of a drive shaft, and you can then connect the two models and have a model of a car. These kinds of tools are used all throughout industrial modeling and simulation in order to allow for �separation of concerns�, allowing experts to engineer their domain and compose the final digital twins with reusable scientific modules. But what about open source? In this talk we will introduce ModelingToolkit, an open source component-based modeling framework that allows for composing pre-built models and scales to large high-fidelity digital twins.

PyData is � READ MORE

The Numerical Analysis of Differentiable Simulation: Automatic Differentiation Can Be Incorrect


ISCL Seminar Series

The Numerical Analysis of Differentiable Simulation: How Automatic Differentiation of Physics Can Give Incorrect Derivatives

Scientific machine learning (SciML) relies heavily on automatic differentiation (AD), the process of constructing gradients which include machine learning integrated into mechanistic models for the purpose of gradient-based optimization. While these differentiable programming approaches pitch an idea of �simply put the simulator into a loss function and use AD�, it turns out there are a lot more subtle details to consider in practice. In this talk we will dive into the numerical analysis of differentiable simulation and ask the question: how numerically stable and robust is AD? We will use examples from the Python-based Jax (diffrax) and PyTorch (torchdiffeq) libraries in order to demonstrate how canonical � READ MORE

JuliaSim: Building a Product which improves Open Source Sustainability


January 26 2025 in Differential Equations, HPC, Julia, Scientific ML | Tags: | Author: Christopher Rackauckas

How do you build products that support open source communities? In this non-technical talk with OpenTeams I discuss how the MIT Julia Lab, PumasAI, and JuliaHub have all been essential pillars of the julialang opensource community in its goal to achieve sustainable open science. If you�ve ever been curious about what the difference is between the Julia Lab and JuliaHub is, the evolution of these groups, and what kinds of different contributions they make to the open source community, in this talk I go through as many details as I could!

Differences Between Methods for Solving Stiff ODEs


April 6 2024 in Differential Equations, Mathematics | Tags: | Author: Christopher Rackauckas

I found these notes from August 2018 and thought they might be useful so I am posting them verbatim.

A stiff ordinary differential equation is a difficult problem to integrator. However, many of the ODE solver suites offer quite a few different choices for this kind of problem. DifferentialEquations.jl offers almost 200 different choices for example. In this article we will dig into what the differences between these integrators really is so that way you can more easily find which one will be most efficient for your problem.

Quick Overview (tl;dr)

  1. A BDF, Rosenbrock, ESDIRK method are standard
  2. For small equations, Rosenbrock methods have performance advantages
  3. For very stiff systems, Rosenbrock and Rosenbrock-W methods do not require convergence of Newton�s method and thus can take larger steps, being more efficient
  4. BDF integrators are only L-stable (and A-stable) to order 2, so if the problem is � READ MORE

Symbolic-Numerics: how compiler smarts can help improve the performance of numerical methods (nonlinear solvers in Julia)


Many problems can be reduced down to solving f(x)=0, maybe even more than you think! Solving a stiff differential equation? Finding out where the ball hits the ground? Solving an inverse problem to find the parameters to fit a model? In this talk we�ll showcase how SciML�s NonlinearSolve.jl is a general system for solving nonlinear equations and demonstrate its ability to efficiently handle these kinds of problems with high stability and performance. We will focus on how compilers are being integrated into the numerical stack so that many of the things that were manual before, such as defining sparsity patterns, Jacobians, and adjoints, are all automated out-of-the-box making it greatly outperform purely numerical codes like SciPy or NLsolve.jl.

PyData Global 2023

Semantic Versioning (Semver) is flawed, and Downgrade CI is required to fix it


Semantic versioning is great. If you don�t know what it is, it�s just a versioning scheme for software that goes MAJOR.MINOR.PATCH, where

  1. MAJOR version when you make incompatible API changes
  2. MINOR version when you add functionality in a backward compatible manner
  3. PATCH version when you make backward compatible bug fixes

That�s all it is, but it�s a pretty good system. If you see someone has updated their package from v3.2.0 to v3.2.1, then you know that you can just take that update because it�s just a patch, it won�t break your code. You can easily accept patch updates. Meanwhile, if you see they released v3.3.0, then you know that some new features were added, but it�s safe for you to update. This allows you to be compatible with v3.3.0 so that if a different package requires it, great you can both use it! � READ MORE

ChatGPT performs better on Julia than Python (and R) for Large Language Model (LLM) Code Generation. Why?


Machine learning is all about examples. The more data you have, the better it should perform, right? With the rise of ChatGPT and Large Language Models (LLMs) as a code helping tool, it was thus just an assumption that the most popular languages like Python would likely be the best for LLMs. But because of the increased productivity, I tend to use a lot of Julia, a language with an estimated user-base of around a million programmers. For this reason, people have often asked me how it fairs with ChatGPT, Github Copilot, etc., and so I checked out those pieces and� was stunned. It�s really good. It seemed better than Python actually?

The data is in: Julia does well with ChatGPT

This question was recently put to the test by a researcher named Alessio Buscemi in A Comparative Study � READ MORE