#autodiff

2025-04-02

package :https://www.kernel-operations.io/rkeops/ useR! video: www.youtube.com/watch?v=5DDd... #rstats #kernels #gpu #autodiff

Kernel Operations on GPU or CP...

2025-04-02

The worst part about having a permanent position is that you see all these interesting other jobs popping up yet can't apply - this one in #autodiff: jobs.inria.fr/public/classic/e

I tried using analytical normal vector calculation on a fractal. It turns out that even without any antialiasing, the image becomes much less noisy.

#sdf #indiedev #compiler #autodiff

Got control flow working, now I can use the fractals from the distance estimator compendium:

jbaker.graphics/writings/DEC.h

#fractal #sdf #autodiff #render #rendering

Using the gradient length of the signed distance field, we can see where the function is non-euclidean.

#sdf #autodiff #render

2023-12-10

Beyond Backpropagation - Higher Order, Forward and Reverse-mode Automatic Differentiation for Tensorken
open.substack.com/pub/getcode/

#Rust #rustlang #autodiff

2023-05-07

I am personally fascinated by "adjoint optimization in CFD", e.g. people using autodiff on entire physical simulations to improve shapes of objects.

At the same time, it seems comparatively rarely used in industry, e.g. most design processes only perform forward simulation.

Does anyone here have an idea *why* that is? Boost for reach please...

Seth Axen 🪓 :julia:sethaxen@bayes.club
2023-04-21

Recently I've gotten really excited about #Enzyme as the future of #autodiff in #JuliaLang, in particular because it supports more language features than #Zygote (e.g. mutation, fast control flow, and preserving structural sparsity). I've started getting acquainted with its rules system, and I have some first impressions by comparison to #ChainRules. 🧵

2022-12-06

#chatgpt explaining #autodiff in various ways. I am very impressed by this #llm, not just its knowledge, but also that it can understand subtle forms of humor in its way of explanations.

Fabian Pedregosafabian@sigmoid.social
2022-11-29

On Thursday I'll be at #NeurIPS2022 presenting a paper on our new system for #autodiff of implicit functions. A 🧵on the paper (arxiv.org/abs/2105.15183)

2022-11-27

What happens when you use #autodiff and let your nonsmooth iterative algorithm goes to convergence?

With J. Bolte & E. Pauwels, we show that under a contraction assumption, the derivatives of the algorithm converge linearly!
Preprint: arxiv.org/abs/2206.00457

I will present this work this week at #NEURIPS2022

Fabian Pedregosafabian@sigmoid.social
2022-11-27

Next week I'll be at #NeurIPS2022 presenting a couple of papers. The first one is on #autodiff through #optimization (aka #unrolling) and its bizarre convergence properties. A 🧵 on the paper (arxiv.org/pdf/2209.13271.pdf) (1/9)

Stijn De Baerdemackercortogantese
2022-11-20

I know it is "old", but I think I just fell in love with automatic differentiation

Huge potential (and force) to speed up stuff in . Also: it is the cute math that stole my ❤️ (as always)

source:
twitter.com/Michielstock/statu
(in dutch, )

Need to dig in further, but insights/thoughts/pointers are strongly appreciated!

2022-11-14

Technical Q. Anyone know how to do recursive binary checkpointing ("treeverse") over a number of steps that isn't determined until runtime? E.g. for an adaptive ODE solver.

Classically, the number of steps is assumed to be known in advance, I think.

#autodiff
#machinelearning
#honestly_I_have_no_idea_what_hashtag_to_use_for_obscure_technical_questions

Seth Axen 🪓 :julia:sethaxen@bayes.club
2022-11-11
Seth Axen 🪓 :julia:sethaxen@fosstodon.org
2022-11-08

With the core techniques for deriving #AutoDiff rules in that page, we can work out rules for complex functions like matrix factorizations. See for example this blog post on deriving rules for the LU decomposition: sethaxen.com/blog/2021/02/diff

Seth Axen 🪓 :julia:sethaxen@fosstodon.org
2022-11-08

In my opinion*, this page from the ChainRules docs is the best intro to working out automatic differentiation rules: juliadiff.org/ChainRulesCore.j

* disclaimer: I wrote it with lots of community input
#AutoDiff #JuliaLang #calculus #gradient

Seth Axen 🪓 :julia:sethaxen@fosstodon.org
2022-11-01

I just migrated from @sethaxen@mastodon.social to this new account at fosstodon.org, so time for a reintroduction!

I'm a #MachineLearning engineer with a focus on probabilistic programming (#probprog) at @unituebingen, where I help scientists use ML for their research. In the office and out, one of my main passions is #FOSS, and I work on a number of #opensource packages, mostly in #JuliaLang :julia: with a focus on #probprog, #manifolds, and #autodiff.

#introduction

Seth Axen 🪓sethaxen
2022-11-01

@johnryan Yeah I do in , and it's great that we can use arbitrary Julia code within our models. This is because most of the language is differentiable with and code is composable, which is not the case for most PPLs.

For research, Julia could come in handy for writing and transforming custom kernels without fussing with CUDA, as some posts in that thread note, but I have no experience with this.

Client Info

Server: https://mastodon.social
Version: 2025.04
Repository: https://github.com/cyevgeniy/lmst