#automaticdifferentiation

2026-01-13

New publication doi.org/10.1038/s41524-025-018

Our work on AD-DFPT, a unification of #automaticdifferentiation with linear response for #densityfunctionaltheory is published in npj Computational Materials. We show examples for #property predition, #uncertainty propagation, the design of #materials and #machinelearning of new #dft models. #condensedmatter #dftk

Dr. Chris Rackauckas :julia:chrisrackauckas@fosstodon.org
2025-11-17

#SciML fact of the day: automatic differentiation fails to give the correct derivative on a lot of very simple functions ๐Ÿ˜ฑ ๐Ÿ˜ฑ ๐Ÿ˜ฑ . #julialang #automaticdifferentiation

youtube.com/shorts/KTguZpL9Zz8

2025-09-10

New preprint: arxiv.org/abs/2509.07785

We present an implementation of AD-DFPT, a unification of #automaticdifferentiation with classical #dfpt response techniques for #densityfunctionaltheory (#dft). We demonstrate its use for #property predition, #uncertainty propagation, design of new #materials as well as the #machinelearning of new #dft models.

#condensedmatter #planewave #response #physics #simulation #computation

Michael Herbstherbst@social.epfl.ch
2025-08-27

This week the @MatMat group takes part in the #psik conference (psik2025.net/) at #epfl
with plentey of cutting-edge talks on #materials #modeling and simulations of #condensedmatter.

My contribution has been a short talk on #error quantification and propagation in #densityfunctionaltheory simulations leveraging the built-in #automaticdifferentiation framework of the #dftk code for automatic
gradient computation.

Slides: michael-herbst.com/talks/2025.

Michael Herbstherbst@social.epfl.ch
2025-06-24

As part of the #cecam workshop on perspectives of the atomistic simulation environment (#ase) I delivered a talk on our #materials #modeling ecosystem juliamolsim.org written in the #julialang
programming language and showed some examples: #automaticdifferentiation through the simulation pipeline, seamless #gpu usage, #error propagation and many more

Slides: michael-herbst.com/talks/2025.
#julialang demo: michael-herbst.com/talks/2025.

#dftk #densityfunctionaltheory #condensedmatter #planewave #simulation

2024-11-24

Have you ever thought ๐Ÿ’ก of using JAX as ๐Ÿงฎ #automaticdifferentiation engine in ๐Ÿ’ป finite element simulations? Boost the performance ๐Ÿ‡ of computationally-expensive hyperelastic material models with #jit in ๐Ÿ” FElupe! ๐Ÿš€ ๐Ÿš€

github.com/adtzlr/felupe

#python #jax #finiteelementmethod #scientificcomputing #computationalmechanics #fea #fem #hyperelasticity

A magnifying glass (logo of FElupe)
Dr. P. M. Secularsecular@mathstodon.xyz
2023-09-06

This paper from @jenseisert and colleagues sounds interesting!

"The incorporation of automatic differentiation in tensor networks algorithms has ultimately enabled a new, flexible way for variational simulation of ground states and excited states. In this work, we review the state of the art of the variational iPEPS framework. We present and explain the functioning of an efficient, comprehensive and general tensor network library for the simulation of infinite two-dimensional systems using iPEPS, with support for flexible unit cells and different lattice geometries."

scirate.com/arxiv/2308.12358

#quantum #TensorNetwork #computational #physics #AutomaticDifferentiation #iPEPS

Virgile AndreaniArmavica@fosstodon.org
2023-07-14

I really enjoyed the talk by Manuel Drehwald at #RustSciComp23 who drew the lines of an exciting future for #AutomaticDifferentiation in #Rust with #LLVM #Enzyme , which should be directly integrated into the compiler at an horizon of a couple of months.

If I understood correctly, the idea is to differentiate code at the LLVM IR level, *after optimization* (and to do another pass of optimization after that). This can produce faster code than the AD engines that operate at the source code level.

2023-05-15

#CFP for `Differentiable Almost Everything: Differentiable Relaxations, Algorithms, Operators, and Simulators`, a workshop at #ICML

differentiable.xyz/

twitter.com/FHKPetersen/status

Differentiable programming is a powerful tool, so I am quite interested in this workshop (especially as a #JuliaLang user, which has fantastic #AD support).

#AutomaticDifferentiation #ML

Marco ๐ŸŒณ Zoccaocramz@sigmoid.social
2023-01-27

not to brag or anything but my ad-delcont library has been an inspiration to this :)

github.com/konn/ad-delcont-pri

this is a line of work that uses delimited continuations to implement reverse-mode #AutomaticDifferentiation , rather than reifing the program into a graph. As such, it enables a nice purely functional API and this latest incarnation performs pretty well too

#machinelearning #functionalprogramming #haskell

Stephen De Gabriellespdegabrielle@types.pl
2022-12-28

Understanding and Implementing Automatic Differentiation
2022-12-04 :: racket, math, machine-learning, projects, tutorials
By: Mike Delmonaco

quasarbright.github.io/blog/20

#Racket #RacketLang #RacketLanguage #AI ##MachineLearning #AutomaticDifferentiation #tutorial

2022-12-23

Call for Talk Proposals for the Enzyme (#AutomaticDifferentiation in #LLVM) Conference.

enzyme.mit.edu/conference

#JuliaLang #AD

aspuruaspuru
2022-12-01

Check out our @uoft work on inverse design for the Hรผckel method using automatic differentiation.

arxiv.org/abs/2211.16763

Led by Rodrigo Vargas and Kjell Jorner

2022-11-21

๐Ÿ“บ I started a new video series on primitive rules for #automaticdifferentiation :youtube.com/watch?v=PwSaD50jTv
starting at scalar rules, continuing with vector/array rules, and finally some results from using the implicit function theorem.

Primitive rules build the basis for automatically differentiating through arbitrary computer programs.

New video to be released every three days :)

2022-11-10

A consistently solid #JuliaLang YouTube Channel: youtube.com/c/MachineLearningS

It mainly covers some advanced topics in one of the strongest areas of #Julia, #AutomaticDifferentiation, especially when applied to the scientific computing domain. For example: youtu.be/e4O6Z9o_D0k

Most topics also have a video covering it using #Jax or one of the specialized #PyTorch or #TensorFlow extensions (e.g., TensorFlow Distributions).

2021-02-01

Found a bug in #zoomasm's 360 projection (the distance estimate scaling was wrong).

Trying to do the maths by hand is too hard, so I copied my #GLSL #DualNumber implementation (for #AutomaticDifferentiation) from my fragm-examples repository, minus the #CPreProcessor macro hell, plus some quaternion-to-rotation-matrix code ported from Python that I found online.

Now it looks okish in the #equirectangular view, need to render some tests at various orientations and inject spatial metadata for viewing in VLC to be more sure I got it right...

Client Info

Server: https://mastodon.social
Version: 2025.07
Repository: https://github.com/cyevgeniy/lmst