#GaussianProcesses

2024-05-01

@charleemos I have found both the #PyMC tutorials (pymc.io/projects/docs/en/lates) and the #Stan User's Guide (mc-stan.org/docs/stan-users-gu) on #GaussianProcesses good for getting your hands dirty. Seeing GPs in action and fiddling with hyperparameters was helpful for me to understand the mathematical underpinnings.

2024-03-21

Cordial congrats to Michel Talagrand on winning this year's Abel Prize, well deserved! His works on bounding #stochastic_process are also of great value for #cosmology ! Imagine we lacked knowledge of bounds on #GaussianProcesses !

youtu.be/wDIqCN7E7VA?si=Jb68Qc

#AbelPrize #AbelPrize2024 #MichelTalagrand

2023-06-26

Via Alexander Terenin: stochastic gradient descent can be used as an efficient approximate sampling algorithm for Gaussian processes. Looks super cool: arxiv.org/abs/2306.11589

#GaussianProcesses #Bayesian

A figure comparing several approximation methods for Gaussian process inference to the exact solution.

Legend: Comparison of stochastic gradient descent, CG and SVGP (a variational approximation) for GP inference with a squared exponential kernel on 10k datapoints from sin(2x) + cos(5x) with observation noise N(0, 0.5). We draw 2000 function samples with all methods by running them for 10 minutes on an RTX 2070 GPU. Infill asymptotics considers x_i ∼ N(0, 1). A large number of points near zero result in a very ill-conditioned kernel matrix, preventing CG from converging. SGD converges in all of input space except at the edges of the data. SVGP can summarise the data with only 20 inducing points. Note that CG converges to the exact solution if one uses more compute, but produces significant errors if stopped
too early, as occurs under the given compute budget. Large domain asymptotics considers data on a regular grid with fixed spacing. This problem is better conditioned, allowing SGD and CG to recover the exact solution. However, 1024 inducing points are not enough for SVGP to summarize the data.
Jarvist Moore FrostJarvist
2023-06-08

Funded (£20,410 4-year tax-free stipend) PhD positions available with me at Imperial College London

, , , , , , , excitement, adventure and really wild things!

Next interview round closing Friday 14th July 2023 .

More details and exemplar projects in this Google doc: docs.google.com/document/d/1wi

2023-02-26

@tylerjburch Yes, I hear you 😓

You've likely already fixed your installation and I'm not sure whether you're using #jupyter, but I found this guide really helpful:

pkseeg.com/post/jupyter-venv/

Now, I always warn people to never mess with the base installation of #python on a machine but use (virtual) environments instead.

Good luck with the #GaussianProcesses, I'm going through the #pymc tutorials for it right now 😎

2023-02-05

I'm eyeballs deep into understanding #GaussianProcesses (GPs). There are great resources out there but I can thoroughly recommend this introductory paper on #Distill by Görtler et al. The interactive plots are a great doi.org/10.23915/distill.00017

2023-01-25

Let's say I have samples of a bounded (but potentially noisy) function at some fixed interval of points. How would I go about determining the likelihood that my observed data was sampled from an *increasing* function with heteroscedastic noise? #machinelearning #statistics #GaussianProcesses ??

I have, once again, made the strange choice to write a blog. This one is about Gaussian Process and, particularly, about what the Markov property looks like when you don't have a linear notion of time to help you define a past and present.

Like all my GP posts, this one is wildly technical but with an aim towards being somewhat useful. The information here is hard to find unless you want to read a 400 page book translated from Russian
dansblog.netlify.app/posts/202

#GaussianProcesses #machinelearning

ML ⇌ Science Colaboratorymlcolab@fediscience.org
2022-12-20

Watch our own @sethaxen summarize our recent #NeurIPS2022 workshop paper on modeling European #paleoclimate using #GaussianProcesses!

youtu.be/ZFiJHmZbpZA

@ml4science @unituebingen @sommer @alvaro

Seth Axen 🪓 :julia:sethaxen@bayes.club
2022-11-29

👋 This is my first time attending @NeuripsConf (virtually to reduce carbon emissions).

On Friday I'll join the workshop "Gaussian Processes, Spatiotemporal Modeling, and Decision-making Systems," where we have a paper, poster, and lightning talk on GPs for modeling #paleoclimate.

If you're attending and want to chat about #GaussianProcesses, probabilistic programming (#ProbProg), or @ArviZ, ping me!

#NeurIPS2022

Seth Axen 🪓 :julia:sethaxen@bayes.club
2022-11-17

Check out some results from one of our current projects! #Spatiotemporal modeling of European #paleoclimate using doubly sparse #GaussianProcesses

arxiv.org/abs/2211.08160

ML ⇌ Science Colaboratorymlcolab@fediscience.org
2022-11-17

Our preprint "Spatiotemporal modeling of European #paleoclimate using doubly sparse Gaussian processes" is now on #arXiv!

This is one of the outcomes of a cooperation we (@sethaxen, Alex Gessner, and Álvaro Tejero-Cantero) are currently running with @sommer and Nils Weitzel.

The paper, as well as a lightning talk and poster, were accepted to the #NeurIPS2022 workshop on #GaussianProcesses, #Spatiotemporal Modeling, and Decision-making Systems #GPSMDMS

arxiv.org/abs/2211.08160

Overview of data and consensus model presented in the paper. Multiple spatiotemporally gridded simulations are combined with reconstructions from fossilized pollen proxies to construct a consensus model. While the data are discrete in space and time, the consensus model is continuous in both space and time, and at any spatiotemporal point, the marginal posterior distribution of temperature can be queried.
Richard McElreath 🦔rlmcelreath@nerdculture.de
2022-11-08

I am a harmless wandering anthropologist, bringing 20 hours of free #CausalInference and #BayesianStatistics instruction to your door. From foundations of inference through DAGs, #MultilevelModels & poststratified causal effects to #GaussianProcesses, Bayesian imputation & ODEs. Theatrical trailer below. Playlist: youtube.com/playlist?list=PLDc

I just fixed some typos in my blogpost on priors for #GaussianProcesses. The way you know it's my blog is that the guy who emailed me said "the equation after footnote 99 doesn't match how you used it after footnote 108".

#MachineLearning #statistics #bayesian #Stan

dansblog.netlify.app/posts/202

Client Info

Server: https://mastodon.social
Version: 2025.04
Repository: https://github.com/cyevgeniy/lmst