#dimensionalityReduction

2025-05-26

We are excited to welcome Prof. Alejandro Rodriguez Garcia from the Abdus Salam International Centre for Theoretical Physics (ICTP) to Enabla! In his lecture, Alex continues the topic started by Marcello and explores the use of unsupervised machine learning techniques in many-body quantum systems, highlighting how dimensionality reduction can illuminate structure within complex data. Particular emphasis is placed on the Principal Component Analysis (PCA) as a key method to maximize variance while reducing dimensionality. This lecture sets the stage for future topics such as clustering and manifold learning.

๐ŸŽฅ Join us for this #OpenAccess lecture and take advantage of Enabla's unique features to ask questions directly to Prof. Rodriguez Garcia and engage in discussions with the community: enabla.com/pub/1112/about

Don't miss this opportunity to enhance your knowledge in the intersection of data mining and quantum physics!

#MachineLearning #UnsupervisedLearning #DimensionalityReduction #QuantumSystems #PCA #DataMining #OpenScience

The "Unsupervised Machine Learning and Dimensional Reduction in Many-Body Quantum Systems" lecture by Prof. Alejandro Rodriguez Garcia from ICTP is now in Open Access on Enabla!. 1.1 hours, video, English.
2025-01-19

Biologists, stop putting UMAP plots in your papers

#UMAP is a powerful tool for exploratory data analysis, but without a clear understanding of how it works, it can easily lead to confusion and misinterpretation.

simplystatistics.org/posts/202

#clustering #dimensionalityreduction #dataviz

Fabrizio Musacchiopixeltracker@sigmoid.social
2024-10-25

We just completed a new course on #DimensionalityReduction in #Neuroscience, and the full teaching material ๐Ÿ๐Ÿ’ป is now freely available (CC BY 4.0 license):

๐ŸŒ fabriziomusacchio.com/blog/202

The course is designed to provide an introductory overview of the application of dimensionality reduction techniques for neuroscientists and data scientists alike, focusing on how to handle the increasingly high-dimensional datasets generated by modern neuroscience research.

#PythonTutorial #CompNeuro

PCA of neural data color-coded by time. The data is projected onto the first two principal components, revealing a time-dependent structure.
Harald KlinkeHxxxKxxx@det.social
2024-01-25

By the way this is the original article that presents t-SNE. Published 11/2008
jmlr.org/papers/volume9/vander
T-distributed stochastic neighbor embedding (t-SNE) is a statistical method for visualizing high-dimensional data in 2 or 3 dimensions.
#DataVisualization #tSNE #MachineLearning #DimensionalityReduction #DataScience #AI #DataAnalysis #DataAnalytics

2023-12-23

โ€œRegardless of how we do dimensionality reduction, if the assumptions and biases underlying a method are not understood then it can be possible to see things in the data that arenโ€™t there. โ€œ

#PCA #DimensionalityReduction #statistics

doi.org/10.1073/pnas.231916912

Fabrizio Musacchiopixeltracker@sigmoid.social
2023-12-21

Why the simplest explanation isnโ€™t always the best: #DimensionalityReduction such as #PCA can see structures that do not exist and miss structures that exist. The simplest explanation isnโ€™t always the best.

โœ๏ธ Dyer & Kording ( @kordinglab) (2023)
๐ŸŒ pnas.org/doi/10.1073/pnas.2319

#DataAnalysis #CompNeuro

PCA can see structure that does not exist and miss structure that exists. All these datasets have the same principal components. (Left) if the data are Gaussian, then PCA is the ideal technique, extracting all the structure that is there. (Middle) when data are not Gaussian, PCA may โ€œseeโ€ dimensions that do not exist, in this case stemming from there being multiple Gaussians. In such a case, relaxing the assumption of orthogonality could allow a model to extract the relevant aspects. (Right) when data are highly structured but not simple (also see ref. 3), PCA will not discover the relevant structure, but will see structure that is in a way not even there. Indeed, in this case of a single line graph another technique, such as isomap, would discover that the whole dinosaur is just a single line, or a 1D-manifold embedded in 2D.
Victoria Stuart ๐Ÿ‡จ๐Ÿ‡ฆ ๐Ÿณ๏ธโ€โšง๏ธpersagen
2023-07-12

...
Addendae (cont'd)

Manifold hypothesis
en.wikipedia.org/wiki/Manifold

Many high-dimensional data sets (requiring many variables) in the real world actually lie along low-dimensional latent manifolds in that high-dimensional space (described by a smaller number of variables).

This principle may underpin the effectiveness of ML algorithms in describing high-dimensional data sets by considering a few common features.

2023-06-01

๐ŸŒŒ๐Ÿ”ฌ BEP39: the Dimensionality Reduction-Based Networks proposal (docs.google.com/document/d/1GT)! Capture high-dimensional brain data complexity and explore their lower-dimensional representation with BIDS. #BrainDataAnalysis #DimensionalityReduction

2023-05-16

Ahh I've been so excited for this paper to come out for ages!! No affiliation, just think it's super cool:

"Collection Space Navigator" for exploring projections of visual art collections

Honestly, when I first saw this, it wasn't the art applications that intrigued me so much as the value it offers for understanding 'slices' through high-dimensional space.

Demo: collection-space-navigator.git

Website: collection-space-navigator.git

#machinelearning #dimensionalityreduction #arts #datavisualization

katch wreckkatchwreck
2023-04-11

`It is considered a non-linear approach as the mapping cannot be represented as a linear combination of the original variables as possible in techniques such as principal component analysis, which also makes it more difficult to use for classification applications`

en.wikipedia.org/wiki/Sammon_m

Clรฉment Canonneccanonne@mathstodon.xyz
2022-11-11

Alright, that's all for today's #quiz. As "usual" (OK, this is new here, but usual for Twitter I guess), answers and discussion in 2-3 days; in the meantime, please ask questions, or post comments below!

#dimensionalityreduction

8/end

Mark Crowleycompthink
2022-11-09

Client Info

Server: https://mastodon.social
Version: 2025.04
Repository: https://github.com/cyevgeniy/lmst