#statisticallearning

Warwick Statisticswarwickstats
2025-07-08

These publications showcase Warwick Statistics' commitment to pushing the boundaries of statistical machine learning and AI methodology. Interested in our research? Connect with our department to explore potential collaborations!

Mattia Rigottimatrig
2024-06-20

"Neural Redshift: Random Networks are not Random Functions", Teney et al.
arxiv.org/abs/2403.02241

Counters the notion that Neural Networks have an inherent "simplicity bias". Instead, inductive bias depends on components such as ReLUs, residual connections, and LayerNorm, which can be tuned to build architectures with a bias for any level of complexity.

Figure 3 of the linked paper with caption:
Comparison of functions implemented by random MLPs (2D input, 3 hidden layers). ReLU and TanH architectures are biased towards different functions despite their universal approximation capabilities. ReLU architectures have the unique property of maintaining their simplicity bias regardless of weight magnitude"
2023-11-03

Always trip me up when i read “principle component analysis”, “reclusive feature elimination”, “gradual boosting” etc. I’d be concerned about the validity of everything else.

Principal vs principle, i can understand, but reclusive, that is so far out…

#ml #datascience #statisticallearning

Xenia Schmalzxenia_ks@dizl.de
2023-03-07

New #preprint: "Rules and statistics: What if it’s both? A basic computational
model of statistical learning in reading acquisition", describing my first attempt at a #ComputationalModel: osf.io/5b76z. The model aims to explain how #StatisticalRegularities can be an integral part of orthographic systems, while little evidence points to a relationship between #StatisticalLearning ability and #reading performance. Feedback welcome! :D

Xenia Schmalzxenia_ks@dizl.de
2023-01-10

I wrote a paper on #StatisticalLearning, #reading, and #dyslexia during maternity leave in 2021, and realise now I have very little recollection of what I actually wrote... mdpi.com/2076-3425/11/9/1143

2022-12-14

At last, the final (corrected) version is available.
Neat modeling of the processing non-adjacent dependencies by Noémi Éltető and Peter Dayan. I am very grateful to be part of this project.
journals.plos.org/ploscompbiol
#statisticallearning #bayesian

2022-11-19

An #introduction post: I love #Statistics and #Learning, which means that I love both #StatisticalLearning and #LearningStatistics. Currently, most of my professional focus is on #teaching #ResearchMethods and Statistics to #undergraduate #psychmajors. I also love #photography, #travel, and #food (who doesn’t), and have recently figured out a way to combine all of these loves into a three-week #StudyAbroad trip to #Japan where I get to teach a class on #PsychologyOfLanguage.

Xinkai Duxinkaidu
2022-11-08

Hi all ^^

I'm a PhD in ClinicalPsych at the University of Oslo and Modum Bad Psychiatric Hospital.

Currently I am investigating the mental morbidity trajectories in the COVID-19 pandemic, by applying advanced statistical methods (e.g., , , & ) to population-based registry resources, biobanks, and large-scale data.

Due to my background in PsychMethods, I am also interested in the estimation of network models and its .

Nice to meet everyone!

Daryl Feehely (old account)dfeehely@mastodon.cloud
2018-03-02

Found This Week #96:

#Music, #statisticallearning, #smartbus, #smartglass, cryto canon, Google TPUs, Microsoft #autonomousdriving cookbook, Caoga Caoga, Tokyo crowd control and an aircraft emergency slide!

medium.com/@dfeehely/found-thi mastodon.cloud/media/MvVSQGOvr

Client Info

Server: https://mastodon.social
Version: 2025.04
Repository: https://github.com/cyevgeniy/lmst