#parametrization

2025-04-30

Параметризация: PyTest vs RobotFramework

В этой статье вы можете найти наглядное сравнение двух популярных инструментов для автоматизации тестирования: PyTest и RobotFramework. На Хабре уже есть хорошая статья с общим сравнением этих фреймворков. Я сфокусируюсь на простоте параметризации тестов.

habr.com/ru/companies/beget/ar

#robotframework #pytest #testautomation #testing #parametrization

∞ 𝕁uan ℂarlosjcponcemath@mathstodon.xyz
2025-02-12

Applying matrix diagonalisation in the classroom with #GeoGebra: parametrising the intersection of a sphere and plane

In collaboration with Bradley Welch

tandfonline.com/doi/full/10.10

#dynamic #geometric #systems #linear #algebra #vectors #parametrization

Victoria Stuart 🇨🇦 🏳️‍⚧️persagen
2023-08-12

Pruning neural networks using Bayesian inference
arxiv.org/abs/2308.02451

* neural network (NN) pruning highly effective at reducing computational & memory demands of large NN
* novel approach utilizing Bayesian inference; seamlessly integrates into training procedure
* leverages the posterior probabilities of NN prior to/following pruning, enabling calculation of Bayes factors
* achieves sparsity maintains accuracy

Victoria Stuart 🇨🇦 🏳️‍⚧️persagen
2023-07-12

On the curvature of the loss landscape
arxiv.org/abs/2307.04719

A main challenge in modern deep learning is to understand why such over-parameterized models perform so well when trained on finite data ... we consider the loss landscape as an embedded Riemannian manifold ... we focus on the scalar curvature, which can be computed analytically for our manifold ...

Manifolds: en.wikipedia.org/wiki/Manifold
...

Victoria Stuart 🇨🇦 🏳️‍⚧️persagen
2023-07-12

Extending the Forward Forward Algorithm
arxiv.org/abs/2307.04205

The Forward Forward algorithm (Geoffrey Hinton, 2022-11) is an alternative to backpropagation for training neural networks (NN)

Backpropagation - the most widely successful and used optimization algorithm for training NN - has 3 important limitations ...

Hinton's paper: cs.toronto.edu/~hinton/FFA13.p
Discussion: bdtechtalks.com/2022/12/19/for
...

Victoria Stuart 🇨🇦 🏳️‍⚧️persagen
2023-07-10

Loss Functions and Metrics in Deep Learning. A Review
arxiv.org/abs/2307.02694

One of the essential components of deep learning is the choice of the loss function and performance metrics used to train and evaluate models.

This paper reviews the most prevalent loss functions and performance measurements in deep learning.

Victoria Stuart 🇨🇦 🏳️‍⚧️persagen
2023-07-10

Pruning vs Quantization: Which is Better?
arxiv.org/abs/2307.02973

* Pruning remove weights reducing memory footprint
* Quantization (4-bit, 8-bit matrix multiplication; ...) reduces bit-width used for both weights / computation used in neural networks, leading to both predictable memory savings & reductions in the necessary compute

In most cases quantization outperforms pruning.

Victoria Stuart 🇨🇦 🏳️‍⚧️persagen
2023-06-25
Victoria Stuart 🇨🇦 🏳️‍⚧️persagen
2023-06-23

Training Transformers with 4-bit Integers
arxiv.org/abs/2306.11987

... we propose a training method for transformers with matrix multiplications implemented with the INT4 arithmetic. Training with an ultra-low INT4 precision is challenging ... we carefully analyze the specific structures of activation & gradients in transformers to propose dedicated quantizers for them. For forward propagation, we identify ...

2023-01-15

"Seasonal snow is sensitive to climate change, and is always taken as a signal of local climate changes. Against the background of global warming, the annual snow cover in the Northern Hemisphere is following an overall decreasing trend. Since snow plays an important role in the water cycle and has significant effects on atmospheric circulation, it is important to be able to simulate it well in climate models".

#climatechange #snow #models #parametrization
eurekalert.org/news-releases/9

Pierre-Simon LaplaceLearnBayesStats@mstdn.science
2023-01-10

ever wanted to learn more about #Bayesian model #parametrization? then episode 74 of our #podcast is just right for you! let @aseyboldt introduce the idea in this excerpt before diving into the nitty & gritty details

Pustam | पुस्तम | পুস্তম🇳🇵pustam_egr@mathstodon.xyz
2023-01-05

John von Neumann once claimed, "with 4 parameters, I can fit an elephant, and with 5, I can make him wiggle his trunk."
\[x(t)=\displaystyle\sum_{k=0}^\infty\left(A_k^x\cos(kt)+B_k^x\sin(kt) \right)\]
\[y(t)=\displaystyle\sum_{k=0}^\infty\left(A_k^y\cos(kt)+B_k^y\sin(kt) \right)\]
Here's a paper proving that von Neumann's claim is valid! 🔗 aapt.scitation.org/doi/10.1119
#Neumann #JohnVonNeumann #VonNeumann #FourierSeries #parameters #complexparameters #parametrization #mathematics #maths

Client Info

Server: https://mastodon.social
Version: 2025.04
Repository: https://github.com/cyevgeniy/lmst