#LinearAlgebra

2026-01-18

@typeswitch
tl;dr solution at bottom

I guess we need to clarify what the allowed operation is.
e.g. if I am allowed to whether any (natural?) linear combination of outcomes is probability 1/2, then I can just ask is 3*P(rolling i) = 1/2.

If the question I'm allowed to ask is
P(rolling i, j, or k) = 1/2, then we are essentially asking for an invertible 6x6 matrix with one row of 1s, and five rows containing a permutation of [1, 1, 1, 0, 0, 0]. In which case, again you are using 5 tests

That is certainly brute forceable by computer (at most (6 choose 3) choose 5 = 15,504 meaningfully different possibilities), but there may be a clever rank argument that it's possible or not.

Okay, I did some more thinking while doing my laundry, and here's a strategy:

If a + b + c = 0.5 and d + b + c = 0.5, then a = d. Repeat with modifications four more times to show that that all are equal to one another.

The problem with that is that doing it naively requires six tests, so seven equations, if we include that they all sum to one, my bad. So you need to take advantage of all the information you've gathered in the first four iterations and choose smartly in the last one.

Here is a solution written as a matrix equation

\(\begin{pmatrix}
1 & 1 & 1 & 1 & 1 & 1 \\
1 & 1 & 1 & 0 & 0 & 0\\
1 & 1 & 0 & 1 & 0 & 0 \\
1 & 1 & 0 & 0 & 1 & 0\\
1 & 1 & 0 & 0 & 0 & 1 \\
0 & 1 & 1 & 1 & 0 & 0\\
\end{pmatrix}
\cdot \begin{pmatrix}
1/6 \\ 1/6 \\ 1/6 \\ 1/6 \\ 1/6 \\ 1/6
\end{pmatrix}
= \begin{pmatrix}
1 \\ 1/2 \\ 1/2 \\ 1/2 \\ 1/2 \\ 1/2
\end{pmatrix}
\)

One final comment, because I can't help myself. It shouldn't be too hard to generalise this pattern to other *even* sizes of dice.

#linearalgebra #probability

Jason Bowen ๐Ÿ‡บ๐Ÿ‡ฆjbowen@mast.hpc.social
2025-12-30

When I switched my major in college from physics to mathematics, I met with the undergraduate advisor for the department to sketch out courses, she (Kathy Davis at the University of Texas) said "you can never learn enough linear algebra."

As time has gone on, I keep going back to that as probably the deepest truth I've ever been told.

#mathematics #mathematicseducation #linearalgebra #universityoftexas

Ebokifyebokify
2025-12-18
โš Antoine Chambert-Loirantoinechambertloir@mathstodon.xyz
2025-11-30

The determinant of transvections (an update). New blog post on Freedom math dance.

In the previous post, I had explained how I could prove a general version of the classic fact that transvections have determinant 1. Here, I show than one can do more with less effort!

freedommathdance.blogspot.com/

#math #LinearAlgebra

N-gated Hacker Newsngate
2025-11-14

Ah, the age-old enigma: ๐Ÿง™โ€โ™‚๏ธ words that defy . Clearly, the only solution is to whip out some ๐Ÿงฎ linear algebra! Because why struggle with when you can just slam it on a and call it a day? ๐Ÿ™„
aethermug.com/posts/linear-alg

2025-11-08

just in case anyone is interested in how to do complex linear algebra, including complex tensor products, without using complex numbers - using real linear operators \(J\) with \(J\circ J=-I\) instead of complex scalars - here are some links.  

The TLDR version on a poster:
mwaa.math.indianapolis.iu.edu/

Using a basis, matrices, and summing over indices - sections 4&5 of this paper:
users.pfw.edu/CoffmanA/pdf/bas

Without a basis (but still a lot of notation) - Chapter 5 starting on page 195, with tensor products starting with Example 5.74 on page 208:
users.pfw.edu/CoffmanA/pdf/boo
#LinearAlgebra #ComplexNumbers #NotASubToot

โš Antoine Chambert-Loirantoinechambertloir@mathstodon.xyz
2025-11-06

The determinant of transvections. โ€” New blog post on Freedom Math Dance

A transvection in a K-vector space V is a linear map T(f,v) of the form xโ†ฆx+f(x)v, where f is a linear form and v is a vector such that f(v)=0. It is known that such a linear map is invertible, with inverse given by f and โˆ’v. More precisely, one has T(f,0)=id and T(f,v+w)=T(f,v)โˆ˜T(f,w). In finite dimension, these maps have determinant 1 and it is known that they generate the special linear group SL(V), the group of linear automorphisms of determinant 1.

When I started formalizing in Lean the theory of the special linear group, the question raised itself of the appropriate generality for such results. In particular, what happens when one replaces the field K with a ring R and the K-vector space V with an R-module?

freedommathdance.blogspot.com/

#math #LinearAlgebra #AbstractAlgebra

Dyalogdyalog
2025-11-03

2013-05: Write a function that produces an nร—n identity matrix (see apl.quest/2013/5/ to test your solution and view ours).

luke.shaw@ironarray.ioluke_shaw_ironarray
2025-10-28

๐Ÿ’ŠIronPill 2๐Ÿ’Š
In the second of our series of short videos ("ironPills") showcasing ironArray's work, we see how Blosc2 can be used to power heavy-duty linear algebra (100GB!) workflows
โšก1.5-2x faster than PyTorch + h5py!
๐Ÿงฑ automated chunking optimised for your machine's cache hierarchy
๐Ÿ simple one-line syntax ๐š‹๐š•๐š˜๐šœ๐šŒ๐Ÿธ.๐š–๐šŠ๐š๐š–๐šž๐š•(๐™ฐ, ๐™ฑ, ๐šž๐š›๐š•๐š™๐šŠ๐š๐š‘='๐š˜๐šž๐š.๐š‹๐Ÿธ๐š—๐š')

See blog here: ironarray.io/blog/la-blosc




N-gated Hacker Newsngate
2025-10-07

๐ŸŽจ๐Ÿค” Wow, who knew linear algebra needed pictures to explain something as ancient as Gaussian elimination! Let's solve the great mystery of nickels and pennies with some doodles, 'cause algebra just isn't mathy enough without it! ๐Ÿ’ฐ๐Ÿ“‰
ducktyped.org/p/an-illustrated

Eastern Long-Whiskered Billyparslii@mountains.social
2025-10-06

What is the technical way to say that I flatten a matrix into a vector to do an operation? I am doing this to put some terms in a loss function to regularize a matrix.

I compute the vector 1-norm on the elements of the matrix M.

^^ is this sufficient? Or is there a better way to say it?

#math #mathematics #datascience #data #machinelearning #deeplearning #linearalgebra

2025-10-01
Linear Algebra Done Right by Sheldon Axler looks like an excellent textbook. If I ever end up teaching or tutoring undergraduate linear algebra again I think I'd try it as a text. The book is open access and is available at https://linear.axler.net (I have no affiliation with the author; I just like the book!)

One thing I like about this book is its approach to eigenvalues and eigenvectors. Most linear algebra books present eigenvalues as roots of the "characteristic polynomial", which is built from the "determinant", which in turn has some formula defining it. These objects are rarely motivated geometrically, and so you're left with limited understanding of just what an eigenvalue is or why linear transformations on finite-dimensional vector spaces must have them. Axler avoids determinants till Chapter 9 of the book, focusing instead on linear operators. The fact that operators must have eigenvalues pops out of the observation that iterating an operator on a given non-zero starting vector results in a set of vectors that must eventually become linearly dependent. This fact also leads to the development of the characteristic polynomial; you can then come at the determinant from this, more geometric, perspective.

#math #teaching #LinearAlgebra

Client Info

Server: https://mastodon.social
Version: 2025.07
Repository: https://github.com/cyevgeniy/lmst