#lambdacalculus

2025-12-14

I'm not understanding this intuitively.

I can't think of examples of a formal definition (in λD) that are NOT correct.

Help welcome !

To my mind, it seems I can derive *:□⃞ from any environment and context because it is the axiomatic (sort) rule.

#maths #typetheory #lambdacalculus

definition of a "correct" definition
2025-12-06

What is "linearisation" is a partial order?

This hasn't been explained in the textbook so I'm guessing it just means writing out the partial order

A <- B <- C ... etc

where A, B, C are the formal definitions, and the arrows show "depends on / makes use of"

---

ps this is chapter 9 on formal definitions building up to λ𝐷₀, from Type Theory and Formal Proof (Nederpelt)

#maths #typetheory #proof #lambdacalculus

exercise 9.1 from type theory and formal proof using the term "linearisations".
2025-11-26

Got to do a decimal->binary algorithm in Binary #LambdaCalculus. 273 bits so far. Who would have thought that destructuring cons cells is such an addictive pattern that cost so little bits?

2025-11-26

My main complaint about #Scheme is that it’s “not practical,” being non-portable and batteries-excluded for any sufficiently complex case. And yet my current situation (my #LambdaCalculus experiments, aspirations for the Web-resident software that’s NOT necessarily #JavaScript, and overall feeling that #CommonLisp is quite trippy when I write it) kind of nudges me towards Scheme. The appeal is real.

@Sandra is partly to blame for that, showing all these beautiful combinators off in her posts. Combinators are cool, and Scheme is (almost) a perfect language for them!

2025-11-25

300 bits! John Tromp managed to cut my Binary #LambdaCalculus sorting algo down to 300 bits! Now I’m sure there’s no other algo beating that in (33) byte count. Or is there maybe a x86 instruction (there always is) to sort bytes in memory?

github.com/tromp/AIT/issues/6#

#theWorkshop

2025-11-24

Hell yeah, my Binary #LambdaCalculus Mergesort is now officially part of John Tromp’s AIT repository! github.com/tromp/AIT/blob/mast

I’m pretty sure my amateur #Haskell translation made the algo gain some bits, but it must be pretty good still!

2025-11-24

Managed to codegolf Binary #LambdaCalculus Mergesort to 367 bits. Which is laudable, considering that the comparison function has to manage an infinitely weird classic BLC input format: github.com/tromp/AIT/issues/6#

2025-11-23

Even given John Tromp’s peculiar I/O format used for Binary. #LambdaCalculus and the need to process it, I was able to use my sorting algorithm and codegolf it down to 418 bits, which is less than 436 he reported. I did it in applicative system, which means I can replace applicative combinators with normal-order ones and get to around 400 bits. But I’m leaving that for later, when John can test my programs himself.

2025-11-22

Being the compatibility freak I am, I’m adding support for Tromp’s Binary #LambdaCalculus IO formats (lists of booleans (bits), and lists of lists of booleans (bytes)) to my cl-blc (codeberg.org/aartaka/cl-blc) toolkit.

I don’t like Tromp’s formats, because they waste a lot of space for questionable gain and a lot of inconveniences. Like, you want to compare two ASCII chars, and even that makes you traverse lists of bits for them, requiring recursion and boolean ops. Instead, just… compare char values as integers?

I hope I won’t have to drop the “BLC” name because I diverged from Tromp’s spec on I/O in favor of more structured inputs/outputs.

2025-11-21

Update: I codegolfed my Binary #LambdaCalculus sorting algorithm down to 248 bits. 31 byte to sort a list of things with a custom comparator function! 341 bits if you bundle integer comparator with it, down from 370.

2025-11-20

So I was always worried about Binary #LambdaCalculus, because it requires quite a bit of implementation support, with full-blown closures et al. I thought that I don’t get to use BLC until I actually implement my own Lambda Calculus engine or a #Lisp at least.

But today I checked. And sure enough, John Tromp, the creator of BLC, generates BLC blobs from equivalent Haskell code. And he runs Haskell code instead of BLC, avoiding the need to implement closures altogether! That’s smart and reassuring—now I can do BLC without worrying that I have to implement something first. I can just compile to raw #CommonLisp lambdas for running and write my lambdas in whatever format I want. Only compiling them to BLC for final publishing and transport. Because that’s what BLC is for—transport of algorithms in the most succint form.

2025-11-20

John Tromp used Binary #LambdaCalculus as a benchmark for how many bits a certain algorithm might occupy when encoded optimally. He reached results as crazy as 300 bits (not bytes!) for a sorting algorithm!

BLC power was due to a smart choice of encoding: lambda abstraction as a 00 two-bit pattern, function application (always monadic) as another (01) two-bit pattern, and parameter reference as unary-encoded (e.g. 1111110 for sixth outermost parameter) de Brujin index.

And choice of primitives to encode was smart too—you can express a lot if you have function abstraction and function application! #OOP can be entirely closure-driven, and numbers, booleans, and lists are all Church-encodable.

But what often isn’t counted in Tromp’s feats of bit-saving is… input data encoding. Sure, your sorting algo is 300 bits, but does it matter if a (10 20 30) list takes, like, 380 bits. For Church-encoded numbers, the bit count grows linearly (or, rather, at 4*n pace), for example. So, for moderately big numbers, BLC is actually suboptimal and starts losing to binary/von Neumann encodings. And if we take that into account and consider algorithm inputs as part of its complexity score, we get real bad numbers.

But then, do we always need to bundle our assumptions with our programs? I’m uncertain.

BLC is still beautiful though, go check it out!

N-gated Hacker Newsngate
2025-11-16

🚀✨ Behold the arcane magic of de Bruijn numerals: where pure lambda calculus meets a cryptic maze of indices, and 🧙‍♂️ arithmetic becomes an Olympic event in mental gymnastics. 🤹‍♂️ Math enthusiasts rejoice, for you’ll need a PhD in esoteric algorithms just to decode this masterpiece! 📜🔮
text.marvinborner.de/2023-08-2

2025-10-29

Hiểu rõ Calculus Lambda 6: Bài toán lặp lại. Phân tích các vấn đề thường gặp trong lý thuyết Lambda, hữu ích cho lập trình hàm và khoa học máy tính. #LambdaCalculus #LýThuyếtLambda #LậpTrìnhHàm

reddit.com/r/programming/comme

2025-10-27

Instantiation vs substitution.

I'm trying to understand the difference - in the context of lambda calculus and formal definitions in type theory.

The textbook I'm following ins't that clear.

Would welcome guidance.

#lambdacalculus #typetheory #maths

textbook sample mentioning substitution and instantiation
2025-10-20

Normalisation by evaluation - Wikipedia
A term is first interpreted into a denotational model of the λ-term structure, and then a canonical (β-normal and η-long) representative is extracted by reifying the denotation. Such an essentially semantic, reduction-free, approach differs from the more traditional syntactic, reduction-based, description of normalisation as reductions in a term rewrite system where β-reductions are allowed deep inside λ-terms.

The denotational semantics of (closed) terms in the meta-language interprets the constructs of the syntax in terms of features of the meta-language; thus, lam is interpreted as abstraction, app as application, etc. The semantic objects constructed are as follows:

"By induction on the structure of types, it follows that if the semantic object S denotes a well-typed term s of type τ, then reifying the object (i.e., ↓τ S) produces the β-normal η-long form of s. All that remains is, therefore, to construct the initial semantic interpretation S from a syntactic term s. This operation, written ∥s∥Γ, where Γ is a context of bindings, proceeds by induction solely on the term structure:"
#lambdacalculus
en.wikipedia.org/wiki/Normalis

𝚛𝚊𝚝rat@social.sdf.org
2025-10-18

"I claimed in the end of the video that this was the first example of animated beta-reductions of visual lambda expressions. Paul Brauner has some videos here: • Lambda Diagrams [1] , although they do not explicitly animate the mechanics of one step of beta-reduction! I probably should have chosen my words more carefully."
youtube.com/watch?v=RcVA8Nj6HE
[1]youtube.com/playlist?list=PLi8
#lambdacalculus #combinators #animation #representation #church #tromp #turing

Client Info

Server: https://mastodon.social
Version: 2025.07
Repository: https://github.com/cyevgeniy/lmst