#entropy

2026-01-23

🚀 Giới thiệu Entropy Stability Engine (ESE): công cụ đo entropy thời gian thực để phát hiện vòng lặp xác定 trong LLM. ✅ Siêu nhẹ, chạy trên ZTE Blade A71 (3 GB RAM) via Pydroid 3. ✅ Giám sát cửa sổ token (default 5), cảnh báo khi entropy giảm < ngưỡng, đưa ra biện pháp chèn nhiễu ngẫu nhiên. 🌱 Giúp giảm lãng phí GPU, tiết kiệm chi phí token & năng lượng, phù hợp cho edge‑computing. #LLM #AI #Entropy #GreenAI #EdgeComputing #CôngNghệ #AI #MãNguồn #HiệuSuất #TiếtKiệm

reddit.com/r/Loc

@lattera you do great work and you get things done #nomad bsd iso #entropy pools #jails

WITHIN OBLIVION - ENTROPY [OFFICIAL EP STREAM] (2025) SW EXCLUSIVE

peertube.gravitywell.xyz/w/f8P

The Magic Churchthemagicchurch
2026-01-18

Clarity doesn’t always come from tightening the mind.
Sometimes it arrives when familiar structures soften.
Neuroscience calls this increased brain
a temporary loosening that allows new and perspective to emerge.
Not chaos.
Not collapse.

Insight lives at the boundary between structure and openness.
And integration is what turns that openness into wisdom.

2026-01-16

It took a while, but I'm finally back to writing my blog 😎

The first installment for 2026 is an easy introduction to calculating information #entropy for optical spectra (or for any signal, really).

In my blog, I focus on #data analysis (#chemometrics, machine learning) applied to optical and near-infrared #spectroscopy Smoothing, or denoising, is one of the most common steps to work with spectroscopy data, and information entropy can be used as a criterion to guide the smoothing process.

Better still, the entropy of the derivative of a signal can help with that, because it accounts for the shape of the signal more naturally.

Read more at nirpyresearch.com/information-

#MachineLearning #NIR #Physics

2026-01-09

Lego Smart Play. Next up, the Replicator Con; non-renewables don’t produce more non-renewables or renewables. Turning Earth into technological waste just to defeat human rights couldn’t be more absurd! The road map built on non-renewables only leads to Armageddon. #Entropy #DRDI #TheFederation #EmpireIsBoring#WW3

2026-01-08

“Reality is that which, when you stop believing in it, doesn’t go away”*…

A road trip in Sausalito, California during wildfire season, September 2020. Photo by Gabrielle Lurie/The San Francisco Chronicle/Getty Images

Reality is tough. Everything eats and is eaten. Everything destroys and is destroyed.

In a way that challenges lots of our deeply-seated conceptions (your correspondent’s, anyway), philosopher (and self-proclaimed pessimist) Drew Dalton invokes the laws of thermodynamics to argue that it is our moral duty to strike back at the Universe…

Reality is not what you think it is. It is not the foundation of our joyful flourishing. It is not an eternally renewing resource, nor something that would, were it not for our excessive intervention and reckless consumption, continue to harmoniously expand into the future. The truth is that reality is not nearly so benevolent. Like everything else that exists – stars, microbes, oil, dolphins, shadows, dust and cities – we are nothing more than cups destined to shatter endlessly through time until there is nothing left to break. This, according to the conclusions of scientists over the past two centuries, is the quiet horror that structures existence itself.

We might think this realisation belongs to the past – a closed chapter of 19th-century science – but we are still living through the consequences of the thermodynamic revolution. Just as the full metaphysical implications of the Copernican revolution took centuries to unfold, we have yet to fully grasp the philosophical and existential consequences of entropic decay. We have yet to conceive of reality as it truly is. Instead, philosophers cling to an ancient idea of the Universe in which everything keeps growing and flourishing. According to this view, existence is good. Reality is good.

But what would our metaphysics and ethics look like if we learned that reality was against us?…

Read on for his provocative argument that philosphers must grapple with the meaning of thermodynamics: “Reality is evil,” from @dmdalton.bsky.social in @aeon.co.

Dalton further explores these ideas in his book The Matter of Evil: From Speculative Realism to Ethical Pessimism (2023)

* Philip K. Dick

###

As we wrestle with reality, we might send somewhat sunnier birthday greetings to Stephen William Hawking CH CBE FRS FRSA; he was born on this date in 1942.  A theoretical physicist and cosmologist, he is probably best known in his professional circles for his work with Roger Penrose on gravitational singularity theorems in the framework of general relativity, for his theoretical prediction that black holes emit radiation (now called Hawking radiation), and for his support of the many-worlds interpretation of quantum mechanics.

But Hawking is more broadly known as a popularizer of science.  His A Brief History of Time stayed on the British Sunday Times best-seller list for over four years (a record-breaking 237 weeks), and has sold over 10 million copies worldwide.

“We have this one life to appreciate the grand design of the universe, and for that, I am extremely grateful.”

source

#ABriefHistoryOfTime #blackHoles #Cosmology #entropy #evil #history #humor #lawsOfThermodynamics #philosophy #Physics #reality #Science #StephenHawking #thermodynamics
A woman and three children sitting in a car with an orange-tinted sky, suggesting a smoky or apocalyptic atmosphere.A black and white portrait of Stephen Hawking smiling while seated in a wheelchair, in an office setting with a computer in the background.
2026-01-06

Tôi vừa phát hành Steer v0.4, công cụ lọc phản hồi AI dựa trên Shannon Entropy. Nếu entropy < 3.5, câu trả lời bị chặn và tạo dữ liệu contrastive cho DPO. Mã nguồn mở, triển khai như Service Mesh để ngăn sycophancy ngay từ đầu. Ai đã dùng entropy filter trong production? #AI #MachineLearning #Entropy #OpenSource #AIVietnam #Lập_trình

reddit.com/r/LocalLLaMA/commen

Don Curren 🇨🇦🇺🇦dbcurren.bsky.social@bsky.brid.gy
2025-12-30

The #Aeon piece linked to below argues that we still haven’t really come to grips with the implications of #entropy. I wrote earlier this year about a book that tries to do that in the context of #economics. Rightly or wrongly, I also dragged in #Trump. doncurren.blogspot.com/2025/09/entr...

RE: https://bsky.app/profile/did:plc:5zca2ola2zxpkw37w4f3wxtu/post/3lxmkfhzfn22a


Entropy Economics and Trump's ...

Srijit Kumar Bhadrasrijit@hachyderm.io
2025-12-26

Entropy generally refers to disorder or uncertainty. Greater entropy means greater uncertainty.

Reading the recent TIME article titled “2026 Will Mark a New World Disorder” [1] led me to think that, in a metaphorical sense, geopolitical entropy i.e. the growing tendency toward disorder, fragmentation, and unpredictability in international relations is on the rise. Global power structures have shifted from relatively stable and centralized systems to ones that are more fragmented and less predictable.

Unlike physical entropy in isolated systems, geopolitical entropy is not necessarily irreversible. Human actions, through diplomacy, new institutions, or alliances, can restore order and reduce disorder. Yet, current trends such as multipolarity, declining hegemony, and the spread of emerging threats suggest that geopolitical entropy will keep increasing unless strong corrective measures are taken.

1. time.com/7341023/2026-new-worl

#GeoPolitics #Disorder #Entropy

Technoholic.metechnoholic
2025-12-21

🌠 The law of thermodynamics states that energy spontaneously tends to disperse. Just like how your hot coffee cools down!

2025-12-18

Nhà khoa học vừa công bố LOGOS-ZERO - framework mới thay thế RLHF truyền thống bằng hàm lỗi nền vật lý nhiệt động. Mục tiêu: làm cho các hallucinations và lỗi logic trở nên "tốn năng lượng" trong suy luận AI. Bài nghiên cứu cũng đề cập hiện tượng lỗi L.A.D. (Lỗi ẩn do phức tạp ngữ nghĩa) trong các mô hình hàng đầu hiện nay. Tìm kiếm ý kiến về khả thi toán học của hàm phạt entrôpia trong nhân tùy chỉnh.

#AIAlignment #LOGOSZERO #NhiệtĐộngLựcHọc #Haliongan #LAD #ENTROPY #DeepLearning #AIResearch

Nutt LosNuttLos
2025-12-17

"Glücklicherweise ist die Erde kein geschlossenes System, denn wir haben die Sonne."

Ich lieb's. 😄

youtube.com/watch?v=evsBkl8dG8k

Fabio Manganiellofabio@manganiello.eu
2025-12-15

@androcat @Ntropic 0 days since having a discussion on the meaning of #entropy 🙂

And a discussion that compares the Boltzmann and Von Neumann/Shannon interpretations would drag me here for hours (and after days of both scientific and philosophical thought exercize over their implications I still can’t come up with a definition that satisfies both the “thermodynamic” and the “information” interpretation).

Let me clarify what I mean by “low/high entropy of a closed system” in this context with a better (but also abused) analogy when we talk about entropy: the coffee and milk case.

It’s an old adage taught to kids from an early age that when you mix coffee and milk you start with a “low entropy” system (coffee and milk as separate states with their own boundaries) and end up with a “high entropy” system (coffee and milk mixed together).

The underlying concept in this definition of entropy is that of “reversibility” of a reaction. Not necessarily the amount of information or independent variables required to accurately model the system. I like this definition better because the principle of least action and the arrow of time just emerge as naturally corollaries of it.

It’s basically much easier to start with a system where the two components are separate and end up with a system where they are mixed than the other way around. This is a definition that even a kid can understand without delving into the logarithmic growth of information states. As @androcat intuitively but eloquently put it:

When a recursive system eats its own shit, it’s not leaking information to some other system, it’s just overwriting it through averaging.

This is exactly what I mean by “high entropy” state in this context.

You can’t get the original coffee grains out of your latte unless you apply some serious molecular wizardy (which btw would just transfer entropy from your cup to whatever machinery you’re using to do your molecular distillation, which is where the concept of “what are the boundaries of your system, where you need to put your entropic probes” becomes important).

Similarly, you can’t get the original girlfriend meme picture starting from the final “three asexual and featureless abuelas on a black background” picture unless you consume all the energy resources of the planet to do a backwards search through all the possible permutations of stable diffusion that may have led to that outcome (and maybe adding a huge additional adversarial network on top to tell how “realistic” each prediction is).

According to a strictly “informational” definition, however, the entropy of the final system in both cases (latte and three asexual abuelas) may actually be low. After all, we don’t need much information to model a low-variance system that converged onto a local minimum: it’s just the average value.

But the arrow of time of events that led to that state matters here: was the system in that state because the latte or the three abuelas were already in that averaged out state since the moment of the big bang, or is it the output of a system that we can reasonably approximate as a closed one (your coffee cup or a specific snapshot of weights saved on an OpenAI server) which recursively ate back its own shit through a tight feedback loop and simply ended up with a very low-variance normal distribution?

This is the conflict point I see between the two dominant interpretations. And I see it mainly as a problem on the “information” interpretation, because by focusing on “how many independent variables do I need to accurately describe the state of the system?” it settles for a stateless definition that misses out the dimension of time (and reversibility).

entropy in action, one of my favorite themes, "nature and time reclaiming our own efforts" (also I hope nothing important was in there 👀 )

#NaturePhotography #forest #entropy #ozymandias #reclamation

a thick length of ridged plastic tubing running horizontally across the ground, bleached white from age and cracked about 40% open, with some leafy forest greens growing around and inside it
Bernd von Mallinckrodt 🦋vonmallinckrodt.bsky.social@bsky.brid.gy
2025-12-07

In the #MallinckrodtCycle, … #MallinckrodtZyklus#chaos and #100%order are two forms of the same #entropy. The #OntologyOfVibration #OntologieDerSchwingung … shows that life exists only in between.🖖

David W. Jonesdancingtreefrog
2025-12-05

Entropy is the ultimate form of enshittification.

Client Info

Server: https://mastodon.social
Version: 2025.07
Repository: https://github.com/cyevgeniy/lmst