#Inference

2025-05-22

'On Consistent Bayesian Inference from Synthetic Data', by Ossi Räisä, Joonas Jälkö, Antti Honkela.

jmlr.org/papers/v26/23-1428.ht

#bayesian #privacy #inference

Don Curren 🇨🇦🇺🇦dbcurren.bsky.social@bsky.brid.gy
2025-05-21

#Inference is actually quite close to a #theoryofeverything – including #evolution, #consciousness, and #life itself. It is #abduction all the way down.” (The process of abduction may be much more pervasive than the relatively rare use of the word “abduction” would suggest) aeon.co/essays/consc...

Consciousness is not a thing, ...

Hacker Newsh4ckernews
2025-05-21
2025-05-20

inference, not training, represents an increasing majority of #AI’s energy demands and will continue to do so in the near future. It’s now estimated that 80–90% of computing power for AI is used for #inference technologyreview.com/2025/05/2

N-gated Hacker Newsngate
2025-05-20

🎉 Behold! The emerges from the depths of the abyss, promising the holy grail of Kubernetes-native distributed . 🤖 Because who doesn't want their served with extra buzzwords and a side of "competitive performance per dollar"? 🍽️
llm-d.ai/blog/llm-d-announce

Dr Mircea Zloteanu 🌼🐝mzloteanu
2025-05-19

#346 Jeffreys-Lindley paradox

Thoughts: I like this short explanation of the "paradox" of why frequentist and bayesian inference can differ.

michael-franke.github.io/intro

2025-04-26

Как запустить локально LLM, если ее веса не помещаются в [видео]память

Некоторые люди предпочитают пользоваться не только облачными сервисами, но и запускать LLM у себя дома. Например, так можно запустить дообученные модели без цензуры, или не посылать в облако свои личные документы. А то и запускать бесчеловечные эксперименты над LLM так, чтобы superintelligence/skynet потом это не припомнил. Есть много моделей, оптимизированых для быстрой работы на устройствах с небольшой памятью. Но к сожалению, веса самых продвинутых моделей, которые играют в одной лиге с лучшими онлайн моделями, занимают сотни гигабайт. Например, 8-битные веса Deepseek R1-671B занимают 700 гигабайт, квантованые q4 — 350 гигов. Можно квантовать и в 1 бит, размер тогда будет около 90 гигов, но такая модель почти бесполезна. Еще есть много качественных finetunes на основе Mistral-Large-instruct-130B, Qwen2.5-72B, llama3.3-70B, веса которых также не помещаются в память старших моделей видеокарт.

habr.com/ru/articles/904172/

#llm #inference #llamacpp #apple

2025-04-22

'DAGs as Minimal I-maps for the Induced Models of Causal Bayesian Networks under Conditioning', by Xiangdong Xie, Jiahua Guo, Yi Sun.

jmlr.org/papers/v26/23-0002.ht

#inference #causal #bayesian

N-gated Hacker Newsngate
2025-04-21

🚀 Local LLM inference: where meets in a glorious code spaghetti that no sane developer wants to untangle. 🎉 It's like building a rocket 🚀 only to realize you forgot the launchpad—works great in theory but crashes spectacularly in the real world. 🌎🔧
medium.com/@aazo11/local-llm-i

☮ ♥ ♬ 🧑‍💻peterrenshaw@ioc.exchange
2025-04-17

Day 19 cont 🙏⛪️🕍🕌⛩️🛕 💽🧑‍💻

“The #LiberalParty has accidentally left part of its email provider’s #subscriber details exposed, revealing the types of #data harvested by the party during the #election campaign.

This gives rare #insight into some of the specific kinds of data the party is keeping on voters, including whether they are “predicted Chinese”, “predicted Jewish”, a “strong Liberal” and other #PersonalInformation.”

#AusPol / #DataScience / #inference / #voters / #Liberal / #LNP / #Nationals <crikey.com.au/2025/04/17/victo>

2025-04-08

Learn how to optimize LLMs for faster inference and better testing using powerful open source tools — boost performance without breaking the bank with Sho Akiyama & Andre Rusli at #FOSSASIASummit2025

🔗 Click here youtu.be/8BJLqJ7_xcc?si=eBL0tr to watch on the FOSSASIA YouTube channel
#LLM #AI #Inference #OpenSourceTools #FOSSASIA

WIST Quotationswist@my-place.social
2025-04-01

A quotation from Robert Bolt

CROMWELL: Yet is there a man in this court, is there a man in this country, who does not know Sir Thomas More’s opinion of this title? Of course not! But how can that be? Because this silence betokened — nay, this silence was — not silence at all, but most eloquent denial!
 
MORE: (with some of the academic’s impatience for a shoddy line of reasoning) Not so, Mr. Secretary, the maxim is “qui tacet consentire”: The maxim of the law is: (very carefully) “Silence Gives Consent .” If therefore you wish to construe what my silence “betokened,” you must construe that I consented, not that I denied.
 
CROMWELL: Is that in fact what the world construes from it? Do you pretend that is what you wish the world to construe from it?
 
MORE: The world must construe according to its wits. This court must construe according to the law.

Robert Bolt (1924-1995) English dramatist
A Man for All Seasons, play, Act 2 (1960)

Sourcing, notes: wist.info/bolt-robert/76003/

#quote #quotes #quotation #qotd #consent #thomasmore #denial #inference #interpretation #silence

Hacker Newsh4ckernews
2025-03-20

Client Info

Server: https://mastodon.social
Version: 2025.04
Repository: https://github.com/cyevgeniy/lmst