AI's Memorization Crisis - The Atlantic
https://www.theatlantic.com/technology/2026/01/ai-memorization-research/685552/
> Large language models don’t “learn”—they copy. And that could change everything for the tech industry.
AI's Memorization Crisis - The Atlantic
https://www.theatlantic.com/technology/2026/01/ai-memorization-research/685552/
> Large language models don’t “learn”—they copy. And that could change everything for the tech industry.
“A large fraction of what #LLMs do is mostly just #memorization …” open.substack.com/pub/garymarc...
Deconstructing Geoffrey Hinton...
Aharon Azulay (@AharonAzulay)
작성자는 관찰 결과가 일치한다고 말하며, 이러한 시스템들이 잘 알려지지 않은 arXiv 논문의 수치적 세부사항까지 기억할 수 있음을 지적하고 있습니다. 연구·검증 관점에서 모델의 기억(기록) 능력과 데이터 출처 관련 행동을 시사하는 댓글입니다.
A quotation from Bill Watterson
CALVIN: As you can see, I have memorized this utterly useless piece of information long enough to pass a test question. I now intend to forget it forever. You’ve taught me nothing except how to cynically manipulate the system. Congratulations.
Bill Watterson (b. 1958) American cartoonist
Calvin and Hobbes (1994-01-27)
More about this quote: wist.info/watterson-bill/81087…
#quote #quotes #quotation #qotd #billwatterson #calvinandhobbes #cramming #cynicism #education #learning #lesson #memorization #rotememorization #school #teaching #test
Ah, the age-old quest for finding the perfect #memorization hack! 🤔 Well, here comes the #spaced #repetition article, promising to revolutionize your brain with a sprinkle of #Haskell and #nootropics. ✨ Just remember, if this method was truly foolproof, #Gwern would be running Apple by now, not blogging about it. 😂
https://gwern.net/spaced-repetition #hacks #HackerNews #ngated
As always: #OpenData persistently available at:
Du, K. (2025). Reconstructing Shuffled Text (Derived Text Formats) [Data set]. Zenodo. https://doi.org/10.5281/zenodo.17198425
#CLS #CCLS25 #DTF #LiteraryComputing #LLM #Memorization
In our own work, we researched memorization in language models for code and ways to let them regurgitate training data:
> From the training data that was identified to be potentially extractable we were able to extract 47% from a CodeGen-Mono-16B code completion model.
> We also observe that models memorise more, as their parameter count grows, and that their pre-training data are also vulnerable to attack
Urteil GEMA gegen Open AI:
> Sowohl durch die Memorisierung in den Sprachmodellen als auch durch die Wiedergabe der Liedtexte in den Outputs des Chatbot lägen Eingriffe in die urheberrechtlichen Verwertungsrechte vor
https://www.justiz.bayern.de/gerichte-und-behoerden/landgericht/muenchen-1/presse/2025/11.php
From Memorization to Reasoning in the Spectrum of Loss Curvature
https://arxiv.org/abs/2510.24256
#HackerNews #Memorization #Reasoning #LossCurvature #MachineLearning #AIResearch
The New York Times thinks a turtle poem will "win your heart" 🐢💔—because nothing screams "captivating" like slow-moving reptiles and deep dives into poetic gravity. 🎼✨ Meanwhile, they offer a #game to help memorize it, as if anyone is clamoring to recite turtle verses at parties. 🎉📜
https://www.nytimes.com/interactive/2025/06/12/books/kay-ryan-turtle-poem.html #turtlepoem #NewYorkTimes #poetry #memorization #heartwarming #HackerNews #ngated
Interesting, "GPT-style models have a fixed memorization capacity of approximately 3.6 bits per parameter."
https://venturebeat.com/ai/how-much-information-do-llms-really-memorize-now-we-know-thanks-to-meta-google-nvidia-and-cornell/
#ai #memorization #llm
How much information do LLMs really memorize? Now we know, thanks to Meta, Google, Nvidia and Cornell https://venturebeat.com/ai/how-much-information-do-llms-really-memorize-now-we-know-thanks-to-meta-google-nvidia-and-cornell/ #AI #memorization #copyright
How much information do LLMs really memorize? Now we know, thanks to Meta, Google, Nvidia and Cornell https://venturebeat.com/ai/how-much-information-do-llms-really-memorize-now-we-know-thanks-to-meta-google-nvidia-and-cornell/ #AI #memorization #copyright
I gladly return to the subject of the ineptitude of our education. Its goal has been to make us not good or wise, but learned; it has attained this goal. It has not taught us to follow and embrace virtue and wisdom, but has imprinted in us their derivation and etymology. We know how to decline virtue, if we cannot love it. If we do not know what wisdom is by practice and experience, we know it by jargon and by rote.
[Je retombe volontiers sur ce discours de l’ineptie de nostre institution : Elle a eu pour sa fin, de nous faire, non bons & sages, mais sçavans : elle y est arrivée. Elle ne nous a pas appris de suyvre & embrasser la vertu & la prudence : mais elle nous en a imprimé la derivation & l’etymologie. Nous sçavons decliner vertu, si nous ne sçavons l’aymer. Si nous ne sçavons que c’est que prudence par effect, & par experience, nous le sçavons par jargon & par cœur.]
Michel de Montaigne (1533-1592) French essayist
Essay (1578), “Of Presumption [De la Presomption], Essays, Book 2, ch. 17 (2.17) (1595) [tr. Frame (1943)]
Sourcing, notes, alternate translations: wist.info/montaigne-michel-de/…
#quote #quotes #quotation #qotd #montaigne #education #learning #meaning #memorization #morality #rote #school #understanding #virtue #wisdom
We readily inquire, “Does he know Greek or Latin?” “Can he write poetry and prose?” But what matters most is what we put last: “Has he become better and wiser?” We ought to find out not who understands most but who understands best. We work merely to fill the memory, leaving the understanding and the sense of right and wrong empty.
[Nous enquerons volontiers, Sçait-il du Grec ou du Latin ? escrit-il en vers ou en prose ? mais, s’il est devenu meilleur ou plus advisé, c’estoit le principal, & c’est ce qui demeure derriere. Il falloit s’enquerir qui est mieux sçavant, non qui est plus sçavant. Nous ne travaillons qu’à remplir la memoire, & laissons l’entendement & la conscience vuide.]
Michel de Montaigne (1533-1592) French essayist
Essay (1572-1578), “Of Pedantry [Du pedantisme]), Essays, Book 1, ch. 24 (1.24) (1595) [tr. Screech (1987), ch. 25]
Sourcing, notes, alternate translations: wist.info/montaigne-michel-de/…
#quote #quotes #quotation #Montaigne #comprehension #education #evaluation #improvement #learning #memorization #rubric #school #student #teaching #understanding #wisdom
A quotation from William Feather
An education isn’t how much you have committed to memory, or even how much you know. It’s being able to differentiate between what you do know and what you don’t. It’s knowing where to go to find out what you need to know, and it’s knowing how to use the information once you get it.
William Feather (1889-1981) American publisher, author
(Attributed)
Sourcing, notes: wist.info/feather-william/1479…
#quote #quotes #quotation #application #competence #education #ignorance #knowledge #memorization #research
Counting to high numbers and reciting poems to suppress evil thoughts in Charles Dickens’s “Hard Times” (1854). #111Words #CharlesDickens #HardTimes #Poetry #Counting #Recitation #Memorization https://andrewjshields.blogspot.com/2025/03/counting-to-high-numbers-and-reciting.html