Man Strava should not have been allowed to buy Fat Map. Like the whole functionality is just gone
Staff Rust-lang developer. Backcountry powersports nerd. Adventure motorcyclist. Dirtbiker. Snowmobiler. Ski tourer. Trad rock climber. Former dungeon master.
Man Strava should not have been allowed to buy Fat Map. Like the whole functionality is just gone
Delegitimization of “AI” by only ever saying “AI” in quotes while it has no actual intelligence
If you happen to be looking for bad redactions in a large set of data files today for some reason, there's an open source tool for that.
"AI" was always a marketing term. Machine-learning--that-works being marketed as "AI" in 2025 is doing so to ride the LLM hype.
also, all the considerable money in ML - buckets of it - comes from the chatbot vendors. This leaves the field functionally corrupt.
if they don't want the stain of the chatbot? Their move.
@juliangruber in theory but not in practice.
For minds (actual intelligence) every output even just thought with no externalized physical action is also an input which changes the mind. It’s a system not a function.
Note that this is not really essential to capitalism specifically ... It’s more a derivative work of related financial inventions.
If we want to talk about real world problems, stock markets usable for making earnings on the derivatives of investments is probably the most morally corrupt world wide issue
Also hot take: having any share in the stock market (granted or bought) (especially in mutual funds) is more morally contemptible than using LLMs
It’s so fucking stupid that people call LLMs “AI”.
Artificial Intelligence does not exist exist. At least not for any useful definition of “intelligence”.
Making an input-output box good at predicting what next output should be from the input is not intelligence.
The last quote is from https://statmodeling.stat.columbia.edu/2014/08/04/correlation-even-imply-correlation/
And, this is just a fancy way to say “LLMs are incapable of ground truth”.
I seems to me, that logically, one should never trust the output of a language model on a topic where the reader has no prior knowledge of, because the only understanding is langauge which is not perfectly descriptive, which, I would assert, derives to correlation all the way down for descriptors, and “correlation does not even imply correlation”.
Man everything on Amazon these days is knock off shit. It’s becoming hard to get quality niche items in Canada (once again, for the worse)
cursor, no, you can't change your rule files to fit your solution, you asshole
What could possibly go wrong?
monomorphization is all fun and games until one function generates 234,836 lines of LLVM IR