if you don't get it: it's fine
Abstraction maker, abstraction breaker. FUN FACT: things I prefix with FUN FACT are sometimes fun and sometimes factual, but very rarely both.
if you don't get it: it's fine
brought a Geiger counter to the orchestra pit
30 or so
@pkhuong that's how they get you!
@lritter welcome to the light side
For context, Lou Wilson is an improv comedian/actor, this character is definitely him doing a bit (and the ads/plugs are all fake), but the game here is that this is basically a double act where the straight man part is played by what seems to be an actual therapist who is just having none of it the entire time.
this is awkward and uncomfortable and yet I can't look away
"Trauma Dump with Lou Wilson" https://www.youtube.com/watch?v=-Y018tpsJJg this is a very strange type of performance art but it's mesmerizing.
I'M SORRY SIR, BUT THAT IS NOT HOW RECEIPTS WORK
maybe I'm just too German but I don't think I'll ever get over the "Do you need a receipt?" - "Yes." - *hands me a blank receipt for me to fill out myself* interaction I've had with multiple cabbies in the last 15 years
SEO reedwagon
artichoke is extra. artless chokes are available for free
@lritter @Doomed_Daniel anyway, so:
1. no actual Turing-complete computing device has ever been, or will ever be, built. (Because infinite storage doesn't exist.)
2. the reason we bother with these idealized models is that incorporating limits into the formalisms makes everything _so much_ harder to work with
3. still, it is wise to remember that everything is an (unreliable) DFA
4. "they're deterministic machines but our ability to understand them is limited" https://www.sigmicro.org/media/oralhistories/colwell.pdf p. 154
@lritter @Doomed_Daniel I point this out because it leads somewhere interesting: so even if our parsers for non-regular langs are actually still parsers for regular langs, why do we bother with them?
And the answer is that _especially_ when subject to storage constraints etc., expressive power really matters.
So while you could turn your matched-parens parser with some bound on N into a DFA with extra states, that's just no way to live. :) (And may itself blow up your storage limits.)
@lritter @Doomed_Daniel And for practical parsers you don't need galactic numbers of n. For example, every lang impl that uses recursive (not manual stack) traversals of ASTs will generally barf for very moderate values of n (as in, a few hundred to a few ten thousand), and hardened codebases that try to handle adversarial inputs will usually still give up after a couple million or so.
@lritter @Doomed_Daniel it's just
(^n )^n
i.e. n open parens, then n close parens, for some n. And then you choose n to be large enough to not fit into all data storage available on the planet. (Multiply it by a couple quadrillion more, for a healthy safety margin.)
@lritter @Doomed_Daniel Along the same lines, the distinction between different types of automata is theoretically very significant but practically speaking, any automaton that's ever been implemented is a finite state machine.
Never mind Turing completeness, in this universe we can't even build an actual pushdown automaton. Even for the most basic "matched parens" context-free grammar, I can give you a trivial input that's in the language that no actual recognizer will ever be able to tell.
@Doomed_Daniel @lritter and if the next state is a pure function of the current state, that means you're in a cycle.
i.e. if you have N bits of state total, then any program that terminates does so in at most 2^N cycles.
@Doomed_Daniel @lritter it's not RAM and it's not about increasing memory usage, it's purely about the size of the state space.
For this statement to actually work as Leonard intends you also need to include all other parts of state (e.g. register values, including internal microarchitectural ones) in the calculation.
It's ultimately just the pigeonhole principle - assume total size of machine state is N bits. Then after 2^N+1 cycles, you must have a repeated state.
@lritter rape and pillage. https://www.youtube.com/watch?v=uJqEKYbh-LU
when a pig does tricks with its back legs, that's sleight of ham
@pervognsen But you keep harping on the logistics of it and I don't see what purpose any of it serves?