Discussion about this post

User's avatar
Swen Werner's avatar

Autoregression is a generative method but not a linguistic description. Language sequences can be generated autoregressively, but language itself is structured by grammar, logic, and semantic constraints. These structures encode operations like comparison, implication, and hierarchy. It is such features that exceed local token dependency. Transformer models simulate reasoning by statistically approximating such constraints. But this is not cognition, and it is not conceptual understanding. Recognizing language as structured logic is not a novel claim. What is novel - dangerously so - is your assertion that sequence generation alone constitutes comprehension. That is an error in understanding.

Expand full comment
Takim Williams's avatar

Cool! I think an autoregressive model of cognition also better explains the prevalence of contradictory beliefs better than traditional models. An individual just has the propensity to espouse different beliefs in different contexts...

Expand full comment
2 more comments...

No posts