AI, Human Cognition and Knowledge Collapse – Daren Acemoglu

From the abstract: "We study how generative AI, and in particular agentic AI, shapes human learning incentives and the long-run evolution of society’s information ecosystem... Learning exhibits economies of scope: costly human effort jointly produces a private signal about their own context and a “thin” public signal that accumulates into the community’s stock of general knowledge, generating a learning externality.... The model highlights a sharp dynamic tension: while agentic AI can improve contemporaneous decision quality, it can also erode learning incentives that sustain long-run collective knowledge. "

3 points | by aanet 9 hours ago

3 comments

  • allinonetools_ 17 minutes ago
    One thing I have been noticing is that when AI answers everything instantly, people stop digging deeper themselves. It helps with speed, but it may reduce the small bits of learning that normally accumulate over time. The long-term effect on how we build shared knowledge will be interesting to watch.
  • ahmed-fathi 9 hours ago
    When you struggle through a hard problem, you get two things: the answer, and a slightly sharper mind. AI gives you the first and skips the second. That's fine once. Scaled across an entire generation of knowledge workers, over a decade that's the collapse Acemoglu is worried about. We're not just outsourcing tasks. We're outsourcing the friction that makes people grow.
  • Cognitive_2026 6 hours ago
    The collapse is real, but it isn't inevitable it’s mostly a design choice. Most AI tools optimize for speed and answer quality, which removes the friction that builds understanding. But tools could be designed differently: ask users for their attempt first,explain reasoning after answers or gradually remove guidance as skill grows. The tricky part is that “makes you think harder” usually hurts engagement metrics. Curious are there any tools already trying to do this well?