Skip to content

Latest commit

 

History

History
18 lines (14 loc) · 559 Bytes

File metadata and controls

18 lines (14 loc) · 559 Bytes
authors
lada_kesseler
video https://www.youtube.com/watch?v=_LSK2bVf0Lc&t=968s

Limited Focus (Obstacle)

Description

LLMs have limited attention. Everything you load into context competes for that attention.

When too much is loaded at once, the model either:

  • Dilutes attention across everything (stays shallow)
  • Fixates on the wrong parts (misses what matters)

Impact

  • Worse performance on all tasks when context is too broad
  • Even explicit ground rules get ignored
  • A longer, focused context outperforms a shorter, scattered one