Karpathy mentioned entropy collapse in LLMs, where they stop doing interesting things because they just don’t have interesting things in their training data
tempting to think they just need randomness, but i think that would make it worse
they need memory & experience
Karpathy mentioned entropy collapse in LLMs, where they stop doing interestin...
View original thread
28
2
i tend to prefer hiring people that are on their second career bc i think it solves some of the same entropy collapse that happens in LLMs
entropy is the opposite of randomness, it’s just a new source of dense signal
entropy is the opposite of randomness, it’s just a new source of dense signal
8
meh, by “entropy” i mean “low entropy”, i.e. the opposite of what i said. just vibe with it
7