“LLMs are deterministic!”
no they’re not
“well, if you set temperature = 0”
still no
“and remove floating point calculations?”
keep going
“and only serve one query at a time?”
you’re so close
“LLMs are deterministic!”
View original thread
47
2
Thinking Machines wrote a superb blog post on determinism in LLMs and it is fascinating (if you like computer architecture otherwise you’re going to tear your hair out)
spoiler: did you know GPUs have instructions that aren’t deterministic?
thinkingmachines.ai/blog/defeati...
spoiler: did you know GPUs have instructions that aren’t deterministic?
thinkingmachines.ai/blog/defeati...
67
15
while you’re at it, Groq (not Grok) does fast inference by removing all non-determinism, ALL THE WAY DOWN
it’s a cool trade-off. they basically turn hardware into a compiler problem
blog.codingconfessions.com/p/groq-lpu-d...
it’s a cool trade-off. they basically turn hardware into a compiler problem
blog.codingconfessions.com/p/groq-lpu-d...
19
4 hours later
strong “buckle up!” vibes
3