dear GPT-5, i’m pretty sure “storagely” is not a word, but that oddly makes a...
View original threaddear GPT-5, i’m pretty sure “storagely” is not a word, but that oddly makes a ton of sense so maybe i’ll start using it
40
2
honestly this model really does feel “big”, but in a totally different way than R1, K2 & Opus. like a whole ass new direction for depth
it’s like that nerdy kid whose words can’t keep up with their brain
it’s like that nerdy kid whose words can’t keep up with their brain
7
1 hour later
my reading on softmax & entropy in LLMs has led me into economics
you can indeed calculate the “temperature” (same thing as the LLM parameter) of wealth redistribution across the years for an economy
and it’s been done:
apps3.aps.org/aps/units/ma...
you can indeed calculate the “temperature” (same thing as the LLM parameter) of wealth redistribution across the years for an economy
and it’s been done:
apps3.aps.org/aps/units/ma...
5
expanding slightly — a low temperature drives “wealth” (or logit values, for LLMs) into fewer hands
whereas higher temperatures will redistribute equally without regard to the model’s (economy’s) biases
so temperature is a parameter that can exaggerate the internal bias or structure
whereas higher temperatures will redistribute equally without regard to the model’s (economy’s) biases
so temperature is a parameter that can exaggerate the internal bias or structure
3