Apparently OpenAI plans on releasing “Onion” next week, potentially as GPT-5.2 or GPT-5.5
Shallotpeat = a huge new model, fixing pretraining bugs in GPT-4.5
Onion = lessons learned from shallotpeat applied to an efficient GPT-5 base (i think)
www.theinformation.com/articles/ope...
Apparently OpenAI plans on releasing “Onion” next week, potentially as GPT-5....
View original threadall of this seems triggered directly by Google’s Gemini 3 Pro
Gemini proved that pre-training scaling is not dead. Word is that while OpenAI is absurdly good at post-training, Google is good at pre-training and post-training gains just fall into their lap
Gemini proved that pre-training scaling is not dead. Word is that while OpenAI is absurdly good at post-training, Google is good at pre-training and post-training gains just fall into their lap
11
1
i think sama’s bet is that if they can catch up on their pre-training game, openai will utterly dominate. and if they don’t, Google will
so yeah, pretraining isn’t dead, not at all
so yeah, pretraining isn’t dead, not at all
13
1