OpenAI — GPT6 will be about continual learning
Anthropic — ???
GDM — pushing context out on smaller models
Chinese labs — hoards of sparse/long attention algos
it seems like everyone is betting on:
1. continual learning
2. that long context enables it
OpenAI — GPT6 will be about continual learning
View original thread
37
2
fwiw the alternatives are:
Thinking Machines — LoRAs
Cartridges — long context but small
memory finetuning — halfway between cartridges & LoRAs
Thinking Machines — LoRAs
Cartridges — long context but small
memory finetuning — halfway between cartridges & LoRAs
7