Bluesky Thread

the GPT-5 system prompt explicitly says not to ask clarifying questions

View original thread
the GPT-5 system prompt explicitly says not to ask clarifying questions

i feel like we’re hitting the bitter lesson on theory of mind. they’re trying to fix every last behavior bug, except human preferences are blatantly contradictory, you simply need to understand the person
Yann Dubois
@yanndubs
X.com
Fixing this is very high on the priority list for the next version!
The reason it says it in the system prompt is because the model was asking too many clarification questions (and thinking long for each) which IMO was even worse UX.
26 1
part of the problem is most people probably don’t type enough to pickup on a theory of mind, and there’s no facial expressions to read, so its not exactly possible for the AI to solve this one computationally
9
26 likes 1 reposts

More like this

×