Bluesky Thread

codex truncated tool calls

View original thread
codex truncated tool calls

apparently it’s because the tokenizer isn’t open source, isn’t compatible with the one that tiktoken ships

the API call to measure length is too slow, and they don’t want to walk the line too close, so they just truncate
Mario Zechner & @badlogicgames
X.com
This is even funnier. Codex will truncate any Bash/MCP tool output to 256 lines or 10kb. If the tool call outputs more than that, then the model only gets to see the first 128 lines, and the last 128 lines, but nothing in the middle.
Been like that since August. No wonder poor GPT is slow going in circles trying to understand
WTF it's actually seeing.
28 2
from what i’ve seen, gpt5 in codex will aggressively self-limit, e.g. using `sed -n 10,20p` to read small chunks instead of the full file.

so not sure how bad of a problem it is in practice
4
28 likes 2 reposts

More like this

×