codex truncated tool calls
apparently it’s because the tokenizer isn’t open source, isn’t compatible with the one that tiktoken ships
the API call to measure length is too slow, and they don’t want to walk the line too close, so they just truncate
codex truncated tool calls
View original thread
28
2
from what i’ve seen, gpt5 in codex will aggressively self-limit, e.g. using `sed -n 10,20p` to read small chunks instead of the full file.
so not sure how bad of a problem it is in practice
so not sure how bad of a problem it is in practice
4