You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The deepseek-v4-pro model natively supports a 1M token context window. When configuring and using this model within the Command Code agent, it should be able to utilize the full 1M context limit to handle large-scale codebase analysis and reasoning tasks.
Actual Behavior
The context limit for the deepseek-v4-pro model in the agent is currently restricted to 200k tokens. When providing context that exceeds this 200k limit, the agent cannot process it fully, which underutilizes the model's actual capability.
Expected Behavior
The
deepseek-v4-promodel natively supports a 1M token context window. When configuring and using this model within the Command Code agent, it should be able to utilize the full 1M context limit to handle large-scale codebase analysis and reasoning tasks.Actual Behavior
The context limit for the
deepseek-v4-promodel in the agent is currently restricted to 200k tokens. When providing context that exceeds this 200k limit, the agent cannot process it fully, which underutilizes the model's actual capability.Command Code Version
0.25.1
Operating System
macOS
Terminal/IDE
No response
Shell
No response
Additional context
No response