whichllm — Browse and compare AI model specs and pricing

LLM Gateway

DeepSeek V4 Pro

deepseek-thinking

Model overview

DeepSeek V4 Pro is an AI model from LLM Gateway with 1000000 token context window and text input support.

Published pricing is $1.74 input and $3.48 output per 1M tokens.

  • Workloads that use text inputs with text outputs.
  • Agent and tool workflows that need function calling.
  • Reasoning-heavy prompts where stepwise problem solving matters.
Model ID deepseek-v4-pro
Provider LLM Gateway
Family deepseek-thinking
Status -
Knowledge Cutoff 2025-05
Release Date 2026-04-24
Input Modalities text
Output Modalities text
Context Window 1000000
Input Limit -
Output Limit 384000
Tool Calling Yes
Reasoning Yes
Structured Output Yes
Temperature Control Yes
Open Weights Yes
Input Cost / 1M tokens $1.74
Output Cost / 1M tokens $3.48
Reasoning Cost / 1M tokens -
Cache Read Cost / 1M tokens $0.14
Cache Write Cost / 1M tokens -