← Back to Directory
DeepSeek
Code2024-06-17

DeepSeek Coder V2

DeepSeek Coder V2 is the world's leading open-source coding model.

#coding#open-source#MoE

Overview

DeepSeek Coder V2 is a Mixture-of-Experts model that has become the benchmark for open-source coding assistants. It supports hundreds of languages and complex repository-level tasks.

Unique Factor

Support for 338 programming languages and top-tier coding performance.

Key Capabilities

300+ languages
MoE architecture
Coding SOTA

Benchmarks

MMLU Score
81%
HumanEval (Coding)
90.2%
GPQA Diamond
65%
MATH Benchmark
80%

Top Use Cases

Legacy Code Migration

Converting code between less common languages.

Example: “Convert this Fortran 77 code to modern C++.

Technical Specs

Context128,000 tokens
Params236B (MoE)
LicenseMIT
ArchMoE

API Pricing

$0.14 / 1M input tokens

Output: $0.28 / 1M tokens

✓ Free tier available
Access API

Developer

Chinese AI research lab delivering frontier-level open-source models.

Prompt Library

Browse Coding Prompts

📋

Previous Version

Deepseek Coder V1 5