GitHub Advisory Database
Security vulnerability database inclusive of CVEs and GitHub originated security advisories from the world of open source software.
GitHub reviewed advisories
Unreviewed advisories
Filter advisories
Filter advisories
GitHub reviewed advisories
All reviewed
5,000+
Composer
4,743
Erlang
35
GitHub Actions
29
Go
2,315
Maven
5,000+
npm
3,949
NuGet
711
pip
3,729
Pub
12
RubyGems
920
Rust
965
Swift
38
Unreviewed advisories
All unreviewed
5,000+
20 advisories
Filter by severity
vLLM Tool Schema allows DoS via Malformed pattern and type Fields
Moderate
CVE-2025-48944
was published
for
vllm
(pip)
May 28, 2025
vLLM allows clients to crash the openai server with invalid regex
Moderate
CVE-2025-48943
was published
for
vllm
(pip)
May 28, 2025
vLLM DOS: Remotely kill vllm over http with invalid JSON schema
Moderate
CVE-2025-48942
was published
for
vllm
(pip)
May 28, 2025
vLLM has a Weakness in MultiModalHasher Image Hashing Implementation
Moderate
CVE-2025-46722
was published
for
vllm
(pip)
May 28, 2025
Potential Timing Side-Channel Vulnerability in vLLM’s Chunk-Based Prefix Caching
Low
CVE-2025-46570
was published
for
vllm
(pip)
May 28, 2025
vLLM vulnerable to Regular Expression Denial of Service
Moderate
GHSA-j828-28rj-hfhp
was published
for
vllm
(pip)
May 28, 2025
vLLM has a Regular Expression Denial of Service (ReDoS, Exponential Complexity) Vulnerability in `pythonic_tool_parser.py`
Moderate
CVE-2025-48887
was published
for
vllm
(pip)
May 28, 2025
vLLM Allows Remote Code Execution via PyNcclPipe Communication Service
Critical
CVE-2025-47277
was published
for
vllm
(pip)
May 20, 2025
Remote Code Execution Vulnerability in vLLM Multi-Node Cluster Configuration
High
CVE-2025-30165
was published
for
vllm
(pip)
May 6, 2025
phi4mm: Quadratic Time Complexity in Input Token Processing leads to denial of service
Moderate
CVE-2025-46560
was published
for
vllm
(pip)
Apr 29, 2025
vLLM Vulnerable to Remote Code Execution via Mooncake Integration
Critical
CVE-2025-32444
was published
for
vllm
(pip)
Apr 29, 2025
Data exposure via ZeroMQ on multi-node vLLM deployment
High
CVE-2025-30202
was published
for
vllm
(pip)
Apr 29, 2025
CVE-2025-24357 Malicious model remote code execution fix bypass with PyTorch < 2.6.0
Critical
GHSA-ggpf-24jw-3fcw
was published
for
vllm
(pip)
Apr 23, 2025
vLLM vulnerable to Denial of Service by abusing xgrammar cache
Moderate
GHSA-hf3c-wxg2-49q9
was published
for
vllm
(pip)
Apr 15, 2025
xgrammar Vulnerable to Denial of Service (DoS) by abusing unbounded cache in memory
Moderate
CVE-2025-32381
was published
for
xgrammar
(pip)
Apr 9, 2025
vLLM deserialization vulnerability in vllm.distributed.GroupCoordinator.recv_object
Critical
CVE-2024-9052
was published
for
vllm
(pip)
Mar 20, 2025
vLLM Allows Remote Code Execution via Mooncake Integration
Critical
CVE-2025-29783
was published
for
vllm
(pip)
Mar 19, 2025
vLLM denial of service via outlines unbounded cache on disk
Moderate
CVE-2025-29770
was published
for
vllm
(pip)
Mar 19, 2025
vLLM uses Python 3.12 built-in hash() which leads to predictable hash collisions in prefix cache
Low
CVE-2025-25183
was published
for
vllm
(pip)
Feb 6, 2025
vllm: Malicious model to RCE by torch.load in hf_model_weights_iterator
High
CVE-2025-24357
was published
for
vllm
(pip)
Jan 27, 2025
ProTip!
Advisories are also available from the
GraphQL API