CVE-2026-40164: Critical Hash Collision Vulnerability in jq JSON Processor Enables DoS Attacks
A hardcoded seed in jq's MurmurHash3 implementation allows attackers to craft malicious JSON payloads causing severe CPU exhaustion. The vulnerability affects CI/CD pipelines and web services processing JSON data.
# The JSON Processing Weakness That Could Freeze Your Computer
jq is a popular tool that helps programmers process and analyze JSON data — think of it as a specialized search engine for data files. The problem is that jq uses a predictable method to organize this data internally, like filing papers in a cabinet using the same lock combination for every cabinet.
An attacker can exploit this by crafting a malicious JSON file that deliberately triggers this weak lock system. Instead of the normal instant lookup, the computer has to search through mountains of data sequentially. It's like someone deliberately filing 100,000 sheets of paper in one filing drawer so that finding anything takes forever.
The real impact: if you run jq on untrusted data — say, processing user-uploaded files or data from an unverified source — an attacker could make your computer grind to a halt, consuming all available processing power. For companies processing large amounts of data automatically, this could crash servers or make services unavailable.
Who's most at risk? DevOps teams and data engineers who use jq to process files from the internet, plus any web services that use jq internally without carefully vetting their inputs.
What should you do? First, update jq to the latest version once it's patched — developers have already fixed this. Second, be cautious about processing JSON files from unknown sources. Third, if you're a technical user, consider validating file sizes before processing with jq as a temporary safeguard.
This vulnerability is a good reminder that even boring-sounding data processing tools need security attention, especially when they're used on files you don't fully trust.
Want the full technical analysis? Click "Technical" above.
The jq command-line JSON processor, widely used across development environments and production systems, contains a critical vulnerability (CVE-2026-40164) that enables attackers to cause severe denial-of-service conditions through hash collision attacks. This high-severity vulnerability stems from the use of a hardcoded, publicly visible seed (0x432A9843) in the MurmurHash3 algorithm used for JSON object hash table operations.
The flaw allows malicious actors to precompute hash collisions offline and craft JSON payloads that degrade hash table performance from O(1) to O(n) complexity, ultimately causing any jq expression to operate at O(n²) complexity. This results in significant CPU exhaustion and system resource depletion, particularly impacting automated systems that process untrusted JSON data.
Technical details
The vulnerability exists in jq's hash table implementation prior to commit 0c7d133c3c7e37c00b6d46b658a02244fdd3c784. The core issue lies in the predictable nature of the MurmurHash3 seed value, which remains constant across all installations and executions of jq.
MurmurHash3, while generally providing good hash distribution, becomes vulnerable when the seed value is known to attackers. The hardcoded seed 0x432A9843 is embedded in the source code and publicly accessible, enabling adversaries to reverse-engineer hash collision scenarios offline.
When processing JSON objects, jq stores key-value pairs in hash tables for efficient lookup operations. Under normal circumstances, this provides O(1) average-case performance. However, when multiple keys hash to the same bucket due to intentional collisions, the hash table degrades to linked-list behavior, requiring linear search through all colliding entries.
The attack leverages this weakness by constructing JSON objects where numerous keys (typically requiring ~100KB payload) all resolve to the same hash bucket. This forces jq to perform O(n) operations for each lookup, and since jq expressions often involve multiple object traversals, the overall complexity becomes O(n²).
Attack vector and exploitation
Exploitation requires an attacker to supply a specially crafted JSON payload to a system running vulnerable jq versions. The attack vector is particularly effective because:
Offline preparation: Attackers can precompute hash collisions without interacting with target systems, making detection difficult during the preparation phase.
Small payload, large impact: A relatively modest ~100KB JSON file can cause severe resource exhaustion, making this an efficient attack vector.
Universal applicability: The same malicious payload works against any vulnerable jq installation due to the hardcoded seed.
Common exploitation scenarios include submitting malicious JSON through web APIs that use jq for data processing, uploading crafted configuration files to systems that parse them with jq, or targeting CI/CD pipelines that process JSON artifacts. The attack doesn't require authentication in many cases, as jq is often used to process publicly submitted data.
Affected systems
This vulnerability affects all jq installations across multiple platforms prior to the security patch. Given jq's cross-platform nature, vulnerable systems include:
Linux distributions with jq packages installed via package managers (apt, yum, pacman), macOS systems with jq installed through Homebrew or MacPorts, Windows environments running jq through WSL, Cygwin, or native builds, and containerized applications including jq in their images.
Particularly at risk are automated systems such as CI/CD pipelines processing JSON artifacts, web services using jq for API data transformation, log processing systems that parse JSON logs with jq, and data analytics platforms leveraging jq for JSON manipulation.
The vulnerability impacts both direct jq usage and applications that embed jq functionality through language bindings or system calls.
Detection and indicators of compromise
Organizations should monitor for several key indicators that may suggest exploitation attempts:
Performance anomalies: Unusual CPU spikes coinciding with JSON processing operations, especially when processing time appears disproportionate to payload size.
Resource exhaustion patterns: Systems experiencing memory pressure or becoming unresponsive during JSON parsing tasks that previously completed quickly.
Payload analysis: JSON inputs containing unusually high numbers of object keys, particularly when keys appear randomly generated or follow specific patterns designed to create hash collisions.
Log analysis should focus on identifying requests containing large JSON objects with suspicious key structures. Network monitoring can help detect repeated submissions of similar large JSON payloads that may indicate ongoing exploitation attempts.
Remediation
Immediate remediation requires updating jq to a version containing commit 0c7d133c3c7e37c00b6d46b658a02244fdd3c784 or later. This patch implements dynamic seed generation, eliminating the predictable hash collision vulnerability.
Short-term mitigations for systems unable to update immediately include implementing input validation to reject JSON objects exceeding reasonable key count thresholds, deploying rate limiting on endpoints processing JSON with jq, and monitoring resource usage to detect potential exploitation attempts.
Long-term security measures should include establishing automated vulnerability scanning for jq installations, implementing robust input validation for all JSON processing workflows, and considering alternative JSON processors for high-risk environments during the transition period.
CypherByte assessment
CypherByte rates this vulnerability as high risk due to its broad impact potential and ease of exploitation. The combination of universal applicability across all jq installations and the ability to precompute attacks offline makes this particularly dangerous for organizations processing untrusted JSON data.
The vulnerability's impact on automated systems amplifies its severity, as successful exploitation can disrupt critical business processes with minimal attacker effort. Organizations should prioritize patching efforts, particularly for internet-facing systems and automated processing pipelines.
While no in-the-wild exploitation has been observed, the straightforward nature of this attack vector suggests that proof-of-concept exploits will likely emerge quickly once technical details become widely known.