Denial of Service (Resource exhaustion)
Description
The commit prevents a potential Denial of Service (memory/resource exhaustion) by capping the amount of the HTTP error response body read during prom.scrape non-200 responses. Previously, the code read the entire response body (io.ReadAll) for non-200 HTTP statuses, which could allow a malicious or misbehaving endpoint to exhaust memory (OOM) by returning a very large body. The fix introduces a limited reader (GetLimitedReader) with a maximum size (maxScrapeSize+1) and reads only up to that bound, mitigating unbounded reads and subsequent memory exhaustion.
Impact: This changes the behavior of error-paths in lib/promscrape/client.go, ensuring that error bodies are not read in full when the endpoint is misbehaving or malicious. This is a genuine security improvement, not just a dependency bump or a cosmetic change.
Proof of Concept
PoC concept (not a full exploit harness):
1) Set up a local HTTP server that simulates a misbehaving metrics endpoint by returning a very large response body with a non-200 status.
Example server (Go-like pseudocode for illustration):
package main
import (
"net/http"
"io"
)
func hugeErrorHandler(w http.ResponseWriter, r *http.Request) {
w.WriteHeader(http.StatusInternalServerError)
// Stream a large payload (e.g., 200 MB) without buffering in memory
chunk := make([]byte, 1024*1024) // 1 MB chunks
for i := 0; i < 200; i++ { // ~200 MB total
if _, err := w.Write(chunk); err != nil {
break
}
if f, ok := w.(http.Flusher); ok {
f.Flush()
}
}
}
func main() {
http.HandleFunc("/metrics", hugeErrorHandler)
http.ListenAndServe(":8080", nil)
}
2) Client simulate the vulnerability (pre-fix behavior) by reading the entire body into memory when receiving a non-200 status. This mirrors what the vulnerable code path did before the patch:
package main
import (
"io/ioutil"
"net/http"
"log"
)
func main() {
resp, err := http.Get("http://localhost:8080/metrics")
if err != nil { log.Fatal(err) }
defer resp.Body.Close()
// Potentially unbounded read (old behavior): read entire body regardless of size
body, err := ioutil.ReadAll(resp.Body)
if err != nil { log.Fatal(err) }
_ = body // use the data somehow
}
3) Expected outcome:
- Before the fix: the client could exhaust memory attempting to read a very large error body from a misbehaving endpoint.
- After the fix: the read is bounded by maxScrapeSize+1, preventing unbounded memory growth. You should observe that the client reads only up to the configured limit and avoids OOM even when the endpoint sends a huge payload.
Prerequisites: Run the server and the client locally, ensuring the client under test uses the patched code path (promscrape ReadData with the limited reader). The PoC demonstrates the risk and mitigation but should be executed in a controlled environment to avoid resource exhaustion.
Commit Details
Author: Nikolay
Date: 2026-04-16 20:50 UTC
Message:
lib/promscrape: prevent unbounded scrape error body read
Previously, on non-200 HTTP status codes, lib/promscrape performed an
unbounded body read, which could potentially result in OOM.
This commit adds a maxScrapeSize limit to error response body reads,
protecting against malicious or misbehaving metrics endpoints.
Triage Assessment
Vulnerability Type: Denial of Service (Resource exhaustion)
Confidence: HIGH
Reasoning:
The commit limits the maximum size of the response body read for non-200 HTTP responses to prevent unbounded reads that could exhaust memory (OOM) or degrade service. This mitigates a potential denial-of-service vulnerability from malicious or misbehaving endpoints.
Verification Assessment
Vulnerability Type: Denial of Service (Resource exhaustion)
Confidence: HIGH
Affected Versions: <1.139.0
Code Diff
diff --git a/lib/promscrape/client.go b/lib/promscrape/client.go
index 84e9b18c8f563..080d586fa20ad 100644
--- a/lib/promscrape/client.go
+++ b/lib/promscrape/client.go
@@ -159,7 +159,9 @@ func (c *client) ReadData(dst *chunkedbuffer.Buffer) (bool, error) {
if resp.StatusCode != http.StatusOK {
metrics.GetOrCreateCounter(fmt.Sprintf(`vm_promscrape_scrapes_total{status_code="%d"}`, resp.StatusCode)).Inc()
- respBody, err := io.ReadAll(resp.Body)
+ lr := ioutil.GetLimitedReader(resp.Body, c.maxScrapeSize+1)
+ respBody, err := io.ReadAll(lr)
+ ioutil.PutLimitedReader(lr)
if err != nil {
respBody = []byte(err.Error())
}