python.large_response_memory
Stability
Medium
Detects HTTP response body reads that load entire responses into memory without size limits, which can cause out-of-memory errors.
Why It Matters
Section titled “Why It Matters”Unbounded response reads can:
- Crash your service — A large response can exhaust available memory
- Enable DoS attacks — Malicious endpoints can send huge responses
- Cause cascading failures — OOM kills can affect other services on the same node
- Degrade performance — Large memory allocations slow down garbage collection
Example
Section titled “Example”# ❌ Before (unbounded memory)response = requests.get(url)data = response.content # Loads entire response into memory
# Also problematicdata = response.json() # Parses entire response# ✅ After (streaming with limits)MAX_SIZE = 10 * 1024 * 1024 # 10MB limit
response = requests.get(url, stream=True)content_length = int(response.headers.get('content-length', 0))
if content_length > MAX_SIZE: raise ValueError(f"Response too large: {content_length}")
# Stream in chunkschunks = []size = 0for chunk in response.iter_content(chunk_size=8192): size += len(chunk) if size > MAX_SIZE: raise ValueError("Response exceeded maximum size") chunks.append(chunk)
data = b''.join(chunks)What Unfault Detects
Section titled “What Unfault Detects”response.contentwithout Content-Length checkresponse.json()on responses from external servicesresponse.texton unbounded responsesaiohttpresponse reads without size limits
Auto-Fix
Section titled “Auto-Fix”Unfault generates patches that add streaming and size limits:
# With aiohttpasync with session.get(url) as response: if response.content_length and response.content_length > MAX_SIZE: raise ValueError("Response too large")
data = await response.read() # aiohttp has built-in limitsConfiguration
Section titled “Configuration”# httpx has built-in limitsimport httpx
# Set max response sizelimits = httpx.Limits(max_content_length=10 * 1024 * 1024)client = httpx.Client(limits=limits)