Skip to content

python.large_response_memory

Stability Medium

Detects HTTP response body reads that load entire responses into memory without size limits, which can cause out-of-memory errors.

Unbounded response reads can:

  • Crash your service — A large response can exhaust available memory
  • Enable DoS attacks — Malicious endpoints can send huge responses
  • Cause cascading failures — OOM kills can affect other services on the same node
  • Degrade performance — Large memory allocations slow down garbage collection
# ❌ Before (unbounded memory)
response = requests.get(url)
data = response.content # Loads entire response into memory
# Also problematic
data = response.json() # Parses entire response
# ✅ After (streaming with limits)
MAX_SIZE = 10 * 1024 * 1024 # 10MB limit
response = requests.get(url, stream=True)
content_length = int(response.headers.get('content-length', 0))
if content_length > MAX_SIZE:
raise ValueError(f"Response too large: {content_length}")
# Stream in chunks
chunks = []
size = 0
for chunk in response.iter_content(chunk_size=8192):
size += len(chunk)
if size > MAX_SIZE:
raise ValueError("Response exceeded maximum size")
chunks.append(chunk)
data = b''.join(chunks)
  • response.content without Content-Length check
  • response.json() on responses from external services
  • response.text on unbounded responses
  • aiohttp response reads without size limits

Unfault generates patches that add streaming and size limits:

# With aiohttp
async with session.get(url) as response:
if response.content_length and response.content_length > MAX_SIZE:
raise ValueError("Response too large")
data = await response.read() # aiohttp has built-in limits
# httpx has built-in limits
import httpx
# Set max response size
limits = httpx.Limits(max_content_length=10 * 1024 * 1024)
client = httpx.Client(limits=limits)