python.unbounded_memory
Stability
High
Detects operations that can consume unbounded memory, leading to out-of-memory errors and service crashes.
Why It Matters
Section titled “Why It Matters”Unbounded memory operations can:
- Crash your service — OOM killer terminates the process
- Degrade performance — Memory pressure causes GC thrashing
- Affect neighbors — Other services on the same node suffer
- Enable DoS attacks — Malicious input triggers memory exhaustion
Example
Section titled “Example”# ❌ Before (unbounded memory)def process_file(file_path): with open(file_path) as f: lines = f.readlines() # Loads entire file into memory return process_lines(lines)
def fetch_all_users(): return list(User.query.all()) # Loads all users into memory# ✅ After (bounded/streaming)def process_file(file_path): with open(file_path) as f: for line in f: # Streams line by line yield process_line(line)
def fetch_users_batch(batch_size=1000): offset = 0 while True: batch = User.query.offset(offset).limit(batch_size).all() if not batch: break yield from batch offset += batch_sizeWhat Unfault Detects
Section titled “What Unfault Detects”file.readlines()without size limitsfile.read()without size parameter- ORM queries that load all records (
query.all(),list(query)) - Unbounded list comprehensions with external data
- Growing collections without bounds
Auto-Fix
Section titled “Auto-Fix”Unfault generates patches that add streaming and pagination:
# Streaming file readsdef read_large_file(path, chunk_size=8192): with open(path, 'rb') as f: while chunk := f.read(chunk_size): yield chunk
# Paginated database queriesdef iter_all_items(session, model, batch_size=1000): offset = 0 while True: batch = session.query(model).limit(batch_size).offset(offset).all() if not batch: return yield from batch offset += batch_sizeMemory-Safe Patterns
Section titled “Memory-Safe Patterns”# Bounded collectionsfrom collections import deque
# Fixed-size bufferbuffer = deque(maxlen=1000)
# Memory-mapped files for large dataimport mmap
with open('large_file.bin', 'rb') as f: mm = mmap.mmap(f.fileno(), 0, access=mmap.ACCESS_READ) # Process without loading entire file