This document summarizes a lecture on stream processing frameworks. It introduces Storm and Spark Streaming as two popular frameworks. Storm processes tuples in real-time using spouts and bolts, but can lose state during failures. Spark Streaming discretizes streams into micro-batches, which allows for fault-tolerance through deterministic processing of RDDs and recovery of lost data. It offers higher throughput and exactly-once processing semantics compared to Storm.