Monitoring Spark Applications Tzach Zohar @ Kenshoo, March/2016 The document discusses monitoring Spark applications. It covers using the Spark UI to monitor jobs, stages and tasks; using the Spark REST API to programmatically access monitoring data; configuring Spark metric sinks like Graphite to export internal Spark metrics; and creating applicative metrics to monitor your own application metrics. The key points are monitoring is important for failures, performance, correctness and understanding data; Spark provides built-in tools but applicative metrics are also useful; and Graphite is well-suited to analyze metrics trends over time.