This document discusses how data refineries use machine learning and artificial intelligence to create jobs. It provides examples of how data refineries analyze large datasets using algorithms and tools like TensorFlow. This requires jobs in data engineering, data analysis, and data wrangling to gather, clean, transform, and interpret the data. Specifically, Descartes Labs is highlighted as using satellite imagery data and tools like Kubernetes to advance scientific understanding and open new fields for analysis related to agriculture and volcanology.