The document introduces Neural Processes (NP) and Conditional Neural Processes (CNP), which are models that aim to learn distributions of functions rather than single functions, improving upon traditional neural networks and Gaussian Processes. CNP focuses on predictive distributions given observations, while NP incorporates global latent variables for inference, allowing it to provide function samples and improve uncertainty estimates. The paper discusses architecture, complexity, training methods, and experiments demonstrating the efficacy of these models in tasks like regression and image completion.