1. Yuta Kashino presented on Edward, a probabilistic programming library built on TensorFlow. Edward allows defining probabilistic models and performing Bayesian inference using techniques like MCMC and variational inference.
2. Dropout was discussed as a way to approximate Bayesian neural networks and model uncertainty in deep learning. Adding dropout to networks can help prevent overfitting.
3. Edward examples were shown using TensorFlow for defining probabilistic models like Bayesian linear regression and hidden Markov models. Models can be easily defined and inference performed.