Published on
By Jurgen Van Gael - http://jvangael.github.io/ - @jvangael
As data scientists and decision makers, uncertainty is all around us: data is noisy, missing, wrong or inherently uncertain. Statistics offers a wide set of theories and tools to deal with this uncertainty, yet most people are unaware of a unifying theory of uncertainty. In this talk I want to introduce the audience to a branch of statistics called Bayesian reasoning which is a unifying, consistent, logical and most importantly successful way of dealing with uncertainty.
Over the past two centuries there have been many proposals for dealing with uncertainty (e.g. frequentist probabilities, fuzzy logic, ...). Under the influence of early 20th century statisticians, the Bayesian formalism was somewhat pushed into the background of the statistical scene. More recently though, some to the credit of computer science, Bayesian thinking has seen a revival. So what and how much should a data scientist or decision maker know about Bayesian thinking?
My talk will consist of four different parts. In the first part, I will explain the central dogma of Bayesian thinking: Bayes Rule. This simple equation (4 variables, one multiplication and one division!) describes how we should update our beliefs about the world in light of new data. I will discuss evidence from neuroscience and psychology that the brain uses Bayesian mechanism to reason about the world. Unfortunately, sometimes the brain fails miserably at taking all the variables of Bayes rule into account.
This leads to the second part of the talk where I will illustrate Bayes rule as a tool for decision makers to reason about uncertainty.
In the third part of the talk I will give an example of how we can build machine learning systems around Bayes rule. The key idea here is that Bayes rule allows us to keep track of uncertainty about the world. In this part I will illustrate one a Bayesian machine learning system in action.
In the final part of the talk I will introduce the concept of “Probabilistic Programming”. Probabilistic programming is a new embryonic programming paradigm that introduces “uncertain variables” as a first class citizen of a programming language and then uses Bayes rule to execute the programs.
When we look at machine learning conferences in the last few years, the Bayesian framework has been prominent. In this talk I want to help the audience understand how the Bayesian framework can help them in their data mining and decision making processes. If people leave the talk thinking Bayes rule is the E=MC^2 of data science, I will consider the presentation a success.
Clipping is a handy way to collect important slides you want to go back to later.
Clipping is a handy way to collect and organize the most important slides from a presentation. You can keep your great finds in clipboards organized around topics.
Be the first to comment