March 02, 2016
by Francesco Gadaleta
Produced by: worldofpiggy.com
Markov Chain MonteCarlo with full conditional calculations
At some point, statistical problems need sampling. Sampling consists in generating observations from a specific distribution. Prior knowledge and likelihood are the essential components of bayesian theory to model a number of real-world problems. Therefore, bayesian statistics is based on the fundamental task of sampling from a distribution, as complicated as it can be, and then compute summary statistics like mean and variance in order to describe the observations that are representing the assumptions (the model).
In this episode, we learn how to do this.
In addition, I explain how Hamiltonian Montecarlo sampling works and why we should all use it, whenever we can. Following the shownotes makes it easier to understand some mathematical formulas that is better to read than listen.
Enjoy the show!
I provide high quality training on statistics, data science, computer programming, in order to facilitate setting data analytics pipelines in the most optimized and cost effective ways. Feel free to schedule a meeting with me
Designing algorithms is what I have been doing in the last 10 years and more. I can set up data analytics solutions for small and large enterprises and apply machine learning algorithms to detect patterns and trends, extract knowledge and support decisions within several commercial domains, from finance, healthcare, traffic, and sales forecasting. I design and deploy algorithms and cloud-based software systems for production environments with high industrial standards. Feel free to schedule a meeting with me
I can provide technical consulting regarding data science, deep learning, computing architectures and the most prominent software packages currently available. My skills in distributed infrastructure are essential to give the insights you need to start your own data analytics pipeline in the most optimized and efficient environment. Feel free to schedule a meeting with me