POPULARITY
Categories
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!Get early access to Alex's next live-cohort courses!Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)Takeaways:BART as a core tool: Gabriel explains how Bayesian Additive Regression Trees provide robust uncertainty quantification and serve as a reliable baseline model in many domains.Rust for performance: His Rust re-implementation of BART dramatically improves speed and scalability, making it feasible for larger datasets and real-world IoT applications.Strengths and trade-offs: BART avoids overfitting and handles missing data gracefully, though it is slower than other tree-based approaches.Big data meets Bayes: Gabriel shares strategies for applying Bayesian methods with big data, including when variational inference helps balance scale with rigor.Optimization and decision-making: He highlights how BART models can be embedded into optimization frameworks, opening doors for sequential decision-making.Open source matters: Gabriel emphasizes the importance of communities like PyMC and Bambi, encouraging newcomers to start with small contributions.Chapters:05:10 – From economics to IoT and Bayesian statistics18:55 – Introduction to BART (Bayesian Additive Regression Trees)24:40 – Re-implementing BART in Rust for speed and scalability32:05 – Comparing BART with Gaussian Processes and other tree methods39:50 – Strengths and limitations of BART47:15 – Handling missing data and different likelihoods54:30 – Variational inference and big data challenges01:01:10 – Embedding BART into optimization and decision-making frameworks01:08:45 – Open source, PyMC, and community support01:15:20 – Advice for newcomers01:20:55 – Future of BART, Rust, and probabilistic programmingThank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian...
On this episode we have on Raheem Jarbo, better known as Mega Ran. We talked about his life as a digital content creator, his experience as a foster parent, the process of adopting his son, the nerves around adopting his son, his love of wrestling and video games, creating music for kids, writing ska songs, and touring. Make sure to check out his new album Buddy's Magic Treehouse wherever you listen to music!Send us a textSupport the showFollow us on Facebook and Instagram @theimperfectdadspodcast . Look for new episodes of The Imperfect Dads Podcast every Monday.
Get early access to Alex's next live-cohort courses!Today's clip is from episode 141 of the podcast, with Sam Witty.Alex and Sam discuss the ChiRho project, delving into the intricacies of causal inference, particularly focusing on Do-Calculus, regression discontinuity designs, and Bayesian structural causal inference. They explain ChiRho's design philosophy, emphasizing its modular and extensible nature, and highlights the importance of efficient estimation in causal inference, making complex statistical methods accessible to users without extensive expertise.Get the full discussion here.Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)TranscriptThis is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!Get early access to Alex's next live-cohort courses!Enroll in the Causal AI workshop, to learn live with Alex (15% off if you're a Patron of the show)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)Takeaways:Causal inference is crucial for understanding the impact of interventions in various fields.ChiRho is a causal probabilistic programming language that bridges mechanistic and data-driven models.ChiRho allows for easy manipulation of causal models and counterfactual reasoning.The design of ChiRho emphasizes modularity and extensibility for diverse applications.Causal inference requires careful consideration of assumptions and model structures.Real-world applications of causal inference can lead to significant insights in science and engineering.Collaboration and communication are key in translating causal questions into actionable models.The future of causal inference lies in integrating probabilistic programming with scientific discovery.Chapters:05:53 Bridging Mechanistic and Data-Driven Models09:13 Understanding Causal Probabilistic Programming12:10 ChiRho and Its Design Principles15:03 ChiRho's Functionality and Use Cases17:55 Counterfactual Worlds and Mediation Analysis20:47 Efficient Estimation in ChiRho24:08 Future Directions for Causal AI50:21 Understanding the Do-Operator in Causal Inference56:45 ChiRho's Role in Causal Inference and Bayesian Modeling01:01:36 Roadmap and Future Developments for ChiRho01:05:29 Real-World Applications of Causal Probabilistic Programming01:10:51 Challenges in Causal Inference Adoption01:11:50 The Importance of Causal Claims in Research01:18:11 Bayesian Approaches to Causal Inference01:22:08 Combining Gaussian Processes with Causal Inference01:28:27 Future Directions in Probabilistic Programming and Causal InferenceThank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad...
Get early access to Alex's next live-cohort courses!Today's clip is from episode 140 of the podcast, with Ron Yurko.Alex and Ron discuss the challenges of model deployment, and the complexities of modeling player contributions in team sports like soccer and football.They emphasize the importance of understanding replacement levels, the Going Deep framework in football analytics, and the need for proper modeling of expected points. Additionally, they share insights on teaching Bayesian modeling to students and the difficulties they face in grasping the concepts of model writing and application.Get the full discussion here.Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)TranscriptThis is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)Takeaways:Teaching students to write out their own models is crucial.Developing a sports analytics portfolio is essential for aspiring analysts.Modeling expectations in sports analytics can be misleading.Tracking data can significantly improve player performance models.Ron encourages students to engage in active learning through projects.The importance of understanding the dependency structure in data is vital.Ron aims to integrate more diverse sports analytics topics into his teaching.Chapters:03:51 The Journey into Sports Analytics15:20 The Evolution of Bayesian Statistics in Sports26:01 Innovations in NFL WAR Modeling39:23 Causal Modeling in Sports Analytics46:29 Defining Replacement Levels in Sports48:26 The Going Deep Framework and Big Data in Football52:47 Modeling Expectations in Football Data55:40 Teaching Statistical Concepts in Sports Analytics01:01:54 The Importance of Model Building in Education01:04:46 Statistical Thinking in Sports Analytics01:10:55 Innovative Research in Player Movement01:15:47 Exploring Data Needs in American Football01:18:43 Building a Sports Analytics PortfolioThank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M,...
Today's clip is from episode 139 of the podcast, with with Max Balandat.Alex and Max discuss the integration of BoTorch with PyTorch, exploring its applications in Bayesian optimization and Gaussian processes. They highlight the advantages of using GPyTorch for structured matrices and the flexibility it offers for research. The discussion also covers the motivations behind building BoTorch, the importance of open-source culture at Meta, and the role of PyTorch in modern machine learning.Get the full discussion here.Attend Alex's tutorial at PyData Berlin: A Beginner's Guide to State Space Modeling Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)TranscriptThis is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)Takeaways:BoTorch is designed for researchers who want flexibility in Bayesian optimization.The integration of BoTorch with PyTorch allows for differentiable programming.Scalability at Meta involves careful software engineering practices and testing.Open-source contributions enhance the development and community engagement of BoTorch.LLMs can help incorporate human knowledge into optimization processes.Max emphasizes the importance of clear communication of uncertainty to stakeholders.The role of a researcher in industry is often more application-focused than in academia.Max's team at Meta works on adaptive experimentation and Bayesian optimization.Chapters:08:51 Understanding BoTorch12:12 Use Cases and Flexibility of BoTorch15:02 Integration with PyTorch and GPyTorch17:57 Practical Applications of BoTorch20:50 Open Source Culture at Meta and BoTorch's Development43:10 The Power of Open Source Collaboration47:49 Scalability Challenges at Meta51:02 Balancing Depth and Breadth in Problem Solving55:08 Communicating Uncertainty to Stakeholders01:00:53 Learning from Missteps in Research01:05:06 Integrating External Contributions into BoTorch01:08:00 The Future of Optimization with LLMsThank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode,...
Today's clip is from episode 138 of the podcast, with Mélodie Monod, François-Xavier Briol and Yingzhen Li.During this live show at Imperial College London, Alex and his guests delve into the complexities and advancements in Bayesian deep learning, focusing on uncertainty quantification, the integration of machine learning tools, and the challenges faced in simulation-based inference.The speakers discuss their current projects, the evolution of Bayesian models, and the need for better computational tools in the field.Get the full discussion here.Attend Alex's tutorial at PyData Berlin: A Beginner's Guide to State Space Modeling Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)TranscriptThis is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)Takeaways:Bayesian deep learning is a growing field with many challenges.Current research focuses on applying Bayesian methods to neural networks.Diffusion methods are emerging as a new approach for uncertainty quantification.The integration of machine learning tools into Bayesian models is a key area of research.The complexity of Bayesian neural networks poses significant computational challenges.Future research will focus on improving methods for uncertainty quantification. Generalized Bayesian inference offers a more robust approach to uncertainty.Uncertainty quantification is crucial in fields like medicine and epidemiology.Detecting out-of-distribution examples is essential for model reliability.Exploration-exploitation trade-off is vital in reinforcement learning.Marginal likelihood can be misleading for model selection.The integration of Bayesian methods in LLMs presents unique challenges.Chapters:00:00 Introduction to Bayesian Deep Learning03:12 Panelist Introductions and Backgrounds10:37 Current Research and Challenges in Bayesian Deep Learning18:04 Contrasting Approaches: Bayesian vs. Machine Learning26:09 Tools and Techniques for Bayesian Deep Learning31:18 Innovative Methods in Uncertainty Quantification36:23 Generalized Bayesian Inference and Its Implications41:38 Robust Bayesian Inference and Gaussian Processes44:24 Software Development in Bayesian Statistics46:51 Understanding Uncertainty in Language Models50:03 Hallucinations in Language Models53:48 Bayesian Neural Networks vs Traditional Neural Networks58:00 Challenges with Likelihood Assumptions01:01:22 Practical Applications of Uncertainty Quantification01:04:33 Meta Decision-Making with Uncertainty01:06:50 Exploring Bayesian Priors in Neural Networks01:09:17 Model Complexity and Data Signal01:12:10 Marginal Likelihood and Model Selection01:15:03 Implementing Bayesian Methods in LLMs01:19:21 Out-of-Distribution Detection in LLMsThank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer,...
Today's clip is from episode 137 of the podcast, with Robert Ness.Alex and Robert discuss the intersection of causal inference and deep learning, emphasizing the importance of understanding causal concepts in statistical modeling. The discussion also covers the evolution of probabilistic machine learning, the role of inductive biases, and the potential of large language models in causal analysis, highlighting their ability to translate natural language into formal causal queries.Get the full conversation here.Attend Alex's tutorial at PyData Berlin: A Beginner's Guide to State Space Modeling Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)TranscriptThis is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
MEGA-RAN returns to TALKTIMELIVE.COM to discuss his latest project, "Buddy's Magic Tree House," which is #1 on iTunes children's chart, his venture through fatherhood, as well as his work with the NEW DAY's brand new WWE theme song, and much more.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)Takeaways:Causal assumptions are crucial for statistical modeling.Deep learning can be integrated with causal models.Statistical rigor is essential in evaluating LLMs.Causal representation learning is a growing field.Inductive biases in AI should match key mechanisms.Causal AI can improve decision-making processes.The future of AI lies in understanding causal relationships.Chapters:00:00 Introduction to Causal AI and Its Importance16:34 The Journey to Writing Causal AI28:05 Integrating Graphical Causality with Deep Learning40:10 The Evolution of Probabilistic Machine Learning44:34 Practical Applications of Causal AI with LLMs49:48 Exploring Multimodal Models and Causality56:15 Tools and Frameworks for Causal AI01:03:19 Statistical Rigor in Evaluating LLMs01:12:22 Causal Thinking in Real-World Deployments01:19:52 Trade-offs in Generative Causal Models01:25:14 Future of Causal Generative ModelingThank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Marcus Nölke, Maggi Mackintosh, Grant...
On this very fun and sweet episode, Brian is joined by the wonderful rapper/DJ/author/producer Mega Ran (@mega_ran) for a chat about touring, MAGFest, making kids music, the joys of collaborating with your own children, tricky concert etiquette, and more! This is a REALLY good one, folks. Be sure to check out Mega Ran's new kids album Buddy's Magic Tree House and all of his other music here: https://linktr.ee/megaran. Check out our Patreon and support the show at patreon.com/leightonnight! Follow us on Twitter at @leightonnight and on Instagram/TikTok at @leighton_night. You can find Brian on Twitter/Instagram at @ bwecht, and Leighton at @graylish (Twitter)/@buttchamps (Instagram). Hosted on Acast. See acast.com/privacy for more information.
Teacher, rapper, author and content creator Mega Ran joins "Draped in Gold." Jack and Flobo are talking Goldberg, All In, and whether or not there's a new WWE Transfer Portal
Today's clip is from episode 136 of the podcast, with Haavard Rue & Janet van Niekerk.Alex, Haavard and Janet explore the world of Bayesian inference with INLA, a fast and deterministic method that revolutionizes how we handle large datasets and complex models. Discover the power of INLA, and why it can make your models go much faster! Get the full conversation here.Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)TranscriptThis is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)Takeaways:INLA is a fast, deterministic method for Bayesian inference.INLA is particularly useful for large datasets and complex models.The R INLA package is widely used for implementing INLA methodology.INLA has been applied in various fields, including epidemiology and air quality control.Computational challenges in INLA are minimal compared to MCMC methods.The Smart Gradient method enhances the efficiency of INLA.INLA can handle various likelihoods, not just Gaussian.SPDs allow for more efficient computations in spatial modeling.The new INLA methodology scales better for large datasets, especially in medical imaging.Priors in Bayesian models can significantly impact the results and should be chosen carefully.Penalized complexity priors (PC priors) help prevent overfitting in models.Understanding the underlying mathematics of priors is crucial for effective modeling.The integration of GPUs in computational methods is a key future direction for INLA.The development of new sparse solvers is essential for handling larger models efficiently.Chapters:06:06 Understanding INLA: A Comparison with MCMC08:46 Applications of INLA in Real-World Scenarios11:58 Latent Gaussian Models and Their Importance15:12 Impactful Applications of INLA in Health and Environment18:09 Computational Challenges and Solutions in INLA21:06 Stochastic Partial Differential Equations in Spatial Modeling23:55 Future Directions and Innovations in INLA39:51 Exploring Stochastic Differential Equations43:02 Advancements in INLA Methodology50:40 Getting Started with INLA56:25 Understanding Priors in Bayesian ModelsThank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad
Get 10% off Hugo's "Building LLM Applications for Data Scientists and Software Engineers" online course!Today's clip is from episode 135 of the podcast, with Teemu Säilynoja.Alex and Teemu discuss the importance of simulation-based calibration (SBC). They explore the practical implementation of SBC in probabilistic programming languages, the challenges faced in developing SBC methods, and the significance of both prior and posterior SBC in ensuring model reliability. The discussion emphasizes the need for careful model implementation and inference algorithms to achieve accurate calibration.Get the full conversation here.Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)TranscriptThis is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)Takeaways:Teemu focuses on calibration assessments and predictive checking in Bayesian workflows.Simulation-based calibration (SBC) checks model implementationSBC involves drawing realizations from prior and generating prior predictive data.Visual predictive checking is crucial for assessing model predictions.Prior predictive checks should be done before looking at data.Posterior SBC focuses on the area of parameter space most relevant to the data.Challenges in SBC include inference time.Visualizations complement numerical metrics in Bayesian modeling.Amortized Bayesian inference benefits from SBC for quick posterior checks. The calibration of Bayesian models is more intuitive than Frequentist models.Choosing the right visualization depends on data characteristics.Using multiple visualization methods can reveal different insights.Visualizations should be viewed as models of the data.Goodness of fit tests can enhance visualization accuracy.Uncertainty visualization is crucial but often overlooked.Chapters:09:53 Understanding Simulation-Based Calibration (SBC)15:03 Practical Applications of SBC in Bayesian Modeling22:19 Challenges in Developing Posterior SBC29:41 The Role of SBC in Amortized Bayesian Inference33:47 The Importance of Visual Predictive Checking36:50 Predictive Checking and Model Fitting38:08 The Importance of Visual Checks40:54 Choosing Visualization Types49:06 Visualizations as Models55:02 Uncertainty Visualization in Bayesian Modeling01:00:05 Future Trends in Probabilistic ModelingThank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand...
ICYMI, I'll be in London next week, for a live episode of the Learning Bayesian Statistics podcast
Today's clip is from episode 134 of the podcast, with David Kohns.Alex and David discuss the future of probabilistic programming, focusing on advancements in time series modeling, model selection, and the integration of AI in prior elicitation. The discussion highlights the importance of setting appropriate priors, the challenges of computational workflows, and the potential of normalizing flows to enhance Bayesian inference.Get the full discussion here.Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)TranscriptThis is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)Takeaways:Setting appropriate priors is crucial to avoid overfitting in models.R-squared can be used effectively in Bayesian frameworks for model evaluation.Dynamic regression can incorporate time-varying coefficients to capture changing relationships.Predictively consistent priors enhance model interpretability and performance.Identifiability is a challenge in time series models.State space models provide structure compared to Gaussian processes.Priors influence the model's ability to explain variance.Starting with simple models can reveal interesting dynamics.Understanding the relationship between states and variance is key.State-space models allow for dynamic analysis of time series data.AI can enhance the process of prior elicitation in statistical models.Chapters:10:09 Understanding State Space Models14:53 Predictively Consistent Priors20:02 Dynamic Regression and AR Models25:08 Inflation Forecasting50:49 Understanding Time Series Data and Economic Analysis57:04 Exploring Dynamic Regression Models01:05:52 The Role of Priors01:15:36 Future Trends in Probabilistic Programming01:20:05 Innovations in Bayesian Model SelectionThank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki...
Today's clip is from episode 133 of the podcast, with Sean Pinkney & Adrian Seyboldt.The conversation delves into the concept of Zero-Sum Normal and its application in statistical modeling, particularly in hierarchical models. Alex, Sean and Adrian discuss the implications of using zero-sum constraints, the challenges of incorporating new data points, and the importance of distinguishing between sample and population effects. They also explore practical solutions for making predictions based on population parameters and the potential for developing tools to facilitate these processes.Get the full discussion here.Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)TranscriptThis is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;) Takeaways:Zero Sum constraints allow for better sampling and estimation in hierarchical models.Understanding the difference between population and sample means is crucial.A library for zero-sum normal effects would be beneficial.Practical solutions can yield decent predictions even with limitations.Cholesky parameterization can be adapted for positive correlation matrices.Understanding the geometry of sampling spaces is crucial.The relationship between eigenvalues and sampling is complex.Collaboration and sharing knowledge enhance research outcomes.Innovative approaches can simplify complex statistical problems.Chapters:03:35 Sean Pinkney's Journey to Bayesian Modeling11:21 The Zero-Sum Normal Project Explained18:52 Technical Insights on Zero-Sum Constraints32:04 Handling New Elements in Bayesian Models36:19 Understanding Population Parameters and Predictions49:11 Exploring Flexible Cholesky Parameterization01:07:23 Closing Thoughts and Future DirectionsThank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan, Francesco Madrisotti, Ivy Huang, Gary...
Today's clip is from episode 132 of the podcast, with Tom Griffiths.Tom and Alex Andorra discuss the fundamental differences between human intelligence and artificial intelligence, emphasizing the constraints that shape human cognition, such as limited data, computational resources, and communication bandwidth. They explore how AI systems currently learn and the potential for aligning AI with human cognitive processes. The discussion also delves into the implications of AI in enhancing human decision-making and the importance of understanding human biases to create more effective AI systems.Get the full discussion here.Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)TranscriptThis is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!Check out Hugo's latest episode with Fei-Fei Li, on How Human-Centered AI Actually Gets BuiltIntro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)Takeaways:Computational cognitive science seeks to understand intelligence mathematically.Bayesian statistics is crucial for understanding human cognition.Inductive biases help explain how humans learn from limited data.Eliciting prior distributions can reveal implicit beliefs.The wisdom of individuals can provide richer insights than averaging group responses.Generative AI can mimic human cognitive processes.Human intelligence is shaped by constraints of data, computation, and communication.AI systems operate under different constraints than human cognition. Human intelligence differs fundamentally from machine intelligence.Generative AI can complement and enhance human learning.AI systems currently lack intrinsic human compatibility.Language training in AI helps align its understanding with human perspectives.Reinforcement learning from human feedback can lead to misalignment of AI goals.Representational alignment can improve AI's understanding of human concepts.AI can help humans make better decisions by providing relevant information.Research should focus on solving problems rather than just methods.Chapters:00:00 Understanding Computational Cognitive Science13:52 Bayesian Models and Human Cognition29:50 Eliciting Implicit Prior Distributions38:07 The Relationship Between Human and AI Intelligence45:15 Aligning Human and Machine Preferences50:26 Innovations in AI and Human Interaction55:35 Resource Rationality in Decision Making01:00:07 Language Learning in AI Models
Today's clip is from episode 131 of the podcast, with Luke Bornn.Luke and Alex discuss the application of generative models in sports analytics. They emphasize the importance of Bayesian modeling to account for uncertainty and contextual variations in player data. The discussion also covers the challenges of balancing model complexity with computational efficiency, the innovative ways to hack Bayesian models for improved performance, and the significance of understanding model fitting and discretization in statistical modeling.Get the full discussion here.Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)TranscriptThis is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)Thank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan, Francesco Madrisotti, Ivy Huang, Gary Clarke, Robert Flannery, Rasmus Hindström, Stefan, Corey Abshire, Mike Loncaric, David McCormick, Ronald Legere, Sergio Dolia, Michael Cao, Yiğit Aşık and Suyog Chandramouli.Takeaways:Player tracking data revolutionized sports analytics.Decision-making in sports involves managing uncertainty and budget constraints.Luke emphasizes the importance of portfolio optimization in team management.Clubs with high budgets can afford inefficiencies in player acquisition.Statistical methods provide a probabilistic approach to player value.Removing human bias is crucial in sports decision-making.Understanding player performance distributions aids in contract decisions.The goal is to maximize performance value per dollar spent.Model validation in sports requires focusing on edge cases.
Mania SZN is finally over, and the @BrokenPencilBC (@Suave4Mayor x @DanjahOne) keeps you up-to-date with the most fitting recap of last weekend's event. Also on deck: everyone's favorite Unc gets caught dabbling in public—again, Mega Ran gets a surprise pop-up, LaGreca doesn't like what he saw and tells the world his true feelings, a Hip-Hop homework assignment, and tons more. Check your preferred streaming home & set a reminder. Like. Rate. Share. Most importantly, Subscribe for auto-delivery. https://pods.link/brokenpencilbc Available on all streaming platforms. #BrokenPencilLogic #YouCantWriteThis #PriceJustWentUp #MarkMyWords #FTCF #WCW #WWE #NXT #AEW #ROH #ImpactWrestling #NJPW #NWA #Podcast #NowStreaming #ApplePodcasts #Spotify #Pandora #TuneIn #prowrestling #VidaHermosaCigars #CerwinVega
Today's clip is from episode 130 of the podcast, with epidemiological modeler Adam Kucharski.This conversation explores the critical role of patient modeling during the COVID-19 pandemic, highlighting how these models informed public health decisions and the relationship between modeling and policy. The discussion emphasizes the need for improved communication and understanding of data among the public and policymakers.Get the full discussion at https://learnbayesstats.com/episode/129-bayesian-deep-learning-ai-for-science-vincent-fortuinIntro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)TranscriptThis is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)Thank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan, Francesco Madrisotti, Ivy Huang, Gary Clarke, Robert Flannery, Rasmus Hindström, Stefan, Corey Abshire, Mike Loncaric, David McCormick, Ronald Legere, Sergio Dolia, Michael Cao, Yiğit Aşık and Suyog Chandramouli.Takeaways:Epidemiology requires a blend of mathematical and statistical understanding.Models are essential for informing public health decisions during epidemics.The COVID-19 pandemic highlighted the importance of rapid modeling.Misconceptions about data can lead to misunderstandings in public health.Effective communication is crucial for conveying complex epidemiological concepts.Epidemic thinking can be applied to various fields, including marketing and finance.Public health policies should be informed by robust modeling and data analysis.Automation can help streamline data analysis in epidemic response.Understanding the limitations of models...
Today's clip is from episode 129 of the podcast, with AI expert and researcher Vincent Fortuin.This conversation delves into the intricacies of Bayesian deep learning, contrasting it with traditional deep learning and exploring its applications and challenges.Get the full discussion at https://learnbayesstats.com/episode/129-bayesian-deep-learning-ai-for-science-vincent-fortuinIntro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)TranscriptThis is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)Takeaways:The hype around AI in science often fails to deliver practical results.Bayesian deep learning combines the strengths of deep learning and Bayesian statistics.Fine-tuning LLMs with Bayesian methods improves prediction calibration.There is no single dominant library for Bayesian deep learning yet.Real-world applications of Bayesian deep learning exist in various fields.Prior knowledge is crucial for the effectiveness of Bayesian deep learning.Data efficiency in AI can be enhanced by incorporating prior knowledge.Generative AI and Bayesian deep learning can inform each other.The complexity of a problem influences the choice between Bayesian and traditional deep learning.Meta-learning enhances the efficiency of Bayesian models.PAC-Bayesian theory merges Bayesian and frequentist ideas.Laplace inference offers a cost-effective approximation.Subspace inference can optimize parameter efficiency.Bayesian deep learning is crucial for reliable predictions.Effective communication of uncertainty is essential.Realistic benchmarks are needed for Bayesian methodsCollaboration and communication in the AI community are vital.Chapters:00:00 Introduction to Bayesian Deep Learning04:24 Vincent Fortuin's Journey to Bayesian Deep Learning11:52 Understanding Bayesian Deep Learning16:29 Current Landscape of Bayesian Libraries21:11 Real-World Applications of Bayesian Deep Learning23:33 When to Use Bayesian Deep Learning28:22 Data Efficiency in AI and Generative Modeling30:18 Integrating Bayesian Knowledge into Generative Models31:44 The Role of Meta-Learning in Bayesian Deep Learning34:06 Understanding Pack Bayesian Theory37:55 Algorithms for Bayesian Deep Learning Models
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)Takeaways:Matt emphasizes the importance of Bayesian statistics in scenarios with limited data.Communicating insights to coaches is a crucial skill for data analysts.Building a data team requires understanding the needs of the coaching staff.Player recruitment is a significant focus in football analytics.The integration of data science in sports is still evolving.Effective data modeling must consider the practical application in games.Collaboration between data analysts and coaches enhances decision-making.Having a robust data infrastructure is essential for efficient analysis.The landscape of sports analytics is becoming increasingly competitive. Player recruitment involves analyzing various data models.Biases in traditional football statistics can skew player evaluations.Statistical techniques should leverage the structure of football data.Tracking data opens new avenues for understanding player movements.The role of data analysis in football will continue to grow.Aspiring analysts should focus on curiosity and practical experience.Chapters:00:00 Introduction to Football Analytics and Matt's Journey04:54 The Role of Bayesian Methods in Football10:20 Challenges in Communicating Data Insights17:03 Building Relationships with Coaches22:09 The Structure of the Data Team at Como26:18 Focus on Player Recruitment and Transfer Strategies28:48 January Transfer Window Insights30:54 Biases in Football Data Analysis34:11 Comparative Analysis of Men's and Women's Football36:55 Statistical Techniques in Football Analysis42:48 The Impact of Tracking Data on Football Analysis45:49 The Future of Data-Driven Football Strategies47:27 Advice for Aspiring Football Analysts
Full show: https://kNOwBETTERHIPHOP.com Artist Played: Okito, Aahmean, Mozaic, conshus, Mega Ran, O-Super, EyeQ, Silas Short, TOKiMONSTA, Mez, Hyldon, Adrian Younge, oreglo, Cautious Clay, DJ Ess, De La Soul, Butcher Brown, Phonte, Devin Morrison, Vvslegend, Billionaire Boyscout, Serebii, Crafty 893, Dave Dar, Kannon Salim Dar, MC Wicks, Planet Asia, Equipto, Archltect, spill tab, Amerigo Gazaway, Cavendish Archive, Slick Rick, OutKast, GOODie MOb, IMAKEMADBEATS
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)Thank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan, Francesco Madrisotti, Ivy Huang, Gary Clarke, Robert Flannery, Rasmus Hindström, Stefan, Corey Abshire, Mike Loncaric, David McCormick, Ronald Legere, Sergio Dolia and Michael Cao.Takeaways:Sharks play a crucial role in maintaining healthy ocean ecosystems.Bayesian statistics are particularly useful in data-poor environments like ecology.Teaching Bayesian statistics requires a shift in mindset from traditional statistical methods.The shark meat trade is significant and often overlooked.Ray meat trade is as large as shark meat trade, with specific markets dominating.Understanding the ecological roles of species is essential for effective conservation.Causal language is important in ecological research and should be encouraged.Evidence-driven decision-making is crucial in balancing human and ecological needs.Expert opinions are...
In our second-ever crossover episode with X-Pod '97, Mega Ran and Marcos join Christian Bladt, Eric Conner (briefly) and Jonathan London (even briefer) as they look back at the entire first season of Your Friendly Neighborhood Spider-Man on Disney+. Learn more about your ad choices. Visit megaphone.fm/adchoices
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)Takeaways:Marketing analytics is crucial for understanding customer behavior.PyMC Marketing offers tools for customer lifetime value analysis.Media mix modeling helps allocate marketing spend effectively.Customer Lifetime Value (CLV) models are essential for understanding long-term customer behavior.Productionizing models is essential for real-world applications.Productionizing models involves challenges like model artifact storage and version control.MLflow integration enhances model tracking and management.The open-source community fosters collaboration and innovation.Understanding time series is vital in marketing analytics.Continuous learning is key in the evolving field of data science.Chapters:00:00 Introduction to Will Dean and His Work10:48 Diving into PyMC Marketing17:10 Understanding Media Mix Modeling25:54 Challenges in Productionizing Models35:27 Exploring Customer Lifetime Value Models44:10 Learning and Development in Data ScienceThank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz,...
Tonight we have the return of our guy Mega Ran we will talk to him about his latest song being the new theme for WWE's The New Day and what happened at the Raw on netflix premiere , we also will talk being at the Grammy's and his latest project Buddy's Magic Tree House ! Plus you know we have to chime in on the Royal Rumble ! What's Next for Jey Uso ? What is CM Punk's favo and how will it play out ? Cory will also b talkin som UFC 312 and we will talk @ Piece Promotions where Cory and Kyle will be live on Commentary. ! get great clips of your content using OPUS CLIP ! CHECK OUT OUR CODE https://www.opus.pro/?via=Ko3C
Tonight we have the return of our guy Mega Ran we will talk to him about his latest song being the new theme for WWE's The New Day and what happened at the Raw on netflix premiere , we also will talk being at the Grammy's and his latest project Buddy's Magic Tree House!Plus you know we have to chime in on the Royal Rumble ! What's Next for Jey Uso ? What is CM Punk's favor and how will it play out? Cory will also be talkin som UFC 312 and we will talk @ Piece Promotions where Cory and Kyle will be live on Commentary! #TheNewDay #NewDay #WWE #WWERaw #SmackDown #MegaRan #RawonNetflix #WWEonNetflix #RoyalRumble #Grammys
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)Thank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan, Francesco Madrisotti, Ivy Huang, Gary Clarke, Robert Flannery, Rasmus Hindström, Stefan, Corey Abshire and Mike Loncaric.Takeaways:The evolution of sports modeling is tied to the availability of high-frequency data.Bayesian methods are valuable in handling messy, hierarchical data.Communication between data scientists and decision-makers is crucial for effective model use.Models are often wrong, and learning from mistakes is part of the process.Simplicity in models can sometimes yield better results than complexity.The integration of analytics in sports is still developing, with opportunities in various sports.Transparency in research and development teams enhances decision-making.Understanding uncertainty in models is essential for informed decisions.The balance between point estimates and full distributions is a...
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!My Intuitive Bayes Online Courses1:1 Mentorship with meOur theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)Takeaways:Bayesian statistics offers a robust framework for econometric modeling.State space models provide a comprehensive way to understand time series data.Gaussian random walks serve as a foundational model in time series analysis.Innovations represent external shocks that can significantly impact forecasts.Understanding the assumptions behind models is key to effective forecasting.Complex models are not always better; simplicity can be powerful.Forecasting requires careful consideration of potential disruptions. Understanding observed and hidden states is crucial in modeling.Latent abilities can be modeled as Gaussian random walks.State space models can be highly flexible and diverse.Composability allows for the integration of different model components.Trends in time series should reflect real-world dynamics.Seasonality can be captured through Fourier bases.AR components help model residuals in time series data.Exogenous regression components can enhance state space models.Causal analysis in time series often involves interventions and counterfactuals.Time-varying regression allows for dynamic relationships between variables.Kalman filters were originally developed for tracking rockets in space.The Kalman filter iteratively updates beliefs based on new data.Missing data can be treated as hidden states in the Kalman filter framework.The Kalman filter is a practical application of Bayes' theorem in a sequential context.Understanding the dynamics of systems is crucial for effective modeling.The state space module in PyMC simplifies complex time series modeling tasks.Chapters:00:00 Introduction to Jesse Krabowski and Time Series Analysis04:33 Jesse's Journey into Bayesian Statistics10:51 Exploring State Space Models18:28 Understanding State Space Models and Their Components
https://spinitron.com/WSFM/pl/20094601/Radio-Active-Kids
Send us a textPlease Like and Subscribe to my YouTube Channel for more Great Guests and Contentwww.youtube.com/@thedanlevelyshow/streamsBag of Tricks Cat is an underground rapper based in Glendale, Arizona. His artistic persona reflects a non-gimmicky approach to hip-hop, emphasizing authenticity and creativity over commercial appeal. He has built a reputation for his unique style and lyrical content, which often resonates with fans looking for substance in music.Bag of Tricks Cat has collaborated with various artists across the hip-hop spectrum, including notable figures such as Mega Ran from Philadelphia, D12 from Detroit, ASTRAY from Saginaw, MI, Whitney Peyton, WILLY NORTHPOLE from Arizona and Smoke DZA from New York. These collaborations highlight his versatility and ability to blend different styles within the genre. He has also performed internationally, showcasing his music beyond local venues.Follow "The Dan Levely Show" onFacebook: http://www.facebook.com/thedanlevelyshow -Instagram: http://www.instagram.com/thedanlevelyshow -YouTube: http://www.youtube.com/@thedanlevelyshow/streams -Twitter: http://www.twitter.com/danlevelyshow*THE VIEWS, OPINIONS, OR COMMENTS EXPRESSED ON "THE DAN LEVELY SHOW" BY ANY GUEST BEING INTERVIEWED ARE THOSE OF THE GUEST AND DO NOT REFLECT OR REPRESENT THE VIEWS AND OPINIONS HELD BY "THE DAN LEVELY SHOW"*Support the show
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!My Intuitive Bayes Online Courses1:1 Mentorship with meOur theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)Takeaways:BART models are non-parametric Bayesian models that approximate functions by summing trees.BART is recommended for quick modeling without extensive domain knowledge.PyMC-BART allows mixing BART models with various likelihoods and other models.Variable importance can be easily interpreted using BART models.PreliZ aims to provide better tools for prior elicitation in Bayesian statistics.The integration of BART with Bambi could enhance exploratory modeling.Teaching Bayesian statistics involves practical problem-solving approaches.Future developments in PyMC-BART include significant speed improvements.Prior predictive distributions can aid in understanding model behavior.Interactive learning tools can enhance understanding of statistical concepts.Integrating PreliZ with PyMC improves workflow transparency.Arviz 1.0 is being completely rewritten for better usability.Prior elicitation is crucial in Bayesian modeling.Point intervals and forest plots are effective for visualizing complex data.Chapters:00:00 Introduction to Osvaldo Martin and Bayesian Statistics08:12 Exploring Bayesian Additive Regression Trees (BART)18:45 Prior Elicitation and the PreliZ Package29:56 Teaching Bayesian Statistics and Future Directions45:59 Exploring Prior Predictive Distributions52:08 Interactive Modeling with PreliZ54:06 The Evolution of ArviZ01:01:23 Advancements in ArviZ 1.001:06:20 Educational Initiatives in Bayesian Statistics01:12:33 The Future of Bayesian MethodsThank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin...
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!My Intuitive Bayes Online Courses1:1 Mentorship with meOur theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)Takeaways:Effective data science education requires feedback and rapid iteration.Building LLM applications presents unique challenges and opportunities.The software development lifecycle for AI differs from traditional methods.Collaboration between data scientists and software engineers is crucial.Hugo's new course focuses on practical applications of LLMs.Continuous learning is essential in the fast-evolving tech landscape.Engaging learners through practical exercises enhances education.POC purgatory refers to the challenges faced in deploying LLM-powered software.Focusing on first principles can help overcome integration issues in AI.Aspiring data scientists should prioritize problem-solving over specific tools.Engagement with different parts of an organization is crucial for data scientists.Quick paths to value generation can help gain buy-in for data projects.Multimodal models are an exciting trend in AI development.Probabilistic programming has potential for future growth in data science.Continuous learning and curiosity are vital in the evolving field of data science.Chapters:09:13 Hugo's Journey in Data Science and Education14:57 The Appeal of Bayesian Statistics19:36 Learning and Teaching in Data Science24:53 Key Ingredients for Effective Data Science Education28:44 Podcasting Journey and Insights36:10 Building LLM Applications: Course Overview42:08 Navigating the Software Development Lifecycle48:06 Overcoming Proof of Concept Purgatory55:35 Guidance for Aspiring Data Scientists01:03:25 Exciting Trends in Data Science and AI01:10:51 Balancing Multiple Roles in Data Science01:15:23 Envisioning Accessible Data Science for AllThank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!My Intuitive Bayes Online Courses1:1 Mentorship with meOur theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)Takeaways:CFA is commonly used in psychometrics to validate theoretical constructs.Theoretical structure is crucial in confirmatory factor analysis.Bayesian approaches offer flexibility in modeling complex relationships.Model validation involves both global and local fit measures.Sensitivity analysis is vital in Bayesian modeling to avoid skewed results.Complex models should be justified by their ability to answer specific questions.The choice of model complexity should balance fit and theoretical relevance. Fitting models to real data builds confidence in their validity.Divergences in model fitting indicate potential issues with model specification.Factor analysis can help clarify causal relationships between variables.Survey data is a valuable resource for understanding complex phenomena.Philosophical training enhances logical reasoning in data science.Causal inference is increasingly recognized in industry applications.Effective communication is essential for data scientists.Understanding confounding is crucial for accurate modeling.Chapters:10:11 Understanding Structural Equation Modeling (SEM) and Confirmatory Factor Analysis (CFA)20:11 Application of SEM and CFA in HR Analytics30:10 Challenges and Advantages of Bayesian Approaches in SEM and CFA33:58 Evaluating Bayesian Models39:50 Challenges in Model Building44:15 Causal Relationships in SEM and CFA49:01 Practical Applications of SEM and CFA51:47 Influence of Philosophy on Data Science54:51 Designing Models with Confounding in Mind57:39 Future Trends in Causal Inference01:00:03 Advice for Aspiring Data Scientists01:02:48 Future Research DirectionsThank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy,
Tony, Kelsey, and special guest Mega Ran sat down for a live show at Thunderbird Lounge in Phoenix, AZ to celebrate Tony's birthday by talking about aliens in the ocean, conspiracy theories at Disney, and taking on a Horror Hot Sauce Trivia Challenge that ends badly for one of the contestants. Scaredycast is presented by Evil Izzy's Haunted Emporium in Phoenix, AZ! Head to Evil Izzy's for your spooky costume and make-up needs or grab some sweet horror merch! This episode is sponsored by: ValuSesh! Want to feel the vibes, but don't want to spend an arm and leg? Sesh For Less and use code SCAREDY at Checkout! If you're in Arizona be sure to visit Polar Bear's Pop Culture Shop for all your retro toy collecting needs! Check out our YouTube where you can now WATCH episodes of Scaredycast! And follow us on social! Become a PATRON to support the show and get spooky exclusive content! Original music by Mangy Bones Get your horror movie news, reviews, and thoughts at HorrorPress.com! True crime, haunted happenings, UFO sightings, horror movies, and cryptid creatures. All the spooky you can endure inside one little horror podcast. Get the thirst of your morbid curiosity quenched when you check out Scaredycast! Visit Scaredycast.com for updates on the show, live show event dates, merch, and more!
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!My Intuitive Bayes Online Courses1:1 Mentorship with meOur theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)-------------------------Love the insights from this episode? Make sure you never miss a beat with Chatpods! Whether you're commuting, working out, or just on the go, Chatpods lets you capture and summarize key takeaways effortlessly.Save time, stay organized, and keep your thoughts at your fingertips.Download Chatpods directly from App Store or Google Play and use it to listen to this podcast today!https://www.chatpods.com/?fr=LearningBayesianStatistics-------------------------Takeaways:Epidemiology focuses on health at various scales, while biology often looks at micro-level details.Bayesian statistics helps connect models to data and quantify uncertainty.Recent advancements in data collection have improved the quality of epidemiological research.Collaboration between domain experts and statisticians is essential for effective research.The COVID-19 pandemic has led to increased data availability and international cooperation.Modeling infectious diseases requires understanding complex dynamics and statistical methods.Challenges in coding and communication between disciplines can hinder progress.Innovations in machine learning and neural networks are shaping the future of epidemiology.The importance of understanding the context and limitations of data in research. Chapters:00:00 Introduction to Bayesian Statistics and Epidemiology03:35 Guest Backgrounds and Their Journey10:04 Understanding Computational Biology vs. Epidemiology16:11 The Role of Bayesian Statistics in Epidemiology21:40 Recent Projects and Applications in Epidemiology31:30...
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!My Intuitive Bayes Online Courses1:1 Mentorship with meOur theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)Takeaways:Bob's research focuses on corruption and political economy.Measuring corruption is challenging due to the unobservable nature of the behavior.The challenge of studying corruption lies in obtaining honest data.Innovative survey techniques, like randomized response, can help gather sensitive data.Non-traditional backgrounds can enhance statistical research perspectives.Bayesian methods are particularly useful for estimating latent variables.Bayesian methods shine in situations with prior information.Expert surveys can help estimate uncertain outcomes effectively.Bob's novel, 'The Bayesian Heatman,' explores academia through a fictional lens.Writing fiction can enhance academic writing skills and creativity.The importance of community in statistics is emphasized, especially in the Stan community.Real-time online surveys could revolutionize data collection in social science.Chapters:00:00 Introduction to Bayesian Statistics and Bob Kubinec06:01 Bob's Academic Journey and Research Focus12:40 Measuring Corruption: Challenges and Methods18:54 Transition from Government to Academia26:41 The Influence of Non-Traditional Backgrounds in Statistics34:51 Bayesian Methods in Political Science Research42:08 Bayesian Methods in COVID Measurement51:12 The Journey of Writing a Novel01:00:24 The Intersection of Fiction and AcademiaThank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell,...