- Видео 47
- Просмотров 158 602
Tom Faulkenberry
Добавлен 7 дек 2011
"A hierarchical Bayesian shifted Wald model with censoring" -- Virtual MathPsych 2023
This is my talk for Virtual MathPsych 2023. The Github repo I mentioned in the talk is here: github.com/tomfaulkenberry/censoredShiftedWald
Просмотров: 151
Видео
Bayesian statistics -- Lecture 5 -- Bayesian t-tests
Просмотров 2,9 тыс.2 года назад
Bayesian statistics Lecture 5 Bayesian t-tests In this video, we walk through the basics of the Bayesian t-test, paying particular attention to the default Cauchy prior and where it comes from. Then I demonstrate a full Bayesian analysis of a t-test in JASP. Along the way, I provide a template for writing the results from the output given in JASP. Some links: - the R code I used for the plots i...
Bayesian statistics -- Lecture 4 -- Bayesian inference for correlations in JASP
Просмотров 1,1 тыс.2 года назад
Bayesian statistics Lecture 4 Bayesian inference for correlations in JASP In this video, we review the definition of correlation, discuss the Bayesian workflow, then demonstrate a full Bayesian analysis of a correlation in JASP. Along the way, I provide a template for writing the results from the output given in JASP. Some links: - the R code I used for the plots in the video: github.com/tomfau...
Bayesian statistics -- Lecture 3 -- Tools for computing Bayes factors
Просмотров 1,4 тыс.2 года назад
Lecture 3 Tools for computing Bayes factors In this video, we review the definition of the Bayes factor, discuss how to transform posterior odds to posterior probability, introduce the Savage-Dickey method, and finally, we use Jasp to analyze our binomial data. Some links: - the R code I used for the plots in the video: github.com/tomfaulkenberry/courses/blob/master/canvas/bayes/lecture3.R - th...
Bayesian statistics -- Lecture 2 -- The language and concepts of Bayesian statistics
Просмотров 1,3 тыс.2 года назад
Lecture 2 The language and concepts of Bayesian statistics In this video, I introduce Bayes' theorem and the language of Bayesian inference. The focus is on visualizing concepts rather than computation. Some links: - excellent intro to Bayes' Theorem (from 3Blue1Brown): ruclips.net/video/HZGCoVF3YvM/видео.html - the R code I used in the video: github.com/tomfaulkenberry/courses/blob/master/canv...
Bayesian statistics -- Lecture 1 -- Classical inference with the binomial model
Просмотров 1,4 тыс.2 года назад
Lecture 1 - Classical inference with the binomial model In this video, I cover the elements of classical statistical inference using the binomial model. I present both the conceptual background as well as a hands-on example using R. Some links: - The "Emily Rosa" study: jamanetwork.com/journals/jama/fullarticle/187390 - the R code I used in the video: github.com/tomfaulkenberry/courses/blob/mas...
Bayesian statistics -- Lecture 0 -- Getting started / Installing R and Jasp
Просмотров 9842 года назад
Lecture 0 - Getting started / Installing R and Jasp In this video, I introduce the course and walk through how to install R and Jasp. www.r-project.org rstudio.com jasp-stats.org Course: PSYC 4390/5090: Bayesian Statistics Dr. Tom Faulkenberry (Tarleton State University)
Experimental Design Lecture 9 - Mediation analysis
Просмотров 4093 года назад
Lecture 9 - Mediation analysis In this video, we introduce mediation analysis and discuss both estimation of paths in mediation models, as well as how to test mediation hypotheses. Some links: - grade data: raw.githubusercontent.com/tomfaulkenberry/courses/master/canvas/5301/grades.csv - David Kenny's website: davidakenny.net/cm/mediate.htm Course: PSYC 5316: Advanced Quantitative Methods & Exp...
Psychometrics - Lecture 9 - Structural equation modeling
Просмотров 7093 года назад
Lecture 9: Structural equation modeling Here, we learn how to use the SEM module in JASP to build, test, and modify structural equation models. In addition, we discuss using the BIC as a tool for model comparison. Note: the beginning of the video refers to this as Lecture 10 you can ignore this! Course: PSYC 4301/5322: Tests and Measurements / Psychometrics Dr. Tom Faulkenberry (Tarleton State ...
Experimental Design Lecture 8 - Linear regression models
Просмотров 4153 года назад
Lecture 8 - Linear regression models In this video, we introduce the basic linear regression model as a tool for (1) prediction, and (2) testing relationships between variables. Course: PSYC 5316: Advanced Quantitative Methods & Experimental Design Dr. Tom Faulkenberry (Tarleton State University)
Psychometrics - Lecture 8 - Confirmatory factor analysis
Просмотров 7223 года назад
Lecture 8: Confirmatory factor analysis Here, we learn how to use JASP to build and test factor analysis models. Also, we talk about the mechanics of path diagrams. Note: the beginning of the video refers to this as Lecture 9 you can ignore this! Course: PSYC 4301/5322: Tests and Measurements / Psychometrics Dr. Tom Faulkenberry (Tarleton State University)
Experimental Design Lecture 7 - Bayesian analysis of covariance (ANCOVA)
Просмотров 1,8 тыс.3 года назад
Lecture 7 - Bayesian Analysis of covariance Building on Lecture 6, we talk about Bayesian ANCOVA in JASP and hopefully remove some of the mystery around the different output tables. Some links: - Lecture 6 video: ruclips.net/video/Gt4EjtBORU8/видео.html - my JASP blog post: jasp-stats.org/2020/11/26/how-to-do-bayesian-linear-regression-in-jasp-a-case-study-on-teaching-statistics/ Course: PSYC 5...
Psychometrics - Lecture 7 - Exploratory factor analysis
Просмотров 9703 года назад
Lecture 7: Exploratory factor analysis Here, we learn how to use JASP to uncover "clusters" of related items in a scale, which we interpret as the multiple "dimensions" of a psychological construct. Note: the beginning of the video refers to this as Lecture 8 you can ignore this! Course: PSYC 4301/5322: Tests and Measurements / Psychometrics Dr. Tom Faulkenberry (Tarleton State University)
Gamma function approximations for computing Bayes factors - MAA Texas Section 2021
Просмотров 1493 года назад
In this talk, I describe some recent work on using approximations of gamma functions to compute closed-form Bayes factors in two-group designs.
Experimental Design Lecture 6 - Analysis of covariance
Просмотров 5333 года назад
Lecture 6 - Analysis of covariance Here we discuss how to compare group means while simultaneously accounting for (or "controlling for") group differences on some other variable (i.e., a covariate). Course: PSYC 5316: Advanced Quantitative Methods & Experimental Design Dr. Tom Faulkenberry (Tarleton State University)
Psychometrics - Lecture 5 - Estimating reliability coefficients
Просмотров 9883 года назад
Psychometrics - Lecture 5 - Estimating reliability coefficients
Experimental Design Lecture 5 - Factorial analysis of variance
Просмотров 4023 года назад
Experimental Design Lecture 5 - Factorial analysis of variance
Experimental Design Lecture 4 - Repeated measures analysis of variance
Просмотров 2793 года назад
Experimental Design Lecture 4 - Repeated measures analysis of variance
Psychometrics - Lecture 4 - Introduction to reliability and classical test theory
Просмотров 2 тыс.3 года назад
Psychometrics - Lecture 4 - Introduction to reliability and classical test theory
Psychometrics - Lecture 3 - statistical properties of composite scores
Просмотров 8723 года назад
Psychometrics - Lecture 3 - statistical properties of composite scores
Experimental Design Lecture 3 - Inference for single factor designs
Просмотров 2893 года назад
Experimental Design Lecture 3 - Inference for single factor designs
Experimental Design Lecture 2 - Introduction to Bayesian inference
Просмотров 3743 года назад
Experimental Design Lecture 2 - Introduction to Bayesian inference
Psychometrics - Lecture 2 - measuring variability and association
Просмотров 8973 года назад
Psychometrics - Lecture 2 - measuring variability and association
Experimental Design Lecture 1 - Review of classical inference
Просмотров 6403 года назад
Experimental Design Lecture 1 - Review of classical inference
Psychometrics - Lecture 1 - Computing percentile ranks
Просмотров 2,7 тыс.3 года назад
Psychometrics - Lecture 1 - Computing percentile ranks
PSYC 2317 - Lecture 10 - Bayesian hypothesis testing
Просмотров 9 тыс.3 года назад
PSYC 2317 - Lecture 10 - Bayesian hypothesis testing
PSYC 2317 - Lecture 9 - Doing tests with JASP / Intro to Analysis of Variance
Просмотров 1,3 тыс.3 года назад
PSYC 2317 - Lecture 9 - Doing tests with JASP / Intro to Analysis of Variance
PSYC 2317 - Lecture 8 - Confidence intervals for t-test designs
Просмотров 1,6 тыс.3 года назад
PSYC 2317 - Lecture 8 - Confidence intervals for t-test designs
PSYC 2317 - Lecture 7 - The independent samples t-test
Просмотров 1,4 тыс.3 года назад
PSYC 2317 - Lecture 7 - The independent samples t-test
PSYC 2317 - Lecture 6 - Introduction to the t-test
Просмотров 1,7 тыс.3 года назад
PSYC 2317 - Lecture 6 - Introduction to the t-test
Such a comprehensive lecture on Ancova. The best I have found so far.
Hi, Dr Faulkenberry. Your video is so helpful. Thank you so much.
Thank you for explaining statistical concepts so I can understand them.
Thank you so much . I finally understood thanks to your explanations and examples. Looking forward to watch the other videos . I am planning to start psychology in one year .
The major problem with the video is that it brings a software when the explanation was going smoothly. Why can't you solve the problem manually with the details? It is easy to explain: Calculate the probability of observing the data under the null (<= 50) and alternative (> 50) using the T-distribution and take the ratio, using the posterior distribution of the mean. So some explanation of the prior and posterior distribution is required.
Thanks for taking the time to leave your comment! I agree...conceptually...that it *should* be this simple. The issue is that calculating the marginal likelihood (i.e., p(data | H1)) is not easy at all. In this case, it requires integrating the likelihood over the prior distribution, which almost always requires a software solution (because the integrals rarely admit closed form solutions...though when they do, it's very nice!). And, because this is an introductory video in the context of a course and book where the software (PsyStat app) has been used throughout, using it for these Bayesian tests is (I think) a natural thing to do.
@@TomFaulkenberry I agree that software are required to do the marginals. What I personally like is to bring the theory and derivations fully up to the point of numerical calculation and then leave the finer details of the calculations to the software. I understand that it is an introductory course. Thanks for your reply.
Watching these now for my 500 level class and I WISH I would have had this class as my intro to stats in undergrad. You’re making it so easy to understand.
Very clear! Thank you very much!
Thank you so much, these videos are really helpful and you are a really good teacher!
Loved the video except from the arsenal banner above the blackboard... Overall still great video though haha
Can you show me how to prove the formula you mentioned in it : rho xt =sigma t/sigma x. I am very interested in the process of proving it but I don’t know how to start.❤
Can you show me how to prove the formula you mentioned in it : rho xt =sigma t/sigma x. I am very interested in the process of proving it but I don’t know how to start.❤
Can you show me how to prove the formula you mentioned in it : rho xt =sigma t/sigma x. I am very interested in the process of proving it but I don’t know how to start.❤
Can you show me how to prove the formula you mentioned in it : rho xt =sigma t/sigma x. I am very interested in the process of proving it but I don’t know how to start.
Can you show me how to prove the formula you mentioned in it : rho xt =sigma t/sigma x. I am very interested in the process of proving it but I don’t know how to start.
Hey I wonder can we use this on different kind of ML model for comparison such as Logistics regression VS Random Forest however they all are parametric model can this still work?
hellas flag behind i loled! go greece
Question: do you run into problems with the falsification paradigm when you say that evidence supports the alternative hypothesis?
very deep question...thanks! The idea of using Bayes factors to support a hypothesis (rather than rejecting) is typically seen as counter to a Popperian view of science (the falsification paradigm). Gelman's (2011) paper "Induction and deduction in Bayesian data analysis" is a good read on this...
@@TomFaulkenberry Thank you. As an outsider (I am definitely not a statistician) I started to realize that the Bayesian approach *seemed* counter to the Popperian paradigm, but haven't seen it discussed. I'll definitely give that paper a read.
Hello Tom, what is the most important point I would need to know as a statistical methods in psych student? I plan to take this course in the Fall, but not sure what to expect to prepare for success.
great question! The most important thing to know is that *you can do this!*. Just read, go slowly through the basics, and keep practicing...
@TomFaulkenberry Thank you for your words of encouragement, I will keep this in mind.
thanks, very useful for my thesis
Thank you so much for making these videos!
Taking Statistics for Behavioral Sciences and your book and this lecture has helped me tremendously!! Thank you!
Now there's a mediation module in JASP
Dr Faulkenberry, could you please upload the remaining lectures? I find these lectures extremely helpful, I'm also preparing for an exam, and your lectures are helping me a lot.
Doesn’t the bayes factor also depend on the power for H1? You need to specify a predicted effect size for H1, right? Or some kind of prior
I did my statistic course 30 years ago and life took me to a different path. Now,I decided to apply to master degree in psychology and needed some refreshment . It's just what I needed. Do you have also playlist for regression? Of course I'm going to sit and learn the material more thoroughly and hope to pass the test successfuly. Thank you Tom!
thank you for sharing this! I do have some videos about linear regression (both traditional and Bayesian). The best place to look is in my playlist on Experimental Design.
@@TomFaulkenberry Thank you! I will! You have an excellent pedagogical intelligence! (if this is the right word?). It's unbelievable how the same material can be easy to understand with the right teacher. I'm so grateful for all the professors in the USA who share academic level courses . Bless you!
this by far the best lecture on Bayesian analysis: short, clear and easy to understand and packed with information. you make statistics seem easy and fun
I felt the need to tell you how much I appreciate the time, effort, and energy involved with your creation of these statistical videos. They are truly helpful and informative without being overwhelming. I have found that they facilitate my ability to learn and understand the logic behind statistics so that I am significantly better prepared for my grad school stats modules than I would be without them. I have even shared your videos with my peers and friends who are at other universities. You are amazing. I appreciate you. Your videos make a difference. Thank you for contributing to my success in statistics.
thank you for sharing this with me! I appreciate your comments, and I'm glad you found the videos helpful.
Thanks,
I know this video is a few years old now, but I want to thank you for this. I am taking an evaluation and assessment course for my graduate psychology course and after reading the chapters on this I was lost. After viewing this video it could not have been more clear. I owe you my life, thank you!
thank you! I appreciate your comment, and I'm glad to hear that the video is useful!
Find it hard to read textbooks myself and thanks so much for these videos!
can you explain how to use jamovi
Tom, hi. I might be wrong but, in moment 0:54, SS2 is wrong. SS2=(43-45)^2 +(49-45)^2+(35-45)^2+(51-45)^2=4+16+100+36=156 and not 184. All the following calculations depend on that. SS1 which is computed the same way is correct.
haha...not only that, but the mean is incorrect too! Thanks for pointing this out, and I'm a bit embarrassed to have never gone back and caught this error. On one hand, the calculations in the video are self-consistent, as they are done completely from the summaries that I gave. The complication that arises is if someone tries to do this example in software, they certainly won't get the same answer.
great techer!! thank you!
really helpful, thank you!!
Can you explain the prior probability distributions for H0 and H1, since while H0 is delta == 50, the H1 is delta <> 50? I can see the prior distribution for H0 could be a normal distribution with mean = 50 and sd = something really tight. But I don’t know about the prior for H1. Perhaps I am overthinking….
Is this applicable for z-test or test with proportions as well?
For z-test, the answer is "yes", because the z-test and t-test are really the same thing (the only difference is that a z-test uses a known value for SD, whereas the t-test uses an estimated value...for the Bayes factor, you can input the z-score in as the t-score). For proportions, you could use a normal approximation. While there are some technical concerns about using t-tests in this case, it is unclear what effect this has on Bayes factors, so this might be a good research question!
in case anyone takes issues with my saying that z and t are the "same thing", I'm talking about the form of the test statistic, not the assumptions of the test, etc. For the Bayes factor calculation context, the form of the test statistic is what matters.
@@TomFaulkenberry really appreciate the reply and the videos that you've created
You have to be the best statistics teacher ever
Could you show the formulas for converting t-stats to bayes factor, or are there library functions in Python or R?
Very helpful example for me. Wouldn’t it be helpful to add the corresponding p-value calculation to make it easier for frequentists to accept?
❤
Thank you! this was a great help in conducting Bayesian Ancova.
You're a great professor. I can tell you really care about the work you're doing. Thank you for that! Students can feel when a professor truly cares.
Thank you this lecture. It helps alot
Just discovered these videos. Thanks very much Doc!
Thank you for posting your amazing formula app! This has taken such a burden off my brain! I appreciate you posting these videos as you do such a superb job explaining these concepts. I will actually pass my Psych Statistics class because of you! THANK YOU THANK YOU THANK YOU!!!
These lectures are so clear and not at all jargony. HUGE kudos to you! These are awesome!!
hey, can you share the file link for saq8 here?
here you go! github.com/tomfaulkenberry/courses/blob/master/canvas/5322/SAQ8.csv
Infinitely thankful you've uploaded these lectures on youtube! I've gone through most of your bayesian videos, and they've made me more comfortable using bayesian analysis for my projects.
this seems easy enough, :'D the formulas are just made to look scary Im starting this course next week wish me luck!
You are a blessing🫶🏽