- Видео 53
- Просмотров 77 336
Bill Basener
Добавлен 11 янв 2019
M1 Code Walkthrough
Walking though the code for module 1 in Statistical Learning for Remotes Sensing. We take the time to discuss details of many common Python commands, especially for working with image arrays and plots.
Просмотров: 36
Видео
HyperspectralPy - Open image and Create Regions of Interest
Просмотров 456 месяцев назад
This is a quick tutorial on how to open a hyperspectral image, create ROIs, and view a spectral library using the open source HyperspectrlPy GUI-based software that can be installed via pip install.
YOLO8 Object Detection with LiDAR - Part 4
Просмотров 39710 месяцев назад
This is video #4 in our series on object detection in LiDAR data with YOLO8. I will demonstrate how to train your own YOLO8 model on your labeled LiDAR data.
YOLO8 Object Detection with LiDAR - Part 3
Просмотров 50210 месяцев назад
This is video #3 in our series on object detection in LiDAR data with YOLO8. I will demonstrate how to label LiDAR raster files using Roboflow.
YOLO8 Object Detection with LiDAR - Part 2
Просмотров 68510 месяцев назад
This is video #2 in our series on object detection in LiDAR data with YOLO8. I will demonstrate how to convert a LiDAR point cloud to a raster file for use in YOLO8.
YOLO8 Object Detection with LiDAR - Part 1
Просмотров 38910 месяцев назад
This is video #1 in our series on object detection in LiDAR data with YOLO8. I will demonstrate how to download LiDAR data from the USGS website.
TensorFlow Tutorial Pt.1
Просмотров 224Год назад
Demo of how to code and optimize neural networks in TensorFlow. In this first part we discuss a classification network. All code is available in my GitHub at: github.com/wbasener/Neural-Netork-From-Scratch-in-Python/blob/main/M2_6_Tutorial_neural_nets_with_keras.ipynb. I assume you know what a neural network is, but little prior coding is required. The goal is to get over the initial challenge ...
TensorFlow Tutorial Pt.2
Просмотров 21Год назад
Demo of how to code and optimize neural networks in TensorFlow - Part 2 focuses on regression using a California house price dataset. All code is available in my GitHub at: github.com/wbasener/Neural-Ne.... I assume you know what a neural network is, but little prior coding is required. The goal is to get over the initial challenge in coding using TensorFlow that many people encounter. We go ov...
TensorFlow Tutorial Pt.3
Просмотров 23Год назад
Demo of how to code and optimize neural networks in TensorFlow - Part 3 focuses on optimizing parameters using random grid search and Bayesian optimization. All code is available in my GitHub at: github.com/wbasener/Neural-Ne.... I assume you know what a neural network is, but little prior coding is required. The goal is to get over the initial challenge in coding using TensorFlow that many peo...
Bills D Smother Ravens - were they good or lucky?
Просмотров 723 года назад
In this video we take a look at the Bills performance in 2020 and so what most predictions of the Bills-Ravens game got wrong. Hint - the Bills have a great defense now, and we can see it is a trend by peaking a little into their season stats. (Also, sorry about the audio. Used a headset instead of my mic to avoid some background noise.)
Josh Allen, by the Numbers
Просмотров 4283 года назад
Who are the best QBs in the NFL? Is Josh Allen elite? We take a look at the numbers to see who makes the cut after the 13th week of the 2020 season. I am a Prof. of Data Science at the University of Virginia and Emeritus Prof. of Math at the Rochester Institute of Technology.
Machine Learning 10.1 - Exploratory Data Analysis
Просмотров 1113 года назад
In this video, you will learn tools for exploratory data analysis. These tools allow a person to view data and look for trends and structures. Here, you will explore the terminology and goals for visualizations and unsupervised learning.
Machine Learning 10.2 - PCA Visualizations
Просмотров 2123 года назад
In this video, we will use PCA (Principal Components Analysis) for dimension reduction and to view high-dimensional data. We used principal components in Module 4 as part of the underlying math for Gaussian regression methods, and we used PCA for regularization in Module 6. Principal components provide a useful mathematical framework for modeling/measuring the shape of data, and, in this module...
Machine Learning 9.4 - R Lab Support Vector Machines
Просмотров 1823 года назад
Machine Learning 9.4 - R Lab Support Vector Machines
Machine Learning 9.3 - Support Vector Machines
Просмотров 3383 года назад
Machine Learning 9.3 - Support Vector Machines
Machine Learning 9.1 - Maximum Margin Classifier
Просмотров 6 тыс.3 года назад
Machine Learning 9.1 - Maximum Margin Classifier
Machine Learning 9.2 - Soft Margins and the Support Vector Classifier
Просмотров 4153 года назад
Machine Learning 9.2 - Soft Margins and the Support Vector Classifier
Machine Learning 8.5 - R Lab, Random Forest and Tree Ensembles
Просмотров 1463 года назад
Machine Learning 8.5 - R Lab, Random Forest and Tree Ensembles
Machine Learning 8.4 - Boosting Ensambles
Просмотров 1103 года назад
Machine Learning 8.4 - Boosting Ensambles
Machine Learning 8.2 - Random Forests
Просмотров 1923 года назад
Machine Learning 8.2 - Random Forests
Machine Learning 7.4 - R Lab, Decision Trees
Просмотров 2803 года назад
Machine Learning 7.4 - R Lab, Decision Trees
Machine Learning 7.3 - Advantages and Disadvantages of Trees
Просмотров 1653 года назад
Machine Learning 7.3 - Advantages and Disadvantages of Trees
Machine Learning 7.2 - Classification Trees
Просмотров 1403 года назад
Machine Learning 7.2 - Classification Trees
Machine Learning 7.1 - Regression Trees
Просмотров 1903 года назад
Machine Learning 7.1 - Regression Trees
Machine Learning 6.4 - R Lab, Nonlinear Regression
Просмотров 1193 года назад
Machine Learning 6.4 - R Lab, Nonlinear Regression
Machine Learning 6.3 - Generalized Additive Models
Просмотров 7493 года назад
Machine Learning 6.3 - Generalized Additive Models
Machine Learning 6.2 - Regression Splines and Local Regression
Просмотров 9033 года назад
Machine Learning 6.2 - Regression Splines and Local Regression
Machine Learning 6.1 - Polynomial Regression and Step Functions
Просмотров 8243 года назад
Machine Learning 6.1 - Polynomial Regression and Step Functions
948 Benny Glen
Young Carol Harris Ruth Clark Jessica
Thank you for your explanation. I also think at 8:15 the multivariate normal distribution's probability density function should have $\sqrt{|\Sigma|}$ in the denominator (rather than $|\Sigma|$ as you have currently) and it also may be helpful to viewers to let them know that $p$ represents the dimension of the space we are considering
Thank you so much! 🙏🙏👍👍❤️❤️Are you able to provide slides for your videos, Prof Basener?
Super clear and simple. Thanks!
This beats my MIT lecture. WIll be coming back for more!
can you share these slides in the videos with me?
good explanation, funny that whenever he received an email notification I go check my inbox ='')
V não sei o meu não está o nome do meu amigo do cartão da minha irmã não tem nada a fazer com o 3f o meu não está funcionando não 3
I enjoyed watching your video, thank you. I will watch more of your videos on machine learning videos thank you!
Here is the link to the download site: apps.nationalmap.gov/lidar-explorer/#/
Sir where can i find yololidarTool.py can u provide that file
Thank you for sharing this. RF and SVM are the way to go with point clouds.
❤❤
Very great video! Thank you professor!! :)
Thank you
How do you get the values of 0.15 and 0.02? I'm getting different values.
Agreed. I got approximately 0.18 and 0.003, respectively.
A very good and concise explanation, even starting with the explanation of likelihood. Very well done!
Why do the stepwise functions have diagonals (slope != 0) joining the parts? shouldn't they all be joined by vertical lines, since they are continuous and yield either a 0 or a constant value?
The NFL is changing Bill! Let's up the weight on rushing yards (...I'll admit I'm a Baltimore fan).
Perfect
Thanks!
Excellent
10:48 ohhhhh, I was just going back and forth between the sections on LDA and QDA in three different textbooks (An Introduction to Statistical Learning, Applied Predictive Analytics, and Elements of Statistical Learning) for well over an hour and that multivariate normal pdf was really throwing me off big time. Mostly because of the capital sigma to the negative 1st power term, I didn't realize it was literally a capital sigma, I kept thinking it was a summation of something!
Good job. It is very easy to follow and understand
i was trying to read it my self but you made it so much simpler
Thanks! I am glad it was helpful.
good explanation, i hope there is always example of implementation
yoooo. This really helped me, my guy. Good work.
So glad it was helpful! Thanks!
Thanks professore
Very useful information, thanks you professor!
I am glad its helpful! Thanks for the kind words.
13:42 correction: higher p-values indicate not very good predictors (insignificant); low predictors with p-values, actually, are good
Thanks for pointing that out. You are exactly right - I said it the opposite of what I should have said!
9:15 you say we should expect 51% since up/(up+down) days equals 51%, but we should expect 50% accuracy with randomly guessing (via frequentist inference); 51% does not represent the number of time you correctly call the market up AND the number of time you correctly call the market down, which is what the Confusion Matrix does. So, (up days / (up days + down days)) does not represent accuracy; confusion matrix represents accuracy when up==up and down==down over total number of days. So confusion matrix is not the same as your 51%; cannot compare 52% with 51%.
thanks, can you do a video on neural networks from ISLR textbook?
So basically, ridge/lasso regression penalize for the *size* of the coefficients while aic/bic subset selection penalizes for the *number* of coefficients
This is an excellent series. Thank you so much for taking the time to make these
very good video, thank you professor
I am glad it is helpful. Thank you for the kind words!
Thank you sir, well explained.
Thanks!
Blender!!! Shocked and Surprised !! Awesome 👍👍👍
You are so great. Keep up please.
could you share the slide?
Interesting and clear explanation! Thank you very much, this will help me in writing my thesis!
How did your thesis go?
Thank you for sharing this, and thumbs up for visualization in Blender :)
you probably dont care but if you are stoned like me during the covid times you can watch pretty much all the new movies and series on instaflixxer. Been binge watching with my girlfriend during the lockdown :)
@Anson Diego definitely, I've been using InstaFlixxer for since december myself :)
@Anson Diego Definitely, I have been watching on InstaFlixxer for since november myself =)
Hi! If the classes are assumed to be normally distributed, does that subsume that the features making up an observations are normally distributed as well?
Yes. If the each class has a multivariate normal distribution than each individual feature variable ihas a single variable normal distribution.
Thankyou so much ! Cleared a lot of my doubts
Thanks for this! I needed to clarify these methods in particular, was reading about them in ISLR
THANK YOU SO MUCH!!
Glad it was helpful!
Great video. Many thanks
How to classify LiDAR point cloud using machine learning in R.
How to classify LiDAR point cloud using machine learning in R
How to classify LiDAR point cloud using machine learning in R.
Great video. I love crunching numbers and building reports like this. I'm definitely not anywhere close to being on your level considering you're a professor, but I've always felt like I should be in a similar field just based on my interests and skillset. How cool would it be to be a sports data analyst?
Passion and a knowing your field is note important than education. I had a few students who now work for MLB teams as analysts and part of me is jealous of them. I love that places like PFF put out enough stats that anyone can an analyst now.