- Видео 12
- Просмотров 139 717
StatMike
США
Добавлен 1 сен 2020
Hi, I'm Mike 👋
I am a lifetime learner with a background in statistics. The common denominator for 11 years as a statistician, 8 years in statistical software and now a growing career in Google cloud is that I love to learn almost as much as I love to share what I learned to help others.
I am passionate about computational engineering and work tirelessly to broaden my exposure. I utilize a wide range of skills and tools to enable deeper inferential and predictive evaluations in highly creative ways. I believe cloud computing is the radical change of our generation that gives us a chance to skip rethinking what we do and think new all over again.
I am a lifetime learner with a background in statistics. The common denominator for 11 years as a statistician, 8 years in statistical software and now a growing career in Google cloud is that I love to learn almost as much as I love to share what I learned to help others.
I am passionate about computational engineering and work tirelessly to broaden my exposure. I utilize a wide range of skills and tools to enable deeper inferential and predictive evaluations in highly creative ways. I believe cloud computing is the radical change of our generation that gives us a chance to skip rethinking what we do and think new all over again.
End-to-End: Pipeline Orchestration (KFP) - BigQuery (BQML) Model For Endpoint Update [notebook 03C]
In [notebook 03a] we trained a model using BigQuery ML (BQML). In [notebook 03b] we used Vertex AI to upload the BQML model and deploy it to a live endpoint for online predictions. In this [notebook 03c] we will build a Kubeflow Pipeline (KFP) to conditionally replace the model on the online endpoint with a better model. Vertex AI Pipelines is a service that runs Kubeflow pipelines as a managed service where we don’t have to worry about infrastructure. This makes it incredibly straight-forward to build a pipelines to orchestrate all the steps of a challenger model scenario:
➡️ Train a Challenger Model
➡️ Retrieve evaluation metrics for the Challenger Model
➡️ Retrieve evaluation metrics for ...
➡️ Train a Challenger Model
➡️ Retrieve evaluation metrics for the Challenger Model
➡️ Retrieve evaluation metrics for ...
Просмотров: 3 222
Видео
End-to-End: ML with TensorFlow in Jupyter with Tensorflow I/O BigQuery Reader [notebook 05]
Просмотров 3,8 тыс.2 года назад
An end-to-end workflow using a Jupyter Notebook hosted by Vertex AI Workbench to train an ML model with TensorFlow within the notebook. Training data is read using the TensorFlow I/O reader for BigQuery. Many deep explanations along the way including using Tensorboard to evaluate the model training. The final model is deployed to a Vertex AI Endpoint and online predictions are demonstrated usin...
Part 2 - End-To-End: Pipeline Orchestration (KFP) - AutoML in Vertex AI for ML Ops [notebook 02c]
Просмотров 2 тыс.2 года назад
Today we revisit a previous video and add a section for model evaluation using the Vertex AI API. See how easy it is to retrieve a vast array of evaluation metrics for AutoML models. The original video: ruclips.net/video/1gHJgY7AXAs/видео.html An end-to-end workflow using Pipelines within Vertex AI on Google Cloud Platform. We will use AutoML to train a machine learning model. A walkthrough of ...
Part 2 - End-To-End: Interactive Code (Python) - AutoML in Vertex AI for ML Ops [notebook 02b]
Просмотров 2,6 тыс.2 года назад
Today we revisit a previous video and add a section for model evaluation using the Vertex AI API. See how easy it is to retrieve a vast array of evaluation metrics for AutoML models. The original video: ruclips.net/video/GOxHYfCLc6U/видео.html An end-to-end workflow using Python clients for Vertex AI on Google Cloud Platform. We will use AutoML to train a machine learning model. A walkthrough o...
ML with SQL in BigQuery to Online Predictions in Vertex AI for ML Operations [notebook 03b]
Просмотров 2,7 тыс.2 года назад
An end-to-end workflow using the Python clients for Vertex AI on Google Cloud Platform. We export a model created with BigQuery ML and use it for online predictions in Vertex AI. This video follows the notebook 03b - Vertex AI BigQuery Machine Learning (BQML) - Online Predictions with BQML Models. The Notebook followed in this video is an older version - link for the version in the video: githu...
End-To-End: ML with SQL in BigQuery (BQML) [notebook 03a]
Просмотров 6 тыс.2 года назад
An end-to-end workflow using the Python client for BigQuery on Google Cloud Platform. We use BigQuery ML to train a model using SQL! A walkthrough of all the steps from connecting to data sources, training a model, evaluating the final model, and requesting predictions from multiple clients. A few deep dives along the way including model explainability! This video follows the notebook 03a - Big...
End-To-End: Pipeline Orchestration (KFP) - AutoML in Vertex AI for ML Operations [notebook 02c]
Просмотров 10 тыс.2 года назад
An end-to-end workflow using Pipelines within Vertex AI on Google Cloud Platform. We will use AutoML to train a machine learning model. A walkthrough of building a repeatable pipeline to orchestrate all the steps from connecting to data sources, training a model, evaluating the final model, deploying to an online endpoint and requesting predictions from multiple clients. A few deep dives along ...
End-To-End: Interactive Code (Python) - AutoML in Vertex AI for ML Operations [notebook 02b]
Просмотров 7 тыс.2 года назад
An end-to-end workflow using Python clients for Vertex AI on Google Cloud Platform. We will use AutoML to train a machine learning model. A walkthrough of all the steps from connecting to data sources, training a model, evaluating the final model, deploying to an online endpoint and requesting predictions from multiple clients. A few deep dives along the way including model explainability! This...
End-To-End: No Code - AutoML in Vertex AI for ML Operations [notebook 02a]
Просмотров 27 тыс.2 года назад
An end-to-end workflow completely within the Vertex AI interface in the Google Cloud Console. We will use AutoML to train a machine learning model. A walkthrough of all the steps from connecting to data sources, training a model, evaluating the final model, deploying to an online endpoint and requesting predictions from multiple clients. A few deep dives along the way! This video follows the no...
Introduction - Vertex AI for ML Operations
Просмотров 40 тыс.2 года назад
Introduction to the playlist of end-to-end workflow walkthroughs for machine learning operations using Google Cloud Platform’s Vertex AI. GitHub Repository: github.com/statmike/vertex-ai-mlops Timeline: 0:00 - Introduction 3:25 - The GitHub Repository 4:40 - Walkthrough List of Workflows 6:40 - Q&A - What is not covered 7:30 - Q&A - Are all the videos needed? 8:12 - Q&A - How do I learn ML? 12:...
Environment Setup - Vertex AI for ML Operations [notebook 00]
Просмотров 24 тыс.2 года назад
A walkthrough of creating a Google Cloud Platform project and setting up the environment for this series of end-to-end workflows. This video follows the readme and the first notebook [00 - Environment Setup] in the repository. GitHub Repository: github.com/statmike/vertex-ai-mlops The Notebook followed in this video: github.com/statmike/vertex-ai-mlops/blob/main/00 - Setup/00 - Environment Setu...
Data Source - Vertex AI for ML Operations [notebook 01]
Просмотров 12 тыс.2 года назад
A walkthrough of creating the data source for this project using BigQuery. We will import, review, and prepare the data for use in machine learning workflows. This video follows the notebook [01 - BigQuery - Table Data Source] in the repository. GitHub Repository: github.com/statmike/vertex-ai-mlops The Notebook followed in this video: github.com/statmike/vertex-ai-mlops/blob/main/01 - Data Sou...
Amazing Mike, thank you very much for the course.
Mike, Great video and github repo is great. Noticed the current version of repository (as on 10/09/2024) has some differences from the one taught in the class.
Great videos. I've learned a lot here. What I am asking myself, given the small number of frauds in the dataset, doesn't it make sense to verify that we'e got a similar ratio of frauds in all 3 subsets after dividing a dataset? Because it is possible, that test data will contain just a very small amount of frauds, or none?
Your videos helped me through my dissertation ❤ so detailed , indepth and knowledgeable
In love with your work! High quality content! It would be great if you could create content on other topics too!
How to schedule a batch prediction job?/predict a number of records at a time after deployment?
You deserve tons of cookies, Mike! I really appreciate your job! Thanks!
Hey, Mike! I have just finished this first Intro video and I want to thank you so much for this content! Right now, I'm looking for content to learn more model deployment on Cloud using Google Cloud and I believe I'll find such a rich content here and perhaps the answers that will help me solve the problems I'm facing right now. Thank you so much, and keep it up! The ML Community needs more people like you!
Very great introduction for BigQuery, I am interested in how BigQuery is compared against SnowFlake these days or other products in the market?
Hi Mike do you know why I keep getting quota limit exceeded errors?
I need to create a legal chatbot for indian context hence for that i need to do data collection that is going to be textual it can be Indian panel codes various court vertdict in pdf format various IPC section with punishment and similar registered crimes in any part of the country. so how can i collect that?
Thanks Mike. Hope new videos are coming soon!
Hi, instead of using bigQuery, can we use Cloud storage to store csv datasets then use it in the notebook? Looking forward to the csv video. ❤
Absolutely! This video showed how to setup the TFio BQ reader but since it's Python you can read from anywhere you have connection to. And loading from GCS is made super easy with the automatic mounting: cloud.google.com/vertex-ai/docs/training/cloud-storage-file-system
Hi Mike, nice video! instead of storing the data in sql, is it possible to store in csv?
Definitely! While the video shows BQ, it is also possible to use CSV in cloud storage. Here is a link that will help: cloud.google.com/vertex-ai/docs/tabular-data/classification-regression/prepare-data#import-source
@@statmike-channel I would like to see a video on building a complete pipeline of fetching data from an api and storing it in big query and using that data to train the model and deploying it to an endpoint. Hopefully you will make one.
Hi Mike thank you for your video. I have a question is there a way to fetch data from an api and used it in vertex ai instead of uploading a dataset? Also how do I automate the fetching process for example like a weather app?
I noticed this comment is for an automl video. If for training you would need to pre gather the data and store it. For serving you could create an endpoint the does fetching as part of the serving. I like using FastAPI with a custom container for prediction: cloud.google.com/vertex-ai/docs/predictions/use-custom-container
@@statmike-channel can you make a video about it? I would really love to see your videos on it. Loving the playlist so far!
@@statmike-channel Hi is fastAPI required? Can we just serve using vertex AI?
i usually dont comment,But Mike I need more vedios from youuuu!!
Awesome Mike! Even after 2 years, this is one of the best! Thanks Mike!
When modifying the version data using the suggestions (which describes false pos. false neg. and true pos.) located in the "labels" under model registry/evaluate/(selected label... i.e. '0' or '1'), where are the changes reflected (if any at all)? I don't see any changes being made to the dataset that the version was trained in after making changes. How can you use these changes and continue modifying the dataset alongside the changes made with the suggestions?
Thank you brother from India
If anyone try to the notebook the first time from 2024, it won't work. You have to click "Open in Vertex AI Workbench" and let it deploy the notebook first.
Very informative ! Thanks for sharing
bravo... the content is brilliant. method is effective. larger "font_size" could have more convinient.
What kind of experince do i need to take on this serious ? I work on IT filed but no experince on mechine learnig or coding ?
Thanks you for your great tutorials. I want to migrate a project from legacy to vertex ai. is there any migration tool kit or tutorial i can get to have basics. thank you
Hi Mike, wonderful videos and your knowledge is amazing in this field. Looking forward you to resume your videos :)
Thanks Mike for such great content. I’ve learnt a whole lot from this. Cheers!
Great stuff! Enjoyed excellent explanations and following along with github repo clone. I would have a few questions. 1) since I changed a few lines in notebooks, what happens if I commit changes: is git going to try updating your repo? 2) I would like to see the underlying model architecture: how to see this? Perhaps answers coming in following vidoes.
Great content, thank you very much!
Thanks so much Mike! This is just want I needed!
Thanks so much for this Mike! I'm super grateful!
Let's work together to make the practice of AI and ML more collaborative, accurate, and more approachable to a wider and more connected audience. _StatMike!
Thank you. I really appreciate your effort. 😊
Thank you for the playlist. Amazing quality. Just wanted to say that you are amazing and your content is appreciated.
Amazing content, incredibly valuable! Can't believe this is free 🙌💫
This is a great resource, looking forwards to checking out your other videos. Thanks so much for putting these up.
Thank you for the series! What tweaking did you do to the model in order to make it better than the previous one -- wouldn't the automl in BQ always produce the same result given the same training data?
I love this!
My feeling just after watching only the first video is that I will learn a lot by following the next videos in the play list. You are talking in an exciting and enjoyable way. I love the Bayes formula on the wall, that is the only simple formula that usually takes a bit of time for me to remember and it is my favorite probabilistic formula. Thanks for the making this video specially in your beautiful office.
I may have an astonishing project that does the impossible...I need a ML set up to take it to the next level... It took me forever to code the one of a kind tool but i really need a ML model to analyze it. I have not been able to get AI to work on it properly....probably because no one has EVER seen a data set like this .. i really need someone to help with this FULL NDA
Great Content. This is exactly what a beginner needs to start working with vertex AI.
Thanks man, this helped a lot.
I like this series, this is the series i dont watch with 2x playback speed
Ha! Same!
hey mike , i don't understand how you create folder under BigQuery? Can you explain?
Hi Mike, seems like the services, tool, and interface had a facelift. I could not locate the Notebook API within Workbench - New to the GCP Platform and trying to follow your videos with today's GCP ecosystem - Please advice. Thanks. BTW, awesome content!!!
Hello Mike! I followed your videos and after 2 model training sessions I got 72$ to pay to Google. I appreciate what you sharing with us but please mention this solution with Google Cloud Machine Learning is not for personal use because the costs are too high and regular guys who just want to learn ML can not afford this. I like the framework and the tools provided by google but if there is no way to just use my personal hardware to run the ML then is no point in learning these stuff. Once again I appreciate your effort to share the knowledge with us
already moved to MLOps engineer role, the github repo and videos help me a lot. Thanks mike
Hi Mike, this is excellent video. Is it possible to let the pipeline trigger automatically in case of data refresh either in google big query or in the source csv file?
Hi Mike, we are not excluding the columns like target, transaction_id and split in batch predictions and it's still working fine. Is it automatically picking up only the required columns?
Could you do a segment where data source is external, such as Snowflake using connectors? I have a use case where I need to read data from Snowflake to Vertex AI through , then I need to write the output back to Snowflake. I can do this manually but I need to authenticate to SF and copy/paste the token manually into the notebook. There has to be a way to do this automatically so I can deploy the model.
Super cool series. There are 12 vodeos in this series. Are there more videos for 06, 07 etc. Thank you Mike for very detailed explanations