Great job! I followed your instructions, and everything started working smoothly for me. Your tutorial is fantastic - keep up the excellent work! You have the potential to reach 1 million subscribers. Keep pushing forward !!!
Please answer to my question, I need to do the same thing as you did in this video. My python script works just fine under Google cloud shell. However, I am still having trouble making it work as cloud function. The purpose is to schedule the execution of the function. It consist of extracting a data from a web site and save in google sheet. I was able to make run it under google cloud shell. Any clue from you ?
Hello, are you still active and availble for questions? I would like to ask you if it's possible to have a function that run and close a VM within google cloud itself? Thanks
Hi I love your videos so far! Can you make a video on vertex pipelines with cloud scheduling for model training and deployment to an application? Looking forward for your reply!
I have a question, what about if I had a script with selenium (web scraping) library? There would be a problem right, because as far I can understand it needs a driver installed in your machine to work 😪
Hi! First of all, great video! Really simple and intuitive. It worked for me when I called only one function inside the hello_pubsub, but when I tried to call several others, from others .py, looked like the function run perfectly but with no results. Is there a way to make the cloud functions wait before every function finishes before moving to the next one? Thanks
Lol it’s not working. Can’t deploy. What can be the problem? I did exactly what you did, except can’t create the same bucket so named the bucket as c4ds1. Also changed the code for that.
Great tutorial. Wondering - why save the data for versioning as csv and not parquet? Wouldn't that allow you to create easy partitions, save on storage, and expedite processing/ querying?
The csv format was just for the demonstration purposes. You are absolutely right that there are different file formats which could be better for this task!
Thanks for the detailed explanation. I try to follow the first method but am facing some issues. Basically, I could load the data into Cloud Storage but when I try to load it to BigQuery, I received 404 error saying reqeust could not be served. Any ideas why it happens. FYI in case it helps. I have modified the python script a bit using local environment os which I could load data into cloud storage and then upload into BigQuery successfully through my laptop. However, when I try to automate it in Google Cloud Function, I faced the 404 issues. Any comment would be appreciated.
There is no easy direct way to do it with Cloud Functions. For GPU you would need to use Google Compute Engine and select GPU machine type, but process of automating some script execution would be much different, and it's not covered in my tutorial.
@@cloud4datascience772 thanks for your response. You're using the first generation environment for this example. I was able to replicate it fine. But I'm trying to deploy a new version on the 2nd generation environment to have more control of the resources for my function and the code did not worked. So I figured out that for 2nd generation environment there should be some change to the code. Once again thanks for this tutorial!
@@freddiemarrero6512 I see, from my experience there should be minor changes to the code for the 2nd gen. I am not planning to cover this in my tutorials in the nearest future so I can only recommend the official documentation on that topic: cloud.google.com/sdk/gcloud/reference/functions/deploy or cloud.google.com/functions/docs/create-deploy-gcloud Good luck!
There is a free tier on GCP, a lot of services that I am using in this video are available for free up to certain level of usage: cloud.google.com/free Generally running the workload with Cloud Functions should be cost effective solution.
Hi This tutorial is great but I encountered one problem while runing the function: google.api_core.exceptions.NotFound: 404 Not found: Table arya-teste:arya_dataset.arya-leads was not found in location southamerica-east1" any clue how to solve this?
Great job! I followed your instructions, and everything started working smoothly for me. Your tutorial is fantastic - keep up the excellent work! You have the potential to reach 1 million subscribers. Keep pushing forward !!!
That is great to hear! Thank you for the kind words :)
Please answer to my question, I need to do the same thing as you did in this video. My python script works just fine under Google cloud shell. However, I am still having trouble making it work as cloud function. The purpose is to schedule the execution of the function. It consist of extracting a data from a web site and save in google sheet. I was able to make run it under google cloud shell. Any clue from you ?
Did you find?
Thanks a lot for sharing your knowledge!! Greetings from Mexico
Really interesting, thank you!
great tutorial!
Thank you!
Hello, are you still active and availble for questions? I would like to ask you if it's possible to have a function that run and close a VM within google cloud itself?
Thanks
Hi I love your videos so far! Can you make a video on vertex pipelines with cloud scheduling for model training and deployment to an application? Looking forward for your reply!
I have a question, what about if I had a script with selenium (web scraping) library? There would be a problem right, because as far I can understand it needs a driver installed in your machine to work 😪
You can install libraries using requirements.txt, otherwise it could be a challenge if you want to customize the execution environment
Awesome. Thank you
We’re does the script get its internet from to load URLs/api when it’s in the cloud? Is it generated or from a mast in the providers grounds?
Internet is provided by the Google Cloud Platform, no need to do anything additional once the script is successfully deployed to Cloud Functions.
Hi! First of all, great video! Really simple and intuitive. It worked for me when I called only one function inside the hello_pubsub, but when I tried to call several others, from others .py, looked like the function run perfectly but with no results. Is there a way to make the cloud functions wait before every function finishes before moving to the next one? Thanks
In other words, I need the second function to get the result from the first, and so on. Thats why I need it to wait
Lol it’s not working. Can’t deploy. What can be the problem? I did exactly what you did, except can’t create the same bucket so named the bucket as c4ds1. Also changed the code for that.
Great tutorial. Wondering - why save the data for versioning as csv and not parquet? Wouldn't that allow you to create easy partitions, save on storage, and expedite processing/ querying?
The csv format was just for the demonstration purposes. You are absolutely right that there are different file formats which could be better for this task!
Thanks for the detailed explanation. I try to follow the first method but am facing some issues. Basically, I could load the data into Cloud Storage but when I try to load it to BigQuery, I received 404 error saying reqeust could not be served. Any ideas why it happens. FYI in case it helps. I have modified the python script a bit using local environment os which I could load data into cloud storage and then upload into BigQuery successfully through my laptop. However, when I try to automate it in Google Cloud Function, I faced the 404 issues. Any comment would be appreciated.
Thanks for your comment, as a first thing you can check if the location of the services are all within the same region (e.g. all services in the EU).
Can we do this using google compute like GPU? how can we do that?
There is no easy direct way to do it with Cloud Functions. For GPU you would need to use Google Compute Engine and select GPU machine type, but process of automating some script execution would be much different, and it's not covered in my tutorial.
Excelent!
what is the zip file ? i didnt understant what is it about..
It’s the python source code
Very helpful tutorial. Can you do the GUI code for Environment version 2 for the function?
Hey, I am not quite sure what are you asking about, could you clarify your question?
@@cloud4datascience772 thanks for your response. You're using the first generation environment for this example. I was able to replicate it fine. But I'm trying to deploy a new version on the 2nd generation environment to have more control of the resources for my function and the code did not worked. So I figured out that for 2nd generation environment there should be some change to the code. Once again thanks for this tutorial!
@@freddiemarrero6512 I see, from my experience there should be minor changes to the code for the 2nd gen. I am not planning to cover this in my tutorials in the nearest future so I can only recommend the official documentation on that topic: cloud.google.com/sdk/gcloud/reference/functions/deploy or cloud.google.com/functions/docs/create-deploy-gcloud
Good luck!
Is it free?
There is a free tier on GCP, a lot of services that I am using in this video are available for free up to certain level of usage: cloud.google.com/free
Generally running the workload with Cloud Functions should be cost effective solution.
@@cloud4datascience772 but I have to pay 300 dollars in advance and I'm just a student so I don't have it
Hi This tutorial is great but I encountered one problem while runing the function: google.api_core.exceptions.NotFound: 404 Not found: Table arya-teste:arya_dataset.arya-leads was not found in location southamerica-east1" any clue how to solve this?
Hey, are you sure that your dataset exists at the time of running the function?
Yes I think it was another temporarly error.
@@cloud4datascience772