Automate Python script execution on GCP

Поделиться
HTML-код
  • Опубликовано: 5 янв 2025

Комментарии • 38

  • @umamaheshmeka1032
    @umamaheshmeka1032 10 месяцев назад +2

    Great job! I followed your instructions, and everything started working smoothly for me. Your tutorial is fantastic - keep up the excellent work! You have the potential to reach 1 million subscribers. Keep pushing forward !!!

  • @MohamedFerrouga
    @MohamedFerrouga 9 месяцев назад +6

    Please answer to my question, I need to do the same thing as you did in this video. My python script works just fine under Google cloud shell. However, I am still having trouble making it work as cloud function. The purpose is to schedule the execution of the function. It consist of extracting a data from a web site and save in google sheet. I was able to make run it under google cloud shell. Any clue from you ?

  • @patriciodiaz2377
    @patriciodiaz2377 6 месяцев назад +3

    Thanks a lot for sharing your knowledge!! Greetings from Mexico

  • @marianklose1197
    @marianklose1197 Год назад

    Really interesting, thank you!

  • @jsuhwork1970
    @jsuhwork1970 Год назад +1

    great tutorial!

  • @EternalAI-v9b
    @EternalAI-v9b 3 месяца назад

    Hello, are you still active and availble for questions? I would like to ask you if it's possible to have a function that run and close a VM within google cloud itself?
    Thanks

  • @algorithmo134
    @algorithmo134 4 месяца назад

    Hi I love your videos so far! Can you make a video on vertex pipelines with cloud scheduling for model training and deployment to an application? Looking forward for your reply!

  • @patriciodiaz2377
    @patriciodiaz2377 6 месяцев назад

    I have a question, what about if I had a script with selenium (web scraping) library? There would be a problem right, because as far I can understand it needs a driver installed in your machine to work 😪

    • @cloud4datascience772
      @cloud4datascience772  6 месяцев назад

      You can install libraries using requirements.txt, otherwise it could be a challenge if you want to customize the execution environment

  • @AlexBurov-wk8my
    @AlexBurov-wk8my Год назад

    Awesome. Thank you

  • @GTFS-lg7tw
    @GTFS-lg7tw Год назад +1

    We’re does the script get its internet from to load URLs/api when it’s in the cloud? Is it generated or from a mast in the providers grounds?

    • @cloud4datascience772
      @cloud4datascience772  Год назад

      Internet is provided by the Google Cloud Platform, no need to do anything additional once the script is successfully deployed to Cloud Functions.

  • @victorricardo8482
    @victorricardo8482 11 месяцев назад

    Hi! First of all, great video! Really simple and intuitive. It worked for me when I called only one function inside the hello_pubsub, but when I tried to call several others, from others .py, looked like the function run perfectly but with no results. Is there a way to make the cloud functions wait before every function finishes before moving to the next one? Thanks

    • @victorricardo8482
      @victorricardo8482 11 месяцев назад

      In other words, I need the second function to get the result from the first, and so on. Thats why I need it to wait

  • @Rajdeep6452
    @Rajdeep6452 9 месяцев назад

    Lol it’s not working. Can’t deploy. What can be the problem? I did exactly what you did, except can’t create the same bucket so named the bucket as c4ds1. Also changed the code for that.

  • @fafenley00
    @fafenley00 Год назад

    Great tutorial. Wondering - why save the data for versioning as csv and not parquet? Wouldn't that allow you to create easy partitions, save on storage, and expedite processing/ querying?

    • @cloud4datascience772
      @cloud4datascience772  Год назад

      The csv format was just for the demonstration purposes. You are absolutely right that there are different file formats which could be better for this task!

  • @LamBecky
    @LamBecky Год назад

    Thanks for the detailed explanation. I try to follow the first method but am facing some issues. Basically, I could load the data into Cloud Storage but when I try to load it to BigQuery, I received 404 error saying reqeust could not be served. Any ideas why it happens. FYI in case it helps. I have modified the python script a bit using local environment os which I could load data into cloud storage and then upload into BigQuery successfully through my laptop. However, when I try to automate it in Google Cloud Function, I faced the 404 issues. Any comment would be appreciated.

    • @cloud4datascience772
      @cloud4datascience772  Год назад

      Thanks for your comment, as a first thing you can check if the location of the services are all within the same region (e.g. all services in the EU).

  • @JishnuMittapalli
    @JishnuMittapalli 10 месяцев назад

    Can we do this using google compute like GPU? how can we do that?

    • @cloud4datascience772
      @cloud4datascience772  10 месяцев назад

      There is no easy direct way to do it with Cloud Functions. For GPU you would need to use Google Compute Engine and select GPU machine type, but process of automating some script execution would be much different, and it's not covered in my tutorial.

  • @marcelomaia4274
    @marcelomaia4274 Год назад

    Excelent!

  • @flosrv3194
    @flosrv3194 7 месяцев назад

    what is the zip file ? i didnt understant what is it about..

  • @freddiemarrero6512
    @freddiemarrero6512 Год назад

    Very helpful tutorial. Can you do the GUI code for Environment version 2 for the function?

    • @cloud4datascience772
      @cloud4datascience772  Год назад

      Hey, I am not quite sure what are you asking about, could you clarify your question?

    • @freddiemarrero6512
      @freddiemarrero6512 Год назад

      @@cloud4datascience772 thanks for your response. You're using the first generation environment for this example. I was able to replicate it fine. But I'm trying to deploy a new version on the 2nd generation environment to have more control of the resources for my function and the code did not worked. So I figured out that for 2nd generation environment there should be some change to the code. Once again thanks for this tutorial!

    • @cloud4datascience772
      @cloud4datascience772  Год назад +1

      @@freddiemarrero6512 I see, from my experience there should be minor changes to the code for the 2nd gen. I am not planning to cover this in my tutorials in the nearest future so I can only recommend the official documentation on that topic: cloud.google.com/sdk/gcloud/reference/functions/deploy or cloud.google.com/functions/docs/create-deploy-gcloud
      Good luck!

  • @nguyenvantruong8141
    @nguyenvantruong8141 Год назад

    Is it free?

    • @cloud4datascience772
      @cloud4datascience772  Год назад

      There is a free tier on GCP, a lot of services that I am using in this video are available for free up to certain level of usage: cloud.google.com/free
      Generally running the workload with Cloud Functions should be cost effective solution.

    • @nguyenvantruong8141
      @nguyenvantruong8141 Год назад

      @@cloud4datascience772 but I have to pay 300 dollars in advance and I'm just a student so I don't have it

  • @rogerrendon8976
    @rogerrendon8976 Год назад

    Hi This tutorial is great but I encountered one problem while runing the function: google.api_core.exceptions.NotFound: 404 Not found: Table arya-teste:arya_dataset.arya-leads was not found in location southamerica-east1" any clue how to solve this?

    • @cloud4datascience772
      @cloud4datascience772  Год назад +1

      Hey, are you sure that your dataset exists at the time of running the function?

    • @rogerrendon8976
      @rogerrendon8976 Год назад

      Yes I think it was another temporarly error.
      @@cloud4datascience772