GCP Cloud Functions for Big query events | Extract data from big query table to GCS Bucket in CSV

Поделиться
HTML-код
  • Опубликовано: 1 фев 2025

Комментарии • 11

  • @AshishDukare-vr6xb
    @AshishDukare-vr6xb 7 месяцев назад

    Thanks for this video. Keep making more such videos

  • @AnitaRaut-e8c
    @AnitaRaut-e8c 5 месяцев назад

    I would appreciate your response in advance.
    How can we identify the path of a specific resource?
    For example, in the video, you specified the path as: projects/{project_name}/datasets/....
    Do we need to grant additional privileges to our default service account for executing the Eventarc trigger?

  • @nishitnishikant8548
    @nishitnishikant8548 10 месяцев назад +1

    Great video.mjust a suggestion.. change' the starting tune....those few Seconds are pain to ears due to the pitch..some smooth melody is much welcome

  • @zramzscinece_tech5310
    @zramzscinece_tech5310 2 года назад

    thanks for providing code for videos.

  • @eduardoestayatenas1728
    @eduardoestayatenas1728 Год назад

    This apply to a stored procedure instead query?

  • @soumyadevporiya
    @soumyadevporiya Год назад

    does not it need to have pub-sub permission for service-account of big query?

  • @deniskotnik1272
    @deniskotnik1272 Год назад

    This can be seen as CDC (change data capture) process.

  • @sriselvi3704
    @sriselvi3704 2 года назад

    Hi bro
    Bro I have a scenario pls answer if you know this bro it would great help
    Bro I have 2 files(1 is .csv and 2nd is .json type) in my system I have to move those files to bucket(in bucket I have 3 folders(1st raw_zone, 2nd cleaning zone, 3rd destination folder))
    Now bro
    I have to move those 2 files to bucket what are the ways I came move to bucket (I know to move files to bucket via cloud shell,by normally uploading to bucket) is there is any other method bro
    2) I need to Tigger the pipeline(this pipeline should be designed in the way that duplicates,null values must be removed in the files) and for triggering I have used cloud functions(when the file loaded in bucket it should sense and trigger the cleaning pipeline) in cloud function whatt code I should put bro to trigger the pipeline
    After triggering the file goes to the pipeline and the cleaning(lyk removing duplicates and nulls happens) and move the o/p to cleaning_zone
    Bro of possible pls help me bro