I would appreciate your response in advance. How can we identify the path of a specific resource? For example, in the video, you specified the path as: projects/{project_name}/datasets/.... Do we need to grant additional privileges to our default service account for executing the Eventarc trigger?
Hi bro Bro I have a scenario pls answer if you know this bro it would great help Bro I have 2 files(1 is .csv and 2nd is .json type) in my system I have to move those files to bucket(in bucket I have 3 folders(1st raw_zone, 2nd cleaning zone, 3rd destination folder)) Now bro I have to move those 2 files to bucket what are the ways I came move to bucket (I know to move files to bucket via cloud shell,by normally uploading to bucket) is there is any other method bro 2) I need to Tigger the pipeline(this pipeline should be designed in the way that duplicates,null values must be removed in the files) and for triggering I have used cloud functions(when the file loaded in bucket it should sense and trigger the cleaning pipeline) in cloud function whatt code I should put bro to trigger the pipeline After triggering the file goes to the pipeline and the cleaning(lyk removing duplicates and nulls happens) and move the o/p to cleaning_zone Bro of possible pls help me bro
Thanks for this video. Keep making more such videos
I would appreciate your response in advance.
How can we identify the path of a specific resource?
For example, in the video, you specified the path as: projects/{project_name}/datasets/....
Do we need to grant additional privileges to our default service account for executing the Eventarc trigger?
Great video.mjust a suggestion.. change' the starting tune....those few Seconds are pain to ears due to the pitch..some smooth melody is much welcome
Taken care in the latest videos
thanks for providing code for videos.
This apply to a stored procedure instead query?
does not it need to have pub-sub permission for service-account of big query?
This can be seen as CDC (change data capture) process.
Hi bro
Bro I have a scenario pls answer if you know this bro it would great help
Bro I have 2 files(1 is .csv and 2nd is .json type) in my system I have to move those files to bucket(in bucket I have 3 folders(1st raw_zone, 2nd cleaning zone, 3rd destination folder))
Now bro
I have to move those 2 files to bucket what are the ways I came move to bucket (I know to move files to bucket via cloud shell,by normally uploading to bucket) is there is any other method bro
2) I need to Tigger the pipeline(this pipeline should be designed in the way that duplicates,null values must be removed in the files) and for triggering I have used cloud functions(when the file loaded in bucket it should sense and trigger the cleaning pipeline) in cloud function whatt code I should put bro to trigger the pipeline
After triggering the file goes to the pipeline and the cleaning(lyk removing duplicates and nulls happens) and move the o/p to cleaning_zone
Bro of possible pls help me bro
no bro