Tune into our #AskGoogleCloud premiere on Friday, March 12 10AM PT for answers and a chance to chat live with Google Cloud’s serverless experts → goo.gle/3bDubsN Get $300 and start running workloads for free → goo.gle/39OlevP
Great video. For use case number 4 where there is an intensive ETL job, is cloud run a good fit? What if the container's memory and CPU cannot handle the ETL load, will it auto-scale the resources in this scenario just like how dataflow does?
Cloud Run could be a good fit as it scales up automatically. You might also want to consider doing ELT inside BigQuery instead of traditional ETL. See the video "How L’Oreal built a data warehouse on Google Cloud".
@@TheMomander Hi thank you for the reply. I am unsure if the container actually scales when there is a single but heavy ETL job instead of multiple requests. Will the backend instances scale in such scenarios? AFAIK since it will be a single request, it should run on a single container. Curious how will autoscaling take place when the ETL job is memory intensive. Thanks
@@ambeshsingh1251 Let's say you want to do ETL on a million records. If you have a physical computer, you might start a program that loops over all the records and processes them one at a time. If the program crashes halfway through, half of the records won't be processed. By contrast, the serverless way to do ETL would be to create a Pub/Sub message for each record and build a Cloud Run service that is triggered by Pub/Sub. Your Cloud Run code would be simpler because it only processes a single record, your program would require less memory for the same reason, your job would scale up well, and if your code crashes halfway through, only a single record is affected. Also, do check out the L'Oreal video I mentioned above. They have a very scalable and well-working ELT pipeline. Their approach is slightly different from what I outlined in the paragraph above.
@@TheMomander Thank you Martin, I definitely would check out that video 😊. Also, it's an efficient way to put each record as a separate msg in pub/sub. In my case there are few heavy files that come in GCS and I need to open and transform them and write to a sink. Pub/Sub is sending the GCS URIs of those files on which cloud run is supposed to do the ETL. Since the file size is big(multiple GBs) i doubt whether Cloud run is a good fit. I know Dataflow is a suitable choice for it but just wanted to see if these scenarios can be handled via Cloud run scalability feature.
@@ambeshsingh1251 Each Cloud Run instance can have a maximum of 32 Gi of memory and can run for up to 60 minutes. Be aware that serverless platforms like Cloud Run generally scale horizontally and not vertically. In other words, they scale up by spinning up more instances, not by giving each instance more resources. It sounds like Dataflow can do the job for you. If you were to use Cloud Run, you may want to build one Cloud Run service that receives the Pub/Sub message about the file upload, parses the file, and sends one Pub/Sub message per record. Then a second Cloud Run service would process each record, one record per invocation.
I would Say those are business cases, or app procedures. I was expecting workload cases like "handle 10k requests per minute for payment" or "response Times of 2s at peak time of 1000 requests per second on catalog browsing" . But an informative vídeo nonetheless....
Sorry, I don't know enough about Telegram bots to answer that question. If it's enough for the bot to respond to HTTP calls, you can run it on Cloud Run. If the bot needs to be running all the time, you're better off running it on a virtual machine (Google Compute Engine).
Tune into our #AskGoogleCloud premiere on Friday, March 12 10AM PT for answers and a chance to chat live with Google Cloud’s serverless experts → goo.gle/3bDubsN
Get $300 and start running workloads for free → goo.gle/39OlevP
This is such a great video. Need concept like this. Use cases and solutions. Simple. In 5 mins. Love it.
Nice video, I think it is time for another comparison between Cloud Run / App Engine and Functions
Absolutely my favorite video about Cloud Run!
Using cloud run. Cant complain much. Runs very well.
This type of use case based approach is really helpful! ❤️
Glad you think so!
Useful info delivered in a fun presentation :) Like!
Very helpful video thanks a lot. Such real scenario use case videos help us a lot to understand the different services and the suitable workloads.
Can you please share documentation for REST API ? As Dina mentions it is an out of box solution, but could not find it o GCP.
Interesting short format
Great video. For use case number 4 where there is an intensive ETL job, is cloud run a good fit? What if the container's memory and CPU cannot handle the ETL load, will it auto-scale the resources in this scenario just like how dataflow does?
Cloud Run could be a good fit as it scales up automatically. You might also want to consider doing ELT inside BigQuery instead of traditional ETL. See the video "How L’Oreal built a data warehouse on Google Cloud".
@@TheMomander Hi thank you for the reply. I am unsure if the container actually scales when there is a single but heavy ETL job instead of multiple requests. Will the backend instances scale in such scenarios? AFAIK since it will be a single request, it should run on a single container. Curious how will autoscaling take place when the ETL job is memory intensive. Thanks
@@ambeshsingh1251 Let's say you want to do ETL on a million records. If you have a physical computer, you might start a program that loops over all the records and processes them one at a time. If the program crashes halfway through, half of the records won't be processed. By contrast, the serverless way to do ETL would be to create a Pub/Sub message for each record and build a Cloud Run service that is triggered by Pub/Sub. Your Cloud Run code would be simpler because it only processes a single record, your program would require less memory for the same reason, your job would scale up well, and if your code crashes halfway through, only a single record is affected.
Also, do check out the L'Oreal video I mentioned above. They have a very scalable and well-working ELT pipeline. Their approach is slightly different from what I outlined in the paragraph above.
@@TheMomander Thank you Martin, I definitely would check out that video 😊. Also, it's an efficient way to put each record as a separate msg in pub/sub. In my case there are few heavy files that come in GCS and I need to open and transform them and write to a sink. Pub/Sub is sending the GCS URIs of those files on which cloud run is supposed to do the ETL. Since the file size is big(multiple GBs) i doubt whether Cloud run is a good fit. I know Dataflow is a suitable choice for it but just wanted to see if these scenarios can be handled via Cloud run scalability feature.
@@ambeshsingh1251 Each Cloud Run instance can have a maximum of 32 Gi of memory and can run for up to 60 minutes. Be aware that serverless platforms like Cloud Run generally scale horizontally and not vertically. In other words, they scale up by spinning up more instances, not by giving each instance more resources.
It sounds like Dataflow can do the job for you. If you were to use Cloud Run, you may want to build one Cloud Run service that receives the Pub/Sub message about the file upload, parses the file, and sends one Pub/Sub message per record. Then a second Cloud Run service would process each record, one record per invocation.
That was very useful, I enjoyed !! Thank you......
Glad it was helpful!
Really helpful, thanks!
You're welcome!
I would Say those are business cases, or app procedures. I was expecting workload cases like "handle 10k requests per minute for payment" or "response Times of 2s at peak time of 1000 requests per second on catalog browsing" . But an informative vídeo nonetheless....
Does Cloud Run support GPU?
Iwould details about using WordPress.
Its great..
A bit cringy, but informative
Good video, I wanna create a Bot telegram Is Cloud Run recommend?
Sorry, I don't know enough about Telegram bots to answer that question. If it's enough for the bot to respond to HTTP calls, you can run it on Cloud Run. If the bot needs to be running all the time, you're better off running it on a virtual machine (Google Compute Engine).
Hmm it can be using pub/sub cloud run integration
👏👏👍