Это видео недоступно.
Сожалеем об этом.
Pub/Sub tips and tricks
HTML-код
- Опубликовано: 15 авг 2024
- Dead-letter queues → goo.gle/3u5dLkl
Message ordering → goo.gle/2M5CaVK
Replaying past messages → goo.gle/3du08oP
Pub/Sub is an asynchronous messaging service that can help you easily run serverless applications. However, there are some common issues that one may run into when using Pub/Sub. In this episode of Serverless Expeditions, we run through some common Pub/Sub issues that our fictitious ride hailing app has ran into - such as handling backlogged messages, unordered messages, and much more. Watch to learn some useful tips about Pub/Sub for your serverless applications!
Timestamps:
0:00 - Intro
1:38 - Using dead letter queues to deal with malformed messages
5:55 - Using message ordering to prevent bugs
9:20 - Replaying past messages to test new versions of your application
Checkout more episodes of Serverless Expeditions → goo.gle/Serverl...
Subscribe to get all the episodes as they come out → goo.gle/GCP
#ServerlessExpeditions #ServerlessExpeditionsExtended
product: Cloud Pub/Sub; fullname: Tianzi Cai, Martin Omander;
Tune into our #AskGoogleCloud premiere on Friday, March 12 10AM PT for a chance to ask Google Cloud’s serverless experts any questions regarding serverless architectures → goo.gle/3evqXcU
Get $300 and start running workloads for free → goo.gle/2Zvv4wE
Help me
A great example used, very clear! Nice job!
Ordering is important. Might try to share more in detail next time. Thx. 👍
I also want to see ordering!
@@Tenelia More details about ordering -- got it! We are adding this to our list of new episode ideas.
Great presentation, Martin & Wes!!
Excellent tips! Please, a Dataflow tips and tricks would be aweosome and a video about real time analytics (Pub/Sub + Dataflow + Tensorflow)
There's a streaming example that uses Pub/Sub + Dataflow + Tensorflow, which it is part of a longer tutorial that showcases batch processing.
Try to search for "molecules-walkthrough" or "Machine Learning with Apache Beam and TensorFlow" in the GCP docs.
Hope you find that tutorial useful, and thank you for watching and for your comment.
Thanks Tianzi! I found the example cloud.google.com/dataflow/docs/samples/molecules-walkthrough I'll try it.
Excellent idea. Got more insights about pub/sub through this.
Thank for the tips. Love this format. Thanks
for the message ordering, what if request a ride fails in dispatcher? since message ordering cant be supported via dead letter and if request ride + cancel ride is pulled together in the same pullResponse, the subscriber can process cancel ride as well even if request ride fails.
Really nice tips! Ordering messages feature will help a lot of people and projects.
What will happen if order is enabled and dead lettering also? Will good messages still be sended in order eventhough some of them are missing between the good ones?
To answer your first question, if the subscription attached to the dead-letter topic has ordering enabled, then messages (sent with an ordering key) that you receive from that subscription will be ordered.
To answer your second question, (good) messages not forwarded to the dead-letter topic will not be received by the subscription attached to the dead-letter topic.
Thanks for watching and for your questions!
Excellent one!!! Thanks @Tianzi :)
I tried the dead letter topic one but with mixed luck :(
It worked when I tested without any subscriber . I pushed a message and checked the delivery attempt after 5 times it pushed the message to dead letter one.
When I used dataflow as a subscriber and intentionally threw exception from subscriber code it did not work. The message is not getting acknowledged as I can see from Unacked message count ,but it never reached to dead letter topic. So the error count keeps on increasing in dataflow.I have not written any code inside subscriber to push the message to dead letter as I think It should be pushed automatically. Am I missing anything ?
Thank you Pinkan. So glad to know that you gave it a try. And thank you for the question.
Mind I ask you how you "intentionally threw exception from subscriber code" in your pipeline? `PubsubIO.readMessages()` returns `org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage.PubsubMessage`, which is unlike `com.google.pubsub.v1.PubsubMessage` and does not have an `ack()` or `nack()` method.
I think it's for this reason that all Pub/Sub messages arriving in the Beam/Dataflow will be considered acknowledged. "The Dataflow runner's implementation of PubsubIO automatically acknowledges messages once they have been successfully processed by the first fused stage (and side-effects of that processing have been written to persistent storage)." [1] Does this make sense?
[1]: cloud.google.com/dataflow/docs/concepts/streaming-with-cloud-pubsub#integration-features
@Google Cloud Tech. When you guys show the demo on console, it doesn't benefit anyone. Most of the companies are doing it using IAS or using Python api. So please provide more tutorial using coding...
You bring up a good point. We often use the console as it is more visual, easier to understand, and shows the concepts across programming languages, not just for one language. We'll see if we can include more code in the future!
Whats the behavior when publisher sets message ordering and publishes messages with an ordering key and there are multiple subscribers listening to the same topic and only one of them is interested in ordered messages.
If you are using any of the official client libraries, messages of the same ordering key will be delivered to the same subscriber. Messages published with no ordering keys can be pulled by subscribers to a subscription with ordering enabled in any order. Thanks for watching and for your question!
What are the calls between the Dispatcher and Fleet manager (get-available_drivers, send-available-drivers)? Are these also pub/sub? How are these communicating?
I tested code running on Cloud Run to add 100 messages to a queue and it took .04s per message. That is 40ms per message add. The message was only like 32 bytes.
Does this sound like it is too slow?
Any way to speed this up?
Have a look at "Publish with batching settings" in PubSub. That may help speed it up.
@@TheMomander Thx, just ran into this in the docs a few minutes ago... :) Appreciate the reply
Interesting intro
Great Video thanks
Glad you enjoyed it
Great
Too bad dead-lettering doesn't work for event-driven functions since you cannot manually NACK.
Hm, I'm not sure I understand the full context. A "NACK" in the case of event-driven functions means any response code that's not 102, 200, 201, 202, or 204. It's possible to write your function to return failure code where necessary. Could you tell me more about what's on your mind?
explains it better than I can 😄
nice
help me