Lambda triggers - Trigger lambda function when a file is uploaded to Amazon S3

Поделиться
HTML-код
  • Опубликовано: 5 авг 2024
  • In this video we will learn how to trigger a lambda function whenever a new file gets uploaded to an S3 bucket. We also want to process the file and store the content in a DynamoDB table.
    Timestamps:
    - Intro: 0:00
    - DynamoDB and S3 configuration: 1:18
    - Create lambda function and IAM config: 1:58
    - Add Lambda trigger: 2:48
    - Write lambda code to process event 3:27
    - Time for testing!: 6:06
    Code example: github.com/endre-synnes/pytho...
    JSON file: github.com/endre-synnes/pytho...
    Follow me on Github: github.com/endre-synnes

Комментарии • 2

  • @user-qq5os6xf9k
    @user-qq5os6xf9k Год назад +1

    Thanks Endre, helpful video, and what I'm looking for is that can we create a queue for the files that drop to an s3 bucket, I want those files to be processed in order, as far as I know, whenever new files drop to a bucket, the attached lambda triggers the function and runs for each file separetely with files are unaware of each other (tell me if I'm wrong). For example when I upload 10 files to a bucket at the same time, I want second file to trigger lambda (or whatsoever) after the first file is completely processed and so on, is there a way to do that, all I can find is that we can use SQS to trigger lambda, but I don't think this is what I'm looking for.

    • @EndreSynnes
      @EndreSynnes  11 месяцев назад

      Sorry for the late response! Thank you so much! 😄
      Well, this may be a bit complicated to answer in a RUclips comment, but I'll try 😅 There is of course several ways to solve this (also if you really need to have sequential processing), but one way could be:
      First to have a Lambda function that triggers on upload to S3 and only stores a record containing the filename, upload-timestamp and a status ("new_file") in a DynamoDB table. Then maybe another function that looks for the oldest files (by timestamp) with status "new_file" and starts processing it from S3. Once finished processing the file, update the DynamoDB record with status as complete.
      Also, it could after that look for other records with status "new_file" and send an event which trigger the processing of the next record, until all records are marked as "complete".
      This may not be the best solution, but It may work if I understood your use case correctly😄