AWS Lambda Function & S3 trigger | tutorial - How to copy files from one bucket to another

Поделиться
HTML-код
  • Опубликовано: 12 дек 2024

Комментарии • 32

  • @krishnakumarmishra5406
    @krishnakumarmishra5406 Год назад +4

    Hey , This code works 100% just follow the steps correctly ! Thank you !

  • @Tony-bc4wc
    @Tony-bc4wc Год назад +2

    It works and is very useful, Thanks !

  • @spooki2000
    @spooki2000 10 месяцев назад +2

    hey, what if i want to move all the files from a folder to another bucket but i don't want to move the folder as well? i want to only move the files.

    • @WojciechLepczynski
      @WojciechLepczynski  10 месяцев назад +1

      I think I wrote about it on my blog, If you move all files, the folder will be automatically deleted, but after moving the files you can create it again if you want.
      You can create a folder with put_object:
      client = boto3.client('s3')
      response = client.put_object(Bucket='test-745df33637-source-lambda-copy', Body='', Key='test-folder/')

    • @spooki2000
      @spooki2000 10 месяцев назад +1

      @@WojciechLepczynski hey thanks for replying helped out alot

    • @WojciechLepczynski
      @WojciechLepczynski  10 месяцев назад +1

      no problem, it's nice that it was useful, and thanks for the feedback

  • @jecalad
    @jecalad Год назад +1

    Hi thanks for video, however for bucket with a lot files is not working or timeout. What are the limits of lambda for doing this and could be an option to achieve this?

    • @WojciechLepczynski
      @WojciechLepczynski  Год назад +2

      If you just want to copy files there are better ways. I called lambda for a different purpose, when someone added a file, you can, for example, change size, format or other parameters and send file to another place after completion.
      Lambda can be run for max 15 minutes, if you want to copy files better use cli and commands cp, mv or sync lepczynski.it/en/aws_en/aws-s3-cli-some-tips-for-automation

  • @theinstigatorr
    @theinstigatorr 3 дня назад +1

    Hi it worked but I have a question. When I create a folder called images and upload the files in the source bucket the files appear in the destination bucket however the images folder disappears from the source bucket once this completes. Was it designed this way? If not what modifications do I need to make to preserve the images folder in the source bucket

    • @WojciechLepczynski
      @WojciechLepczynski  3 дня назад +1

      I think I wrote about it on my blog and somewhere in the comment, If you move all files, the folder will be automatically deleted, but after moving the files you can create it again if you want.
      You can create a folder with put_object:
      client = boto3.client('s3')
      response = client.put_object(Bucket='test-745df33637-source-lambda-copy', Body='', Key='test-folder/')

  • @kaviyageethaM
    @kaviyageethaM Год назад +2

    can i apply the same if the s3 buckets are in different aws accounts

    • @WojciechLepczynski
      @WojciechLepczynski  Год назад +1

      You can add the appropriate permissions and call the function from another account.
      It is important that S3 and the lambda function are in the same region.
      repost.aws/knowledge-center/lambda-s3-cross-account-function-invoke
      Another way - you can also create an additional function that will call other functions, e.g.
      S3(Account A)--> Lambda(Account A)-->Lambda(Account B)
      ruclips.net/video/Qrm84k9vRXg/видео.html

    • @kaviyageethaM
      @kaviyageethaM Год назад +1

      @@WojciechLepczynski thank you so much, i'll try and come back

    • @WojciechLepczynski
      @WojciechLepczynski  Год назад +1

      No problem, thanks

  • @arturpater9947
    @arturpater9947 Год назад +1

    Hey, how should I modify code and trigger if I want Lambda function to invoke everytime I insert file into one of subfolders? So in your example let's say under 'images/' there would be 3 another subfolders and I want this function to work if you insert file to one of them. I tried to change prefix to 'images/*' and same thing for trigger but it's not working. I'd appreciate some help with that

    • @WojciechLepczynski
      @WojciechLepczynski  Год назад +1

      Calling the function depends on the S3 trigger, you only change the Prefix from 'images/' to 'images/subfolder1/' ( 3:30 in the video). Thanks to this, only when you add something to the subfolder 'subfolder1' in images, the function will be launched.
      You'll probably have to create a new trigger because AWS won't let you change the current one. Once you create a trigger, wait a few minutes and then test it

    • @arturpater9947
      @arturpater9947 Год назад +1

      @@WojciechLepczynski got it. The thing is I'm uploading data from Kinesis to S3 on daily basis and each day it's creating me another subfolder. So in my case 'images' is really '2023' and then I have subfolder '11' and then under that '6' for yesterday, '7' for today, etc. So I'd like to copy files from one bucket to another one when for example tomorrow it will insert some files to newly created subfolder '8'. I mean I don't want to specify subfolder '8' as a source/trigger but I want it to work also for day after tomorrow, etc. Is it possible to achieve something like this with this function?

    • @WojciechLepczynski
      @WojciechLepczynski  Год назад +1

      ​ @arturpater9947 It is possible, If I understood correctly just specify the main folder, as '2023'. It should also work for all files and subfolders in it.
      For me, the trigger is the 'images' folder and if I create several subfolders in it and create a file, it will also call the function.
      You can also use 'CloudWatch events' as a trigger. You can specify that you move data from a folder and all subfolders to a different location every day at a specific time. It all depends on what you want to achieve.
      Of course, I recommend testing first. It may also turn out that you will need to improve the code in the lambda function.

  • @ankitranjan88
    @ankitranjan88 Год назад +2

    Thanks

  • @manojmac1901
    @manojmac1901 2 года назад +1

    Can a similar logic be applied to copy a file from local hdfs to s3 ?

    • @WojciechLepczynski
      @WojciechLepczynski  2 года назад +3

      check this aws.amazon.com/blogs/storage/using-aws-datasync-to-move-data-from-hadoop-to-amazon-s3/

  • @gopikrishnachodapaneedi850
    @gopikrishnachodapaneedi850 2 года назад +1

    I tried the exact steps what you done.But it is not replicated on my S3 bucket

    • @WojciechLepczynski
      @WojciechLepczynski  2 года назад +1

      It should work, you can check if you haven't missed something. On my blog I have describe step by step. What error are you getting? What is not working?

    • @gopikrishnachodapaneedi850
      @gopikrishnachodapaneedi850 2 года назад

      @@WojciechLepczynski Actually I am not getting any error.I have taken policy from yours and modified with my s3 arn's and rename the source and destination buckets in Lambda Code.And created trigger on source bucket.Everthing is fine But i am not getting result in destination bucket after uploading file in image/ folder

    • @gopikrishnachodapaneedi850
      @gopikrishnachodapaneedi850 2 года назад +2

      @@WojciechLepczynski it is working I forgot to run test .Thanks for that. Nice blog.Very useful and interesting. Keep going.This task is for moving.What if I want to take copy of the same bucket to another bucket as a backup.What changes I need to do in lambda function code.

    • @WojciechLepczynski
      @WojciechLepczynski  2 года назад +3

      Function deletes files there s3.Object(bucket.name, obj.key).delete()

    • @carlosperal5163
      @carlosperal5163 2 года назад +1

      @@gopikrishnachodapaneedi850 Hello, same happened to me, what do you mean with run test ?

  • @RasikaVenkadesh
    @RasikaVenkadesh Год назад

    I too tried, but it is not reflected in destination bucket

    • @WojciechLepczynski
      @WojciechLepczynski  Год назад +2

      You can check the logs first and make sure you have the right permissions.
      You can copy the function code from my blog. On the blog you will also find a detailed description. Regards