AWS Lambda function | Copy files from one S3 bucket to another S3 bucket as soon as uploaded

Поделиться
HTML-код
  • Опубликовано: 12 дек 2024

Комментарии • 67

  • @bryantTheFatBadger
    @bryantTheFatBadger 4 года назад +1

    This is literally the best solution walk-thru I have watched on YT. Clear, instructive, and it actually works. You are a hero!

  • @renukasrivastava1167
    @renukasrivastava1167 10 месяцев назад

    Thank you for such a simple and good explanation

  • @MrElsocio
    @MrElsocio 3 года назад

    This's pretty awesome. Thanks! We can also do the same with CRR or SRR within S3. But very helpful video to understand Lambda. Thanks again :).

  • @eminedogan3125
    @eminedogan3125 2 года назад

    Great video, Thank you for the clear explanation!

  • @PawanKumar-gl4yw
    @PawanKumar-gl4yw 2 года назад

    Great Explanation. I would like to know that when Lambda copy data from source bucket to target bucket, where it stores the data ? And if data is let's say 1Tb, then how Lambda would work ?

  • @knandi73
    @knandi73 2 года назад +1

    There is a CSV file in the local computer and it is uploaded to AWS S3.
    If any changes are made in that CSV file on the local computer, the changes should reflect in AWS S3 automatically using AWS Lambda function.
    What are the steps to achieve this?

  • @multitaskprueba1
    @multitaskprueba1 2 года назад

    Fantastic video! Thank you! You are a genius!

  • @sunnysandeep202
    @sunnysandeep202 4 года назад +1

    Great Artice. It helped me a lot.

  • @AshokSharma-yv6mw
    @AshokSharma-yv6mw 2 года назад

    Good tutorial. However, the first 15 lines of Python/Boto3 code for the Lambda trigger are not readable. Please share.

  • @kalyanijagtap4448
    @kalyanijagtap4448 3 года назад

    Great video sir u have explained it very well

  • @AjishPrabhakar
    @AjishPrabhakar Год назад

    But this can get easily failed for uploading large files, say for eg file size over 500GB . The lambda runtime execution timeout will happen.

  • @NamasteErwin
    @NamasteErwin Год назад

    Hi , that's a great info and thanks for the tutorial...i have question and if this can be answered, can solve my problem..so i have a custom app and we have integrated with AWS event bridge and wants events to be targeted out side of AWS ..one we are using is Google cloud storage..so will the similar python script which can solve my problem

  • @Videos-rj1ek
    @Videos-rj1ek 2 года назад

    can we see the log of this copy event..labda copying...you put print statement...does it publish to cloudwatch?

  • @shivamgarg4958
    @shivamgarg4958 Год назад

    can you create a lambda function to compress images using python

  • @poppadoesitpropa
    @poppadoesitpropa 3 года назад

    Great demo, any chance there is a AWS LAMBDA to copy from S3 to FSx windows?

  • @hardikmaghrola
    @hardikmaghrola 2 года назад

    Create an AWS Lambda function to count the number of words in a text file. The general requirements are as follows:
    Use the AWS Management Console to develop a Lambda function in Python and to create its required resources.
    Report the word count in an email using an Amazon Simple Notification Service (SNS) topic. Optionally, also send the result in an SMS (text) message.
    Format the response message as follows:
    The word count in the file is nnn.
    Replace textFileName with the name of the file.
    Specify the email subject line as: Word Count Result
    Automatically trigger the function when the text file is uploaded to an Amazon S3 bucket.
    Test the function by uploading several text files with different word counts to the S3 bucket.
    Forward the email produced by one of your tests to your instructor along with a screenshot of your Lambda function.

  • @franklinbulmez4989
    @franklinbulmez4989 Год назад

    how can I copy only the file and not all the prefix where it resides?

  • @baluchittela3016
    @baluchittela3016 2 года назад

    Thanks for your clear explanation. I followed your steps, as you said, but I am getting errors while running the lambda function. Could you please help me ASAP?
    Error:-
    {
    "errorMessage": "module 'urllib' has no attribute 'unquote_plus'",
    "errorType": "AttributeError",
    "requestId": "0cb2294e-a023-4ab2-8395-05f70689e10f",
    "stackTrace": [
    " File \"/var/task/lambda_function.py\", line 17, in lambda_handler
    object_key = urllib.unquote_plus(event['Records'][0]['s3']['object']['key'])
    "
    ]
    }

  • @balajikubendran9120
    @balajikubendran9120 4 года назад +1

    Hi @Prabhakar
    i need to unzip the zip file in the sub bucket, is this possible to extract the zip fin in its sub bucket, can you please inform

  • @shreyashmakadia8951
    @shreyashmakadia8951 3 года назад

    Error when trigger create "Unable to validate the following destination configurations"

  • @abelrozario2757
    @abelrozario2757 3 года назад

    Thank you 👍🏻, can we do similar copy using different aws account for input s3 bucket?

  • @mohammadanas6755
    @mohammadanas6755 Год назад

    Sir I want to transfer a file from one aws s3 to different aws s3 using bash script .

  • @kudlamolka1429
    @kudlamolka1429 2 года назад

    Is the source_bucket name is obtained by the trigger?

  • @mejiger
    @mejiger 2 года назад +1

    nice one but python 2.7 is not supported on aws anymore and the code is not working for me for python 3+

    • @carlosperal5163
      @carlosperal5163 2 года назад

      Same

    • @TonySpark-er2hj
      @TonySpark-er2hj Год назад

      @@carlosperal5163 from __future__ import print_function
      import boto3
      import time, urllib
      import json
      """Code snippet for copying the objects from AWS source S3 bucket to target S3 bucket as soon as objects uploaded on source S3 bucket
      @author: Prabhakar G
      """
      print ("*"*80)
      print ("Initializing..")
      print ("*"*80)
      s3 = boto3.client('s3')
      def lambda_handler(event, context):
      # TODO implement
      source_bucket = event['Records'][0]['s3']['bucket']['name']
      object_key = urllib.unquote_plus(event['Records'][0]['s3']['object']['key'])
      target_bucket = 'techhub-output-data-andy'
      copy_source = {'Bucket': source_bucket, 'Key': object_key}
      print ("Source bucket : ", source_bucket)
      print ("Target bucket : ", target_bucket)
      print ("Log Stream name: ", context.log_stream_name)
      print ("Log Group name: ", context.log_group_name)
      print ("Request ID: ", context.aws_request_id)
      print ("Mem. limits(MB): ", context.memory_limit_in_mb)
      try:
      print ("Using waiter to waiting for object to persist through s3 service")
      waiter = s3.get_waiter('object_exists')
      waiter.wait(Bucket=source_bucket, Key=object_key)
      s3.copy_object(Bucket=target_bucket, Key=object_key, CopySource=copy_source)
      return response['ContentType']
      except Exception as err:
      print ("Error -"+str(err))
      return e
      This works for me with the newer version of PYTHON well 3.7 anyway:) cheers

  • @satyamKumar-mr5gc
    @satyamKumar-mr5gc Год назад

    How can run this program for long time

  • @jatin_khera
    @jatin_khera 3 года назад

    I have one doubt if the bucket contains multiple objects and any file from one particular folder is overwritten will it reflect in the new bucket as well ?

  • @kudaykumar1261
    @kudaykumar1261 3 года назад

    Thank you so much sir ... its really work.

  • @shreerangaraju1013
    @shreerangaraju1013 2 года назад

    where's the event json for this?

  • @lakshmisharon5756
    @lakshmisharon5756 2 года назад

    i tried the code but its not working for me i dont know why. its not getting copied to target bucket
    can anyone help

  • @dianaan2080
    @dianaan2080 3 года назад

    I am getting key error: 'Records'.. What to do?

  • @vishwarajgupta1963
    @vishwarajgupta1963 3 года назад

    HI Sir, Do you teach as well ? I am looking for lambda coaching.

  • @sunnysandeep202
    @sunnysandeep202 4 года назад

    Sir I want to add data and fetch data to postgresql through c# using lambda. Kindly help me here

  • @dodokwak
    @dodokwak 3 года назад

    Thank you. I also use two buckets ( destination and incoming one)+ lambda function for resizing images.
    On the server side I use django-storages and it's const AWS_S3_CUSTOM_DOMAIN which points to a bucket with resized images. Buckets and their objects have public access. Everything works almost well but I've got a strange bug: 404 error when trying to get image for the first time which turns into 200 OK after refresh. Has somebody got the same issue?

  • @ramyahello
    @ramyahello 4 года назад

    Good video, please upload a video what will you do if you want to add prefix and suffix, like TXT file to one bucket and jpg to another

  • @shivagyaneshwar1106
    @shivagyaneshwar1106 2 года назад +1

    {
    "errorMessage": "'Records'",
    "errorType": "KeyError",
    "stackTrace": [
    " File \"/var/task/lambda_function.py\", line 17, in lambda_handler
    source_bucket = event['Records'][0]['s3']['bucket']['name']
    "
    ]
    }
    getting this error

  • @abhishekroxz
    @abhishekroxz 2 года назад

    What if the data size is huge 10 tb can we transfer the entire data within 15 min?

    • @technologyhub1503
      @technologyhub1503  2 года назад

      Above Lambda example to demonstrate Lambda capabilities to perform operations on S3 bucket.
      In this case huge data around 10TB+, we can perform the data transfer between buckets using one of the following option s:
      1. Cross-region replication or same-region replication
      2. S3 batch operation
      3. S3DistCp with Amazon EMR
      4. Use aws DataSync

  • @syedahmadzada3166
    @syedahmadzada3166 2 года назад

    Your video is really helpful but the code keep giving me an issue line 16

  • @tokunbokazeem1299
    @tokunbokazeem1299 4 года назад

    great video, can you upload the text to secret manager instead of another s3 bucket?

  • @eladlevi47
    @eladlevi47 2 года назад

    Someone has a manual for the same procedure but with python version 3.x ??

    • @devashree8884
      @devashree8884 2 года назад

      import boto3
      import time, urllib
      import json
      print ("*"*80)
      print ("Initializing..")
      print ("*"*80)
      s3 = boto3.client('s3')
      def lambda_handler(event, context):
      # TODO implement
      source_bucket = event['Records'][0]['s3']['bucket']['name']
      object_key = urllib.parse.unquote_plus(event['Records'][0]['s3']['object']['key'])
      target_bucket = 'name of yout target bucket'
      copy_source = {'Bucket': source_bucket, 'Key': object_key}
      print ("Source bucket : ", source_bucket)
      print ("Target bucket : ", target_bucket)
      print ("Log Stream name: ", context.log_stream_name)
      print ("Log Group name: ", context.log_group_name)
      print ("Request ID: ", context.aws_request_id)
      print ("Mem. limits(MB): ", context.memory_limit_in_mb)
      try:
      print ("Using waiter to waiting for object to persist through s3 service")
      waiter = s3.get_waiter('object_exists')
      waiter.wait(Bucket=source_bucket, Key=object_key)
      s3.copy_object(Bucket=target_bucket, Key=object_key, CopySource=copy_source)
      return 'Successfully copied files'
      except Exception as err:
      print ("Error -"+str(err))
      return err

  • @PraveenKumar-ic5zo
    @PraveenKumar-ic5zo Год назад

    Nice Video.

  • @gridofmemories
    @gridofmemories 4 года назад

    I followed the video but on uploading to my source bucket my file is not copying to the target bucket

    • @davidcloes9048
      @davidcloes9048 3 года назад

      my uploaded file was also not copying to the target bucket. I had inadvertently not attached the AWSS3FullAccess policy to the role I had created. I only noticed because I had also neglected to add the AWSLambdaBasicExecutionRole to the role, so monitoring wasn't working either. Attached them both and viola!, file was copied to the 2nd bucket.

    • @sekmer009
      @sekmer009 3 года назад

      @@davidcloes9048 i followed all the steps. but still didnt copy to destination. can you help pls. Anything to set permissions or enable at s3 bucket. One more observation that I dont see Enable check box while creating the trigger. Won't this work on AWS basic user login?

    • @RaamVersion2O
      @RaamVersion2O 3 года назад +1

      you should maintain runtime python 2.7 only then only you got it.

    • @dianaan2080
      @dianaan2080 3 года назад +2

      I am getting key error:'Records'

  • @nithinbhandari3075
    @nithinbhandari3075 3 года назад

    Nice Video.
    Thanks.

  • @bikramchandradas4120
    @bikramchandradas4120 4 года назад

    Any bodey help me to create website
    In the website Dashboard ,have to pute ,aws,start,stop, options...
    To give to user to use their own vps server

  • @sunnysandeep202
    @sunnysandeep202 4 года назад

    can i delete object from destination bucket as soon as object with same name deleted from source bucket using lambda function? if yes how can i do that?

    • @technologyhub1503
      @technologyhub1503  4 года назад +2

      Yes, we can tweak the lambda function code as per our requirement. We can delete an object, copy object, we can use copied object data to insert into mysql, postgreSQL, DynamoDB, we can also use this data for Alexa training data set and etc.
      If we want to delete an object from destination bucket as soon as source bucket object is deleted with object name.
      1. Apply lambda function on source bucket with DELETE event.
      2. As soon as you delete a file from source bucket, first we need to cross verify whether same object/file already exists in destination bucket then we can write a code snippet for deleting an object from destination bucket.
      s3.delete_object(Bucket=bucket, Key=destination_object_key)
      Please let me know if you need any help on the same.

  • @divyanshjha7672
    @divyanshjha7672 Год назад

    hua hi nahi :(

  • @whathowwhywhenandhere9168
    @whathowwhywhenandhere9168 3 года назад +3

    While running above code I am getting this error lease somebody help
    Response
    {
    "errorMessage": "'Records'",
    "errorType": "KeyError",
    "stackTrace": [
    " File \"/var/task/lambda_function.py\", line 17, in lambda_handler
    source_bucket = event['Records'][0]['s3']['bucket']['name']
    "
    ]
    }

    • @jaimearielchitaybautista6719
      @jaimearielchitaybautista6719 3 года назад

      I have the same error

    • @saipranav3153
      @saipranav3153 3 года назад

      upload file in s3 and execute it through trigger.
      If I am not wrong. I guess you have used test to execute the labda

    • @shaikfarheen8906
      @shaikfarheen8906 2 года назад

      Same thing I am getting ..pls give me a solution

    • @smdmatheen2245
      @smdmatheen2245 2 года назад

      anyone fix the error

  • @krishnamurali8522
    @krishnamurali8522 4 года назад

    Super

  • @gandheshiva8943
    @gandheshiva8943 4 года назад

    thankyou

  • @abhilashak1628
    @abhilashak1628 2 года назад

    hi the solution didnot work for me can you help me can you shae me mail id so that i can share the error details