AWS S3 File Upload + Lambda Trigger (Tutorial In Python) | Step by Step Guide

Поделиться
HTML-код
  • Опубликовано: 11 авг 2019
  • S3 is an easy to use all purpose data store. Frequently we use it to dump large amounts of data for later analysis. Using S3 Put events with Lambda, we can shift this process to an event driven architecture. In this video, I show you how to set up a S3 Put Event Trigger that invokes a lambda function every time a file is added to your s3 bucket. This allows for event driven processing and other neat applications!
    Looking to get hands on experience building on AWS with a REAL project? Check out my course - The AWS Learning Accelerator! courses.beabetterdev.com/cour...
    00:12 Example Overview
    01:07 Creating a Lambda Function
    02:40 Lambda Code Walkthrough
    03:49 Creating the S3 Trigger
    06:00 Validating the S3 Trigger
    06:32 Testing with a File Upload
    06:56 Validating the Lambda Invocation
    🎉SUPPORT BE A BETTER DEV🎉
    Become a Patron: / beabetterdev
    📚 MY RECOMMENDED READING LIST FOR SOFTWARE DEVELOPERS📚
    Clean Code - amzn.to/37T7xdP
    Clean Architecture - amzn.to/3sCEGCe
    Head First Design Patterns - amzn.to/37WXAMy
    Domain Driver Design - amzn.to/3aWSW2W
    Code Complete - amzn.to/3ksQDrB
    The Pragmatic Programmer - amzn.to/3uH4kaQ
    Algorithms - amzn.to/3syvyP5
    Working Effectively with Legacy Code - amzn.to/3kvMza7
    Refactoring - amzn.to/3r6FQ8U
    🎙 MY RECORDING EQUIPMENT 🎙
    Shure SM58 Microphone - amzn.to/3r5Hrf9
    Behringer UM2 Audio Interface - amzn.to/2MuEllM
    XLR Cable - amzn.to/3uGyZFx
    Acoustic Sound Absorbing Foam Panels - amzn.to/3ktIrY6
    Desk Microphone Mount - amzn.to/3qXMVIO
    Logitech C920s Webcam - amzn.to/303zGu9
    Fujilm XS10 Camera - amzn.to/3uGa30E
    Fujifilm XF 35mm F2 Lens - amzn.to/3rentPe
    Neewer 2 Piece Studio Lights - amzn.to/3uyoa8p
    💻 MY DESKTOP EQUIPMENT 💻
    Dell 34 inch Ultrawide Monitor - amzn.to/2NJwph6
    Autonomous ErgoChair 2 - bit.ly/2YzomEm
    Autonomous SmartDesk 2 Standing Desk - bit.ly/2YzomEm
    MX Master 3 Productivity Mouse - amzn.to/3aYwKVZ
    Das Keyboard Prime 13 MX Brown Mechanical- amzn.to/3uH6VBF
    Veikk A15 Drawing Tablet - amzn.to/3uBRWsN
    ☁Topics covered include:
    - S3 Bucket Creation
    - S3 File Upload
    - S3 Put Event
    - S3 Event Subscription to Lambda
    - Cloudwatch Logs
    🌎 Find me here:
    Twitter - / beabetterdevv
    Instagram - / beabetterdevv
    Patreon - Donations help fund additional content - / beabetterdev
    #S3
    #Lambda
    #AWS
    #Serverless

Комментарии • 158

  • @BeABetterDev
    @BeABetterDev  2 года назад +2

    Looking to become an expert on AWS Lambda? Check out my new course: AWS Lambda - A Practical Guide
    www.udemy.com/course/aws-lambda-a-practical-guide/?referralCode=F6D1A50467E579C65372

  • @w0lfmanf14
    @w0lfmanf14 5 лет назад +17

    Long awaited real world use case and simplification of AWS solutions / products.
    Kudos and keep it up.

    • @BeABetterDev
      @BeABetterDev  5 лет назад +2

      Thanks for the support. If you have any topics you'd like me to cover please let me know!

  • @2112jonr
    @2112jonr 3 года назад +2

    Contains ALL of the details needed, yet concise and clear. Great tutorial for any one new to AWS, thank you ! :-)

  • @arjunrajapppan4768
    @arjunrajapppan4768 2 года назад

    This video really helps. Explanation and demo are so precise. Please keep doing this great work.

  • @claytonnighthawk7837
    @claytonnighthawk7837 3 года назад +1

    Your videos on Step Functions and Lambdas have been incredibly helpful in designing a new serverless billing ingestion process for work. Thank you so much!

    • @BeABetterDev
      @BeABetterDev  3 года назад

      You're very welcome Clayton! Super glad this was helpful :)

  • @BenMatern
    @BenMatern 4 года назад +5

    Short and sweet. This is the video I've been looking for, thanks heaps.

  • @josephblessingh2384
    @josephblessingh2384 3 года назад

    Such a simple explanation!! Thank you for your effort! At least 18k viewers were benefitted by your video!

  • @matthewyong492
    @matthewyong492 2 года назад +6

    I still reference old content and WOW, the explanations and video quality have come a long way in just two years.

    • @BeABetterDev
      @BeABetterDev  2 года назад +1

      Sometimes I look back on my old videos and think "what was I thinking!?". Glad you agree I've come a long way - It's been quite a journey!

  • @tpshundal8855
    @tpshundal8855 4 года назад +8

    I used to be Azure guy but just recently switched to AWS and I found your videos are very helpful! Moreover you do even better job to keep the video with specific information and don't drag it. Good work!! and keep going.

    • @BeABetterDev
      @BeABetterDev  4 года назад +3

      Thank you so much for your support! Its comments like yours that keep me motivated to keep going :)

  • @DataTranslator
    @DataTranslator Год назад

    clear, to the point, and useful. Thank you !!

  • @brockobama257
    @brockobama257 3 года назад

    Hurray! I vaguely understand what we did and it worked!

  • @thebengalurugirl
    @thebengalurugirl 3 года назад

    Your videos are really helpful to understand the concepts practically

    • @BeABetterDev
      @BeABetterDev  3 года назад

      Thanks Sweta! Glad you enjoyed :)

  • @flwi
    @flwi 4 года назад

    Thanks for this little tutorial! Didn't know you can filter for specific filetypes on the invocation.

    • @BeABetterDev
      @BeABetterDev  4 года назад

      You're very welcome Florian! Glad you learned something new!

  • @PawanMusing
    @PawanMusing 4 года назад +1

    Really awesome explanation with simple example. Keep coming.....

  • @AbdurrahmanKocukcu
    @AbdurrahmanKocukcu 3 года назад +1

    Very lean and clean :) Thanks

  • @raj-nq8ke
    @raj-nq8ke Год назад

    Great tutorial

  • @richardlanglois5183
    @richardlanglois5183 3 года назад +1

    Great presentation!

  • @shazinfy
    @shazinfy 2 года назад +1

    Thanks for precise video.

  • @rashmuify
    @rashmuify 2 года назад

    Thanks for the video, I follow your videos as they are very nicely explained. Request you to please upload a video on how AWS sagemaker works?

  • @clearthinking5441
    @clearthinking5441 3 года назад +1

    Very well explained.

  • @avinashgodvin9900
    @avinashgodvin9900 4 года назад +1

    awesome video . very informative

  • @merinmaria99
    @merinmaria99 4 года назад +1

    Brilliant. Do you have a video that shows SNS Topic event upload to S3?

  • @tigour99
    @tigour99 4 года назад +1

    Thanks a lot very precise. Any info in what that code for Node would be?

  • @MohdGaus-dg9ii
    @MohdGaus-dg9ii 2 года назад +1

    Saved my job 💕

  • @havokbaphomet666
    @havokbaphomet666 3 года назад +1

    Cheers from Brazil!

    • @BeABetterDev
      @BeABetterDev  3 года назад

      Thanks for watching Payador :) Stay safe!

  • @sahilarora5731
    @sahilarora5731 3 года назад

    Hi,
    can you please let me know if there is any way to invoke or call AWS Appflow from lambda?

  • @timmyzheng6049
    @timmyzheng6049 2 года назад

    Thank you, this is very helpful. Quick question: if I am about to use Step Functions to run a few lambda functions in a batch style starting by a CloudWatch schedule event, but I also need these lambda functions to wait until particular S3 objects are uploaded to run. How does the S3 trigger work, will the lambda functions wait until they get S3 events?

  • @vishalchavhan6731
    @vishalchavhan6731 4 года назад

    Great job

  • @vishnuvs870
    @vishnuvs870 2 года назад

    Hi sir, thanks for the tutorial. i finished this tutorial.
    but i have one scenario of json validation, while uploading a json file in the bucket using lambda it must be validate. i dont know how to do that

  • @holyproton8855
    @holyproton8855 4 года назад +1

    Thanks boss!

  • @sriadityab4794
    @sriadityab4794 2 года назад +1

    How to handle if there are multiple files dropped in S3 at the same time where we need trigger one glue job? How should we handle Lambda here? Any help is appreciated.

  • @PIYUSH61004
    @PIYUSH61004 2 года назад

    Hey! I am facing an issue. My lambda function and S3 bucket both are in us-west-2 region but while creating event notification, I am not able to see my lambda function in the drop down list. What could be the potential issue?

  • @kemosabe5598
    @kemosabe5598 2 года назад +1

    thanks, good content

  • @amjds1341
    @amjds1341 3 года назад

    Duno why people would dislike such a nice video

  • @vishalchavhan6731
    @vishalchavhan6731 4 года назад

    Can you share a way to create a generic python lambda function which will copy data from multiple s3 buckets to corresponding tables in redshift in such a way that my credentials are not visible to anyone

  • @chandanp3693
    @chandanp3693 2 года назад

    Thanks for the step by step guide. I have a question. If we follow this approach, the lambda will trigger every time a new file is created in s3. So, if we get 10 files simultaneously, it's going to invoke lambda 10 times. Is it possible to invoke lambda only once in such cases?

  • @jesicadcruz8656
    @jesicadcruz8656 2 года назад

    This video was helpful in understanding how Events work. I had a doubt- Is it possible to retrieve the file path of the object? How do we write a lambda function for storing the file path to a database that is triggered every time a put event happens in a bucket?

  • @sibamuns
    @sibamuns 2 года назад

    I want to trigger a lambda for a file that’s two layers down in the s3 bucket (I.e s3/folder1/folder2/my-file.txt). How do I configure the lambda to ingest only files put in folder2?

  • @ashwinraghunath8425
    @ashwinraghunath8425 2 года назад

    So I was able to trigger lambda based on S3:ObjectCreated by uploading a file. But what happens when the whole S3 bucket region is down? In this case, the file won't be uploaded to the bucket and won't trigger the lambda. In this case, what do we do to still process the file when the bucket region is down?

  • @leonidasgarcia887
    @leonidasgarcia887 2 года назад

    if i wanted to call that data variable outside the function to look at the data locally, how would that look like?

  • @marbas9215
    @marbas9215 4 года назад

    Thanks for the video! How would you add an email notification to this process when the file is processed to notify that it's processed or if there was some kind of error while processing?

    • @simonrous2191
      @simonrous2191 2 года назад

      maybe invoke SNS and then configure the email there?

  • @vijayyadav1002
    @vijayyadav1002 2 года назад

    i am trying to write an xml file into s3 bucket using lambda but it is timing
    out. Do you know what could be the reason?

  • @juanbenitez6565
    @juanbenitez6565 3 года назад

    Nice video, is it possible to do all that S3 config from a file?

  • @lalitda4011
    @lalitda4011 2 года назад +1

    @Be A Better Dev
    Hi, Thanks for the video. I have a question, let's say we have a application that let user upload a file which goes to S3 and the file is processed by lambda. So in the event when user A & user B uploads a file to S3 with a negligible time difference, which file is processed first file from user A or user B? so basically, does S3 maintains the order of file to pass it to lambda to work on? Please guide. Thanks.

    • @BeABetterDev
      @BeABetterDev  2 года назад

      Hi Lalit,
      Order would not be preserved - hope this clarifies!

  • @dineshkumar-xj7wp
    @dineshkumar-xj7wp 2 года назад

    How to configure email alert if daily routine based object is not uploaded in s3 object, can you help.

  • @arfatbagwan48
    @arfatbagwan48 2 месяца назад

    can s3 event send notification to lambda for get request ? if yes what is difference between using s3 event vs s3 object lambda ? please help me out

  • @charmikhambhati3014
    @charmikhambhati3014 4 года назад +1

    How to read metadata sent with the file that is uploaded

  • @user-lg3bs4et4o
    @user-lg3bs4et4o 8 месяцев назад

    How can i get the codes and materials you use such as: the transaction.json etc

  • @prabeshm8056
    @prabeshm8056 3 года назад

    Can i take "screenshot" when the snapshots are created ?

  • @dianaan2080
    @dianaan2080 3 года назад

    My lambda is triggering but i am getting error in getting the bucket name. Key error. What I have to do?

  • @iammrchetan
    @iammrchetan 2 года назад

    Can I get the complete code?
    I have tried the same python code in the lambda function, that just is giving errors at cloudwatch.

  • @TonyFraser
    @TonyFraser 2 года назад +1

    Thanks for this! I quickly fought through a couple of issues with imports and such, but woot this hit the spot! As soon as I got your demo running it was only another two hours for me to get the same running in sam. Now I just gotta figure out how to cloud formation that s3 event trigger into a sam CF template, and viola!

    • @BeABetterDev
      @BeABetterDev  2 года назад

      Glad I was able to help Tony. Sounds like you got quite a project on your hands!

  • @sanjayplays5010
    @sanjayplays5010 4 года назад +2

    It might be useful to have a video on how to use the Boto3 library. I find the documentation pretty unclear as to which option to use. For example, if I want to write to an S3 bucket, do I use resource, client, or bucket etc?

    • @BeABetterDev
      @BeABetterDev  4 года назад +1

      Great suggestion Sanjay, thank you!

  • @lokeshkarnam6521
    @lokeshkarnam6521 3 года назад

    Unable to find the phyton code in description, if you don't mind kindly share the code....

  • @LepiaDesu
    @LepiaDesu 3 года назад +1

    I am new to all of this, but is there a way to do all of this explained in the video via CLI?

    • @BeABetterDev
      @BeABetterDev  3 года назад

      Hi Leipa,
      Absolutely you can do all of this via CLI commands. It may be a bit tricky though since you will need to find the correct API calls to make and arguments to provide.
      Thanks for watching and stay safe!

  • @igorgarabajiv8322
    @igorgarabajiv8322 4 года назад

    Both S3-Lambda videos are good. What architecture you suggest when needing to "listen" to the s3 bucket (on local computer) and download content when a file is uploaded to s3 from another system?

    • @BeABetterDev
      @BeABetterDev  4 года назад +1

      Hi Igor,
      Couple ways of doing this. The most robust would be S3 Put Notification -> SNS Topic -> Invoke HTTP Endpoint on your local machine. This way, your local application can get invoked by the S3 Put event!

    • @igorgarabajiv8322
      @igorgarabajiv8322 4 года назад

      @@BeABetterDev will give it a try very soon. Thank you!

  • @anubhavbiswas4142
    @anubhavbiswas4142 3 года назад

    I get YOU ARE NOT ALLOWED TO EDIT THIS SETTINGS PLZ CONTACT UR ADMIN - while I try to save the notification event setup for the bucket... Plz help

  • @kudaykumar1261
    @kudaykumar1261 3 года назад +2

    Everything is fine but the final log event getting the error. "urllib" not defined

    • @hannahlivnat9698
      @hannahlivnat9698 3 года назад +1

      Had the same error and just added 'import urllib' to the top!

  • @deepakkumar-yl6nc
    @deepakkumar-yl6nc Год назад

    How we generate this log in s3 bucket not in cloudwatch

  • @davidespinoza2104
    @davidespinoza2104 4 месяца назад

    thank you! i've a question: what happens if i upload the same file? lambda is executed?

  • @elmirach4706
    @elmirach4706 4 года назад +2

    I started exactly the same but I didn't get CloudWatch and S3 as layers. How to add them?

    • @BeABetterDev
      @BeABetterDev  4 года назад

      Hi elmira,
      Are you certain you select a role with these permissions during lambda creation? If you select the 'create a new role' during setup it should set cloudwatch correctly.

    • @elmirach4706
      @elmirach4706 4 года назад +1

      @@BeABetterDev yes I'm sure although the whole view/console has changed now the resources/destination s etc have moved, so it's difficult to orient

  • @ChandraShekar-ts6nu
    @ChandraShekar-ts6nu 3 года назад

    Can you please share a way to create a python lambda function which will copy csv data from S3 bucket to corresponding tables in Postgres

    • @abcdleonix2345
      @abcdleonix2345 2 года назад

      By any chance did you figure out or have the python lambda function

  • @interviewdedo
    @interviewdedo 3 года назад

    can you please share me the json file link

  • @abhishekkumar-ahk
    @abhishekkumar-ahk 3 года назад

    I want to add cloud watch content to my rds table after Lambda is triggered. Is it possible and if yes How

    • @BeABetterDev
      @BeABetterDev  3 года назад

      Hi Abhishek,
      You would need to use one of the cloudwatch APIs to query the data from your lambda function, and write it into your RDS table.

  • @roshm4484
    @roshm4484 3 года назад

    Hi Bro,
    Can you cover terraform with s3 bucket.
    Thanks in advance.

  • @veelglorie
    @veelglorie 5 лет назад +1

    Good video, can you talk about multiples websites using aws?
    I mean, if I want to serve a coffee bar and a blog, I need to use only on CloudFront distribution, or generate multiples?
    I would like to see a video talking about serving multiples websites using aws, serverless prefer

    • @BeABetterDev
      @BeABetterDev  4 года назад +1

      Hi Lucas, thanks for the question. There are many ways to achieve this. The first thing that comes to mind is for you to use an EC2 Instance that hosts your applications. You can use the machine to host two separate stacks (LAMP, MEAN, or any other really). Here is an example (its a bit overkill in the redundancy category if you ask me), but it should help you get started: aws.amazon.com/getting-started/projects/launch-lamp-web-app/

    • @veelglorie
      @veelglorie 4 года назад

      @@BeABetterDev thanks hey!

  • @JohnJohnson-sf5wl
    @JohnJohnson-sf5wl 2 года назад +1

    I enjoyed your video. Can anyone help? I understood everything but I didn’t get where he edited the codes. Is that a lambda app? I mean the black screen on which he edited the codes. Is that a third party app ? Can anyone help?

    • @BeABetterDev
      @BeABetterDev  2 года назад

      Hi John the code is a lambda based application. Hope this helps clarify.

  • @pavanvarmamanthena9957
    @pavanvarmamanthena9957 3 года назад +1

    can you please do a video on how to trigger a glue etl job when a file is uploaded to s3?

    • @BeABetterDev
      @BeABetterDev  3 года назад +1

      Great idea for a video Pavan, I've added it to my backlog. Cheers.

    • @pavanvarmamanthena9957
      @pavanvarmamanthena9957 3 года назад

      @@BeABetterDev thank you so much

  • @manasupadhyay3883
    @manasupadhyay3883 3 года назад

    I am using the same code and tried with some other code I am getting an key error
    {
    "errorMessage": "'Records'",
    "errorType": "KeyError",
    "stackTrace": [
    [
    "/var/task/lambda_function.py",
    11,
    "lambda_handler",
    "bucket = event['Records'][0]['s3']['bucket']['name']"
    ]
    ]
    }
    Can you help me with any suggestion

    • @BeABetterDev
      @BeABetterDev  3 года назад

      Hi Manas, make sure you are changing the code to account for whatever your bucket name is.
      Daniel

    • @manasupadhyay3883
      @manasupadhyay3883 3 года назад

      @@BeABetterDev I am changing the code but it's not happening

  • @nisargpatel5718
    @nisargpatel5718 4 года назад

    Hello,
    Could you please make a video on aws lmabda function with firehose

  • @OsantoR
    @OsantoR 3 года назад +1

    I really like your videos, but do u think u can do tutorials with AWS CDK ?

  • @gwulfwud
    @gwulfwud 3 года назад +1

    you ever done serverless? can you do a tutorial on setting up a local environment with serverless?

  • @oguzynx
    @oguzynx 2 года назад

    where is the code ? where can we download it

  • @valentinoforever
    @valentinoforever 2 года назад

    Nice, however it's missing a fundamental part, get response from lambda into your app. I can't find the way...

  • @geisyalviarez6852
    @geisyalviarez6852 3 года назад +1

    Hello, could send the code please

  • @edmundchua2944
    @edmundchua2944 3 года назад

    where is the script? also you edited it, how do you expect a newbie to know what to do? thank U

  • @prishupanday318
    @prishupanday318 4 года назад

    download the object file to local system using lambda function.

    • @BeABetterDev
      @BeABetterDev  4 года назад

      Hi Prishu, thank you for the suggestion. I used it as a basis for my recent video about how to download a s3 file from lambda. See here: ruclips.net/video/6LvtSmJhVRE/видео.html
      Thanks again

  • @codeinglife
    @codeinglife Год назад

    Hello sir we have need S3 download request count my website

  • @johncollera
    @johncollera 2 года назад

    Do you prefer windows or mac?

    • @BeABetterDev
      @BeABetterDev  2 года назад +1

      Hi John,
      These days, I actually prefer Mac. I feel like I run in to less headaches when developing. Maybe I'm biased though since I use a Mac at work for most of my development.

  • @techwithraj11
    @techwithraj11 2 года назад

    name 's3' is not defined at line 10 in lambda function
    Anyone else got the same error?

    • @techwithraj11
      @techwithraj11 2 года назад +1

      got the answer from other videos in the series
      have to add
      import boto3
      s3 = boto3.client('s3') in the function at top

  • @saikumar255
    @saikumar255 5 месяцев назад

    try to provide data set so that we also will implement all

  • @billykovalsky8149
    @billykovalsky8149 4 года назад +1

    Doesn't work for me. There's no indication that I put a file in s3 at all. Nothing happens, no logs, nothing

    • @BeABetterDev
      @BeABetterDev  4 года назад

      Hi Billy.
      Are you sure you have precisely followed the instructions in the video including setting the appropriate IAM roles on your lambda?
      Thanks,

    • @billykovalsky8149
      @billykovalsky8149 4 года назад

      @@BeABetterDev Yes. For some reason when I changed 'PUT' to 'All object create event' on S3 trigger it started working. Still no clue why.

    • @holyproton8855
      @holyproton8855 4 года назад

      @billykovalsky You might need to attach a policy to your Lambda permissions.
      Steps after you created the permission to read items stored on the S3 bucket:
      1. Go to IAM and click Roles
      2. Click on the role name you created (I called my s3ReadOnly)
      3. Click attach policy and search for CloudWatchFullAccess and attach it to your role
      This should provide you with logs if an event is triggered.
      If your log group isn't created( Which it should automatically generate) you can create one yourself with the naming convention: /aws/lambda/

  • @mightye6669
    @mightye6669 2 года назад

    can you please include ur source code in future videos so I dont have to type too much. pls and thx

  • @TheReal_E.IRIZARRY
    @TheReal_E.IRIZARRY 2 года назад

    7:26 "Uh yah let's go over to the CloudWatch log screen". Totally omitting the click-bait "View logs in CloudWatch logs" button on the Lambda screen. 🤣🙃😂🤣 The OP just thought it was a torrent phishing button placed there by the graces of a rouge site claiming they are his online AWS Lambda console.

  • @gopiselvamrgs4726
    @gopiselvamrgs4726 2 года назад

    [ERROR] KeyError: 'Records'

  • @856shaileshkumar
    @856shaileshkumar 2 года назад +1

    Beware : reading and writing to the same S3 bucket can get your bill skyrocket . So just follow each step correctly or try not to upload the file again to s3.

  • @ivaturianil
    @ivaturianil 4 года назад

    Thanks for the video explanation. Getting below issue when running "test" for lambda code.
    {
    "errorMessage": "'Records'",
    "errorType": "KeyError",
    "stackTrace": [
    [
    "/var/task/lambda_function.py",
    14,
    "lambda_handler",
    "bucket = event['Records'][0]['s3']['bucket']['name']"
    ]
    ]
    }

    • @BeABetterDev
      @BeABetterDev  4 года назад

      Hi Anil,
      You could possibly print out the 'event' object at the start of your lambda function to see if it contains any records. This may help you debug the problem.

    • @ivaturianil
      @ivaturianil 4 года назад

      @@BeABetterDev ​ When i upload the file, it is working fine. The actual issue is - I am trying to test the function using "test". Thanks for your quick reply.

    • @BeABetterDev
      @BeABetterDev  4 года назад +1

      Hi Anil, in order for that to work you need to provide an input to the test event that is similar to the one that triggers the lambda. This may require some research on your end.
      Alternatively you can hook up your bucket to the lambda and upload/delete/re-upload the file to trigger the lambda.

    • @ivaturianil
      @ivaturianil 4 года назад

      @@BeABetterDev Thank You. I just got to know that. FYIP, This is my first Lambda function.

  • @raym6415
    @raym6415 2 года назад

    I am not sure why you made a video if we are supposed to guess what write or click on!! What is that? Lowest quality video setting on your computer?!😅

  • @balajivelaga3088
    @balajivelaga3088 2 года назад

    Where is the Code, you could have put in the github.

    • @Bayashat2002
      @Bayashat2002 Месяц назад

      import urllib.parse
      import json
      import boto3
      # Initialize S3 client
      s3 = boto3.client('s3')
      def lambda_handler(event, context):
      # 1. Get the bucket name
      bucket = event['Records'][0]['s3']['bucket']['name']

      # 2. Get the file/key name
      key = urllib.parse.unquote_plus(event['Records'][0]['s3']['object']['key'], encoding='utf-8')

      try:
      # 3. Fetch the file from S3
      response = s3.get_object(Bucket=bucket, Key=key)

      # 4. Deserialize the file's content
      text = response["Body"].read().decode()
      data = json.loads(text)

      # 5. Print the content
      print(data)

      # 6. Parse and print the transactions
      transaction = data["transactions"]
      for record in transaction:
      print(f"Transaction Type: {record['transType']}")
      return "Success!"
      except Exception as e:
      print(e)
      raise e

  • @greendsnow
    @greendsnow 2 года назад

    Lacks intuition so hard, kills brain cells trying to figure out.

  • @greendsnow
    @greendsnow 2 года назад

    extremely complicated compared to GCS

  • @D.horiz0n
    @D.horiz0n 3 года назад +2

    LIES

  • @shuvo3001
    @shuvo3001 2 года назад +2

    The python code with some which worked for me.
    import json
    import urllib
    import boto3
    def lambda_handler(event, context):
    # TODO implement
    bucket = event['Records'][0]['s3']['bucket']['name']
    key = urllib.parse.unquote_plus(event['Records'][0]['s3']['object']['key'], encoding='utf-8')
    try:
    s3 = boto3.client('s3')
    response = s3.get_object(Bucket=bucket,Key=key)
    text = response["Body"].read().decode()
    data = json.loads(text)
    print(data)
    transactions = data['transactions']
    for record in transactions:
    print(record['transType'])
    return 'Success!'
    except Exception as e:
    print(e)
    raise e

    • @Bayashat2002
      @Bayashat2002 Месяц назад

      yes, for now, we should explicitly import the needed libraries.