AWS Lambda : load JSON file from S3 and put in dynamodb

Поделиться
HTML-код
  • Опубликовано: 26 июл 2024
  • www.udemy.com/course/masterin...
    for online/classroom trainings contact +91988661111
    join udemy course Mastering AWS CloudFormation www.udemy.com/mastering-aws-c...

Комментарии • 122

  • @JavaHomeCloud
    @JavaHomeCloud  5 лет назад +1

    contact us online classroom training's & project support please contact phone number +919886611117

    • @gittin_funky
      @gittin_funky 5 лет назад +1

      why do you need to upload the json to an s3 bucket ? is it possible to upload the json directly to the dynamodb. Apologies i do not know much about AWS as im just learning

    • @CodySMitchell
      @CodySMitchell 5 лет назад

      I am getting an error of....
      [ERROR] ClientError: An error occurred (ValidationException) when calling the PutItem operation: One or more parameter values were invalid: Missing the key object_id in the item
      Traceback (most recent call last):
      File "/var/task/lambda_function.py", line 14, in lambda_handler
      table.put_item(Item = jsonDict)
      File "/var/runtime/boto3/resources/factory.py", line 520, in do_action
      response = action(self, *args, **kwargs)
      File "/var/runtime/boto3/resources/action.py", line 83, in __call__
      response = getattr(parent.meta.client, operation_name)(**params)
      File "/var/runtime/botocore/client.py", line 320, in _api_call
      return self._make_api_call(operation_name, kwargs)
      File "/var/runtime/botocore/client.py", line 623, in _make_api_call
      raise error_class(parsed_response, operation_name)

  • @mridulbansal8590
    @mridulbansal8590 6 лет назад +1

    The way you explain makes it appear that it is a piece of cake. Looking forward for more of these.

  • @whyCHANNELwhy
    @whyCHANNELwhy 4 года назад +4

    This is one of the cleanest solutions I have seen

  • @davehaughton
    @davehaughton 4 года назад +2

    Thanks very much, was very clear and easy to follow, definitely a concept that any one wanting to write to dynamodb from a json file should follow. great stuff.

  • @mikewright4967
    @mikewright4967 6 лет назад

    Thank you! I've been trying to split my s3 object all day! Lifesaver

  • @andrewiandreleonardo1765
    @andrewiandreleonardo1765 2 года назад +1

    I know this is 3 years ago but pls watch this, this save my whole semester :))

  • @hkmehandiratta
    @hkmehandiratta 3 года назад +3

    Thanks for uploading. The tutorial explains concepts very clearly. I believe its AWS which has modified the event schema. I had to modify 2 lines as on date to get the bucket name and json file name. ie. bucket = event['Records'][0]['s3']['bucket']['name']
    json_file_name = event['Records'][0]['s3']['object']['key'].

    • @ronnic1
      @ronnic1 3 года назад +1

      Can't thank you enough for posting the correction

    • @hkmehandiratta
      @hkmehandiratta 3 года назад +1

      Dear @@ronnic1 , thanks for taking out time and sharing your knowledge. Also, thank you for acknowledging my bit of efforts. Regards

  • @kamleshjoshi4419
    @kamleshjoshi4419 5 лет назад

    Good Explanation.. Thanks for sharing the knowledge !!

  • @reshaknarayan3944
    @reshaknarayan3944 6 лет назад +1

    Crystal clear. Please Upload a video to Import multiple JSON into Dynamo DB

  • @kfrankola4
    @kfrankola4 5 лет назад +2

    AWESOME VIDEO! extremely clear and well informed instruction. thank you!

  • @wilsoncheng6134
    @wilsoncheng6134 4 года назад +1

    Very good and clear! Very easy to follow !

  • @prannoyroy5312
    @prannoyroy5312 4 года назад +1

    Good tutorial and use case shown👍

  • @subha9333
    @subha9333 3 года назад +1

    You are a God sent... I am very grateful to you for clearing so many concepts together...

  • @DeiseZen
    @DeiseZen 2 года назад +1

    You are so good at explaining things! Very grateful for your generosity!

  • @AhumadaMauricio
    @AhumadaMauricio 2 года назад +1

    Thanks for the video. Easy to follow and worked as advertised. Thanks

  • @kofio7581
    @kofio7581 4 года назад +1

    Really great explanation!

  • @pranabsarkar
    @pranabsarkar 4 года назад +1

    Thanks for the nice explanation!

  • @ronakkataria485
    @ronakkataria485 6 лет назад

    Thanks you sir. That was very helpful.

  • @multitaskprueba1
    @multitaskprueba1 2 года назад

    Fantastic video! Thank you so much! You are a genius!

  • @mayukhg
    @mayukhg 5 лет назад

    Video is of great help....thx

  • @tristansmith5267
    @tristansmith5267 3 года назад +2

    Thank you so much! This was such a clear and informative video and really helped me with my project.

  • @divyatirthadas
    @divyatirthadas 4 года назад +1

    Great information, how would I use Athena here to read from S3 & send that information to Aurora ? Also how would you handle multiple records

  • @hariprasadshetty6296
    @hariprasadshetty6296 4 года назад

    Hello,
    crystal clear explanation. Thank you for this.
    Do you have video series on Lamda function with python!

  • @andynelson2340
    @andynelson2340 3 года назад +1

    I got it to work! Thanks.

  • @shufanxing
    @shufanxing 3 года назад +1

    Thank you very much. It is very clear.

  • @RashaadFontenot
    @RashaadFontenot 6 лет назад

    Thank you

  • @vishwasgupta3180
    @vishwasgupta3180 5 лет назад

    You are awesome

  • @anasbaligh2
    @anasbaligh2 5 лет назад

    This work for local json file , what's of I need to do that for On-line json file and API ?

  • @DylanHillard
    @DylanHillard 2 года назад

    When you create the lambda function, your destinations to CloudWatch, DynamoDB, and S3 auto-populate. When I create my function they are not included in the flowchart and my data doesn't transmit. How do you have those destinations populate on the creation of the function?

  • @jvalal
    @jvalal 5 лет назад

    How is the data getting into the Employees DB when we didn't create it?

  • @RanganathPanibhathe
    @RanganathPanibhathe 3 года назад

    Any suggestion on how this lambda can be implemented in Java as well? That'd be great :)

  • @navyashree5887
    @navyashree5887 2 года назад

    Hi Hari, I have one question if I want to trigger Terraform script to create a ecs cluster for new products automatically using lambda, SNS topic message as a input parameter into lambda function how can I do it ?

  • @srinivasnara8708
    @srinivasnara8708 4 года назад +1

    How can we insert bulk data json file to dynamo db

  • @virgin1785
    @virgin1785 6 лет назад +3

    I would also like to know how to insert multiple rows from a json file into dynamodb from S3 using similar lambda function

    • @yuanmiau
      @yuanmiau 5 лет назад +1

      Some solution? In mi case have un flat file with a size of 7000 records. How can load with calls minimum to DynamoDB?

  • @cazador_0454
    @cazador_0454 6 лет назад +2

    If anyone is still looking for a way to loop and persist:
    for line in jsonDict:
    print(line)
    table.put_item(Item=line)

    • @sutrangi
      @sutrangi 4 года назад

      getting Extra data error;
      Extra data: line 2 column 1 (char 521): JSONDecodeError
      Traceback (most recent call last):
      File "/var/task/lambda_function.py", line 12, in lambda_handler
      jsonDict = json.loads(jsonFileReader)
      File "/var/lang/lib/python3.6/json/__init__.py", line 354, in loads
      return _default_decoder.decode(s)
      File "/var/lang/lib/python3.6/json/decoder.py", line 342, in decode
      raise JSONDecodeError("Extra data", s, end)
      json.decoder.JSONDecodeError: Extra data: line 2 column 1 (char 521)

    • @nishgupta29
      @nishgupta29 4 года назад

      Hey were you able to resolve this ?

  • @igoralves1
    @igoralves1 4 года назад +1

    Today (dec-2019) seems that things are not the same in AWS. Followed the tutorial but I cant see the list of rerouces on te riigth side of the lambda function, like in 5:21. What is going on? Any idea∑

  • @sushilmenon5644
    @sushilmenon5644 5 лет назад +2

    constantly get the error in cloudwatch even though replicating the exact steps and code. not sure what is wrong out here.
    table.put_Item(Item=jsonDict)
    AttributeError: 'dynamodb.Table' object has no attribute 'put_Item'

    • @StratosBotsaris
      @StratosBotsaris 3 года назад

      import boto3
      dynamodb = boto3.resource('dynamodb')
      table = dynamodb.Table('employees')
      table.put_item(Item=jsonDict)

  • @thabisanisibanda6930
    @thabisanisibanda6930 4 года назад

    Thanks man

  • @GloDBSec
    @GloDBSec 6 лет назад

    Very good. Thx a lot. Question: My Bucket is in US-West and apparently my Lambda Function is in US-East. Apparently, both must be in the same region. I can't find the way to assign my function to US-West. Any idea ? Thx

    • @JavaHomeCloud
      @JavaHomeCloud  6 лет назад

      GloDBSec
      In lambda code while getting s3 client mention the region of your s3 bucket , hopefully it should work.

  • @jeissonmartinez154
    @jeissonmartinez154 5 лет назад

    Who is the limit form rows?

  • @walidsliti
    @walidsliti 6 лет назад

    can you please provide me the code to load csv files from s3 to dynamo db thanks

  • @mrlive221
    @mrlive221 5 лет назад +1

    hi great work! , i have 1 question like i want to store data from exteranal api in aws and want show some data in view how it is possible can you help me?

    • @JavaHomeCloud
      @JavaHomeCloud  5 лет назад

      Can you add more to your question

    • @mrlive221
      @mrlive221 5 лет назад

      @@JavaHomeCloud , i want to fetch data from different external api and want to store it in dynamodb or any other database so how i can do it?i will have to use lambda with S3 or how it is possible?

  • @DavisTibbz
    @DavisTibbz 3 года назад

    Wheeeew! Why so complex compared to other cloud solutions??

  • @mallikarjunsangannavar907
    @mallikarjunsangannavar907 Год назад

    Any idea on how to do the same thing using YML file?

  • @ravitiwary9993
    @ravitiwary9993 6 лет назад

    I am getting below error
    n error occurred (ValidationException) when calling the PutItem operation: One or more parameter values were invalid: Type mismatch for key line expected: S actual: M: ClientError
    Traceback (most recent call last):
    File "/var/task/lambda_function.py", line 13, in lambda_handler
    table.put_item(Item=jsonDict)
    File "/var/runtime/boto3/resources/factory.py", line 520, in do_action
    response = action(self, *args, **kwargs)
    File "/var/runtime/boto3/resources/action.py", line 83, in __call__
    response = getattr(parent.meta.client, operation_name)(**params)
    File "/var/runtime/botocore/client.py", line 314, in _api_call
    return self._make_api_call(operation_name, kwargs)
    File "/var/runtime/botocore/client.py", line 612, in _make_api_call
    raise error_class(parsed_response, operation_name)
    botocore.exceptions.ClientError: An error occurred (ValidationException) when calling the PutItem operation: One or more parameter values were invalid: Type mismatch for key line expected: S actual: M

    • @JavaHomeCloud
      @JavaHomeCloud  6 лет назад

      Its expecting String and you are passing Map.
      "more parameter values were invalid: Type mismatch for key line expected: S actual: M"
      Verify your code

  • @vijayanand6854
    @vijayanand6854 4 года назад

    i am getting error
    {
    "errorMessage": "'str' object has no attribute 'loads'",
    "errorType": "AttributeError",
    "stackTrace": [
    " File \"/var/task/lambda_function.py\", line 12, in lambda_handler
    jsonDict = jsonFileReader.loads('jsonFileReader')
    "
    ]
    }

  • @sekharcnanda
    @sekharcnanda 4 года назад

    Good video.
    I have a question. what if the file is huge in size, so lambda can not process in 15minutes, it will give timeout error. then how can we process and save the file information into DB?

    • @JavaHomeCloud
      @JavaHomeCloud  4 года назад

      We have to choose different approach for example
      1. AWS EMR
      2. Run on EC2
      3. AWS Batch,
      4. etc...

  • @walidsliti
    @walidsliti 6 лет назад

    what we need to change if we want to import csv files from s3 to dynamodb

    • @JavaHomeCloud
      @JavaHomeCloud  6 лет назад +1

      Walid Sliti
      we have to add csv python module to the code and read data from csv file

    • @walidsliti
      @walidsliti 6 лет назад

      i tried but it doesn't work

  • @smpa3481
    @smpa3481 3 года назад

    Can you please provide me information for how to transfer data from DynamoDB to S3 using lambda?

  • @geonjulee3641
    @geonjulee3641 4 года назад +1

    I set the permissions the same way, but the destination is empty. How can I add it?

  • @hollyzhang6491
    @hollyzhang6491 3 года назад

    Thanks for your sharing! I follow those step by step, all the log info is right. But I still can not see the data dynamodb. why is that?

  • @yuanmiau
    @yuanmiau 5 лет назад +1

    In mi case have un flat file with a size of 7000 records. How can load with calls minimum to DynamoDB?

  • @mandalarian
    @mandalarian 5 лет назад +1

    json example at 8:46

  • @dosani786
    @dosani786 5 лет назад

    getting error object not defined on this line json_obj = s3_client.get_object(Bucket=bucketname,Key=filename)
    print(Json_obj)

  • @mlsivaprasad
    @mlsivaprasad 6 лет назад

    I am unable to find the logs in cloud watch. is there any additional configuration required?

    • @JavaHomeCloud
      @JavaHomeCloud  6 лет назад +1

      Sivaprasad ML
      no additional configuration required, check your lambda has permissions to send logs to cloudwatch

    • @michalpesicka8152
      @michalpesicka8152 5 лет назад

      It helped me to assign the role CloudWatchLogsFullAccess policy.

  • @nishgupta29
    @nishgupta29 4 года назад

    Is anyone else getting " [ERROR] KeyError: 'Records" while running this?
    Can anyone help please? Code is exactly similar as in video.

    • @electronlibre4163
      @electronlibre4163 4 года назад

      check your IAM policy and role, looks like lambda could not connect to S3

    • @StratosBotsaris
      @StratosBotsaris 3 года назад +1

      At your lamdba function page, you need to click on the drop down of the button **Test** and choose to **Configure test event.** Over there you need to create a new test event with the following data inside:
      {
      "Records": [
      {
      "s3": {
      "bucket": {
      "name": "your-bucket",
      "arn": "arn:aws:s3:::your-bucket"
      },
      "object": {
      "key": "your_data.json"
      }
      }
      }
      ]
      }
      where:
      1. *your-bucket* is the name of the bucket you have created in S3
      2. *your_data.json* is a json file that you have uploaded in your S3 bucket

  • @nikhilgupta4995
    @nikhilgupta4995 5 лет назад

    I am getting error while put the dictionary in the dynamodb
    b'{
    "emp_id":"3",
    \t"Name":"Hari",
    \t"Age""26,
    \t"Location":["USA"]
    }'
    error:- Expecting ':' delimiter: line 4 column 7 (char 42)

    • @LokanathV
      @LokanathV 5 лет назад +1

      ':' is missing after "Age" in your JSON file

  • @msharimca
    @msharimca 5 лет назад

    can you please let me know how to load data to redshift from s3, here showing dynamo DB i want same thing videos in redshift. kindly help us

    • @JavaHomeCloud
      @JavaHomeCloud  5 лет назад

      Hi Hari, Multiple options are there for your requirement, either use Lambda or Glue

  • @srigiri8702
    @srigiri8702 3 года назад +1

    is it possible to upload CSV file instead of jason

  • @bmaxx9558
    @bmaxx9558 6 лет назад

    Is there any way to copy zip files of cloudtrail logs from s3 to dynamodb?

    • @JavaHomeCloud
      @JavaHomeCloud  6 лет назад

      Yes

    • @bmaxx9558
      @bmaxx9558 6 лет назад

      Would you please record a session on how to unzip and read cloud trail log files in s3 if possible.
      Thnaks in advance

  • @kirankumarkavali3897
    @kirankumarkavali3897 6 лет назад

    In above video he showed us how to insert only one item.but I have to multiple items.how can we do that.please help me how to do this through iteration.

    • @vishwasgupta3180
      @vishwasgupta3180 5 лет назад

      1. Iterate those objects using a for loop and insert each one OR
      2. Try to find if there is any method load_many like put_item

    • @yuanmiau
      @yuanmiau 5 лет назад

      I need load many record (5000 records) in one request only..... is possible?

    • @nishgupta29
      @nishgupta29 4 года назад

      Hey did you get through this ? Getting the same issue

  • @rohitveeradhi
    @rohitveeradhi 4 года назад

    Is this code only for a single entry or multiple entries in json file?

    • @JavaHomeCloud
      @JavaHomeCloud  4 года назад

      This is for single entry

    • @rohitveeradhi
      @rohitveeradhi 4 года назад

      @@JavaHomeCloud Can u please send the code for multiple entries??

  • @akashdeepmunjal7398
    @akashdeepmunjal7398 4 года назад

    I followed the same but I am getting this error:
    File "/var/task/lambda_function.py", line 6, in lambda_handler
    bucket = event['Records'][0]['s3']['bucket']['name']
    KeyError: 'Records'

    • @nishgupta29
      @nishgupta29 4 года назад

      Getting the same error, were you able to figure it out?

    • @luizmt2
      @luizmt2 4 года назад

      Same error here..
      I found out that the problem is related to the bucket name and file name When I explicitly inform both it works fine. I gotta fix this problem in order to get the file names dynamically.
      here are the edited lines:
      bucket = 'my-bucket-name' #event['Records'][0]['s3']['bucket']['name']
      json_file_name = 'my-file-name.json' #event['Records'][0]['object']['key']

    • @StratosBotsaris
      @StratosBotsaris 3 года назад

      At your lamdba function page, you need to click on the drop down of the button **Test** and choose to **Configure test event.** Over there you need to create a new test event with the following data inside:
      {
      "Records": [
      {
      "s3": {
      "bucket": {
      "name": "your-bucket",
      "arn": "arn:aws:s3:::your-bucket"
      },
      "object": {
      "key": "your_data.json"
      }
      }
      }
      ]
      }
      where:
      1. *your-bucket* is the name of the bucket you have created in S3
      2. *your_data.json* is a json file that you have uploaded in your S3 bucket

  • @kiit-mech-mech-pz8vd
    @kiit-mech-mech-pz8vd 4 года назад

    Getting the below error
    Parameter validation failed:
    Missing required parameter in input: "Key"
    Unknown parameter in input: "key", must be one of: Bucket, IfMatch, IfModifiedSince, IfNoneMatch, IfUnmodifiedSince, Key, Range, ResponseCacheControl, ResponseContentDisposition, ResponseContentEncoding, ResponseContentLanguage, ResponseContentType, ResponseExpires, VersionId, SSECustomerAlgorithm, SSECustomerKey, SSECustomerKeyMD5, RequestPayer, PartNumber: ParamValidationError
    Traceback (most recent call last):
    File "/var/task/lambda_function.py", line 8, in lambda_handler
    json_object = s3_client.get_object(Bucket=bucket, key=json_file_name)
    File "/var/runtime/botocore/client.py", line 272, in _api_call
    return self._make_api_call(operation_name, kwargs)
    File "/var/runtime/botocore/client.py", line 549, in _make_api_call
    api_params, operation_model, context=request_context)
    File "/var/runtime/botocore/client.py", line 597, in _convert_to_request_dict
    api_params, operation_model)
    File "/var/runtime/botocore/validate.py", line 297, in serialize_to_request
    raise ParamValidationError(report=report.generate_report())
    botocore.exceptions.ParamValidationError: Parameter validation failed:
    Missing required parameter in input: "Key"
    Unknown parameter in input: "key", must be one of: Bucket, IfMatch, IfModifiedSince, IfNoneMatch, IfUnmodifiedSince, Key, Range, ResponseCacheControl, ResponseContentDisposition, ResponseContentEncoding, ResponseContentLanguage, ResponseContentType, ResponseExpires, VersionId, SSECustomerAlgorithm, SSECustomerKey, SSECustomerKeyMD5, RequestPayer, PartNumber
    Why this error is coming and how to resolve it ?

  • @codercoder2697
    @codercoder2697 4 года назад

    I know ur name sir Hari...

  • @HappyHour2024
    @HappyHour2024 3 года назад

    hello sir i m facing an error , while creating IAM policy , the error is " parser error message " an i m not able to create policy ,
    here is the application
    aws.amazon.com/blogs/machine-learning/build-your-own-text-to-speech-applications-with-amazon-polly/
    I'll be very thankful for your help , plzz do help sir

  • @reddyeswar8014
    @reddyeswar8014 3 года назад

    Hello
    I need python along with AWS services