Stream AWS DynamoDB data to AWS OpenSearch Index using DynamoDB Lambda Triggers

Поделиться
HTML-код
  • Опубликовано: 7 сен 2024
  • This video demonstrates how to setup DynamoDB lambda triggers to push data to OpenSearch index.
    DynamoDB table setup: • AWS DynamoDB Streams |...
    OpenSearch Kibana Domain: • AWS OpenSearch | Creat...
    GitHub repository: github.com/lis...

Комментарии • 20

  • @rajeshsahu605
    @rajeshsahu605 Год назад +1

    Thank you so much for sharing this video! I particularly enjoyed the way you explained concept. Your examples really helped me understand it better. Thank you again for sharing your knowledge and expertise!
    I have couples of question
    1. Is it mandatory to create a domain for open search because we are pushing the data using lamda
    2. as we are using API gateway how we can access from postman to hit the API to see the Results
    Could you possibly cover above in a future video? Thank you again for the great content!

    • @listentolearn2363
      @listentolearn2363  Год назад

      Glad it was helpful. Thanks for your support!
      1. Yes, it is mandatory to create an open search domain for this use case as we are trying to push the data to open search.
      2. Sure, I will post a video on api gateway and how to access it using postman soon.

  • @tacowilco7515
    @tacowilco7515 6 месяцев назад

    This is awesome tutorial. Very useful. Thanks!

  • @ameetsings
    @ameetsings 11 месяцев назад

    One question, in opensearch any document is uniquely identified by _id. How to set it by our combination of fields so that no duplic records are created?

    • @listentolearn2363
      @listentolearn2363  10 месяцев назад

      Easiest way is to use a concatenation of hash and sort keys of the record which will always be unique.

  • @yichenwang8008
    @yichenwang8008 11 месяцев назад

    Would you please list the python library installed? I suggest in the github zip ,remove the boto and botocore, they are too big ,making unreadable in lambda console, besides, t hey are installed by default in lambda no need to install again.

    • @listentolearn2363
      @listentolearn2363  10 месяцев назад

      aws_request_auth
      boto3
      botocore
      certifi
      charset_normalizer
      dynamodb_json
      elasticsearch
      idna
      jmespath
      requests
      s3transfer
      simplejson
      six.py
      urllib3

  • @MrVinaysharma0
    @MrVinaysharma0 9 месяцев назад +1

    where is the python code?

    • @listentolearn2363
      @listentolearn2363  9 месяцев назад

      github.com/listentolearn/aws-opensearch-dynamodb-stream

  • @arunverma6384
    @arunverma6384 Год назад

    Getting error 'PROBLEM: Function call failed' for 'stream-lambda' on each table modification(insert, update, delete)

    • @listentolearn2363
      @listentolearn2363  Год назад

      Hi Arun, hope you updated the lambda environmental variable values to suit your use case? could you please share the cloudwatch logs, that should give you more details?

    • @gaganpreetsingh5956
      @gaganpreetsingh5956 Год назад

      I am facing the same issue and i have followed every step you did even declared correct environmental variables as well
      RequestError: RequestError(400, 'illegal_argument_exception', 'Action/metadata line [1] contains an unknown parameter [_type]') Traceback (most recent call last): File "/var/task/lambda_function.py", line 78, in lambda_handler pushBatch(actions) File "/var/task/lambda_function.py", line 45, in pushBatch (success, failed) = helpers.bulk(elasticsearch, actions, stats_only=True) File "/var/task/elasticsearch/helpers/actions.py", line 314, in bulk for ok, item in streaming_bulk(client, actions, *args, **kwargs): File "/var/task/elasticsearch/helpers/actions.py", line 235, in streaming_bulk for data, (ok, info) in zip( File "/var/task/elasticsearch/helpers/actions.py", line 130, in _process_bulk_chunk raise e File "/var/task/elasticsearch/helpers/actions.py", line 126, in _process_bulk_chunk resp = client.bulk("
      ".join(bulk_actions) + "
      ", *args, **kwargs) File "/var/task/elasticsearch/client/utils.py", line 101, in _wrapped return func(*args, params=params, **kwargs) File "/var/task/elasticsearch/client/__init__.py", line 1576, in bulk return self.transport.perform_request( File "/var/task/elasticsearch/transport.py", line 402, in perform_request status, headers_response, data = connection.perform_request( File "/var/task/elasticsearch/connection/http_requests.py", line 186, in perform_request self._raise_error(response.status_code, raw_data) File "/var/task/elasticsearch/connection/base.py", line 253, in _raise_error raise HTTP_EXCEPTIONS.get(status_code, TransportError)(

    • @listentolearn2363
      @listentolearn2363  11 месяцев назад

      you might be using a different version of elaticsearch@@gaganpreetsingh5956

  • @yichenwang8008
    @yichenwang8008 Год назад

    So even lambda is in a VPC, the DDB stream still can reach it??? Cool! But how? DDB is not in any VPC, how will it transfer json to a lambda in VPC?

  • @louisperianayagam5230
    @louisperianayagam5230 Год назад

    very nice video but i am getting the below error
    {
    "errorMessage": "'Records'",
    "errorType": "KeyError",
    "stackTrace": [
    " File \"/var/task/lambda_function.py\", line 52, in lambda_handler
    records = event['Records']
    "
    ]
    }

    • @listentolearn2363
      @listentolearn2363  Год назад

      Hi Louis,
      Records is one of the mandatory objects within dynamodb stream event.
      Could you please try printing the event itself with the lambda function to see whats actually in the event?
      print(event) before line 52.

    • @louisperianayagam5230
      @louisperianayagam5230 Год назад

      @@listentolearn2363 after giving print statement getting the below error
      {
      "errorMessage": "Unable to import module 'lambda_function': No module named 'lambda_function'",
      "errorType": "Runtime.ImportModuleError",
      "stackTrace": []
      }

    • @listentolearn2363
      @listentolearn2363  Год назад

      Please make sure that you are zipping the contents of the directory and not the directory itself.