AWS Data Migration Service (DMS) // MySQL to S3

Поделиться
HTML-код
  • Опубликовано: 10 сен 2024

Комментарии • 29

  • @KahanDataSolutions
    @KahanDataSolutions  3 года назад

    ►► The Starter Guide for Modern Data (Free PDF) → www.kahandatasolutions.com/startermds

  • @richardkoranteng9337
    @richardkoranteng9337 2 года назад +3

    Straight to the point, informative, and easy to consume content. Thanks

  • @josepaulo9054
    @josepaulo9054 Год назад

    congratulations on the initiative

  • @singhsandeep
    @singhsandeep 3 года назад +5

    Hey Kahan, very informative video. I had few doubts. If I am using DMS for CDC then each change will cause a new object in S3, thereby raising costs (beacuse of huge s3 writes). Is there a way to mitigate this?

  • @denysmurynka
    @denysmurynka Год назад

    damn, it helps me a lot
    too easy! thanks

  • @prasann26
    @prasann26 Год назад

    AWESOME Sir !!!

  • @Alwinlcw
    @Alwinlcw Год назад

    Hi Kahan, any idea where is the binary file located and if there is cases whereby changes to the source doesnt capture in the DMS. Where will be the place to troubleshoot it.

  • @hifsamalik2942
    @hifsamalik2942 Год назад

    how can i use this way to store all the updates changes in one csv file (the updated result) rather than storing just the updates in individual csv's?

  • @josepaulo9054
    @josepaulo9054 Год назад

    Good afternoon, I have a question about the DMS.
    I have a source table with 50 columns and I want to generate a table with only 10 columns at the destination, can I configure this within the DMS?

  • @seqthomas1345
    @seqthomas1345 2 года назад

    Is there a free version of setting this up for small databases? Even the cheapest option costs ~$150 a year. ($0.018 an hour)

  • @cheapthrills1824
    @cheapthrills1824 3 года назад +1

    Legend🔥🤜🤛

  • @vishalrajmane7649
    @vishalrajmane7649 3 года назад

    My dms source endpoint is getting failed due to error : cannot connect to ODBC provider ODBC general error

  • @hoangminhninh9133
    @hoangminhninh9133 2 года назад

    Thanks for your video, if my mysql instance is On-premise, any solution to move the data to s3 like this?

  • @dimmycentury
    @dimmycentury 2 года назад

    How do you handle performance issues impacting your db when you have alot of transactions replicating to s3

    • @KahanDataSolutions
      @KahanDataSolutions  2 года назад

      You could ramp up your database specs to give it more memory/compute power. Or perhaps switch from event replication via DMS to a more batch loading process where it can replicate data during times of less activity (ex. overnight).

  • @williamlinck9971
    @williamlinck9971 Год назад

    Hi Kahan, amazing content. How would you feel about making a video setting up DMS in Terraform? In my experience, companies tend to use Terraform codes as a way of organizing all the stack, making this platform settings really uncommon in real life. I've got some codes examples if you want, this is a content that is not found on RUclips, so I think it would be great for your channel as well.

  • @iamdare
    @iamdare 2 года назад

    thanks for the video. please how did you create that RDS Endpoint?

  • @richa6695
    @richa6695 2 года назад +1

    Can you use DMS for business logic(for a specific use case) as well or should only be used for data migration

    • @KahanDataSolutions
      @KahanDataSolutions  2 года назад +1

      Hi Richa - I'm not 100% sure if this is possible. However, I would still recommend keeping this focused only on data migration.
      Adding logic at this step will add more complexity to your overall architecture.
      You may end up having logic happening in various places, making it difficult for other developers/stakeholders to put the pieces together.

    • @richa6695
      @richa6695 2 года назад +1

      @@KahanDataSolutions Thanks for the quick reply. I have a use case where I need to capture the data changes in Postgres DB and push the messages to SQS. This use can easily be achieved with debezium and Kafka but is there any alternative to do the same thing with AWS services preferably ? Like pushing data changes in Postgres to SQS?

    • @KahanDataSolutions
      @KahanDataSolutions  2 года назад

      ​@@richa6695 Perhaps you could use Lambda functions. The trigger could be the migration and then then the end point would be something in SQS.
      I have not tried this personally, but that's my first thought.
      Let me know if you end up implementing a solution!

  • @dariswanjanwerip.2210
    @dariswanjanwerip.2210 3 года назад

    Can you give list of service, policy, and roles that is used ?
    I want to implement this migration using IAM user

    • @KahanDataSolutions
      @KahanDataSolutions  3 года назад

      This video will walk you through the permissions that were used for this - ruclips.net/video/9n28d8ezrLQ/видео.html

  • @petergriffin513
    @petergriffin513 Год назад

    What's your monitor's refresh rate?

    • @KahanDataSolutions
      @KahanDataSolutions  Год назад

      I'm not sure (sorry!).
      But I'm a huge Family Guy fan. Can't believe Peter Griffin is also an aspiring data engineer!

    • @petergriffin513
      @petergriffin513 Год назад

      @@KahanDataSolutions lol, no worries.

  • @vishalrajmane7649
    @vishalrajmane7649 3 года назад

    Can u help me with this

    • @KahanDataSolutions
      @KahanDataSolutions  3 года назад

      Hi, unfortunately I won't be able to help you here. This sounds like an issue specific to your set-up. Try searching on Google or other forums for the same error. Good luck!