Migration of On Premise Oracle database to AWS RDS using Oracle Datapump and S3 integration

Поделиться
HTML-код
  • Опубликовано: 25 ноя 2024

Комментарии • 84

  • @biggieboi358
    @biggieboi358 2 года назад +1

    ur channel is underrated. keep posting. good job!

    • @InfraXpertzz
      @InfraXpertzz  2 года назад +1

      Glad to see such nice comments. Please share within your circle and support us

  • @RAJ-dn2ox
    @RAJ-dn2ox 4 года назад +1

    Excellent real time migration video.

    • @InfraXpertzz
      @InfraXpertzz  4 года назад

      Thanks a lot for the nice comments. Please subscribe, like and share to support us

  • @ManojYadav-lw8tu
    @ManojYadav-lw8tu 4 года назад +1

    Very Good Explanation .

    • @InfraXpertzz
      @InfraXpertzz  4 года назад

      Thanks for the nice review. Please do check our AWS RDS playlist for more videos. Also request you to subscribe and share our channel to support us

  • @mojowinskyibero8161
    @mojowinskyibero8161 2 года назад +1

    Thanks

    • @InfraXpertzz
      @InfraXpertzz  2 года назад

      You are most welcome. Glad you liked it

  • @fortunemasamba5782
    @fortunemasamba5782 8 месяцев назад

    Thank you Sir

  • @pankajmohapatra1990
    @pankajmohapatra1990 4 года назад

    Nice and very informative Video

    • @InfraXpertzz
      @InfraXpertzz  4 года назад

      Thanks for the valuable comments 🙂

  • @rehantayyab82
    @rehantayyab82 3 года назад +1

    u have good stuff ....... do u have some video to migrate RAC setup to AWS or OCI ?

    • @InfraXpertzz
      @InfraXpertzz  3 года назад

      Thanks for the nice comments. We have videos on creating RAC database on AWS. Since AWS RDS doesn't support RAC, rather they support Multi AZ, so the RAC setup needs to be done on EC2 server. And migrating to any database hosted on EC2 server is similar to migrating to any on premise server or VM

  • @vivekpuurkayastha1580
    @vivekpuurkayastha1580 Год назад

    Hi is it possible to automate the process. My use case is I will have oracle dump file in S3 and I will need to select some tables fro the dump file and store it back to s3 and then transform and abck to s3 then read using Athena ... In this process I assume we can selectively import tables using INCLUDE/EXCLUDE into a RDS Oracle instance and then store files form RDS instance into S3 as csv/parquet using an ETL tool ... This is a weekly load and each time we will get full data dump, so i believe once I write data in s3 as csv will need to drop the tables form the RDS. Can you please guide with an overall approach?

  • @moedweik9523
    @moedweik9523 Год назад +1

    Excellent Video! Would this work too for 200GB- 300 GB oracle dump file created on RHEL server? Do I need 300 GB of local disk space on my laptop and 300 GB of S3 storage? It seems you are moving the file from on-premise server to local windows machine and then upload to AWS S3.

    • @InfraXpertzz
      @InfraXpertzz  Год назад

      Glad you liked it. Yes it will work for the size you mentioned. We had to transfer to local windows since the VM used here couldn't communicate with the outer world. You can send it directly to S3 with aws cli enabled server. Or send to any jump server having more than 300gb of space before transferring to S3

  • @pratibhay07
    @pratibhay07 3 года назад +1

    Nice demo..this method is good for small database. Can you make one video for cloud endure migration.

    • @InfraXpertzz
      @InfraXpertzz  3 года назад

      Thanks a lot for the nice comments. We will be checking on cloud endure migration and will definitely try putting the video as we get through it

    • @InfraXpertzz
      @InfraXpertzz  3 года назад

      We have posted a session on cloud endure migration. Please check the same. Hope you find it helpful

  • @chandramunagala2040
    @chandramunagala2040 3 года назад +1

    Thank you. it's really helpful. Do you have any videos for Exporting schemas/tablespaces from Oracle database.

    • @InfraXpertzz
      @InfraXpertzz  3 года назад

      Thanks for the comments. For exporting schemas/tablespaces from rds, the procedure would be the same, just a change in datapump syntax. You can refer to the below link for syntax docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Oracle.Procedural.Importing.html#Oracle.Procedural.Importing.DataPumpDBLink.Step6
      The syntax is :
      DECLARE
      v_hdnl NUMBER;
      BEGIN
      v_hdnl := DBMS_DATAPUMP.OPEN( operation => 'EXPORT', job_mode => 'SCHEMA', job_name=>null);
      DBMS_DATAPUMP.ADD_FILE(
      handle => v_hdnl,
      filename => 'sample.dmp',
      directory => 'DATA_PUMP_DIR',
      filetype => dbms_datapump.ku$_file_type_dump_file);
      DBMS_DATAPUMP.ADD_FILE(
      handle => v_hdnl,
      filename => 'sample_exp.log',
      directory => 'DATA_PUMP_DIR',
      filetype => dbms_datapump.ku$_file_type_log_file);
      DBMS_DATAPUMP.METADATA_FILTER(v_hdnl,'SCHEMA_EXPR','IN (''SCHEMA_1'')');
      DBMS_DATAPUMP.START_JOB(v_hdnl);
      END;
      /

  • @prisillarachel2007
    @prisillarachel2007 Год назад

    pls explain how to migrate fulldb export from ec2 and import it on oracle rds

  • @mahesh667
    @mahesh667 3 года назад +1

    Thanks a lot! Rather than using PL/SQL Procedures and packages,
    is there any issue to migrate with datapump utility (expdp/impdp ) ? I chose impdp. Your comments please.

    • @InfraXpertzz
      @InfraXpertzz  3 года назад

      Thought we have already replied so. So a delay in responding. How will you use impdp in RDS other than PL/SQL Procedures and Packages, when you donot have login to the DB server. To use impdp, we need access to the DB server right?

    • @mahesh667
      @mahesh667 3 года назад +1

      @@InfraXpertzz thanks for the reply. We have one option that we can create oracle client on remote machine windows/Linux and we can use datapump utility and sqlplus and other utilities. and I have done DB import of 1tb using datapump on my 18c RDS.

    • @InfraXpertzz
      @InfraXpertzz  3 года назад

      Is this through db link? I haven't tried this yet. But this looks to be a good method.

    • @mahesh667
      @mahesh667 3 года назад

      @@InfraXpertzz Not using dblink, but it is using tnsstring.

    • @InfraXpertzz
      @InfraXpertzz  3 года назад +2

      Thanks for the confirmation. Will plan to explore this too

  • @BiniAshin
    @BiniAshin Год назад

    Nice video and good explanation. I have a question - can we use datapump to import csv data files?I have a usecase where data is present in csv and want to upload it to RDS, can this be done using RDS-S3 integration and datapump?

    • @InfraXpertzz
      @InfraXpertzz  Год назад

      Apologies. But I haven't tried it yet. Can't comment much on this

  • @vinothkumarsundaram5383
    @vinothkumarsundaram5383 2 года назад

    Really very good explanation and thanks a lot. One doubt, is there any limitation for s3 storage for transferting dump. If dump size is more, how to do speedup and additional storage in s3.

    • @InfraXpertzz
      @InfraXpertzz  2 года назад +1

      Thanks a lot for the nice comments. I am not aware of any limitations though. Still if you are planning to work on this in your project environment, I would request you to raise a case with AWS to get more details

  • @frankfrank6463
    @frankfrank6463 2 года назад +1

    If you are on corporate network dont you have to open firewall for port 1521 to RDS endpoint or IP address so your SQL Developer client can connect to AWS RDS service using 1521?

    • @InfraXpertzz
      @InfraXpertzz  2 года назад

      Yes you have to. Also in aws security group the port needs to be opened in whichever network you are

    • @frankfrank6463
      @frankfrank6463 2 года назад

      @@InfraXpertzzIt might take security a few weeks to open up port 1521 on corporate firewall. Do you know if there is way to connect to RDS using sql*plus client on the same machine where RDS instance runs so i can execute the SQL commands to run Data Pump?

    • @InfraXpertzz
      @InfraXpertzz  2 года назад

      If from aws security group the port 1521 is opened, you can create an ec2 instance, install oracle client over there and connect using sqlplus

    • @frankfrank6463
      @frankfrank6463 2 года назад

      The problem with using EC2 for connection to RDS is that AWS will charge extra money for the EC2 instance. The port 1521 is open in the security group attached to the RDS. I even tested access using port 1521 from home compute and it did not work. When i run TRACEROUTE I do see traffic leaving the the local network to AWS servers but then i start seeing timeouts.

  • @frankfrank6463
    @frankfrank6463 2 года назад

    Can you explain how did your local oracle SQL Developer running on your local machine connect to the Oracle Database in AWS/RDS? Also dont you have to create an oracle instance in AWS and link it to RDS before you do this migration?

    • @InfraXpertzz
      @InfraXpertzz  2 года назад

      With rds credentials and connectivity details you can connect from local sql developer. Plz check the full video to have an understanding. Rds is a database. There is no need to create any oracle instance separately in AWS

    • @frankfrank6463
      @frankfrank6463 2 года назад

      @@InfraXpertzz My thinking is RDS is relational database service that can support mysql, sql server, oracle, etc. I saw another video where they created an oracle instance using RDS. Also, are you connecting from your local site SQL developer to AWS using VPN connection and port 1521?

  • @jalnil75
    @jalnil75 3 года назад +1

    Do you have all steps written in notepad or word?

    • @InfraXpertzz
      @InfraXpertzz  3 года назад

      Please check on the blog link in video description. You would get the steps over there

  • @rejectedideas1827
    @rejectedideas1827 3 года назад

    I want to transfer just a few tables to S3 only and stop. And this should happen on a schedule automatically. After that another team will take care of the S3 data.
    Is that possible?
    Without RDS in the picture

    • @InfraXpertzz
      @InfraXpertzz  3 года назад

      You can do this through dbms jib scheduler. Or even you can set a cronjob in ec2 to take the export through network link

  • @fafonso54
    @fafonso54 3 года назад +1

    Poderiam colocar legenda pelo penos em Ingles...a pronuncia é muito dificil

    • @InfraXpertzz
      @InfraXpertzz  3 года назад

      U can check the blog from the link provided in the video description

  • @curiousenough5906
    @curiousenough5906 3 года назад

    can this on prem oracle db be converted to AWS postgres via S3 integration?

    • @InfraXpertzz
      @InfraXpertzz  3 года назад +1

      Sorry. Need to check on that. Using dms and aws sct it is possible, but with s3 integration, we cannot comment much on it

    • @curiousenough5906
      @curiousenough5906 3 года назад +1

      @@InfraXpertzz cool, do you have a video/reference on aws sct and dms?

    • @InfraXpertzz
      @InfraXpertzz  3 года назад

      Yes. Its there. You can check from the playlist. It is from oracle to mysql. This is a demo shown in test environment. In real life you might encounter more issues while converting, which needs to be handled

    • @InfraXpertzz
      @InfraXpertzz  3 года назад +1

      U can also refer to the below blog :
      infraxpertzz.com/heterogenous-database-migration-oracle-to-mysql-using-aws-sct-and-aws-dms/

    • @curiousenough5906
      @curiousenough5906 3 года назад

      @@InfraXpertzz thank you, appreciated :)

  • @mahesh667
    @mahesh667 3 года назад

    Hi , Can you do me a favor please,
    I have been tryng modify a init parameter in aws rds with command ALTER SYSTEM SET pga_aggregate_limit=0 SCOPE=BOTH;
    but unable to run it. Can you please help me with aws rds relevant command to
    modify the same parameter, its little urgent, thanks

    • @InfraXpertzz
      @InfraXpertzz  3 года назад +1

      most of the parameters you need to change from parameter groups itself from RDS console. However I will check and let you know if this is feasible or not.
      One more thing I noticed, that the pga_aggregate_limit parameter is a static one in AWS. If you go to Parameter group and search for pga, you will see the apply type as static. So scope=both wont work in here.

    • @InfraXpertzz
      @InfraXpertzz  3 года назад

      Please go through the link pasted below for the tasks you can do from command line :
      docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Appendix.Oracle.CommonDBATasks.html

    • @mahesh667
      @mahesh667 3 года назад +1

      @@InfraXpertzz thanks a lot, we have changed it from Rds console and it’s working as expected now.

    • @InfraXpertzz
      @InfraXpertzz  3 года назад

      Did u have to restart the database? As it can be seen as static parameter in the parameter group

    • @mahesh667
      @mahesh667 3 года назад +1

      @@InfraXpertzz yes as it’s static in aws, after changed it was showing pending reboot to effect, rebooted instance.

  • @qazisaifhussain6562
    @qazisaifhussain6562 3 года назад

    share your document

    • @InfraXpertzz
      @InfraXpertzz  3 года назад

      Please check description for links