Nice video. Just one question. Why didn't you choose BLOB storage and you selected Data Lake store? I am working on a project. A bit quick answer would be appreciated. Thanks
I used ADLS for this exercise. We can choose storage service based on our requirement. If data volume less than 250GB then I prefer Blob. ADLS is designed for big data analytics. We can query data from in ADLS where we can’t query in blob. There are lot differences between blob vs ADLS. Please go through those modules to get better idea. Thanks.
Thank you for this video. My requirement is that data on daily basis gets accumulated in S3 bucket and we want it to get it in Azure on daily basis. Can we do that? I guess, the option of scheduling was there where we have chosen"Run Once". Could you please confirm. Also, how can we access this data in Power BI. Could you please help in that as well.
We want to move data of 1.5 TB from aws s3 bucket to azure storage Question : can we go with GenV2 storage or Data lake storage? To move data S2S vpn connections needed between aws and azure? Or when we use ADF its not required Quick response appreciated
i have tried a few times to do this process and keep getting connection failed to my s3. "Credential is incorrect or do not have the permission to access. Access Denied" Would you be able to make a video of how you created your s3 bucket
Thank you very much Praveen super explanation.☺
Sure, np!!
To the point tutorial, thanks to you man
Glad it helped
Exactly what I was looking for. Thanks!
Great to hear!
Thanks a lot brother.
Sure, Np!!
Nice video. Just one question. Why didn't you choose BLOB storage and you selected Data Lake store? I am working on a project. A bit quick answer would be appreciated. Thanks
I used ADLS for this exercise. We can choose storage service based on our requirement. If data volume less than 250GB then I prefer Blob. ADLS is designed for big data analytics. We can query data from in ADLS where we can’t query in blob. There are lot differences between blob vs ADLS. Please go through those modules to get better idea. Thanks.
Thank you for this video. My requirement is that data on daily basis gets accumulated in S3 bucket and we want it to get it in Azure on daily basis. Can we do that? I guess, the option of scheduling was there where we have chosen"Run Once". Could you please confirm. Also, how can we access this data in Power BI. Could you please help in that as well.
Yes you can do with azure data factory or azure synapse studio. I made latest videos on this use case. Thanks
Can you please create a video for the reverse I.e. data source is azure gen2 and the target is aws s3
Yes sure!!
can we schedule this? because I need to store daily file in s3 so I need to copy file daily from s3 to azure storage
Yes we can schedule the pipeline
We want to move data of 1.5 TB from aws s3 bucket to azure storage
Question : can we go with GenV2 storage or Data lake storage?
To move data S2S vpn connections needed between aws and azure? Or when we use ADF its not required
Quick response appreciated
Yes sure you can go with adls gen2.
I got this error "The bucket you are attempting to access must be addressed using the specified endpoint." Please, how can we resolve ?
Did you follow the steps that I performed in this module?
How to further copy the data from container to azure sql tables
Hi dhyaanesh, there are multiple modules available in the channel. Thanks
Nice video. How much it costs to transfer data from s3 to azure?
Depends. Here is the AWS calculator.
calculator.aws/#/
i have tried a few times to do this process and keep getting connection failed to my s3. "Credential is incorrect or do not have the permission to access.
Access Denied"
Would you be able to make a video of how you created your s3 bucket
Sure. Here is the video module.
ruclips.net/video/Cij5_4yLHLA/видео.html
Can you please create video on. ?
How to copy data aws s3 bucket to azure using AWS Datasync.
Yes sure!