You are always helpful .. I frequently visit your channel for new updates. Thanks for the stuff you provide without any cost. Quick question: Can we make this process automated? Like when ever i insert something in bucket-1 it should reflect in bucket-2 ( Like CRR in individual account). Thanks in advance!
Thank you very much for being a regular to our channel. Share the goodness Why not use CRR itself? if not S3Event Trigger to Lambda(To Copy) - If you want the lambda to be written, let us know.
Very helpful..it works..thanks. I want to know 2 things. 1. I am transferring files for corporate purpose..it’s copying from 2 hours..what if I stop and start syncing tomorrow again. Will it resumes or create copy. 2. If there is addition of objects later on in source bucket..then again I can sync it by this command? To copy those object?
I have just tried crr (just now for you) 1/ Resumes - No copies - Objects are version controlled, as such you will have new version 2/ I will explore crr/srr with Replica modification sync, which takes care of these edge cases. If you need further assistance reach out to our email, we should be able to help you
Very useful info. have a question - If i add the new files in source bucket, is it going to sync that to the destination bucket automatically or we have to run the sync command again??
Hi - I am a big fan of your videos. They are very informative. A BIG THANKS to you. I am trying to copy over all the buckets, folders, and files from one account to another account with retaining the ACL's and later keep running the syncing process using S3 Event Notification. The above videos show for only one bucket....but I have 100+ buckets. How do I do it? Tried using crossFTP, s3s3mirror but none of them worked. am I missing something?
what is the instance ( @172.31.82.194) you used to execute the CLI? Is it the destination or some other ? I couldn't be able to figure out from your video narration?
How to make the file transfer more secure... In this case file will get transferred using the default s3 PL(public ip), what if i want to do the file transfer using private ip like endpoints or vpn... Is that possible?
Ln 7, Col 33Missing ARN Field: Resource ARNs must include at least 6 fields and include the following structure: arn:partition:service:region:account:resource I am getting the above error could you pls guide me?
In my source bucket some objects are public and some are private, when I moved all objects to destination using the aws sync command, all the objects in destination are private
Full Permissions are almost always a overkill(though it solves the issue). If you are doing it for production deployments, and going to use it regularly then you should use more restrictive permissions
@Valaxy Technologies, Hi, I am trying the following : 2 d/f accounts, Trying to replicate the bucket from account a to account b, The bucket gets replicated but when I open the file in destination account, I get the permission denied error. In the destination bucket I see the following: Server-side encryption Access denied. I am not sure how to fix this. I am also not sure about the exact policy which should be there in source to give the permission to open the files in destination bucket. Please help.
@@ValaxyTechnologies We planned for syncing of s3 bucket with other cloud storage but it failed,so planning to sync mutiple s3 buckets of same account instead of cross account sync...how to do this?
@@rahulkrishnam5923 What is the objective here - Just to create a copy of objects for backup? (if so why not create multiple destination buckets and use s3 CRR). What is failing - What have you tried?
Thanks A lot Bro/ Sir Its helps me a lot , Work Like Charm
Respect from Lhr ,Pakistan
You are always helpful .. I frequently visit your channel for new updates. Thanks for the stuff you provide without any cost.
Quick question:
Can we make this process automated? Like when ever i insert something in bucket-1 it should reflect in bucket-2
( Like CRR in individual account). Thanks in advance!
Thank you very much for being a regular to our channel. Share the goodness
Why not use CRR itself? if not S3Event Trigger to Lambda(To Copy) - If you want the lambda to be written, let us know.
you can write a replication policy in S3 , it will replicate the new data which will come on source bucket
Very helpful..it works..thanks. I want to know 2 things.
1. I am transferring files for corporate purpose..it’s copying from 2 hours..what if I stop and start syncing tomorrow again. Will it resumes or create copy.
2. If there is addition of objects later on in source bucket..then again I can sync it by this command? To copy those object?
I have just tried crr (just now for you)
1/ Resumes - No copies - Objects are version controlled, as such you will have new version
2/ I will explore crr/srr with Replica modification sync, which takes care of these edge cases.
If you need further assistance reach out to our email, we should be able to help you
@@ValaxyTechnologiesThanks for your help
Its working fine ..but when I checked source and destination bucket objects its different, the storage size is same but objects count is different.
Very useful info.
have a question -
If i add the new files in source bucket, is it going to sync that to the destination bucket automatically or we have to run the sync command again??
Hi - I am a big fan of your videos. They are very informative. A BIG THANKS to you.
I am trying to copy over all the buckets, folders, and files from one account to another account with retaining the ACL's and later keep running the syncing process using S3 Event Notification. The above videos show for only one bucket....but I have 100+ buckets. How do I do it? Tried using crossFTP, s3s3mirror but none of them worked. am I missing something?
Email us, we can have a call and discuss how to do it
@@ValaxyTechnologies Can you please share your email address?
Thanks a lot for explaining this very well.
My pleasure
nice explanation
Thank you. Hope you liked it:)
what is the instance ( @172.31.82.194) you used to execute the CLI? Is it the destination or some other ? I couldn't be able to figure out from your video narration?
That is my local desktop configured with AWS CLI(aka Access Keys/Secret Keys). It could be any machine, with the appropriate credentials configured.
Explained very will could you please make the viedo automated S3 to S3 data transfer in same regiion.Thanks
Super
Thanks
Now you can follow us on Instagram as well
instagram.com/valaxytechnologies/
Is there any option to do this in node.js through aws-sdk npm?
How to make the file transfer more secure... In this case file will get transferred using the default s3 PL(public ip), what if i want to do the file transfer using private ip like endpoints or vpn... Is that possible?
Ln 7, Col 33Missing ARN Field: Resource ARNs must include at least 6 fields and include the following structure: arn:partition:service:region:account:resource
I am getting the above error could you pls guide me?
Is there a way to copy the objects with their permission?
Yes, it should be possible. What have you tried so far.
In my source bucket some objects are public and some are private, when I moved all objects to destination using the aws sync command, all the objects in destination are private
@@narenpushparaju8416 Try using the --acl option. This might interest you - github.com/aws/aws-cli/issues/3215
Hello Sir, how to sync Source S3 to another account S3 bucket as both are in same region, should be accessed with STS session token service
Same Region Replication
Tried this, did not work. I am getting below error
fatal error: An error occurred (AccessDenied) when calling the ListObjects operation: Access Denied
What permission does the AWS CLI has?
I was able to resolve this after adding policy to user that allows full access to s3 objects.
Full Permissions are almost always a overkill(though it solves the issue). If you are doing it for production deployments, and going to use it regularly then you should use more restrictive permissions
can we automate the process that it will copy everytime something new ad to the source bucket
Yep - Check out S3 Event Notification - ruclips.net/video/EGyuzMbXD0Y/видео.html
@Valaxy Technologies, Hi, I am trying the following : 2 d/f accounts, Trying to replicate the bucket from account a to account b, The bucket gets replicated but when I open the file in destination account, I get the permission denied error.
In the destination bucket I see the following: Server-side encryption
Access denied.
I am not sure how to fix this. I am also not sure about the exact policy which should be there in source to give the permission to open the files in destination bucket. Please help.
Email us, we can discuss it, as there are quite a few things that would be incorrect.
@@ValaxyTechnologies Can you please share the email address?
how to i setup the source and destination accounts with aws cli ?
aws s3 sync s3://my-us-west-2-bucket s3://my-us-east-1-bucket --source-region us-west-2 --region us-east-1
kk fine i got it thanks and can i use Transfer Acceleration for the destination bucket for transfer ? will it work ?
and how is the a3 works will it download files to local machine and upload to the new bucket or will it automatically upload ??
All traffic goes through Amazon-internet-Amazon networks. No downloading to local systems
How to sync multiple s3 buckets of same account and destination bucket should have only unique files
Can you elaborate on your requirement. Is this for a client or just to clarify your learner doubt?
@@ValaxyTechnologies We planned for syncing of s3 bucket with other cloud storage but it failed,so planning to sync mutiple s3 buckets of same account instead of cross account sync...how to do this?
@@ValaxyTechnologies learner doubt
@@rahulkrishnam5923 What is the objective here - Just to create a copy of objects for backup? (if so why not create multiple destination buckets and use s3 CRR). What is failing - What have you tried?
After copying to destination bucket,it should automaticslly get deleted from source bucket and there should be two way file sharing can be done?