Eli is the first source of information when I want to learn something new or just review things I have not worked with in a while. He is clear, easy to understand and the material he uses to show you something is easy to get.
Two questions: 1) How to exclude more than one path? Like: I don't want to keep /mnt and /media and /tmp 2) How to create a backup file for more than one path? Like: I want to create an archive file, and I want it to include /usr/share and /bin and /var/www. BWT sir, you're awesome. Keep do this. :)
Einstein said that any intelligent fool can make things bigger, more complex. But it takes a touch of genius and lots of courage to move something in the opposite direction. Thank You for this tutorial
Thank you so much for making Linux so simple to understand. You are the best teacher ever! I watched all the 9 videos from introduction to the end, every class was very easy to understand and I would be glad to watch all your classes. You just earned a subscriber.
Here we are, some 10 years later, and THIS is still the best way I could find to back up my Linux systems (maintaining permissions and symlinks) to a NFS mounted NAS drive and AWS S3 bucket.
Best Video, your narration is too good to keep track of and your recap actually makes it hard to forget! Thank you soo much! Look forward for more videos :)
Woow AWESOME course! You are taking time explaining every single thing, + you are speaking at a perfect pace for a non-native English speaker Thumb up + Subscribed Are the 19 who disliked your video, tried it on Windows?
hi Eli, I have an idea for part two of this tutorial - I intend to do backup of my system, so that once it will get corrupt (after update/upgrade), I could go back to previous working Linux before corruption of the system. That's something I am missing in this tutorial. BTW - your tutorials are amazing! I've seen around 20 to date and they are so easy to understand! thank you for sharing with your knowledge with us publicly. cheers!
All very good! Except using wordpress :-) Thank you for sharing! Just wanna add that performing the tar -c.. command is NOT adding new nor updating old files inside the tar.gz file, every time it overwrite them, just take it into consideration! Also you can recover a single file from the tar, by using the --file option.
Linux has a very helpful command called "file". This command can tell you what a file contains based on the contents of the file. It can tell you that a file is a compressed file, tar file, mp3 file, python file, etc. etc. So if you do create a compressed tar file called "Bob", you can identify it years later!
I've been enjoying these tutorials and find them very helpful. Three questions: 1. when extracting from a tar ball, why must you tell tar the file is compressed. Can't it figure that out itself? 2. If I want to schedule a cron job to be performed every Tuesday and Thursday would I enter "24" in the day-of-week location? 3.Suppose I want to schedule a backup once per day with cron, but I don't want the tar ball created today overwritten by the tar ball created tomorrow. I assume there's a way of avoiding that.
The way you presented Linux videos were really fantastic and it was very crystal clear.Can you share me the link if you have complete Linux Courses Videos....
There is lot more to tell about tar. t - option/switch/argument for me is the sugar when I use tar . Width a "t" option/switch/argument You can look inside tar archive :) Also its possible to extract one file after You had a look with a "t" option. About sudo - I belive Eli has configured sudo to be used without a password for certain non authoritative user to operate with system as a root. It's wrong - sudo loses the concept why it was made :) Also using sudo in cron - it's wrong for most of us, in case of Eli - as I said, I believe he has made one non authoritative user to be used without password. Also - sudo - it is not in all Linux flavours / distributions. Eli I would enchant this video course. But!!! Even for me it was great to listen - You are great orator. Good Job.
I like Your Channel So much You are the best teacher i ever had ! You are amazing Guy Too bad I can't join you in real Classroom With my laptop :) have a nice Day Greeting from Algeria
Eli, I love your tutorials and how much you simplify them. Can you make a video on red hat or Centos kick-start? i mean the distribution doesn't matter if its a Linux kick start.
LOL, Dang it ELI!!! I watched this video trying to remember if its: ....--exclude=/mnt.... or ....--exclude = /mnt... with spaces Eli: "I'm not going to exclude anything, just to make life easier..." LOL that was funny
I haven't researched these questions yet, so I'm sure that there's an answer out for this but it would have been nice if they had been covered in the video: 1) When running these scheduled cron jobs, is there a way for it to check to see if hardware is active and above a certain level first (say, CPU above 95% and the fans are going nuts), or if some specific program is already running, wait for that task to finish before proceeding with the backup? It sure would suck if you suddenly had cron trying to do a full / backup when you're running some extremely intensive program. 2) When a new cron job backup is run, will it overwrite the existing tarball in that directory, or does it automatically increment the tarball filenames in some fashion? I.e. backup.tar.gz1, backup.tar.gz2, etc. I'm assuming that it just overwrites the old file. Does anyone happen to know how you would get cron to avoid overwriting previously created backups? 3) This is really great tutorial for full backups, but how would you go about performing differential or incremental backups? My guess is that the answer is going to be "Just always do Full backups", but there have to be at least *some* cases out there where doing full backups, even with compression, simply wouldn't be acceptable (think 20TB backup when only a 1kb txt file has been modified).
I like your lessons - very clear and simple. I am learning a lot from your lessons. I have one question. How can we backup to external hard drive from the command line. I am new to linux and I would appreciate if you teach us about this. Thanks.
Sorry Eli, I know your video is good.. but in the video how did you get wwwbackup.tar.gz? Did you just go through vi and make that empty file or what? Whenever I enter the same command, it says it's not a file or directory or permission denied..
Your video is so helpful and informative. Can post another video for how to automatically backup files for all users home directory using tar and shell scripting?
hi Eli, I watched the video again and you explain a bit how to backup parts or the whole system and how to recover tar ball file, but I was thinking about step-by-step backup of the whole sytem and especially detailed guide how to recover that system, so that it will not have to be re-installed, but just recovered from the tar ball file. for me it looks kind of rocket science, but you probably would say that all was explained already... . anyway, if I will find out somewhere on google.:)
Eli thanks for the informational video. quick question, when you did the crontab -e and it opened up crontab and had the # m h dom mon dow cheat-sheet, had you already input that into crobtab file for the sake of the video or did the earlier versions of Linux distributions already have that? I am using centos 7 and it doesnt have that little cheat-sheet. I know that its just a comment because of the # sign but i was just wondering. Thank you :)
Eli thanks for all this videos Small problem on this: In the Lesson „Linux Backup with TAR und Cron Jobs“ something with the order oft the virtual sessions is messed up. The terminal sessions are starting in the middle and later from begin.
Eli I have been a huge fan of yours since my first computer back in 2001 and now in 2015 i am in need of a new computer within 6 months or less. What OS based computer would you recommend for running ecommerce apps? Windows or MAC?
But what about say for February 28th or 29th if applicable for a scehduled cronjob? If i want to run a backup on the 30th of every month would I be right in assuming that if the month does not have 30 days then it would revert to running the job on the closest day? How about bi-weekly backups? I assume that's also setting the day interval to 14 right? Awesome tutorial man, I love your videos.
You were still doing backups with tar in 2011? Srsly? Sheesh, I stopped using tar for backups around 2000. That's when I learned that rsync existed. To another disk on the same server there's no advantage with rsync (but no disadvantage either). Rsync over ssh to a disk on another server and you get what is effectively a full backup but with data transfer amounting to less than an incremental backup. Rsync is designed to work well even with high-latency connections, so with the reduced data transfer daily full backups to remote sites is feasible. No faffing about backing stuff up onto removeable media and remembering to take it offsite. Plus you can have rsync automatically populate a directory with the files that changed or were deleted. So in one operation you can have a full backup and a decremental backup (like incremental but going back in time) with data transferred less than for an incremental backup. Or if you want you can have it create the equivalent of Microsoft's shadow volumes giving what appear to be full daily backups but with actual storage only of the unique files (don't forget to allocate some extra inodes to that disk when you create the filesystem) and data transfer amounting to less than an incremental backup. Tar was obsolete 16 years ago. You really ought to consider removing this video because it's giving people a *bad* solution to making backups. The only time tar is sensible is for archiving to tape, and archiving to tape isn't sensible when you can rsync to disk. So unless you're the NSA and keeping vast amounts of data forever, rsync is a far better solution.
Nice troll. You prefer bacula. With four arcane components to configure, requiring a degree in rocket surgery to figure out. That goes horribly wrong if there is even minor clock skew. That can purge backups which later incremental backups depend on, rendering the incrementals useless and your recovery strategy fucked. That transfers an entire multi-gigabyte file if a single byte changes (tar has the same failing, rsync does not). Optimized for backup to tape, sucks big time for backup to disk. Plus many other lesser problems. Nice front-end, shame it's a bitch to configure and performs crappily. Yep, perfectly sane alternative to rsync, or even command-line tar. Marginally preferable to using pencil and paper to write down the pattern of 1s and 0s of the files, then restoring using a hex editor to type them back in. Oh, and the name means "penis bones" (something most mammals have, humans are one of the few exceptions). But it looks pretty, and that's what counts, eh? Oh, and it's always handy when you want to troll a discussion about sane backup strategies.
Bacula isn't hard to use or understand. You can also use BackupPC or Amanda or rsync or even back in time. Nobody force you. > Yes, much better than tar. I say it about rsync.
To clarify: > Bacula isn't hard to use or understand. I mean, if you sysadmin you may be forced to learn it. For desktop user is too much imho. Simpler is back in time or dejadup, etc.
Hi Eli I have a question for you, is there a way to backup an entire ubuntu system and restore it with this command or any other terminal like command? I am playing with a vm that is in a hypervisor which I do not have access to and I would like to have that vm backed up to a local machine (through ssh) and how will I go about restoring it? To my understanding, since I do not have access to the hypervisor I cannot use clonezilla or any other tool like that, I do not have access to the boot menu and such. So I am guessing (from what I am encounter) it is impossible to backup the vm, it is recommended to backup only certain files for example just the www folder instead of the whole system.
Hi Eli ! Great videos ! just for you to know. There is a problem (at least for my view) on 11:17. It does not show you type the commands for the tar...only shows when it's finishing !. Best regards !
Eli is the first source of information when I want to learn something new or just review things I have not worked with in a while. He is clear, easy to understand and the material he uses to show you something is easy to get.
ELI IS TEACHER WHO CAN TEACH
Two questions:
1) How to exclude more than one path?
Like: I don't want to keep /mnt and /media and /tmp
2) How to create a backup file for more than one path?
Like: I want to create an archive file, and I want it to include /usr/share and /bin and /var/www.
BWT sir, you're awesome.
Keep do this. :)
tar -xvpzf your_backup.tar.gz --exclude={"/home/*/Downloads/","/home/*/Documents"}
Einstein said that any intelligent fool can make things bigger, more complex.
But it takes a touch of genius and lots of courage to move something in the opposite direction.
Thank You for this tutorial
Thank you so much for making Linux so simple to understand. You are the best teacher ever! I watched all the 9 videos from introduction to the end, every class was very easy to understand and I would be glad to watch all your classes. You just earned a subscriber.
Here we are, some 10 years later, and THIS is still the best way I could find to back up my Linux systems (maintaining permissions and symlinks) to a NFS mounted NAS drive and AWS S3 bucket.
Great instruction. Just moved a website to a dedicated server and plan on using this to perform backups. Thanks dude!
you are a star after this many years, no one has such a great description
Watching in 2022. Eleven years later this is still very useful. thank you.
I just went through your whole Linux course tonight, massively appreciated, very clearly presented.
Eli you have by far the best tutorials on RUclips! I've learned so much from these linux videos and I love how you focus on what matters!
what a king you are! That is how things should be explained. I run into many tutorailas that skip points that make people confusing..Thank you
Excellent Teacher I have seen in my life Since I started the school on 1978
Can't wait for tom to watch this so i'm watching it now... watched your whole Linux class straight and i loved it.. THANKS ALOT!
Thx Eli you are the bomb!! I've watched about 35 of your videos so far and haven't been disappointed yet, thanks for making it easy to grasp!
I wish I would have found you weeks ago! Thank you for explaining this down to the meaning of each character. Much appreciated!
you made my life a lot easier with this video.....keep posting these kind of videos...you are too good in teaching eli
Thanks Eli for your patient and clear class. I learned a lot from you.
wOw!!!!!!
I never thought I would learn Linux this fast.... 2 Days and i know hell lot of things....
.
THUMBS UP ELI !!!! U R AWESOME
Thank you for your efforts to share your knowledge to everyone who wants. These are very helpful me.
Best Video, your narration is too good to keep track of and your recap actually makes it hard to forget! Thank you soo much! Look forward for more videos :)
You can backup /
If you need a bare metal backup you can just use a piece of backup software.
Woow AWESOME course!
You are taking time explaining every single thing, + you are speaking at a perfect pace for a non-native English speaker
Thumb up + Subscribed
Are the 19 who disliked your video, tried it on Windows?
God bless you Eli! Your videos are lifesavers! Thank you ! I wish I could replace my current teacher with you.
Nano is an excellent editor for those finding Vim too difficult.
Fantastic tutorial. I am so happy that I found your channel. Keep up the great work.
thank you sir for your valuable videos. I learned lots of thing from basic to advance from your tutorials.
thanks a lot
Awesome job Eli! I had so much fun and boosted my confidence. Cheers!
Excellent Video. Thanks for helping me understand Linux a little better!
Eli, you are great, nice video and presentation. I enjoy watching all of your video.
hi Eli, I have an idea for part two of this tutorial - I intend to do backup of my system, so that once it will get corrupt (after update/upgrade), I could go back to previous working Linux before corruption of the system.
That's something I am missing in this tutorial.
BTW - your tutorials are amazing! I've seen around 20 to date and they are so easy to understand! thank you for sharing with your knowledge with us publicly. cheers!
Awesome I was searching for a linux backup program .... but this is so simple ... thanks!
This was so well explained. Thanks a lot ETCG.
appreciate your videos lessons and logical well expressed manner 👍
All very good! Except using wordpress :-) Thank you for sharing! Just wanna add that performing the tar -c.. command is NOT adding new nor updating old files inside the tar.gz file, every time it overwrite them, just take it into consideration! Also you can recover a single file from the tar, by using the --file option.
Eli you are a awesome teacher
Yes you can, everything you can do with the server can be done with the desktop using the terminal
Thank you kind brother. You are a gentleman and a scholar.
Linux has a very helpful command called "file". This command can tell you what a file contains based on the contents of the file. It can tell you that a file is a compressed file, tar file, mp3 file, python file, etc. etc. So if you do create a compressed tar file called "Bob", you can identify it years later!
thanks.. Happy to say u are my Linux teacher ..
Thanks for All lectures
I've been enjoying these tutorials and find them very helpful. Three questions: 1. when extracting from a tar ball, why must you tell tar the file is compressed. Can't it figure that out itself? 2. If I want to schedule a cron job to be performed every Tuesday and Thursday would I enter "24" in the day-of-week location? 3.Suppose I want to schedule a backup once per day with cron, but I don't want the tar ball created today overwritten by the tar ball created tomorrow. I assume there's a way of avoiding that.
Great series...Thank you Eli.
The way you presented Linux videos were really fantastic and it was very crystal clear.Can you share me the link if you have complete Linux Courses Videos....
IT's very easy to understand, thanks very much.
thank you very much. You are a really great trainer.
There is lot more to tell about tar. t - option/switch/argument for me is the sugar when I use tar . Width a "t" option/switch/argument You can look inside tar archive :) Also its possible to extract one file after You had a look with a "t" option. About sudo - I belive Eli has configured sudo to be used without a password for certain non authoritative user to operate with system as a root. It's wrong - sudo loses the concept why it was made :) Also using sudo in cron - it's wrong for most of us, in case of Eli - as I said, I believe he has made one non authoritative user to be used without password. Also - sudo - it is not in all Linux flavours / distributions. Eli I would enchant this video course. But!!! Even for me it was great to listen - You are great orator. Good Job.
Best video yet, you saved my brain thank you!
tq eli, u r making my life easy
Excellent tutorial series on Ubuntu Linux !
Many thanks buddy. You got yourself a subscriber
whole day i had struggled and got this video thank u very much , please some video of apache and mysql thanks eli
Parameters can always be mixed for example ls doesn't care if you type "ls -RQ1" or "ls -1QR" :)
You forgot to mention that you have to start the cron daemon with /etc/init.d/cron start
i did not start cron daemon with that and it works - Kubuntu 18.4
Excellent video again!!
Still so much powerful videos
I like Your Channel So much You are the best teacher i ever had !
You are amazing Guy Too bad I can't join you in real Classroom With my laptop :)
have a nice Day Greeting from Algeria
Thanks for the video....I was struggling with this.This video helped me to solve my problem.
Eli, I love your tutorials and how much you simplify them. Can you make a video on red hat or Centos kick-start? i mean the distribution doesn't matter if its a Linux kick start.
Love your videos - educational while entertaing :)
Sir,can you please do a video on Shell scripting(Bash and Perl)?
Thank you for the detailed explanations of it all.
It depends on what you want to compress. Video/Audio/Image files aren't really cut for compressing. Text-Files profit the most probably.
LOL, Dang it ELI!!!
I watched this video trying to remember if its:
....--exclude=/mnt.... or
....--exclude = /mnt... with spaces
Eli: "I'm not going to exclude anything, just to make life easier..."
LOL that was funny
This video helped me a lot
Thanks !!!
thanks a lot, your lessons are very informative and useful.
Great tut!
Thanks a lot, you can explain amazingly simple.
OMG, Thank you. This is the one of the most useful tut
superb sir thanks for upload
just realized that your description has the file .taz.gz in your extension. It won't affect the file at all, just something I realized.
Solid overview Eli.
I miss this Eli
Great teacher
Thanks a lot Eli
thank u so much ur amazing and awsome for averyone intersted
You are awesome sir
I haven't researched these questions yet, so I'm sure that there's an answer out for this but it would have been nice if they had been covered in the video:
1) When running these scheduled cron jobs, is there a way for it to check to see if hardware is active and above a certain level first (say, CPU above 95% and the fans are going nuts), or if some specific program is already running, wait for that task to finish before proceeding with the backup? It sure would suck if you suddenly had cron trying to do a full / backup when you're running some extremely intensive program.
2) When a new cron job backup is run, will it overwrite the existing tarball in that directory, or does it automatically increment the tarball filenames in some fashion? I.e. backup.tar.gz1, backup.tar.gz2, etc. I'm assuming that it just overwrites the old file. Does anyone happen to know how you would get cron to avoid overwriting previously created backups?
3) This is really great tutorial for full backups, but how would you go about performing differential or incremental backups? My guess is that the answer is going to be "Just always do Full backups", but there have to be at least *some* cases out there where doing full backups, even with compression, simply wouldn't be acceptable (think 20TB backup when only a 1kb txt file has been modified).
1) yes you can. there are two ways of doing it. the first using ps aux | grep crontab. and the other is with top
I like your lessons - very clear and simple. I am learning a lot from your lessons.
I have one question. How can we backup to external hard drive from the command line. I am new to linux and I would appreciate if you teach us about this. Thanks.
Sorry Eli, I know your video is good.. but in the video how did you get wwwbackup.tar.gz?
Did you just go through vi and make that empty file or what? Whenever I enter the same command, it says it's not a file or directory or permission denied..
Your video is so helpful and informative. Can post another video for how to automatically backup files for all users home directory using tar and shell scripting?
hi Eli,
I watched the video again and you explain a bit how to backup parts or the whole system and how to recover tar ball file, but I was thinking about step-by-step backup of the whole sytem and especially detailed guide how to recover that system, so that it will not have to be re-installed, but just recovered from the tar ball file. for me it looks kind of rocket science, but you probably would say that all was explained already... . anyway, if I will find out somewhere on google.:)
Eli thanks for the informational video. quick question, when you did the crontab -e and it opened up crontab and had
the # m h dom mon dow cheat-sheet, had you already input that into crobtab file for the sake of the video or did the earlier versions of Linux distributions already have that? I am using centos 7 and it doesnt have that little cheat-sheet. I know that its just a comment because of the # sign but i was just wondering. Thank you :)
Can you do one with Rsync? They way you convey everything is great!
Awesome !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
Eli thanks for all this videos
Small problem on this:
In the Lesson „Linux Backup with TAR und Cron Jobs“ something with the order oft the virtual sessions is messed up. The terminal sessions are starting in the middle and later from begin.
Eli I have been a huge fan of yours since my first computer back in 2001 and now in 2015 i am in need of a new computer within 6 months or less. What OS based computer would you recommend for running ecommerce apps? Windows or MAC?
***** or LINUX?
Hi Eli. Thank you for really helpful course. Excellent presentation of the material.
hey eli ...gr8 job man .!!! please upload more video of servers like ftp, samba, dns, nfs. thnx alot again
But what about say for February 28th or 29th if applicable for a scehduled cronjob?
If i want to run a backup on the 30th of every month would I be right in assuming that if the month does not have 30 days then it would revert to running the job on the closest day?
How about bi-weekly backups?
I assume that's also setting the day interval to 14 right?
Awesome tutorial man, I love your videos.
Is there an advanced playlist? This one was perfect!
You were still doing backups with tar in 2011? Srsly?
Sheesh, I stopped using tar for backups around 2000. That's when I learned that rsync existed. To another disk on the same server there's no advantage with rsync (but no disadvantage either). Rsync over ssh to a disk on another server and you get what is effectively a full backup but with data transfer amounting to less than an incremental backup. Rsync is designed to work well even with high-latency connections, so with the reduced data transfer daily full backups to remote sites is feasible. No faffing about backing stuff up onto removeable media and remembering to take it offsite.
Plus you can have rsync automatically populate a directory with the files that changed or were deleted. So in one operation you can have a full backup and a decremental backup (like incremental but going back in time) with data transferred less than for an incremental backup.
Or if you want you can have it create the equivalent of Microsoft's shadow volumes giving what appear to be full daily backups but with actual storage only of the unique files (don't forget to allocate some extra inodes to that disk when you create the filesystem) and data transfer amounting to less than an incremental backup.
Tar was obsolete 16 years ago. You really ought to consider removing this video because it's giving people a *bad* solution to making backups. The only time tar is sensible is for archiving to tape, and archiving to tape isn't sensible when you can rsync to disk. So unless you're the NSA and keeping vast amounts of data forever, rsync is a far better solution.
Yes, much better than tar. I prefer Bacula.
Also why backuping eg: /usr? You need only: packages list, /etc, /var (and /srv, /root, /home, etc).
Nice troll. You prefer bacula. With four arcane components to configure, requiring a degree in rocket surgery to figure out. That goes horribly wrong if there is even minor clock skew. That can purge backups which later incremental backups depend on, rendering the incrementals useless and your recovery strategy fucked. That transfers an entire multi-gigabyte file if a single byte changes (tar has the same failing, rsync does not). Optimized for backup to tape, sucks big time for backup to disk. Plus many other lesser problems.
Nice front-end, shame it's a bitch to configure and performs crappily. Yep, perfectly sane alternative to rsync, or even command-line tar. Marginally preferable to using pencil and paper to write down the pattern of 1s and 0s of the files, then restoring using a hex editor to type them back in.
Oh, and the name means "penis bones" (something most mammals have, humans are one of the few exceptions).
But it looks pretty, and that's what counts, eh? Oh, and it's always handy when you want to troll a discussion about sane backup strategies.
Bacula isn't hard to use or understand. You can also use BackupPC or Amanda or rsync or even back in time. Nobody force you.
> Yes, much better than tar.
I say it about rsync.
To clarify:
> Bacula isn't hard to use or understand.
I mean, if you sysadmin you may be forced to learn it. For desktop user is too much imho. Simpler is back in time or dejadup, etc.
grt job eli..
thank u
i saw ur videos
it helps mee...
thank u
Great video...
hello computer guys your videos are good it's help us to be come IT professionals
usefull...superb
Hi Eli I have a question for you, is there a way to backup an entire ubuntu system and restore it with this command or any other terminal like command? I am playing with a vm that is in a hypervisor which I do not have access to and I would like to have that vm backed up to a local machine (through ssh) and how will I go about restoring it?
To my understanding, since I do not have access to the hypervisor I cannot use clonezilla or any other tool like that, I do not have access to the boot menu and such. So I am guessing (from what I am encounter) it is impossible to backup the vm, it is recommended to backup only certain files for example just the www folder instead of the whole system.
Hi Eli ! Great videos ! just for you to know. There is a problem (at least for my view) on 11:17. It does not show you type the commands for the tar...only shows when it's finishing !. Best regards !
Adrian Sandol he does show the command afterwards.
Great video. Question. What if you wanted to transmit that backup off site to another Linux machine on a schedule? How would you do that?
*** 5 STAR *** for you as always ...