Thanks. I don't get why this stuff is not used by most people. I don't do windows primarily since almost 2 decades now, but it makes my head explode trying to understand how little of the core functionality is put to use by many windows admins. RTFM seems to not happen. Your video is a good thing[tm] and can save millions for companies all over the world, if people just watch and apply.
Hi, Nick i have seen almost every major topic video you created to counter check on our technique and get various thing to keep for us. Thus i found your video very helpful in terms of detail explanation regarding every topic you made for those who are in the filed IT & Acting as Admin somewhere to cope at their end on a quick way. Overall provided info, clear voice and step you created relevant to topic are quite good. Keep providing more tips & tricks. Thanks
Dont know if anyone gives a shit but last night I hacked my friends Instagram account using InstaPlekt. You can find it by Googling if you wanna try it yourself
Thank you for your excellent explanation. Just wanna do one request that when you have the powershell commands could you please write in the description in your upcoming videos. Thanks once again.
Great video. You are legit. Thanks for doing the research and explaining pros, cons and caveats and for answering the most important question: WHY?. A lot of RUclips tutorial content is just guys clicking through things with techno/ house music in the background or them regurgitating useless talking points that you already know if your watching the tutorial.
Very usefull for 70-740 exam. Fiew more videos and you can make a business like CBT Nugetts to sell the videos like a tutorial for preparing the MCSA exams.
Thank you Daniel for noticing this! I have a long road in front of me before achieving this. I want to be more experienced both in the technology itself and in teaching.
Data Deduplication dramatically reduces data storage requirements. When identical files are found a single copy is kept and then the file will be point between users. Users can interact with the pointers as if they were the original file, and when users make modifications to the file then a copy with their modifications is created for their use.
When you were choosing the file system type for the volume, you said you can use either NTFS or ReFS. Deduplication does not work with ReFS, at least not yet. What I am wondering though is if you can get both the benefits of file duplication and ReFS by creating a virtual hard disk on an NTFS partition, then format the virtual hard disk inside the file server VM as ReFS. Would the host file system NTFS then be able to dedupe the data on the virtual ReFS?
Hi xravxx, thank you for mentioning that. You are right on the Data Dedup only works on NTFS. About the scenario that you mentioned, I think that although it could be possible and you will have some benefits if something goes wrong Microsoft could say it is not a supported configuration. I would be reserved if you are doing this in large production environment.
I don't think it will work. The virtual disk is, say, a .vmdk file which if of the size of the data in the virtual drive then mounted as a ReFS drive. For Dedup (running only on NTFS) it will see one huge file. Thus, it can't optimize data in the virtual disk. Someone has another point of view? Maybe I got something wrong here. :) (but I don't think so. hehe)
Any hint about [-InputOutputThrottleLevel ] and [-ThrottleLimit ]? I need the dedup job to be more aggressive as I have large volumes ( ~10TB ) and it takes forever. Should I set -InputOutputThrottleLevel High -ThrottleLimit 80 or -InputOutputThrottleLevel Low -ThrottleLimit 20 Which one will use more iops so the dedup can finish within 24h? The explanation on official microsoft is kind of confusing.
Hi anghel you can check - blogs.technet.microsoft.com/filecab/2014/12/04/sizing-volumes-for-data-deduplication-in-windows-server/ This is for Windows Server 2012 R2 and can help. I/O Throttling ensures that jobs don't interfere with other I/O-intensive processes. High - it will reduce the interference, Low - increase the interference. Be sure to plan carefully as it can impact the performance.
Hi Fenazz, Microsoft claim the hit on performance is minimum, but depends on the actual load. Windows Server 2016 offers fine tuning on the Data Deduplication that can improve the overall performance and will not impact the file operations.
I think MS are able to save %77 because it's the same file (iso) I think the performance will be poor specially if we regularly open the data. But thanks for the video. The only use for this could be ssd but I use ssd to have faster system not sure if I'll trust MS to manage my data.
good video but I the idea of Data Duplication is not really effective. HD are very cheap wasting our time going through the long MS steps is stupid. But as usual MS always add extra steps to everything can be done quickly
Thanks. I don't get why this stuff is not used by most people. I don't do windows primarily since almost 2 decades now, but it makes my head explode trying to understand how little of the core functionality is put to use by many windows admins. RTFM seems to not happen. Your video is a good thing[tm] and can save millions for companies all over the world, if people just watch and apply.
Your videos are super helpful for the 70-740 exam.
I have been watching many of your videos. Thank you for taking the time to create such useful material.
Thank you, Tony! Much appreciated!
Hi, Nick i have seen almost every major topic video you created to counter check on our technique and get various thing to keep for us. Thus i found your video very helpful in terms of detail explanation regarding every topic you made for those who are in the filed IT & Acting as Admin somewhere to cope at their end on a quick way. Overall provided info, clear voice and step you created relevant to topic are quite good. Keep providing more tips & tricks. Thanks
You are great man Nick. Just love it, you are awesome dude. Keep it up.
Love to watch your video’s. easy to understand and very helpful for us.
Awesome video, great explanation and examples. Keep up the good work.
Amazing Feature! Congrats for your video. Thanks from Brazil.
This was a great video! I'm glad you took the time out to explain every detail.
Dont know if anyone gives a shit but last night I hacked my friends Instagram account using InstaPlekt. You can find it by Googling if you wanna try it yourself
Thank You - Excellent explanation and demonstration. Great Video!
Thank you very much for your excellent videos. Keep the good work up.
Very helpful video and details is explained thoroughly.
Awesome video, very well explained. Bravo!!!
Of course, you are much appreciated for the excellent explanation. Keep it up please
I joined Your Informative channel tonight
Thank you for your support, Sener!
Thank you for your excellent explanation. Just wanna do one request that when you have the powershell commands could you please write in the description in your upcoming videos. Thanks once again.
I like your videos
Perfect job 👍🏻
Thank you
Great video. You are legit. Thanks for doing the research and explaining pros, cons and caveats and for answering the most important question: WHY?. A lot of RUclips tutorial content is just guys clicking through things with techno/ house music in the background or them regurgitating useless talking points that you already know if your watching the tutorial.
awesome video and useful feature, I gonna be your follower for sure.
Hi Vielside, thank you so much for the support!
Really good and informativ videos from you
Thanks
Thank you, petjoh2.
Nice job, thanks for sharing!
Awesome work . Its really really helpful.
Its very useful. Thanks Nick for creating such a wonderful video.
Thank you for the support!
Such an awesome video, thank you very much for this big help!!
Awesome tutorials man! Thanks.
Very usefull for 70-740 exam. Fiew more videos and you can make a business like CBT Nugetts to sell the videos like a tutorial for preparing the MCSA exams.
Thank you Daniel for noticing this! I have a long road in front of me before achieving this. I want to be more experienced both in the technology itself and in teaching.
Love this!! Any diff with windows 2019? Thank you.
Great video!
Nice tutorial
Thank you so much
INDIA
Data Deduplication dramatically reduces data storage requirements. When identical files are found a single copy is kept and then the file will be point between users. Users can interact with the pointers as if they were the original file, and when users make modifications to the file then a copy with their modifications is created for their use.
Great video thanks to share
Very great content! I like your videos.
Thank you for the support!
You rock, I like your videos.
Excellent Video
Thank you Naresh!
When you were choosing the file system type for the volume, you said you can use either NTFS or ReFS. Deduplication does not work with ReFS, at least not yet. What I am wondering though is if you can get both the benefits of file duplication and ReFS by creating a virtual hard disk on an NTFS partition, then format the virtual hard disk inside the file server VM as ReFS. Would the host file system NTFS then be able to dedupe the data on the virtual ReFS?
Hi xravxx, thank you for mentioning that. You are right on the Data Dedup only works on NTFS. About the scenario that you mentioned, I think that although it could be possible and you will have some benefits if something goes wrong Microsoft could say it is not a supported configuration. I would be reserved if you are doing this in large production environment.
I don't think it will work. The virtual disk is, say, a .vmdk file which if of the size of the data in the virtual drive then mounted as a ReFS drive. For Dedup (running only on NTFS) it will see one huge file. Thus, it can't optimize data in the virtual disk. Someone has another point of view? Maybe I got something wrong here. :) (but I don't think so. hehe)
Any hint about [-InputOutputThrottleLevel ] and [-ThrottleLimit ]? I need the dedup job to be more aggressive as I have large volumes ( ~10TB ) and it takes forever. Should I set
-InputOutputThrottleLevel High -ThrottleLimit 80
or
-InputOutputThrottleLevel Low -ThrottleLimit 20
Which one will use more iops so the dedup can finish within 24h? The explanation on official microsoft is kind of confusing.
Hi anghel you can check - blogs.technet.microsoft.com/filecab/2014/12/04/sizing-volumes-for-data-deduplication-in-windows-server/
This is for Windows Server 2012 R2 and can help. I/O Throttling ensures that jobs don't interfere with other I/O-intensive processes. High - it will reduce the interference, Low - increase the interference. Be sure to plan carefully as it can impact the performance.
Quick question: how would you automatize the process? In the example, it is manually launched. Would you use the scheduler for that?
Hi Eric, this should be automated in general. Scheduled tasks should run on specific times.
How that will affect performance as we using a lot of traffic to shared drive?
Thanks in advance
Hi Fenazz, Microsoft claim the hit on performance is minimum, but depends on the actual load. Windows Server 2016 offers fine tuning on the Data Deduplication that can improve the overall performance and will not impact the file operations.
thank you nick! :)
Thank you ZM C!
I think MS are able to save %77 because it's the same file (iso) I think the performance will be poor specially if we regularly open the data. But thanks for the video. The only use for this could be ssd but I use ssd to have faster system not sure if I'll trust MS to manage my data.
Thank you boss
Good Job, ;)
Dude,youre a FUCKING LEGEND! thank you so much
good video but I the idea of Data Duplication is not really effective. HD are very cheap wasting our time going through the long MS steps is stupid. But as usual MS always add extra steps to everything can be done quickly
Reference pls :)
знаех си че е българин :) windows дес... ten
are you drowning in your own saliva? Worst presentation of anything I have ever seen/heard