Just paused the video to comment.. Yes sap videos please.. No one out there explains azure as good as you.. Would be eager to learn sap on azure from this channel. Thanks
My very first video was showing a SAP lab build through PowerShell in the Azure classic days I also have ARM Templates that do that build today on my GitHub What would you want to see in a SAP video series?
Dean, I like your videos and I shared it with my colleagues. I vote for a session on SAP on Azure, especially for us that we don't know kinds of SAP workloads. In my work, we need to provide sizing or design recommendation for SAP, but I am confused with all the SAP terminologies, like NW, HANA, business suite ....
Hi Dean, do you know how much of the NetApp features the azure resource is using? I mean here, does it work with data deduplication, compression? Also, the snapshoting seems to be a manual process which I reckon it can be automated with a script. Where is that snapshot stored and what costs it involves? Couldn't find any info after quick browse in the documents. Using your example, if you have a volume of 100gb and create a snapshot, pool usage will be 200gb out of the 4tb?
Shapshot Capacity Consumption doc -- docs.microsoft.com/en-us/azure/azure-netapp-files/azure-netapp-files-cost-model#capacity-consumption-of-snapshots I will get back to you on the other questions
Here is some info from Kirk Ryan on your question: I wrote the scheduler in use at many customers today (github.com/kirkryan/anfScheduler/tree/master) it's an Azure logic app. Under NDA we can share that this scheduling will be added to the blade in future. Snapshots are stored within the volume itself, and the only overhead is the block level change delta between snapshots so for example: 200GB volume with 100GB database First snapshot = no space overhead Second snapshot = rate of change between first and second snapshot at block level. So if we were to say that 1GB a day of data changed and we took one daily snapshot then the snapshot overhead would be 1GB. This overhead is released when you no longer retain the snapshot and space is released back into the parent volume. The primary benefit of this is an instant restore of data. Also (under NDA) ANF is gaining cross-region replication (currently in private preview) which means the whole volume including all your snapshots are replicated to a second region asynchronously for you (min every 10 minutes).
@@AzureAcademy You're welcome. By the way, I started my Azure journey with you about 3 years ago; now I am a few weeks from becoming a solution architect; you must have noticed me already. My honest impression: For putting so much effort to post so much quality content, I think you should consider going slower. It's not like you're in closed-in environment where you're trying to impress a new starter at work, you're actually on a world platform. Anybody can tune in to your tutorial you see! Please browse for the competition, you'll see what I mean. There's so much content out there on these subject. I could name the top guys. I know you wish to grow, so maybe this will help in future? Take care
AWESOME...Congratulations @Bijou that is a great journey! please contact me offline and let me know which other creators you are thinking I should take my pace from...I can always improve 😁 Thanks for the feedback, and good luck to you in the new role!
@@AzureAcademy Yes of course! I just tried but it appears I can't find a way to send messages on RUclips. I've not used it for anything else like that. Although, I believe if you msg me a msg I should be able to see a notification. I will reply then. Thanks
SAP HANA is a memory database...but you need large amounts of very fast storage. On top of that Azure NetApp Files supports Cifs (SMB) and NFS in the same appliance. Benefit is that you can use 1 appliance for All your file needs in the cloud
If you are asking if you can setup Standard, premium or Ultra capacity pools and use them for different workloads...YES. or are you talking about some other storage tiering
Hello Dean. Can you let me know if it is possible to configure an auto scaling solution for the ANF volume though the logic apps and automation account to meet the demand of resource utilization during the daily log on storm? Something where we can expand the premium pool from 1TiB to 4TiB to have a through put of 64 MiB/s to 256 MiB/s and then change it back to 1TiB after 1hr time window.
I believe you can do this...it would probably need something like Azure DevOps or an Azure Automation account to execute the scripts, like the Scaling tool to monitor what is going on and if it should scale up or down.
Fantastic video walk-through. I do have an issue/question though. I'm trying to create a volume but receive an error because ANF only seems to use unsigned LDAP (389) queries to create the volume (virtual computer object) in AD. Our AD requires LDAP signing (Per MS security recommendations). Is there any work-around for this? (best I could think of would be temporarily allowing LDAP signing to get the Computer Object(s) created, then switching the policy back? I see there is an feedback request created here: feedback.azure.com/forums/926410-azure-netapp-files-anf/suggestions/40090933-enable-ldaps-support
there isn't a way to do this as far as I know...but thanks for mentioning it and putting in the feedback...I will also pass that one to the product group
Just paused the video to comment.. Yes sap videos please.. No one out there explains azure as good as you.. Would be eager to learn sap on azure from this channel. Thanks
Thanks for the feedback and suggestion Rohan, I will see what I can do!
Great video, Azure NetApp files.
Thanks!
Excellent work Dean.
Thanks Frank!
This video is great, help me a lot with the preparation of AZ-140. I'm going to take the exam next week :)
Awesome! Please let me know when you pass!
@@AzureAcademy I passed the exam, it was awesome!!! I couldn't do it without all the resources available in your channel. I'm absolutely grateful :)
Awesome, Congratulations Christian !!!
👍🎉👏🏆🙌👍
Thank you, Yes it would be great if you could start a SAP series !!
My very first video was showing a SAP lab build through PowerShell in the Azure classic days
I also have ARM Templates that do that build today on my GitHub
What would you want to see in a SAP video series?
Dean, I like your videos and I shared it with my colleagues. I vote for a session on SAP on Azure, especially for us that we don't know kinds of SAP workloads. In my work, we need to provide sizing or design recommendation for SAP, but I am confused with all the SAP terminologies, like NW, HANA, business suite ....
Thanks for the feedback...Stay tuned!
Great Demo again. Can you do a demo on analyzing storage performance both from OS level and from Azure storage end
I will look into that...thanks for the recommendation!
Splendid!
Thanks very much!
Thanks for sharing 🙏💕😌🙏
happy to help!
Hi Dean, do you know how much of the NetApp features the azure resource is using? I mean here, does it work with data deduplication, compression? Also, the snapshoting seems to be a manual process which I reckon it can be automated with a script. Where is that snapshot stored and what costs it involves? Couldn't find any info after quick browse in the documents. Using your example, if you have a volume of 100gb and create a snapshot, pool usage will be 200gb out of the 4tb?
Shapshot Capacity Consumption doc --
docs.microsoft.com/en-us/azure/azure-netapp-files/azure-netapp-files-cost-model#capacity-consumption-of-snapshots
I will get back to you on the other questions
Here is some info from Kirk Ryan on your question:
I wrote the scheduler in use at many customers today (github.com/kirkryan/anfScheduler/tree/master) it's an Azure logic app. Under NDA we can share that this scheduling will be added to the blade in future.
Snapshots are stored within the volume itself, and the only overhead is the block level change delta between snapshots so for example:
200GB volume with 100GB database
First snapshot = no space overhead
Second snapshot = rate of change between first and second snapshot at block level.
So if we were to say that 1GB a day of data changed and we took one daily snapshot then the snapshot overhead would be 1GB. This overhead is released when you no longer retain the snapshot and space is released back into the parent volume. The primary benefit of this is an instant restore of data.
Also (under NDA) ANF is gaining cross-region replication (currently in private preview) which means the whole volume including all your snapshots are replicated to a second region asynchronously for you (min every 10 minutes).
@@AzureAcademy cheers!
😊
Thank you.
Anytime
@@AzureAcademy You're welcome. By the way, I started my Azure journey with you about 3 years ago; now I am a few weeks from becoming a solution architect; you must have noticed me already. My honest impression: For putting so much effort to post so much quality content, I think you should consider going slower. It's not like you're in closed-in environment where you're trying to impress a new starter at work, you're actually on a world platform. Anybody can tune in to your tutorial you see! Please browse for the competition, you'll see what I mean. There's so much content out there on these subject. I could name the top guys. I know you wish to grow, so maybe this will help in future?
Take care
AWESOME...Congratulations @Bijou that is a great journey! please contact me offline and let me know which other creators you are thinking I should take my pace from...I can always improve 😁 Thanks for the feedback, and good luck to you in the new role!
@@AzureAcademy Yes of course! I just tried but it appears I can't find a way to send messages on RUclips. I've not used it for anything else like that. Although, I believe if you msg me a msg I should be able to see a notification. I will reply then. Thanks
my email is listed on the about page on my channel
Please give us use case for SAP solution
SAP HANA is a memory database...but you need large amounts of very fast storage. On top of that Azure NetApp Files supports Cifs (SMB) and NFS in the same appliance. Benefit is that you can use 1 appliance for All your file needs in the cloud
You are the best big thank you
You're welcome! Please share The Azure Academy with others so they can learn too.
+Mohammed Khalid Saleh That is awesome to hear, thanks!
This is great tutorial again.
So, we'll be charge 4TB cost, if we provision 4TB host pool or actual data cost reside in share?
Here is the cost model doc for ANF --
docs.microsoft.com/en-us/azure/azure-netapp-files/azure-netapp-files-cost-model
@@AzureAcademy ok got it thanks
anytime!
Could we use Azure NetApp filewith storage tiering ?
If you are asking if you can setup Standard, premium or Ultra capacity pools and use them for different workloads...YES.
or are you talking about some other storage tiering
Please create videos for SAP on Azure
anything specific on SAP you want us to cover?
@@AzureAcademy please cover SAP and HANA implementation on Azure Certified VMs and use cases of Storage like Azure disk and NetApp
Thanks - Adeel
Build following Certified SAP HANA requirements
Hello Dean, what's the bare minimum permissions for Users to have Anf to b used for Fxlogix.
FSLogix permissions can be found here
docs.microsoft.com/en-us/fslogix/fslogix-storage-config-ht
Hello Dean. Can you let me know if it is possible to configure an auto scaling solution for the ANF volume though the logic apps and automation account to meet the demand of resource utilization during the daily log on storm? Something where we can expand the premium pool from 1TiB to 4TiB to have a through put of 64 MiB/s to 256 MiB/s and then change it back to 1TiB after 1hr time window.
I believe you can do this...it would probably need something like Azure DevOps or an Azure Automation account to execute the scripts, like the Scaling tool to monitor what is going on and if it should scale up or down.
Fantastic video walk-through. I do have an issue/question though. I'm trying to create a volume but receive an error because ANF only seems to use unsigned LDAP (389) queries to create the volume (virtual computer object) in AD. Our AD requires LDAP signing (Per MS security recommendations). Is there any work-around for this? (best I could think of would be temporarily allowing LDAP signing to get the Computer Object(s) created, then switching the policy back?
I see there is an feedback request created here: feedback.azure.com/forums/926410-azure-netapp-files-anf/suggestions/40090933-enable-ldaps-support
there isn't a way to do this as far as I know...but thanks for mentioning it and putting in the feedback...I will also pass that one to the product group
I prefer to see on Sap videos
Great… Is there any particular area of SAP you’re interested in?
@@AzureAcademy Thanks for the reply.
In general prospective will help your subscriber to understand the conceptual things in a better way!!
Sounds good
SAP
OK...will start working on it.
SAP
Anything specific on SAP?