✨ *Question of the day* : Do you use Environments in the new YAML Pipelines experience? 🆘 NEED HELP? 🆘 Book a 1:1 Consultation with CoderDave: geni.us/cdconsult We can talk about GitHub, Azure DevOps, or any other DevOps tool or project you need help with! 🙏🏻SUPPORT THE CHANNEL🙏🏻 Buy me a coffee: www.buymeacoffee.com/CoderDave PayPal me donation: paypal.me/dabenveg
Hi. I'm sorry but in many years of being into DevOps I've never had to do a migration from Azure DevOps Pipelines to Jenkins. :) it is usually the opposite (from Jenkins to AzDO Pipelines). I did a quick search and I haven't found anything, so I guess the only way is manually re-writing everything in Jenkins
Ok, but how do I configure Environments with classic pipelines (GUI)? I have added the Deployment Agent but it does not have an element to specify the Environment.... How can I do it?
While this would be certainly useful for Database deployments, unfortunately environments are not available in Classic Pipelines but only in YAML Pipelines
Hi Dave, I need one help. My pipeline is failing against timeout issue because it is taking more than 1 hour for the process to be completed. It is a heavy solution that gets npm packages from internet as well as in Artifacts. After it needs to build 35 modules in order to build binaries. Can you please help here what can be done. Can some default functionality be used. Thank you in advance for the help.
One thing I can think of is using a cache to retrieve your npm packages, instead of going to retrieve them from internet every time (see this ruclips.net/video/HfJcN9gWleM/видео.html)
Hey, I would start with looking the the Environments for VMs in azure pipelines. I have a entire video about that 😉 here it is: ruclips.net/video/zBr7cl6ASMQ/видео.html
Well, you could mix (there are ways to kick a YAML Pipeline from a Release Pipeline and viceversa) but I would recommend sticking to a single approach. Both have pros and cons 😅
Thanks for this presentation. I have been in trouble to manage my build agent environnement (SDK). To deploy it, I used a stage in my build pipeline executed on condition. It seems that even my build agent are physical I can use it to manage the pool deployment with the advantage of traceability. Do we manage those deploy management the same way as the others (I mean in a git repo) ?
Thanks! I wouldn't say they are obsolete, but they "serve a different purpose". In the release pipelines you still get some things you do not have in YAML, like post-deployment approvals and gates (even tho the latter ones are almost matched already in YAML). And some of the triggers are different as well, for example in Release Pipelines you can have artifacts coming from GitHub Releases... But yes, YAML Pipelines get Environments... ;)
I am using VMSS agent(windows) in my azure devops pipeline and it has powershell script to run the code but i am getting az-accounts module not found. how i can fix it your assistance much appreciated.
You can check if that module is installed with Get-InstalledModule Az.Accounts -AllVersions If it is, try to unistall it and reinstall it with Uninstall-Module -Name Az.Accounts Install-Module -Name Az.Accounts
Do you have a demo for associating Azure DevOps releases, release pipelines, etc, with Work items / Boards? How to utilize that setting "report results to work item or to boards'? Anyone else can help? There are 2 links sections on Work Item form - Deployment and Development. It is more or less clear how data appear under "Development". I cannot reproduce the "Deployment" data.
I was looking to make that video already, because those settings can be confusing. Stay tuned because that will happen soon 😉 if I can, already next week!
Thanks for this video, made a lot of sense, just one question though, can you make a video on environment where the deployment target is a private AKS cluster? Tried connecting now got some errors.
Thank you for devops series . Can you cover trigger portion as well with yaml . Something like when a image push to azure container registry, a trigger should be listening in yaml and then stage should run and finally deploy to K8 automatically. I know how to do that will classic ui. But looking to do same with YAML
I would normally use on environment per app. Unless multiple apps use the same resources . Let me explain... differently from what we used to do on prem , in cloud with PaaS (where I normally deploy) resources are usually tied to a single application. So an environment for me would be, for example, a pool of App Service or a K8S namespace... which I normally use for a single app
Having a hard time picturing how this is supposed to scale. I manage 62 services each deploying to 16 different regions each with the ability to deploy independently. That would mean just shy of 1000 rows in that environment table to manually set up and maintain permissions for. And that’s not counting anything before production deployments, I’ve got 40 devs each with their own ability to deploy their 62 services. Not to mention I have to share my ADO Project with over 150 other teams who are in 50+ regions with their own set of microservices. Finding my 1000 environments would be darn near impossible. Classic isn’t great at my scale when it comes to releases, but pipeline deployments seems like a nonstarter. Is there something I’m missing?
Hello Dave, I am a developer among team of 5. Each have a branch to work on. I need to deploy into certain VMs and test my code before raising a Pull Request to main branch. Should i create a personal pipeline then each member has to create multiple. Repeated pipeline or can we trick the common pipeline to jst create a environment
What I would do is creating 5 Environments (one for each team member) in Azure DevOps, and then create only 1 Pipeline. I would then have a conditional in the Pipeline that, depending on the branch it is doing CI from (one of the 5 you mentioned) deploys to the correct environment
@@CoderDave have 2 pipelines one for development and another is for production. how can i attach .env and .env.production file to development and production environment so when i release it will take particular envirorment get deployed to appcenterfor tester to test
Thanks for the details. There’s a lot to discuss about this, won’t fit in YT comments 😄 why don’t you join my Discord server (link in the video description) and ask the question again there in the AzDO questions section so we can discuss about it 😉
@@CoderDave sorry for the unclear question. I don't see you deploy anything to any target resources in this demo? Can you copy files to target machines or run command on target machines? For example, change a configuration or create/delete folder etc...
Yes yes, I believe in the demo I haven't actually deployed anything to keep that simple and less time consuming, but you can of course deploy in the normal way as you would usually do, but you get all the cool stuff from the environments. I made another video using environments specifically for virtual machines, check it out if you want: ruclips.net/video/zBr7cl6ASMQ/видео.html
@@CoderDave I still don't see in this new video where you actual push software code or run any code on target machines to do anything so call deploment. What you are actually doing is just executing empty tasks.
You just replace the “empty tasks” with whatever you use for deployment. It should be a “Copy File” task, and ssh or powershell task, etc… that would depend on what you actually have to achieve 😀
Helo Dave! Many thanks for your content, I´ve been researching and by far you are the one that best explain it (around Azure Pipelines) I have one question, I develope some content in Python to run on daily basis (it has to do API calls, and move data). I have configured on a Linux machine (using cron jobs). I would like to move it to Azure Devops, I would like to be able to change the code in Repos and that Devops change it in the Virtual Machine and keep it running. I am little bit lose, even I don´t know if Pipelines is done for this or for other stuff like apps.
Hi Pablo. First of all, thanks for your compliments. I try my best to produce high quality content. About your question, what you mentioned can absolutely be done with Pipelines. To respond automatically to a push to your repo you can use the normal CI trigger in pipelines (from master or from another branch if you want). Then, to deploy the code to a VM you can use the deployment step (can be done via SSH, File Copy, or other methods... those depend on how you want to deploy). I'd recommend you using the YAML pipelines for this, so you can take advantage of the "Environments for Virtual Machines", I have some examples in this video here: ruclips.net/video/zBr7cl6ASMQ/видео.html
@@CoderDave Thank you Dave! I´ve finally have my code linked to Azure DevOps and now everytime that I update my code, it is updated in my Virtual Machine. I have one last doubt which I would like to know your opinion. I am working mainly with data, no apps, mainly my scrips are used for ask for data to multiple source, wrap them, clean data and upload to somewhere. In the Virtual Machine I am doing that with a cron job which is schedule to run every day or every hour. I have the feeling that Pipelines are more to release the code, am I wrong? Which is your opinion around that? I can create a cron job in the virtual machine pointing to the relase but what I would like is keep all under the same place, because it could be more workable. I think that is a question for people that is working on data and would like to use DevOps for that kind of Scripts. Thanks in advance for your help and work, and sorry for so many questions...
Yes, Pipelines is more a CICD service, however it could be used as a sort-of automation engine as well. If you want to have everything managed by Pipelines (so you don;t have to manually create and maintain the cron job, for example) I would have a pipeline with a schedyle trigger (which btw accept the same cron syntax) and I'll have it executing the data loading/massaging script on the host vis SSH or anything like it
Hi, very well explained. I have one question, Can we have 2 different environments (eg Dev/Test) share the same resource? If yes, can we have different approvals/checks for each?
Good video. One observation: I did not have any pipeline defined nor knowledge of pipelines yet and it threw me off when you cut and pasted an existing pipeline into the environment... I had to stop the video and jump into another video where I could learn about creating a pipeline :(
Hey, sorry to hear that. Yes, that video was targeting people that already have knowledge about Pipelines, becuase the concept of Environments build on top of the Pipelines.
While is true that I’ve used an already created pipeline (the video is about environments, not pipelines per se 😀), I showed the relevant parts of the pipeline needed to work with the other environment. Environments are an “advanced” feature of Pipelines, so working with them requires some degree of experience with Pipelines. If you are new to Pipelines, I’d advice you to check some other videos before this one 😉
I spent half a day, then I found this and figure it out. Thank you so much.
Really the way ur explaining is relentless ..thankfull for ur updates.om azure
Always happy to be helpful :)
✨ *Question of the day* : Do you use Environments in the new YAML Pipelines experience?
🆘 NEED HELP? 🆘
Book a 1:1 Consultation with CoderDave: geni.us/cdconsult
We can talk about GitHub, Azure DevOps, or any other DevOps tool or project you need help with!
🙏🏻SUPPORT THE CHANNEL🙏🏻
Buy me a coffee: www.buymeacoffee.com/CoderDave
PayPal me donation: paypal.me/dabenveg
Good Explanation - All the best brother.
Thank you so much 🙂
Is a vm or kubernetes resource required for the YML release pipeline?
No no it’s is not. Can deploy to anything and runs directly in azure DevOps
can you please explain about task group and variable groups?
I'll try my best to make a video about it. Stay tuned ;)
Thanks Dave, great video.
Glad you enjoyed it
Hello, I add the environment in Yaml file as you described but I have the following "Unexpected property environment". what is missing?
Good question. No, it does not. If the branch is not updated, then the auto-merge wo';t happen until you update it.
Hi.. I am looking forward to have something on.. how we can migrate azure devops pipelines into Jenkins. Please guide
Hi. I'm sorry but in many years of being into DevOps I've never had to do a migration from Azure DevOps Pipelines to Jenkins. :) it is usually the opposite (from Jenkins to AzDO Pipelines). I did a quick search and I haven't found anything, so I guess the only way is manually re-writing everything in Jenkins
Ok, but how do I configure Environments with classic pipelines (GUI)? I have added the Deployment Agent but it does not have an element to specify the Environment.... How can I do it?
Hey there, as I mentioned in the video, Environments are available only in YAML Pipelines, not in Classic
Hi,
Can you show me how to create environment for Classic cli pipeline,
you did in Yaml please share a link which create for classic cli.
Unfortunately I cannot. Environments are a feature only of YAML Pipelines, Classic Pipelines cannot use Environments
@@CoderDave thankyou for confirmation..
Excellent presentation. Clear explanation.
Thanks!
Nice explanation. I do not get why you can define pool though like in docs examples
Thanks :) I'm not sure what you mean, can you clarify your question?
Would this be useful while creating classic pipelines for Database?
While this would be certainly useful for Database deployments, unfortunately environments are not available in Classic Pipelines but only in YAML Pipelines
Hi Dave, I need one help. My pipeline is failing against timeout issue because it is taking more than 1 hour for the process to be completed. It is a heavy solution that gets npm packages from internet as well as in Artifacts. After it needs to build 35 modules in order to build binaries. Can you please help here what can be done. Can some default functionality be used. Thank you in advance for the help.
One thing I can think of is using a cache to retrieve your npm packages, instead of going to retrieve them from internet every time (see this ruclips.net/video/HfJcN9gWleM/видео.html)
Same approach can probably be used for modules. Also, I wonder if there is a way you can build those things in parallel using multiple jobs
What tool are you using to draw on screen while recording?
Hey. I'm using ZoomIt: docs.microsoft.com/en-us/sysinternals/downloads/zoomit
@CoderDace : Hello Sir
I have Laravel project installed in VM(PHP laravel), how i can setup ci/cd with that. please help
Hey, I would start with looking the the Environments for VMs in azure pipelines. I have a entire video about that 😉 here it is: ruclips.net/video/zBr7cl6ASMQ/видео.html
Thank you so much for the video
Thanks for the support ☺️
there a way to link an environment to a release pipeline deployment?
I’m afraid not. Environments are usable only with YAML Pipelines
@@CoderDave oh :( So for deployments we need to decided to use YAML environments or the release pipeline, right? We can't mix both.
Well, you could mix (there are ways to kick a YAML Pipeline from a Release Pipeline and viceversa) but I would recommend sticking to a single approach. Both have pros and cons 😅
can you please refer the git repo link which you have used as code
sorry, I don't have that repo anymore
Thanks for this presentation. I have been in trouble to manage my build agent environnement (SDK). To deploy it, I used a stage in my build pipeline executed on condition. It seems that even my build agent are physical I can use it to manage the pool deployment with the advantage of traceability. Do we manage those deploy management the same way as the others (I mean in a git repo) ?
Great work Dave as usual. So technically speaking "Release Pipelines" are obsolete?
Thanks! I wouldn't say they are obsolete, but they "serve a different purpose". In the release pipelines you still get some things you do not have in YAML, like post-deployment approvals and gates (even tho the latter ones are almost matched already in YAML). And some of the triggers are different as well, for example in Release Pipelines you can have artifacts coming from GitHub Releases... But yes, YAML Pipelines get Environments... ;)
Looking forward to the next video in the series
Thanks. It will come soon, and it will be about using Environments with AKS and Kubernetes
I am using VMSS agent(windows) in my azure devops pipeline and it has powershell script to run the code but i am getting az-accounts module not found. how i can fix it your assistance much appreciated.
You can check if that module is installed with
Get-InstalledModule Az.Accounts -AllVersions
If it is, try to unistall it and reinstall it with
Uninstall-Module -Name Az.Accounts
Install-Module -Name Az.Accounts
How do You create an Environment and the environment is an App Service?
There is no specific environment for App Service. Just create a generic one
Do you have a demo for associating Azure DevOps releases, release pipelines, etc, with Work items / Boards? How to utilize that setting "report results to work item or to boards'? Anyone else can help?
There are 2 links sections on Work Item form - Deployment and Development. It is more or less clear how data appear under "Development". I cannot reproduce the "Deployment" data.
I was looking to make that video already, because those settings can be confusing. Stay tuned because that will happen soon 😉 if I can, already next week!
The video on this is coming out next Tuesday :) Don't mis it!
Hey @gregdvorkin, here you have the video explaining all those settings: ruclips.net/video/EHbPpQNoBvI/видео.html
Let me know what you think
great explanation! thanks!
Glad it was helpful!
Thanks for this video, made a lot of sense, just one question though, can you make a video on environment where the deployment target is a private AKS cluster? Tried connecting now got some errors.
I will, thanks for the suggestion. Stay tuned
Thank you for devops series . Can you cover trigger portion as well with yaml . Something like when a image push to azure container registry, a trigger should be listening in yaml and then stage should run and finally deploy to K8 automatically. I know how to do that will classic ui. But looking to do same with YAML
Thanks for the suggestion, I will definitely plan for it. Stay tuned ;)
Hi, I know it's been a while but I finally have that video out: ruclips.net/video/FHnPJ8FBjLM/видео.html Let me know what you think!
Did you find a way to create these environments by using CLI or REST API ?
Unfortunately, both the REST APIs and the CLI don't currently support managing environments.
Do you use one environment for multiple apps or one environment per app? For example: dev-app1, dev-app2 etc. or just one "dev" environment for all?
I would normally use on environment per app. Unless multiple apps use the same resources . Let me explain... differently from what we used to do on prem , in cloud with PaaS (where I normally deploy) resources are usually tied to a single application. So an environment for me would be, for example, a pool of App Service or a K8S namespace... which I normally use for a single app
@@CoderDave Thanks for the answer and video, very helpful. I use app services in this case. 😁
More than welcome ☺️
Having a hard time picturing how this is supposed to scale. I manage 62 services each deploying to 16 different regions each with the ability to deploy independently. That would mean just shy of 1000 rows in that environment table to manually set up and maintain permissions for.
And that’s not counting anything before production deployments, I’ve got 40 devs each with their own ability to deploy their 62 services. Not to mention I have to share my ADO Project with over 150 other teams who are in 50+ regions with their own set of microservices. Finding my 1000 environments would be darn near impossible.
Classic isn’t great at my scale when it comes to releases, but pipeline deployments seems like a nonstarter. Is there something I’m missing?
SUBSCRIBED! You've got some great vids, thanks so much for uploading all the tech info, like candy to a kid in a tech store. : - )
Thank you. I try my best to help... happy to know my videos are helpful ☺️
Best explanation!
Thank you. The second video of this series, about Azure DevOps Environments for Kubernetes, is coming next Tuesday😉
Hello Dave, I am a developer among team of 5. Each have a branch to work on. I need to deploy into certain VMs and test my code before raising a Pull Request to main branch. Should i create a personal pipeline then each member has to create multiple. Repeated pipeline or can we trick the common pipeline to jst create a environment
What I would do is creating 5 Environments (one for each team member) in Azure DevOps, and then create only 1 Pipeline. I would then have a conditional in the Pipeline that, depending on the branch it is doing CI from (one of the 5 you mentioned) deploys to the correct environment
can anyone share how to add .env and .env.production to your dev and prodution pipelne ?
Hi. What is your use case?
@@CoderDave have 2 pipelines one for development and another is for production. how can i attach .env and .env.production file to development and production environment so when i release it will take particular envirorment get deployed to appcenterfor tester to test
Thanks for the details. There’s a lot to discuss about this, won’t fit in YT comments 😄 why don’t you join my Discord server (link in the video description) and ask the question again there in the AzDO questions section so we can discuss about it 😉
Super !!
Thank you! Cheers!
What has you deployed to any of the target machines?
Sorry, not sure I understand the question 😬
@@CoderDave sorry for the unclear question. I don't see you deploy anything to any target resources in this demo? Can you copy files to target machines or run command on target machines? For example, change a configuration or create/delete folder etc...
Yes yes, I believe in the demo I haven't actually deployed anything to keep that simple and less time consuming, but you can of course deploy in the normal way as you would usually do, but you get all the cool stuff from the environments.
I made another video using environments specifically for virtual machines, check it out if you want: ruclips.net/video/zBr7cl6ASMQ/видео.html
@@CoderDave I still don't see in this new video where you actual push software code or run any code on target machines to do anything so call deploment. What you are actually doing is just executing empty tasks.
You just replace the “empty tasks” with whatever you use for deployment. It should be a “Copy File” task, and ssh or powershell task, etc… that would depend on what you actually have to achieve 😀
Helo Dave! Many thanks for your content, I´ve been researching and by far you are the one that best explain it (around Azure Pipelines) I have one question, I develope some content in Python to run on daily basis (it has to do API calls, and move data). I have configured on a Linux machine (using cron jobs). I would like to move it to Azure Devops, I would like to be able to change the code in Repos and that Devops change it in the Virtual Machine and keep it running. I am little bit lose, even I don´t know if Pipelines is done for this or for other stuff like apps.
Hi Pablo. First of all, thanks for your compliments. I try my best to produce high quality content.
About your question, what you mentioned can absolutely be done with Pipelines. To respond automatically to a push to your repo you can use the normal CI trigger in pipelines (from master or from another branch if you want). Then, to deploy the code to a VM you can use the deployment step (can be done via SSH, File Copy, or other methods... those depend on how you want to deploy). I'd recommend you using the YAML pipelines for this, so you can take advantage of the "Environments for Virtual Machines", I have some examples in this video here: ruclips.net/video/zBr7cl6ASMQ/видео.html
CoderDave Manu thanks dave!!
You’re more than welcome.
@@CoderDave Thank you Dave! I´ve finally have my code linked to Azure DevOps and now everytime that I update my code, it is updated in my Virtual Machine. I have one last doubt which I would like to know your opinion. I am working mainly with data, no apps, mainly my scrips are used for ask for data to multiple source, wrap them, clean data and upload to somewhere. In the Virtual Machine I am doing that with a cron job which is schedule to run every day or every hour. I have the feeling that Pipelines are more to release the code, am I wrong? Which is your opinion around that? I can create a cron job in the virtual machine pointing to the relase but what I would like is keep all under the same place, because it could be more workable. I think that is a question for people that is working on data and would like to use DevOps for that kind of Scripts. Thanks in advance for your help and work, and sorry for so many questions...
Yes, Pipelines is more a CICD service, however it could be used as a sort-of automation engine as well. If you want to have everything managed by Pipelines (so you don;t have to manually create and maintain the cron job, for example) I would have a pipeline with a schedyle trigger (which btw accept the same cron syntax) and I'll have it executing the data loading/massaging script on the host vis SSH or anything like it
Hi, very well explained. I have one question, Can we have 2 different environments (eg Dev/Test) share the same resource? If yes, can we have different approvals/checks for each?
Yes, absolutely. Remember that the "Environment" in AzDO is just a logical representation of your resources
Good video. One observation: I did not have any pipeline defined nor knowledge of pipelines yet and it threw me off when you cut and pasted an existing pipeline into the environment... I had to stop the video and jump into another video where I could learn about creating a pipeline :(
Hey, sorry to hear that. Yes, that video was targeting people that already have knowledge about Pipelines, becuase the concept of Environments build on top of the Pipelines.
Typical DevOps tuition. 1. Shows us how to create environment - great. 2. Here is a pipeline I already created? How do we follow along - we cannot.
While is true that I’ve used an already created pipeline (the video is about environments, not pipelines per se 😀), I showed the relevant parts of the pipeline needed to work with the other environment.
Environments are an “advanced” feature of Pipelines, so working with them requires some degree of experience with Pipelines. If you are new to Pipelines, I’d advice you to check some other videos before this one 😉