Azure DevOps Pipeline and Image Builder
HTML-код
- Опубликовано: 28 ноя 2024
- In this video, we go over using an Azure DevOps pipeline to automate the image build process with Azure Image Builder. This Azure DevOps tutorial goes over using Azure DevOps with VS Code to manage files. We then build a YAML pipeline with Azure CLI, ARM template deployments and PowerShell to build an image.
Links
Udemy:
www.udemy.com/...
GitHub Repo:
github.com/tsr...
Image Builder Playlist:
• Azure Image Builder
DevOps Playlist:
• Azure DevOps
This is the most underrated channel on RUclips. Few months back I started my job as a Data Engineer, your videos have helped me in every way possible. Thank you, @Travis Roberts
Thanks for sharing, I appreciate it!
Travis, I just want to say that I find your content and presentations spot on. I cannot fault this for someone who needs that 'visual' comprehension!
Travis, this a Fantastic tutorial! Thank you so much.
Thank you Travis, another great demo! So much easier then to deal with packer, specifically when building win and t-shooting winrm communications:) Could demo how would you build OS customization pipeline, let's say to add existing vm to session host pool (updating regkey with host pool key) Very interesting to see your approach on that
Thanks! You may have picked up on what I'm building up to :)
Great Work Travis! Quality tutorial and There is only a few of it on on devops at youtube.
A good suggestion would be a good tutorial on powershell scripts in tasks to save filés in blob storage
When I looked at the options available for moving the file to blob storage, the AZ CLI turned up as the easiest one. If using windows, azcopy with PowerShell could be a good option.
Great tutorial!
What about using the DevOps task for the Azure Image builder in the pipeline? Did you try that out as well?
If yes which way would you recommend?
Image template building lasts as long time in background, if I would like to deploy VM using that build image, how would do that in DevOps pipeline?
Use a template that references the Azure Compute Gallery image definition. The template deployment is used in the pipeline.
another great video Travis
i have a question...
is it possible to use the devops, pipelines and the image builder to apply windows patches to a pre existing server2019 image?
the scenario is that our existing server image does not get patched regularly... I need to deploy a vm, patch it, run sysprep and then capture the newly patched server as an image and add it as the next version number in azure compute galleries.
so im wondering if this is something possible in devops and pipelines?
do you have any guides for something like this?
cheers
@Ciraltos great video. I have a question regarding step 1 for AzureCLI copy script to blob storage. Can the blob storage account be from another subscription ? I mean different value then azureSubscription input.
I published a similar video yesterday where I used a SAS URL for the script and softwire archive file. Using the SAS URL it shouldn't matter what subscription the file is in.
Hello Travis,
Thank you so much for these. This playlist is the sole reason I now have some level of proficiency with Azure Image builder. Quick question though, say I have a PowerShell script for the customization to be run on the image. I have some variables in this script and I would like to pass parameters to them during the build process. How can I achieve this?
Who writes those ".ps1" scripts? I know cloud engineers or cloud ops can write the .json file, or even devops engineer, but what the installation script files?
thank's for this great tutorial !! is it possible to personnalise a ubuntu image ?
how else can i trigger the pipline if archive is changed?
Thanks for this video Travis - really sums it up nicely and great idea to use SAS to access storage account. I'm wanting to follow this approach but cannot get it to work. I know the SAS is valid, but the image build fails with: Not authorized to access the resource. Any ideas? It feels like the SAS is not usable by the image builder, or is not being passed over correctly. Thanks
Try to connect to the storage account with Azure Storage Explore and the SAS key. That will verify if there is a problem with the SAS key. Also, verify networking, NSG's or Privet Endpoints that may limit connectivity.
Thansk,
Travis
@@Ciraltos Thanks for getting back to me Travis. Looks like the issue was actually in DevOps and the way it was handling the variable string (I am generating the SAS on the fly and passing as a parameter to the ARM template). I've switched to string replacement on in the ARM template rather than concatenation and parameters - all seems to be ok now.
Hello Sir,
How is work life balance in DevOps jobs.??.. asking as want to start a career path..where I can make a decent amount of money and..can enjoy my life..So if that's the case..Is DevOps for me??
I can't speak directly to DevOps but can to IT in general. Work life balance depends more on the company then the field. There are companies with overworked and under staffed employees. These companies will ultimately will have a high turnover. I would only suggest that environment if you are breaking into the field and need a place to get started. If you find yourself in that type of org stick it out long enough to look good on the resume and find something better.
There are a lot of good places to work that respect work-life balance. It's not uncommon to work a night or weekend to meet a deadline or finish a project, but that should be the exception, not the norm. My philosophy is: if a company wants flexibility in your personal time you should get flexibility in your work time. Set good boundaries, be flexible but don't get taken advantage of.
Also, with all IT dedicated time learning new skills. This industry changes rapidly and continuous learning is needed to stay relevant. It doesn't have to be a lot of time, but enough to keep up with trends.
There's no reason to add replication to the build process. Validate your builds, then replicate. If you have a build fail for whatever reason (mistake in updating a script?), putting replication in the pipeline just wastes time.