Azure DevOps Pipelines with Terraform and Stages

Поделиться
HTML-код
  • Опубликовано: 15 июл 2024
  • In this video, we build an Azure DevOps pipeline with stages to deploy resources built with Terraform. We start by reviewing the environment including the code and the storage account used for the backend Terraform data. We add the Terraform extension to DevOps and create a multiple stage Azure DevOps pipeline with tasks to initialize, verify plan and apply the Terraform configuration. Next, we create a second pipeline to run the destroy command to clean up the deployment.
    00:00 - Start
    01:37 - Demo Overview
    02:59 - Add a Service Connection
    04:17 - Add the Terraform Extension
    05:45 - Create the Pipeline
    06:45 - Add the Validate Stage
    08:04 - Add Terraform Install Task
    09:03 - Add Terraform Init Task
    10:41 - Add Terraform Verification Task
    11:18 - Add the Deploy Stage
    12:41 - Create the Deployment Jobs
    13:54 - Add Terraform Plan Task
    14:33 - Add Terraform Apply Task
    15:09 - Run the Pipeline
    17:23 - Create a Destroy Pipeline
    Links:
    Terraform Playlist
    • Getting Started with T...
    Azure DevOps Playlist
    • Azure DevOps
    Files on GitHub
    github.com/tsrob50/PublicDevOps
    Zero to Hero with Azure Virtual Desktop
    www.udemy.com/course/zero-to-...
    Hybrid Identity with Windows AD and Azure AD
    www.udemy.com/course/hybrid-i...
  • НаукаНаука

Комментарии • 66

  • @jakegoodlett9608
    @jakegoodlett9608 Год назад +2

    Great video, we're moving away from classic pipelines so this was very helpful.

  • @johnvbrennan
    @johnvbrennan Год назад

    Great video. Really did a good job of explaining yaml stages and use of terraform tasks.

  • @marybecken2358
    @marybecken2358 Год назад +1

    great explanation- thank you!

  • @Fullywtf
    @Fullywtf Год назад

    After a bunch of googling, this video was gold. Thank you!

  • @Kevin.Gilbert
    @Kevin.Gilbert 6 месяцев назад +3

    Thank you so much for making this video. It took a process that is quite daunting to new DevOps engineers and made it very easy to follow.

  • @leopoldocardoso2970
    @leopoldocardoso2970 3 дня назад

    Thank you so much for making this video. Amazing

  • @dansmith5381
    @dansmith5381 Год назад +1

    Thanks Travis, this was a great video for getting started with pipelines using yaml rather than using the classic pipeline builder UI

  • @Max-cq6hl
    @Max-cq6hl 2 месяца назад +1

    Exactly what I was looking for. Tyty!

  • @Carlesgl81
    @Carlesgl81 Год назад

    Great Help Travis!

  • @akash25446
    @akash25446 8 месяцев назад

    Wow, Great Explanation. This helped me to clear my concepts from basic . Very Professional !! Thanks a lot !

  • @jarrrred
    @jarrrred 3 месяца назад

    I super appreciate this video. Thank you! This was an awesome tutorial.

  • @rajojha449
    @rajojha449 Месяц назад

    Great teaching .to the point what is needed . Thanks for the content

  • @nageshwarkatta910
    @nageshwarkatta910 10 месяцев назад

    Wow Its nice video. Explanation to deploy different stages using azure pipeline. facing couple of issues regarding destroy stage. finally we fixed based on your video reference. Thanks sooo muchhh.

  • @AshokKumar-js4st
    @AshokKumar-js4st 2 месяца назад

    amazing.. excellent.. superb explanation

  • @suhailnepal9819
    @suhailnepal9819 Год назад

    Thank you, everything is well explained in a simple manner.

    • @Ciraltos
      @Ciraltos  Год назад

      Glad it was helpful!

  • @kevinm2567
    @kevinm2567 Год назад +1

    Wow! I just witnessed a professional at work. Subscribed.

  • @kaio100ken
    @kaio100ken 3 месяца назад

    I dont know if I commented this earlier but what a great video.

  • @gally.streetphotography
    @gally.streetphotography 9 месяцев назад

    Great video

  • @andrehahnemarsaioli7730
    @andrehahnemarsaioli7730 Год назад

    Awesome! thanks

  • @cloudhandle
    @cloudhandle 2 месяца назад

    Clear cut what was needed.

  • @MrMitsumatsu
    @MrMitsumatsu Год назад +1

    You rock, keep it coming

  • @kumarnandi3592
    @kumarnandi3592 Год назад

    thankyou sir ❤

  • @tambahako628
    @tambahako628 Год назад

    Nice!

  • @saiedbilal4305
    @saiedbilal4305 5 месяцев назад

    this is just wow !!

  • @user-sm6rn3ku3u
    @user-sm6rn3ku3u 6 дней назад

    Jai ho

  • @omarrob6894
    @omarrob6894 Год назад

    Thanks for your wonderful videos. just wondering if you have done any CI/CD videos?

  • @oldmonk5600
    @oldmonk5600 Год назад +2

    You explained very well each steps of YAML pipeline , can you please make video for multistage deployment as well. Appreciate your help and couldn't find YAML file which you provided gitrepo

  • @kbrajeshwaran
    @kbrajeshwaran Год назад

    Much appreciated your efforts.Need to understand how to deploy azure resources like vent and subnet and vm & SQL paas and kV for different subscription via azure pipeline also queries on branch pipeline for the respective environment.

  • @MrFallout86
    @MrFallout86 Год назад +8

    Great video. But how does this approach accommodate different environments? Wouldn't it be better to run the majority of these commands in the release pipeline, so you can distinguish the environment in the release lanes?

  • @gerardr727
    @gerardr727 Год назад

    Great job! Super easy to understand and follow, thank you very much.
    Being said that, I have a question:
    How do you manage the storage for multiple deployments? Do you use the same?
    Let's imagine that we have a pipeline to deploy RG1 for project1.
    Another pipeline to deploy RG2 for project2.
    All this TF status can be/should be stored in the same blob container?
    How I can use a TF destroy only for RG1 and keep RG2 available?
    Thanks in advance

  • @lmb25315
    @lmb25315 Год назад

    Have always loved your videos my man. First time posting a question here. What is a solution in Azure or Windows to auto-deploy and Azure File Share to Windows VMs as a drive letter? I have tried using the PowerShell connect script to run on startup via GPO (for servers or users in OUs) but have been unsuccessful. Thanks!

  • @DavidBenOtto
    @DavidBenOtto Год назад

    Great video. I would like to know how to import existing resources in my pipelines.

  • @SameedAhmed
    @SameedAhmed Год назад

    Thanks

  • @rest822
    @rest822 13 дней назад

    Excellent video. Question: How do we move the lock.hcl file back to our ADO repository ?

  • @madhurshukla23jan
    @madhurshukla23jan 5 месяцев назад

    great explanation, though I was not able to find pipelines in git repo only tf files are stored.

  • @marsamuk
    @marsamuk Год назад

    Hi Travis. What’ll happen if you enable trigger to the main branch and make a code change and commit. Will it re-create the same resources? From what I understand it won’t because the current resources will exist in tfstate. Trying to understand how this will work in a prod environment when code changes should deploy new infra. Thanks

  • @user-dw3ov3jh4d
    @user-dw3ov3jh4d 9 месяцев назад +1

    6:26 where do you find the backend storage key? Is that from ADO or from the Azure portal?

  • @rahul128ful
    @rahul128ful Год назад

    how did you get the variables you are specifying under variables from line no 11

  • @_devik
    @_devik Год назад

    in the gitignore you have the .tfvars, yet the file is and needs to be present in the repo?

  • @LokendraSinghmourya1983
    @LokendraSinghmourya1983 Год назад

    Its detailed . thank for help ,is it possible to share this sample pipeline ?

  • @SeChelios
    @SeChelios Год назад

    Hi, how did you use variables in backend block in terraform configuration? Terraform doesn't allow to do that

  • @andyhuynh2450
    @andyhuynh2450 6 месяцев назад

    Hi Travis. If I ran terraform in visual code ebrything seems to work. When I push this to devops and ran the pipeline , it error on unit stating tfstate rg , storage and tfstae already existed. How do I run pipeline without getting errors saying the tfstae has change?

  • @adamzachary6947
    @adamzachary6947 Год назад

    Great! but I have a quick question. What if I need to add an environment between the two stages?? For manual approval.
    Say I have created the same, two stages, first one to validate and plan. and if plan is completed successfully it moves to the second stage, which is apply stage. But before I run the apply I want to make sure that someone goes manually and approve the pipeline to apply the changes ....

    • @bibinvarughese9132
      @bibinvarughese9132 11 месяцев назад +1

      There is a task called "manualvalidation". You can add this in between the "stages" and the recipients can do a validation to decide whether to approve or not.

  • @jordanfox470
    @jordanfox470 Год назад

    @travis is there a way to create the back end storage in the project itself or is that just not best practice?

    • @Ciraltos
      @Ciraltos  Год назад +1

      The backend storage has to be available for the initialization. It has to be in place before apply command runs.

  • @socisback
    @socisback Год назад +1

    This only works if first manually create the resource group, storage account, and container in Azure. If fails otherwise!

  • @cloudhandle
    @cloudhandle 2 месяца назад

    So because it needs terraform installer in yml file on top inside stages, you can just put - validate, apply, destroy without preceding with 'terraform' keyword?
    E.g.:
    just 'validate' instead of 'terraform validate' cmd?
    Please correct me if I am wrong.

  • @brunoccs
    @brunoccs 2 месяца назад

    I ran into a problem, no file was created for the tfstate in the container.

  • @satishraju5188
    @satishraju5188 10 месяцев назад

    Hi @Travis Is there a way we can avoid initializing terraform multiple times in the pipeline ??
    Thanks,
    Satish

    • @scerons
      @scerons 3 месяца назад +1

      Hi! I believe you can use self-hosted agents when running your pipeline and that would allow you to preserve data and pass it on from one stage to the next. Haven't really tried it myself though

    • @reya4182
      @reya4182 3 месяца назад

      @scerons that makes sense will try once...thank you

  • @bhavanibattina4864
    @bhavanibattina4864 3 месяца назад +1

    @Travis if we are not providing the working Directory in the pipeline steps how this pipeline knows where is the code?

    • @MinDTraX
      @MinDTraX Месяц назад

      did you manage to find the answer?

  • @ThomasJSweet
    @ThomasJSweet Год назад

    Anyone know a workaround for: Failed to get existing workspaces: containers.Client#ListBlobs: Failure responding to request: StatusCode=403. The App Registration has "owner" and Storage Account Blob owner. Thx?

    • @ThomasJSweet
      @ThomasJSweet Год назад

      I found it, it is due to restricting the storage account to a specific IP. It seems that for this to work, the storage account must be "accessible from all networks", even if you have "allow azure services on the trusted services list to access this storage account" selected.

  • @rajeshchamanthula3201
    @rajeshchamanthula3201 4 месяца назад +1


    │ Error: No configuration files

    │ Apply requires configuration to be present. Applying without a
    │ configuration would mark everything for destruction, which is normally not
    │ what is desired. If you would like to destroy everything, run 'terraform
    │ destroy' instead.

    ##[error]Terraform command 'apply' failed with exit code '1'.
    ##[error]╷
    │ Error: No configuration files

    │ Apply requires configuration to be present. Applying without a
    │ configuration would mark everything for destruction, which is normally not
    │ what is desired. If you would like to destroy everything, run 'terraform
    │ destroy' instead.

    • @MinDTraX
      @MinDTraX Месяц назад

      I have the same issue, did you manage to resolve it?

  • @anilrepala505
    @anilrepala505 Год назад

    Destroy pipeline is not deleting the resources 😑

  • @ozshiffytech5389
    @ozshiffytech5389 Год назад +1

    The video is blurry

  • @ermiaskeno1986
    @ermiaskeno1986 4 дня назад

    thank you for you best content, I am getting error running azure devops pipeline , its not initializing terraform , I appreciate your suggestion, /opt/hostedtoolcache/terraform/0.14.11/x64/terraform init -backend-config=storage_account_name=xxx -backend-config=container_name=xxx -backend-config=key=xxx -backend-config=resource_group_name=xxx -backend-config=arm_subscription_id=xxx -backend-config=arm_tenant_id=*** -backend-config=arm_client_id=*** -backend-config=arm_client_secret=***
    ##[error]Error: There was an error when attempting to execute the process '/opt/hostedtoolcache/terraform/0.14.11/x64/terraform'. This may indicate the process failed to start. Error: spawn /opt/hostedtoolcache/terraform/1.0.0/x64/terraform ENOENT
    Finishing: terraform init