Provision Storage Account using Terraform and Azure DevOps Pipeline

Поделиться
HTML-код
  • Опубликовано: 22 май 2021
  • 1. Install Terraform
    2. Execute Terraform command locally
    3. Create Terraform configuration file(s)
    4. Create Azure DevOps project
    5. Push this configuration file to DevOps project
    6. Create build pipeline
    7. Create release pipeline
    8. Test on Azure portal
    #ingeniusyt #Terraform #AzureDevOps
  • НаукаНаука

Комментарии • 18

  • @AKASHDPakash
    @AKASHDPakash Месяц назад

    Good one abhishek. Thanks for the video

  • @anandsys25
    @anandsys25 5 месяцев назад +1

    from Azure portal GUI, we could have done this task in minutes, why we are making complicating by creating terrraform script and then incorporating in Dev Ops etc.. . Anyway the information is good.

  • @BharatSingh-hf1yt
    @BharatSingh-hf1yt 2 года назад

    Great Video
    really helpful

  • @karthikkarthik100
    @karthikkarthik100 2 года назад +1

    Sir, when you added the SPN , you clicked on the checkbox which would give access to pipeline. Thats the reason no rbac permission was needed later

    • @InGeniusYT
      @InGeniusYT  Год назад

      Please can you make more the question more specific, I am not able to understand that clearly.

  • @harikasinghbondili7931
    @harikasinghbondili7931 Год назад +1

    Im trying to run this code while in relase pipeline it showing exit code(1) error

  • @VTECLiam
    @VTECLiam 2 года назад

    Hi there, thanks for this video.
    Where is the tfstate file stored +.lck file stored? I assumed it was going to be saved into the tfstate container that was set in the process? It hasn't saved there for me. Does that mean its saved as an artefact? The whole idea is to work collaboratively on the tfstate file so multiple engineers can get to it and work with it. Do you know how this works with ADO and what is best practice around this?

    • @InGeniusYT
      @InGeniusYT  Год назад

      In Terraform, the tfstate file and the .lck file (lock file) are stored in a location specified by the backend configuration. The backend configuration determines where and how the state file is stored and accessed.
      The state file is typically stored remotely to enable collaboration and shared access among multiple engineers. Some commonly used backends for storing the tfstate file include Azure Blob Storage, AWS S3, or a version control system like Git. The choice of backend depends on your specific requirements and infrastructure setup.
      When using Azure DevOps (ADO) pipelines, you have several options for storing the tfstate file:
      Azure Blob Storage: You can configure Terraform to store the tfstate file in an Azure Blob Storage container. This allows multiple engineers to access and collaborate on the state file.
      Azure DevOps Repos: If you're using Azure Repos as your version control system, you can store the tfstate file alongside your code in the repository. However, this method may not be suitable for larger projects or when strict state locking is required.
      Terraform Cloud or Terraform Enterprise: HashiCorp provides a paid service called Terraform Cloud (or Terraform Enterprise for self-hosted installations) that offers advanced collaboration features, including state management and locking. With this approach, the tfstate file is stored securely in the cloud, and multiple engineers can work on it simultaneously.
      To determine the best practice for your specific scenario, consider factors such as team size, security requirements, infrastructure complexity, and desired collaboration features. It's generally recommended to use a remote backend for larger projects with multiple engineers to ensure concurrent access, state locking, and version control of the tfstate file.

  • @Teraflickstale
    @Teraflickstale 10 месяцев назад

    Thank you for this informative video. Can you please suggest reasons for getting below error ?
    \terraform\1.5.7\x64\terraform.exe' failed with exit code 1

  • @ItsmeRajivS
    @ItsmeRajivS 2 года назад

    good video but please share how to create storage account or any resource with yaml

    • @InGeniusYT
      @InGeniusYT  Год назад

      To create a storage account or any Azure resource using YAML in Azure DevOps (ADO) pipeline, you can use the Azure Resource Manager (ARM) template deployment task. Here's an example YAML snippet that demonstrates how to create a storage account:
      ```yaml
      - task: AzureResourceManagerTemplateDeployment@3
      inputs:
      deploymentScope: 'Resource Group'
      azureResourceManagerConnection: 'MyAzureServiceConnection' # Replace with your Azure service connection name
      subscriptionId: '12345678-1234-1234-1234-1234567890ab' # Replace with your Azure subscription ID
      action: 'Create Or Update Resource Group'
      resourceGroupName: 'my-resource-group' # Replace with your desired resource group name
      location: 'eastus' # Replace with your desired location
      templateLocation: 'Linked artifact'
      csmFile: 'path/to/your/arm-template.json' # Replace with the path to your ARM template JSON file
      deploymentMode: 'Incremental'
      ```
      In the above example, you need to replace the placeholders with the appropriate values:
      - `MyAzureServiceConnection`: Replace it with the name of your Azure service connection defined in ADO.
      - `12345678-1234-1234-1234-1234567890ab`: Replace it with your Azure subscription ID.
      - `my-resource-group`: Specify the name of the resource group you want to create.
      - `eastus`: Specify the desired Azure region/location.
      - `path/to/your/arm-template.json`: Provide the path to your ARM template JSON file, which defines the storage account resource.
      Make sure you have the ARM template file available in your repository, and adjust the path accordingly in the YAML snippet. The ARM template file should contain the necessary resource definitions for the storage account.
      You can modify this example YAML snippet based on your requirements to create different Azure resources using ARM templates in your ADO pipeline.

  • @ajaysriram2351
    @ajaysriram2351 2 года назад

    Why we need to createvstorage account when we are configuring subscription

    • @InGeniusYT
      @InGeniusYT  Год назад

      When configuring an Azure subscription, you do not explicitly create a storage account as part of that process. The storage account is a separate Azure resource that provides a scalable and secure place to store data for various services and applications.

  • @TiteufMela
    @TiteufMela Год назад

    hi,
    How to tell to terraform do not delete existing resources when launching the ci cd in azure devops?

    • @InGeniusYT
      @InGeniusYT  Год назад

      To prevent Terraform from deleting existing resources when launching the CI/CD pipeline in Azure DevOps, you can use the `-refresh=false` flag. This flag instructs Terraform to not refresh the state of resources during the execution, thereby preventing any changes or deletions.
      In your Azure DevOps pipeline YAML file, you can add this flag as an argument when running the Terraform command. Here's an example:
      ```yaml
      - task: TerraformCLI@0
      displayName: 'Terraform Apply'
      inputs:
      command: 'apply'
      workingDirectory: 'path/to/terraform/configuration'
      commandOptions: '-refresh=false'
      ```
      In the above example, the `Terraform Apply` task is executed with the `-refresh=false` option, ensuring that existing resources are not refreshed or deleted.
      Make sure to adjust the `workingDirectory` to the path where your Terraform configuration files are located within your repository.
      By using this flag, you can safely execute your CI/CD pipeline without the risk of accidentally deleting existing resources. However, please note that this approach assumes that your Terraform configuration is up to date and accurately reflects the state of your infrastructure.

    • @TiteufMela
      @TiteufMela Год назад

      Thank you very much

  • @raoazurecloud3775
    @raoazurecloud3775 3 года назад

    Can you share video Azure DevOps REST APIs working in the background creating the ADO Infrastructure. ​​​​​​​

    • @InGeniusYT
      @InGeniusYT  Год назад

      Sure, this video will come soon