Currently, the integrated testing feature is limited to the branch-based module publishing method. Extending this to tag-based modules is something we plan to address in the future based on customer feedback.
Terraform and Terraform Cloud don’t natively show the % of passed/failed tests, but if you’re using the CLI workflow, adding the -json flag (i.e. terraform test -json) will output the test summary in JSON format that you can parse: {"@level":"info","@message":"Success! 2 passed, 0 failed.","@module":"terraform.ui","@timestamp":"2024-04-16T12:01:52.025416-04:00","test_summary":{"status":"pass","passed":2,"failed":0,"errored":0,"skipped":0},"type":"test_summary"} From this, you could calculate the percent of tests that passed and failed.
How can we utilize Terraform test to assess modifications or additions to an existing infrastructure, such as introducing a new VM or modifying existing ones, especially when there are already 50 VMs in the environment? This scenario arises frequently when clients are reluctant to allocate a separate testing environment or disrupt the existing setup.
The Terraform test framework is intended to be used while developing module code, not during normal operations like your scenario. Tests always execute with a temporary in-memory state, they don’t operate on existing states. For your scenario, a combination of an approval workflow around examining plan outputs before applying changes, and something like the Sentinel policy framework available in HCP Terraform would be more relevant than the Terraform test framework.
Hi. Intresting. I am facing some issues while writing integration tests for external Git modules. Can I raise support on this?
Hi! Can we run tests for modules with tags instead of branches? It is not possible to change anything in the configuration.
Currently, the integrated testing feature is limited to the branch-based module publishing method. Extending this to tag-based modules is something we plan to address in the future based on customer feedback.
how can we get code coverage report ( eg. how much test % are passed or fail ?)
Terraform and Terraform Cloud don’t natively show the % of passed/failed tests, but if you’re using the CLI workflow, adding the -json flag (i.e. terraform test -json) will output the test summary in JSON format that you can parse:
{"@level":"info","@message":"Success! 2 passed, 0 failed.","@module":"terraform.ui","@timestamp":"2024-04-16T12:01:52.025416-04:00","test_summary":{"status":"pass","passed":2,"failed":0,"errored":0,"skipped":0},"type":"test_summary"}
From this, you could calculate the percent of tests that passed and failed.
How can we utilize Terraform test to assess modifications or additions to an existing infrastructure, such as introducing a new
VM or modifying existing ones, especially when there are already 50 VMs in the environment? This scenario arises frequently when clients are reluctant to allocate a separate testing environment or disrupt the existing setup.
The Terraform test framework is intended to be used while developing module code, not during normal operations like your scenario. Tests always execute with a temporary in-memory state, they don’t operate on existing states. For your scenario, a combination of an approval workflow around examining plan outputs before applying changes, and something like the Sentinel policy framework available in HCP Terraform would be more relevant than the Terraform test framework.