Advanced Terraform

Testing with terraform test

● Advanced ⏱ 25 min read terraform

Terraform 1.6 introduced a native testing framework built directly into the CLI. Before that, the only way to test modules was to provision real infrastructure with tools like Terratest and then destroy it. The native framework adds proper unit tests (no infrastructure required) and structured integration tests, making it practical to test modules as part of a standard CI/CD pipeline.

Why Test Terraform?

Infrastructure code without tests drifts silently. A module that worked in 2022 may produce wrong outputs, missing tags, or broken security group rules by 2024 after enough "small fixes." Tests catch:

The Native Test Framework

The framework introduces two file types:

File typeExtensionLocation
Test files.tftest.hclModule root or tests/ subdirectory
Mock provider files.tfmock.hclReferenced from test files

Run all tests:

terraform test

Run a specific test file:

terraform test -filter=tests/naming.tftest.hcl

Writing Test Files

A test file contains one or more run blocks. Each run block is a test case — it applies the module with given inputs and runs assertions on the outputs:

# tests/basic.tftest.hcl

# Variables shared across all run blocks in this file
variables {
  environment  = "test"
  project_name = "myapp"
  vpc_cidr     = "10.0.0.0/16"
}

run "creates_vpc_with_correct_cidr" {
  command = plan   # don't actually create infrastructure

  assert {
    condition     = aws_vpc.main.cidr_block == var.vpc_cidr
    error_message = "VPC CIDR does not match input variable"
  }
}

run "applies_required_tags" {
  command = plan

  assert {
    condition     = aws_vpc.main.tags["Environment"] == var.environment
    error_message = "Environment tag not applied to VPC"
  }

  assert {
    condition     = aws_vpc.main.tags["Project"] == var.project_name
    error_message = "Project tag not applied to VPC"
  }
}

A run block can override variables for that specific test case:

run "prod_environment_uses_larger_instance" {
  command = plan

  variables {
    environment = "prod"   # override the file-level variable
  }

  assert {
    condition     = aws_instance.app.instance_type == "t3.large"
    error_message = "Prod environment should use t3.large"
  }
}

run "non_prod_uses_small_instance" {
  command = plan

  variables {
    environment = "dev"
  }

  assert {
    condition     = aws_instance.app.instance_type == "t3.micro"
    error_message = "Non-prod environment should use t3.micro"
  }
}

Assertions

Each assert block has a condition (any boolean expression) and an error_message shown when the condition is false.

Assertions can reference any value available after the run: resources, outputs, data sources, and locals:

run "subnet_cidrs_are_within_vpc" {
  command = plan

  assert {
    condition     = cidrcontains(aws_vpc.main.cidr_block, aws_subnet.public.cidr_block)
    error_message = "Public subnet CIDR is outside the VPC CIDR"
  }
}

run "security_group_has_egress" {
  command = plan

  assert {
    condition     = length(aws_security_group.app.egress) > 0
    error_message = "Security group must have at least one egress rule"
  }
}

run "bucket_name_matches_pattern" {
  command = plan

  assert {
    condition     = can(regex("^myapp-[a-z]+-[0-9]{12}$", aws_s3_bucket.data.bucket))
    error_message = "S3 bucket name does not match expected pattern"
  }
}

Test outputs from a module specifically:

# Module under test: modules/networking/
# outputs.tf exports: vpc_id, public_subnet_ids, private_subnet_ids

run "outputs_correct_subnet_count" {
  command = plan

  variables {
    availability_zones = ["us-east-1a", "us-east-1b", "us-east-1c"]
  }

  assert {
    condition     = length(output.public_subnet_ids) == 3
    error_message = "Should create one public subnet per AZ"
  }

  assert {
    condition     = length(output.private_subnet_ids) == 3
    error_message = "Should create one private subnet per AZ"
  }
}

Mocking Providers & Modules

Unit tests with command = plan still need a configured provider — even for a plan, Terraform calls the provider to validate schema. Mocking lets you skip provider authentication entirely.

Define a mock provider in a .tfmock.hcl file:

# tests/mocks/aws.tfmock.hcl

mock_provider "aws" {
  mock_resource "aws_vpc" {
    defaults = {
      id         = "vpc-mock12345"
      arn        = "arn:aws:ec2:us-east-1:123456789012:vpc/vpc-mock12345"
      cidr_block = "10.0.0.0/16"
    }
  }

  mock_resource "aws_subnet" {
    defaults = {
      id  = "subnet-mock12345"
      arn = "arn:aws:ec2:us-east-1:123456789012:subnet/subnet-mock12345"
    }
  }

  mock_data "aws_availability_zones" {
    defaults = {
      names = ["us-east-1a", "us-east-1b", "us-east-1c"]
    }
  }
}

Reference the mock in a test file:

# tests/unit.tftest.hcl

mock_provider "aws" {
  source = "./mocks/aws.tfmock.hcl"
}

variables {
  environment = "test"
  vpc_cidr    = "10.0.0.0/16"
}

run "vpc_tags_are_correct" {
  command = plan   # uses mocked provider — no AWS credentials needed

  assert {
    condition     = aws_vpc.main.tags["Environment"] == "test"
    error_message = "Environment tag missing or wrong"
  }
}

Mock modules let you stub out module dependencies without provisioning them:

mock_module "networking" {
  outputs = {
    vpc_id            = "vpc-mock12345"
    private_subnet_ids = ["subnet-a", "subnet-b", "subnet-c"]
  }
}
🧭
Unit tests = plan + mocks. Integration tests = apply + real provider.

Unit tests run in milliseconds, need no credentials, and test logic. Integration tests take minutes, cost money, and test that the real provider accepts your configuration. Run unit tests on every commit; run integration tests on merges to main or nightly.

Integration Tests

Integration tests use command = apply to provision real infrastructure, check it, and then destroy it. Terraform handles the full lifecycle within the test run:

# tests/integration.tftest.hcl

variables {
  environment  = "test"
  project_name = "learniac-test"
  vpc_cidr     = "10.1.0.0/16"
}

run "provision_vpc" {
  command = apply   # actually creates the VPC

  assert {
    condition     = output.vpc_id != ""
    error_message = "VPC ID should not be empty after apply"
  }
}

run "verify_subnets_created" {
  command = apply

  assert {
    condition     = length(output.public_subnet_ids) == 3
    error_message = "Expected 3 public subnets"
  }
}

# Terraform destroys all resources created during the test run after all
# run blocks complete — no cleanup code needed

Multiple run blocks in a file share state — resources created in an earlier apply run are available to later runs. This lets you test sequences: create, then verify, then modify.

expect_failures for negative testing

Test that invalid inputs produce validation errors:

run "rejects_invalid_environment" {
  command = plan

  variables {
    environment = "staging"   # not in the allowed list
  }

  expect_failures = [var.environment]
}

run "rejects_cidr_too_small" {
  command = plan

  variables {
    vpc_cidr = "10.0.0.0/28"   # too small for required subnets
  }

  expect_failures = [var.vpc_cidr]
}

If the plan succeeds when a failure was expected, the test fails. This ensures variable validation rules are actually enforced.

Running Tests

# Run all tests (unit + integration)
terraform test

# Run only plan-based (unit) tests — skip apply tests
terraform test -filter=tests/unit.tftest.hcl

# Verbose output showing each assertion result
terraform test -verbose

# Run a specific test file
terraform test -filter=tests/naming.tftest.hcl

# Run with specific variable overrides
terraform test -var="environment=staging"

Output format:

$ terraform test -verbose

tests/unit.tftest.hcl... in progress
  run "vpc_tags_are_correct"... pass
  run "subnet_cidr_within_vpc"... pass
  run "security_group_has_egress"... pass
tests/unit.tftest.hcl... tearing down
tests/unit.tftest.hcl... pass

Success! 3 passed, 0 failed.

Tests in CI/CD

Add unit tests (mocked, plan-only) to the PR pipeline so every change is validated before merge:

name: Terraform Module Tests

on:
  pull_request:
    paths:
      - 'modules/**'
      - '*.tf'

jobs:
  unit-tests:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4

      - uses: hashicorp/setup-terraform@v3
        with:
          terraform_version: "~1.9"

      - name: Init
        run: terraform init

      - name: Unit Tests (no credentials needed — mocked)
        run: terraform test -filter=tests/unit.tftest.hcl

  integration-tests:
    runs-on: ubuntu-latest
    if: github.ref == 'refs/heads/main'   # only on merge to main
    needs: unit-tests
    steps:
      - uses: actions/checkout@v4

      - uses: hashicorp/setup-terraform@v3
        with:
          terraform_version: "~1.9"

      - name: Configure AWS Credentials
        uses: aws-actions/configure-aws-credentials@v4
        with:
          role-to-assume: ${{ secrets.AWS_ROLE_ARN }}
          aws-region: us-east-1

      - name: Init
        run: terraform init

      - name: Integration Tests
        run: terraform test -filter=tests/integration.tftest.hcl
🧭
Use a dedicated test AWS account

Integration tests provision and destroy real infrastructure. Use a separate AWS account or isolated VPC for test runs so that test failures can't affect production state, and costs stay predictable and auditable.

Key Takeaways