Modern Infrastructure as Code (IaC) workflows increasingly rely on intelligent tooling to improve productivity, reduce errors, and accelerate delivery. While Terraform provides a powerful declarative model for provisioning infrastructure, writing and maintaining Terraform configurations can still involve frequent context switching between documentation, provider schemas, and validation tools.
This is where the Model Context Protocol (MCP) enters the picture.
MCP is a standardized way for AI-powered tools to securely interact with external systems through structured, tool-based communication. Instead of relying purely on text prediction, an AI assistant can invoke specialized tools, such as Terraform-aware services, to retrieve schemas, validate configuration logic, or generate accurate resource definitions based on provider metadata.
A Terraform MCP Server acts as a bridge between Terraform and AI-enabled development environments such as Visual Studio Code. It exposes Terraform-specific capabilities (for example, provider inspection or configuration generation) in a structured way that AI tools can safely consume. The result is a development workflow that is:
In this tutorial, we will:
By the end of this guide, you will have a working Terraform MCP setup integrated into your local development environment and understand how it enhances your Infrastructure as Code workflow.
The Model Context Protocol (MCP) is an open-source protocol developed by Anthropic that standardizes how AI applications interact with external tools, services, and systems.
At a high level, MCP enables AI assistants to move beyond simple text prediction and safely execute structured operations, such as reading files, inspecting schemas, validating configurations, or querying APIs, through clearly defined tool interfaces.
MCP allows AI clients to:
MCP introduces a standardized communication layer between AI clients and executable tools. It is built around a simple but powerful architecture:
How MCP Works (Conceptual Flow):
With MCP servers, AI assistants can provide much more accurate and useful responses by relying on real-time or domain-specific data sources. For example, a Terraform-aware AI assistant can use an MCP server to retrieve the exact schema for an AWS S3 bucket resource, ensuring that any generated configuration is valid and up-to-date with the latest provider version.
In short, MCP transforms AI from a predictive assistant into a tool-aware development partner.
The Terraform MCP Server is designed specifically for the Terraform ecosystem. It enables AI models and MCP-aware tools to interact with Terraform provider documentation, modules, policies, and workspace information in real time, helping ensure that generated configurations are accurate, current, and based on up-to-date provider metadata rather than relying on potentially outdated training data.
Key Capabilities and Features
At its core, the Terraform MCP Server acts as a bridge between AI clients and Terraform-related data sources, such as the Terraform Registry, Terraform Enterprise, or HCP Terraform. By exposing a suite of specialized tools via MCP, the server allows AI clients (like Visual Studio Code, Claude Desktop, or other MCP-compatible editors) to request and receive precise, structured information about providers, modules, and other Terraform resources.
Its key capabilities include:
These capabilities allow AI assistants to dynamically query real provider data, significantly improving the reliability of generated Terraform configurations and reducing the risk of errors stemming from outdated or incorrect schema information.
How It Works
When connected to an AI client that supports MCP, the Terraform MCP Server exposes a set of tools and
resources that the model can invoke automatically based on the user's prompts. For example, when you ask
about an AWS resource configuration, the AI client can use tools such as search_providers
and get_provider_details to fetch the exact documentation for the AWS provider and include
accurate schema details in the output.
A typical workflow involves:
Transport and Session Modes
To accommodate different deployment scenarios, the Terraform MCP Server supports:
Ecosystem Integration
Because it is built on the MCP standard, the Terraform MCP Server integrates with a variety of MCP-aware clients, editors, and AI assistants that support the protocol. This makes it easier for engineers to combine interactive AI workflows with Terraform authoring, whether locally in Visual Studio Code or in more advanced agent workflows.
Note: The feature remains in beta at the time of writing, and it’s recommended to use it in development or experimentation environments rather than critical production infrastructures.
This overview should help readers understand what the Terraform MCP Server is, what it enables, and how it fits into AI-assisted Terraform development before diving into installation and usage steps.
This tutorial illustrates how to set up the Terraform MCP Server in a local Windows-based development environment with Visual Studio Code.
At a high level, you need:
The MCP server acts as a bridge between GitHub Copilot (via MCP support in VS Code) and Terraform's documentation and registry APIs. GitHub Copilot detects when Terraform-specific context is needed and invokes MCP tools exposed by the Terraform MCP Server. The server retrieves structured, real-time provider and module data and passes it back to Copilot, which uses it to generate accurate Terraform configuration snippets. Docker simplifies the setup by allowing you to run the server without manually compiling or installing dependencies.
First of all, ensure Docker is installed and running. If required, follow the steps outlined in the Install Docker Desktop on Windows guide to set up Docker on your machine.
Next, add the GitHub Copilot Chat extension to VS Code and sign up for GitHub Copilot Free.
GitHub Copilot Free imposes monthly limits on the number of AI-generated completions, but it should be sufficient for testing and experimentation with the Terraform MCP Server. If you find yourself hitting limits, consider upgrading to one of the paid GitHub Copilot plans for higher usage quotas and additional features.
You can install the Terraform MCP Server in your user profile or in the current workspace. The
workspace configuration is stored in a .vscode/mcp.json file. The user profile
configuration is stored in the VS Code user settings directory, which is typically located at
%APPDATA%\Code\User on Windows.
In this example, we will set up the Terraform MCP Server in the current workspace. To do this,
create a .vscode/mcp.json file in the root of your workspace with the following
content:
After saving the file, go to the Extensions view, right-click the terraform server in the MCP SERVERS - INSTALLED section or select the gear icon, and then click Start Server.
In the Output panel, you should see logs indicating that the Terraform MCP Server has
started successfully. You may see warning messages about TFE client issues, which is expected
as we did not include any related configuration in the mcp.json file. The server
will still function and provide provider documentation and registry data, but Terraform
Cloud features will be unavailable until a valid token is provided.
Next, make sure the Terraform MCP Server tools are enabled. To complete this, click the
tool icon (Configure Tools...) at the bottom of the Chat box and ensure that the
tools from the terraform MCP server are selected. You should see
search_providers, get_provider_details, search_modules,
and a few others in the list of available tools. If they are not selected, click on terraform
or select individual tools to enable them for use in the Chat.
At this stage the Terraform MCP Server is fully set up and ready to use in your VS Code workspace. In the next section we will show how to use it to get real-time provider information and generate Terraform configuration snippets based on the latest data from the Terraform Registry.
To validate that the Terraform MCP Server is working correctly, create a file, for example
random_pet_example.tf, in your workspace with the following content:
Activate the Chat view in VS Code (Ctrl+Alt+I) and add the
random_pet_example.tf file as context to the conversation. You can do this by dragging
the file into the Chat view or by clicking the Add Context button in the Chat view.
In the Chat input box, type the following text and press Enter:
The AI assistant will use the Terraform MCP Server tools (as instructed by the #terraform
tag) to fetch real-time data from the Terraform documentation and provide a response based
on the latest provider information. If you satisfied with the response, click the
Keep button to use the generated code. Otherwise, modify the prompt and click
Send to have the assistant generate a new response, which may yield a different
example.
You can also issue follow-up questions or request changes to the generated code, for example, to modify the outputs:
Below is the resulting code generated by the AI assistant based on our prompts (your results may differ):
Note: terraform apply may generate an error if the prefix
attribute is assigned an empty string value (""). In such a case,
provide a non-empty default value for var.prefix (for example,
"test") or comment-out the line with the prefix attribute in
the resource.random_pet.example block.
By default, Terraform MCP can retrieve information only from the public Terraform Registry. Integrating the Terraform MCP Server with HCP Terraform extends its capabilities to include organization and workspace-level data, private registry access, and other HCP-specific tools. This allows AI clients to generate configurations that are aware of the current state of your Terraform environment, including policies, runs, and private modules, rather than relying solely on public documentation and registry data.
To integrate Terraform MCP with HCP Terraform, follow these high-level configuration steps:
mcp.json) to include HCP-specific settings:
{
"servers": {
"terraform": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e", "TFE_TOKEN=${input:tfe_token}",
"hashicorp/terraform-mcp-server:0.4.0"
],
"type": "stdio"
}
},
"inputs": [
{
"type": "promptString",
"id": "tfe_token",
"description": "Terraform API Token",
"password": true
}
]
}
Once connected, the Terraform MCP Server will expose additional HCP Terraform-specific tools, for example:
list_workspaces - Retrieve a list of workspaces in your HCP Terraform organization.get_workspace_details - Get detailed information about a specific workspace, including
current runs, variables, and state versions.search_policies - List Sentinel policies associated with your organization or specific
workspaces.get_policy_details - Retrieve details about a specific Sentinel policy, including
its rules and enforcement levels.search_private_modules - Search for modules in your private registry within HCP
Terraform.get_private_module_details - Get detailed information about a specific private module,
including its input variables, outputs, and usage examples.With HCP Terraform integration, AI clients can generate Terraform configurations that are not only accurate based on public provider data but also contextually aware of your organization's specific Terraform environment, policies, and private modules. This leads to more relevant and compliant configuration suggestions, improving both productivity and governance in your Terraform workflows.
Instruction files allow you to define custom rules and guidelines for how AI assistants should use MCP tools in specific contexts.
VS Code supports various instruction files, such as:
.github/copilot-instructions.md - For GitHub Copilot. It is stored within the
workspace's root directory and automatically applies to all chat requests in the workspace.AGENTS.md - For VS Code AI agents. One or more of these files can be created in
the workspace's root or its subdirectories. Useful if you work with multiple AI agents in your
workspace. Automatically applies to all chat requests in the workspace or a specific subfolder.*.instructions.md - For a specific MCP tool or a subset of files. It applies only
to requests related to that tool or file type.
As a general guideline, start with a single .github/copilot-instructions.md file for
project-wide coding standards. Add .instructions.md files when you need different rules
for different file types or frameworks. Use AGENTS.md if you work with multiple AI agents
in your workspace.
For example, you can describe the general documentation standard, naming conventions, and project
structure in the .github/copilot-instructions.md file and add Terraform-specific coding
and formatting standards in a separate terraform.instructions.md file.
In the instruction file, you can specify that the AI assistant should always use the Terraform MCP Server to fetch provider documentation and registry data, rather than relying on its training data.
Example: .github/copilot-instructions.md
Example: .github/instructions/terraform.instructions.md
By creating instruction files in your repository, you can provide instructions that guide the AI's behavior when generating code or responding to prompts. This is especially useful for ensuring that the AI uses the Terraform MCP Server tools in a way that aligns with your project's conventions, policies, or specific requirements.
The Terraform MCP Server brings structured, tool-driven intelligence to Terraform development by connecting AI assistants to official Terraform data sources. Instead of relying solely on predictive text generation, AI tools can retrieve up-to-date provider documentation, module metadata, and workspace-level data from HCP Terraform, when integrated.
By following this tutorial, you have learned to:
.github/copilot-instructions.mdAGENTS.md*.instructions.md filesWhen combined with Visual Studio Code and GitHub Copilot, Terraform MCP transforms AI from a general-purpose code assistant into a structured, Terraform-aware development partner. By providing real-time access to official documentation and registry data, it enables a more reliable and context-aware Infrastructure as Code workflow.
As AI-assisted Infrastructure as Code continues to evolve, Terraform MCP provides a practical and controlled foundation for adopting these capabilities in both individual and enterprise environments.
More Terraform Tutorials
Getting Started with Terraform
Understanding Terraform Variable Precedence
Terraform Value Types Tutorial
Terraform count Explained with Practical Examples
Terraform for_each Tutorial with Practical Examples
Exploring Terraform dynamic Blocks with GCP Examples
Working with External Data in Terraform
Terraform Modules FAQ