packer-cn/website/pages/docs/provisioners/ansible.mdx

784 lines
22 KiB
Plaintext

---
description: |
The ansible Packer provisioner allows Ansible playbooks to be run to provision
the machine.
layout: docs
page_title: Ansible - Provisioners
sidebar_title: Ansible (Remote)
---
# Ansible Provisioner
Type: `ansible`
The `ansible` Packer provisioner runs Ansible playbooks. It dynamically creates
an Ansible inventory file configured to use SSH, runs an SSH server, executes
`ansible-playbook`, and marshals Ansible plays through the SSH server to the
machine being provisioned by Packer.
-> **Note:** Any `remote_user` defined in tasks will be ignored. Packer
will always connect with the user given in the json config for this
provisioner.
-> **Note:** Options below that use the Packer template engine won't be able to
accept jinja2 `{{ function }}` macro syntax in a way that can be preserved to
the Ansible run. If you need to set variables using Ansible macros, you need to
do so inside your playbooks or inventory files.
Please see the [Debugging](#debugging), [Limitations](#limitations), or [Troubleshooting](#troubleshooting) if you are having trouble
getting started.
## Basic Example
This is a fully functional template that will provision an image on
DigitalOcean. Replace the mock `api_token` value with your own.
Example Packer template:
<Tabs>
<Tab heading="JSON">
```json
{
"builders": [
{
"type": "digitalocean",
"api_token": "6a561151587389c7cf8faa2d83e94150a4202da0e2bad34dd2bf236018ffaeeb",
"image": "ubuntu-14-04-x64",
"region": "sfo1"
}
],
"provisioners": [
{
"type": "ansible",
"playbook_file": "./playbook.yml"
}
]
}
```
</Tab>
<Tab heading="HCL2">
```hcl
source "digitalocean" "example"{
api_token = "6a561151587389c7cf8faa2d83e94150a4202da0e2bad34dd2bf236018ffaeeb"
image = "ubuntu-14-04-x64"
region = "sfo1"
}
build {
sources = [
"source.digitalocean.example"
]
provisioner "ansible" {
playbook_file = "./playbook.yml"
}
}
```
</Tab>
</Tabs>
Example playbook:
```yaml
---
# playbook.yml
- name: 'Provision Image'
hosts: default
become: true
tasks:
- name: install Apache
package:
name: 'httpd'
state: present
```
## Configuration Reference
Required Parameters:
@include 'provisioner/ansible/Config-required.mdx'
Optional Parameters:
@include '/provisioner/ansible/Config-not-required.mdx'
@include 'provisioners/common-config.mdx'
## Default Extra Variables
In addition to being able to specify extra arguments using the
`extra_arguments` configuration, the provisioner automatically defines certain
commonly useful Ansible variables:
- `packer_build_name` is set to the name of the build that Packer is running.
This is most useful when Packer is making multiple builds and you want to
distinguish them slightly when using a common playbook.
- `packer_builder_type` is the type of the builder that was used to create
the machine that the script is running on. This is useful if you want to
run only certain parts of the playbook on systems built with certain
builders.
- `packer_http_addr` If using a builder that provides an http server for file
transfer (such as hyperv, parallels, qemu, virtualbox, and vmware), this
will be set to the address. You can use this address in your provisioner to
download large files over http. This may be useful if you're experiencing
slower speeds using the default file provisioner. A file provisioner using
the `winrm` communicator may experience these types of difficulties.
## Debugging
To debug underlying issues with Ansible, add `"-vvvv"` to `"extra_arguments"`
to enable verbose logging.
<Tabs>
<Tab heading="JSON">
```json
"extra_arguments": [ "-vvvv" ]
```
</Tab>
<Tab heading="HCL2">
```hcl
extra_arguments = [ "-vvvv" ]
```
</Tab>
</Tabs>
## Limitations
### Redhat / CentOS
Redhat / CentOS builds have been known to fail with the following error due to
`sftp_command`, which should be set to `/usr/libexec/openssh/sftp-server -e`:
```text
==> virtualbox-ovf: starting sftp subsystem
virtualbox-ovf: fatal: [default]: UNREACHABLE! => {"changed": false, "msg": "SSH Error: data could not be sent to the remote host. Make sure this host can be reached over ssh", "unreachable": true}
```
### chroot communicator
Building within a chroot (e.g. `amazon-chroot`) requires changing the Ansible
connection to chroot and running Ansible as root/sudo.
<Tabs>
<Tab heading="JSON">
```json
{
"builders": [
{
"type": "amazon-chroot",
"mount_path": "/mnt/packer-amazon-chroot",
"region": "us-east-1",
"source_ami": "ami-123456"
}
],
"provisioners": [
{
"type": "ansible",
"extra_arguments": [
"--connection=chroot",
"--inventory-file=/mnt/packer-amazon-chroot"
],
"playbook_file": "main.yml"
}
]
}
```
</Tab>
<Tab heading="HCL2">
```hcl
source "amazon-chroot" "example" {
mount_path = "/mnt/packer-amazon-chroot"
region = "us-east-1"
source_ami = "ami-123456"
}
build {
sources = [
"source.amazon-chroot.example"
]
provisioner "ansible" {
extra_arguments = [
"--connection=chroot",
"--inventory-file=/mnt/packer-amazon-chroot"
]
playbook_file = "main.yml"
}
}
```
</Tab>
</Tabs>
### WinRM Communicator
There are two possible methods for using ansible with the WinRM communicator.
Please note that if you're having trouble getting Ansible to connect, you may
want to take a look at the script that the Ansible project provides to help
configure remoting for Ansible:
https://github.com/ansible/ansible/blob/devel/examples/scripts/ConfigureRemotingForAnsible.ps1
#### Method 1 (recommended)
The recommended way to use the WinRM communicator is to set `"use_proxy": false`
and let the Ansible provisioner handle the rest for you. If you
are using WinRM with HTTPS, and you are using a self-signed certificate you
will also have to set `ansible_winrm_server_cert_validation=ignore` in your
extra_arguments.
Below is a fully functioning Ansible example using WinRM:
<Tabs>
<Tab heading="JSON">
```json
{
"builders": [
{
"type": "amazon-ebs",
"region": "us-east-1",
"instance_type": "t2.micro",
"source_ami_filter": {
"filters": {
"virtualization-type": "hvm",
"name": "*Windows_Server-2012*English-64Bit-Base*",
"root-device-type": "ebs"
},
"most_recent": true,
"owners": "amazon"
},
"ami_name": "test-ansible-packer",
"user_data_file": "windows_bootstrap.txt",
"communicator": "winrm",
"force_deregister": true,
"winrm_insecure": true,
"winrm_username": "Administrator",
"winrm_use_ssl": true
}
],
"provisioners": [
{
"type": "ansible",
"playbook_file": "./playbook.yml",
"user": "Administrator",
"use_proxy": false,
"extra_arguments": ["-e", "ansible_winrm_server_cert_validation=ignore"]
}
]
}
```
</Tab>
<Tab heading="HCL2">
```hcl
source "amazon-ebs" "example" {
region = "us-east-1"
instance_type = "t2.micro"
source_ami_filter {
filters = {
"virtualization-type": "hvm",
"name": "*Windows_Server-2012*English-64Bit-Base*",
"root-device-type": "ebs"
}
most_recent = true
owners = ["amazon"]
}
ami_name = "test-ansible-packer"
user_data_file = "windows_bootstrap.txt"
communicator = "winrm"
force_deregister = true
winrm_username = "Administrator"
winrm_insecure = true
winrm_use_ssl = true
}
build {
sources = [
"source.amazon-ebs.example",
]
provisioner "ansible" {
playbook_file = "./playbooks/playbook-windows.yml"
user = "Administrator"
use_proxy = false
extra_arguments = [
"-e", "ansible_winrm_server_cert_validation=ignore"
]
}
}
```
</Tab>
</Tabs>
Note that you do have to set the "Administrator" user, because otherwise Ansible
will default to using the user that is calling Packer, rather than the user
configured inside of the Packer communicator. For the contents of
windows_bootstrap.txt, see the winrm docs for the amazon-ebs communicator.
When running from OSX, you may see an error like:
```text
amazon-ebs: objc[9752]: +[__NSCFConstantString initialize] may have been in progress in another thread when fork() was called.
amazon-ebs: objc[9752]: +[__NSCFConstantString initialize] may have been in progress in another thread when fork() was called. We cannot safely call it or ignore it in the fork() child process. Crashing instead. Set a breakpoint on objc_initializeAfterForkError to debug.
amazon-ebs: ERROR! A worker was found in a dead state
```
If you see this, you may be able to work around the issue by telling Ansible to
explicitly not use any proxying; you can do this by setting the template option
<Tabs>
<Tab heading="JSON">
```json
"ansible_env_vars": ["no_proxy=\"*\""],
```
</Tab>
<Tab heading="HCL2">
```hcl
ansible_env_vars = ["no_proxy=\"*\""]
```
</Tab>
</Tabs>
in the above Ansible template.
#### Method 2 (Not recommended)
If you want to use the Packer ssh proxy, then you need a custom Ansible
connection plugin and a particular configuration. You need a directory named
`connection_plugins` next to the playbook which contains a file named
packer.py` which implements the connection plugin. On versions of Ansible
before 2.4.x, the following works as the connection plugin:
```python
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
from ansible.plugins.connection.ssh import Connection as SSHConnection
class Connection(SSHConnection):
''' ssh based connections for powershell via packer'''
transport = 'packer'
has_pipelining = True
become_methods = []
allow_executable = False
module_implementation_preferences = ('.ps1', '')
def __init__(self, *args, **kwargs):
super(Connection, self).__init__(*args, **kwargs)
```
Newer versions of Ansible require all plugins to have a documentation string.
You can see if there is a plugin available for the version of Ansible you are
using
[here](https://github.com/hashicorp/packer/tree/master/examples/ansible/connection-plugin).
To create the plugin yourself, you will need to copy all of the `options` from
the `DOCUMENTATION` string from the [ssh.py Ansible connection
plugin](https://github.com/ansible/ansible/blob/devel/lib/ansible/plugins/connection/ssh.py)
of the Ansible version you are using and add it to a packer.py file similar to
as follows
```python
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
from ansible.plugins.connection.ssh import Connection as SSHConnection
DOCUMENTATION = '''
connection: packer
short_description: ssh based connections for powershell via packer
description:
- This connection plugin allows ansible to communicate to the target packer machines via ssh based connections for powershell.
author: Packer
version_added: na
options:
**** Copy ALL the options from
https://github.com/ansible/ansible/blob/devel/lib/ansible/plugins/connection/ssh.py
for the version of Ansible you are using ****
'''
class Connection(SSHConnection):
''' ssh based connections for powershell via packer'''
transport = 'packer'
has_pipelining = True
become_methods = []
allow_executable = False
module_implementation_preferences = ('.ps1', '')
def __init__(self, *args, **kwargs):
super(Connection, self).__init__(*args, **kwargs)
```
This template should build a Windows Server 2012 image on Google Cloud
Platform:
```json
{
"variables": {},
"provisioners": [
{
"type": "ansible",
"playbook_file": "./win-playbook.yml",
"extra_arguments": [
"--connection",
"packer",
"--extra-vars",
"ansible_shell_type=powershell ansible_shell_executable=None"
]
}
],
"builders": [
{
"type": "googlecompute",
"account_file": "{{ user `account_file`}}",
"project_id": "{{user `project_id`}}",
"source_image": "windows-server-2012-r2-dc-v20160916",
"communicator": "winrm",
"zone": "us-central1-a",
"disk_size": 50,
"winrm_username": "packer",
"winrm_use_ssl": true,
"winrm_insecure": true,
"metadata": {
"sysprep-specialize-script-cmd": "winrm set winrm/config/service/auth @{Basic=\"true\"}"
}
}
]
}
```
-> **Warning:** Please note that if you're setting up WinRM for provisioning, you'll probably want to turn it off or restrict its permissions as part of a shutdown script at the end of Packer's provisioning process. For more details on the why/how, check out this useful blog post and the associated code:
https://cloudywindows.io/post/winrm-for-provisioning-close-the-door-on-the-way-out-eh/
### Post i/o timeout errors
If you see
`unknown error: Post http://<ip>:<port>/wsman:dial tcp <ip>:<port>: i/o timeout`
errors while provisioning a Windows machine, try setting Ansible to copy files
over [ssh instead of
sftp](https://docs.ansible.com/ansible/latest/reference_appendices/config.html#envvar-ANSIBLE_SCP_IF_SSH).
### Too many SSH keys
SSH servers only allow you to attempt to authenticate a certain number of
times. All of your loaded keys will be tried before the dynamically generated
key. If you have too many SSH keys loaded in your `ssh-agent`, the Ansible
provisioner may fail authentication with a message similar to this:
```text
googlecompute: fatal: [default]: UNREACHABLE! => {"changed": false, "msg": "Failed to connect to the host via ssh: Warning: Permanently added '[127.0.0.1]:62684' (RSA) to the list of known hosts.\r\nReceived disconnect from 127.0.0.1 port 62684:2: too many authentication failures\r\nAuthentication failed.\r\n", "unreachable": true}
```
To unload all keys from your `ssh-agent`, run:
```shell-session
$ ssh-add -D
```
### Become: yes
We recommend against running Packer as root; if you do then you won't be able
to successfully run your ansible playbook as root; `become: yes` will fail.
### Using a wrapping script for your ansible call
Sometimes, you may have extra setup that needs to be called as part of your
ansible run. The easiest way to do this is by writing a small bash script and
using that bash script in your "command" in place of the default
"ansible-playbook". For example, you may need to launch a Python virtualenv
before calling ansible. To do this, you'd want to create a bash script like
```shell
#!/bin/bash
source /tmp/venv/bin/activate && ANSIBLE_FORCE_COLOR=1 PYTHONUNBUFFERED=1 /tmp/venv/bin/ansible-playbook "$@"
```
The ansible provisioner template remains very simple. For example:
<Tabs>
<Tab heading="JSON">
```json
{
"type": "ansible",
"command": "/Path/To/call_ansible.sh",
"playbook_file": "./playbook.yml"
}
```
</Tab>
<Tab heading="HCL2">
```hcl
provisioner "ansible" {
command = "/Path/To/call_ansible.sh"
playbook_file = "./playbook.yml"
}
```
</Tab>
</Tabs>
Note that we're calling ansible-playbook at the end of this command and passing
all command line arguments through into this call; this is necessary for
making sure that --extra-vars and other important ansible arguments get set.
Note the quoting around the bash array, too; if you don't use quotes, any
arguments with spaces will not be read properly.
### Docker
When trying to use Ansible with Docker, it should "just work" but if it doesn't
you may need to tweak a few options.
- Change the ansible_connection from "ssh" to "docker"
- Set a Docker container name via the --name option.
On a CI server you probably want to overwrite ansible_host with a random name.
Example Packer template:
<Tabs>
<Tab heading="JSON">
```json
{
"variables": {
"ansible_host": "default",
"ansible_connection": "docker"
},
"builders":[
{
"type": "docker",
"image": "centos:7",
"commit": true,
"run_command": [ "-d", "-i", "-t", "--name", "{{user `ansible_host`}}", "{{.Image}}", "/bin/bash" ]
}
],
"provisioners": [
{
"type": "ansible",
"groups": [ "webserver" ],
"playbook_file": "./webserver.yml",
"extra_arguments": [
"--extra-vars", "ansible_host={{user `ansible_host`}} ansible_connection={{user `ansible_connection`}}"
]
}
]
}
```
</Tab>
<Tab heading="HCL2">
```hcl
variable "ansible_host" {
default = "default"
}
variable "ansible_connection" {
default = "docker"
}
source "docker" "example" {
image = "centos:7"
commit = true
run_command = [ "-d", "-i", "-t", "--name", var.ansible_host, "{{.Image}}", "/bin/bash" ]
}
build {
sources = [
"source.docker.example"
]
provisioner "ansible" {
groups = [ "webserver" ]
playbook_file = "./webserver.yml"
extra_arguments = [
"--extra-vars",
"ansible_host=${var.ansible_host} ansible_connection=${var.ansible_connection}"
]
}
}
```
</Tab>
</Tabs>
Example playbook:
```yaml
- name: configure webserver
hosts: webserver
tasks:
- name: install Apache
yum:
name: httpd
```
### Amazon Session Manager
When trying to use Ansible with Amazon's Session Manager, you may run into an error where Ansible
is unable to connect to the remote Amazon instance if the local proxy adapter for Ansible [use_proxy](#use_proxy) is false.
The error may look something like the following:
```
amazon-ebs: fatal: [default]: UNREACHABLE! => {"changed": false, "msg": "Failed to connect to the host via ssh: ssh: connect to host 127.0.0.1 port 8362: Connection timed out", "unreachable": true}
```
The error is caused by a limitation on using Amazon's SSM default Port Forwarding session which only allows for one
remote connection on the forwarded port. Since Ansible's SSH communication is not using the local proxy adapter
it will try to make a new SSH connection to the same forwarded localhost port and fail.
In order to workaround this issue Ansible can be configured via a custom inventory file to use the AWS session-manager-plugin
directly to create a new session, separate from the one created by Packer, at runtime to connect and remotely provision the instance.
-> **Warning:** Please note that the default region configured for the `aws` cli must match the build region where the instance is being
provisioned otherwise you may run into a TargetNotConnected error. Users can use `AWS_DEFAULT_REGION` to temporarily override
their configured region.
<Tabs>
<Tab heading="JSON">
```json
"provisioners": [
{
"type": "ansible",
"use_proxy": false,
"ansible_env_vars": ["PACKER_BUILD_NAME={{ build_name }}"],
"playbook_file": "./playbooks/playbook_remote.yml",
"inventory_file_template": "{{ .HostAlias }} ansible_host={{ .ID }} ansible_user={{ .User }} ansible_ssh_common_args='-o StrictHostKeyChecking=no -o ProxyCommand=\"sh -c \\\"aws ssm start-session --target %h --document-name AWS-StartSSHSession --parameters portNumber=%p\\\"\"'\n"
}
]
```
</Tab>
<Tab heading="HCL2">
```hcl
provisioner "ansible" {
use_proxy = false
playbook_file = "./playbooks/playbook_remote.yml"
ansible_env_vars = ["PACKER_BUILD_NAME={{ build_name }}"]
inventory_file_template = "{{ .HostAlias }} ansible_host={{ .ID }} ansible_user={{ .User }} ansible_ssh_common_args='-o StrictHostKeyChecking=no -o ProxyCommand=\"sh -c \\\"aws ssm start-session --target %h --document-name AWS-StartSSHSession --parameters portNumber=%p\\\"\"'\n"
}
```
</Tab>
</Tabs>
Full Packer template example:
<Tabs>
<Tab heading="JSON">
```json
{
"variables": {
"instance_role": "SSMInstanceProfile"
},
"builders": [
{
"type": "amazon-ebs",
"region": "us-east-1",
"ami_name": "packer-ami-ansible",
"instance_type": "t2.micro",
"source_ami_filter": {
"filters": {
"virtualization-type": "hvm",
"name": "ubuntu/images/*ubuntu-xenial-16.04-amd64-server-*",
"root-device-type": "ebs"
},
"owners": [
"099720109477"
],
"most_recent": true
},
"communicator": "ssh",
"ssh_username": "ubuntu",
"ssh_interface": "session_manager",
"iam_instance_profile":"{{user `instance_role`}}"
}
],
"provisioners": [
{
"type": "ansible",
"use_proxy": false,
"ansible_env_vars": ["PACKER_BUILD_NAME={{ build_name }}"],
"playbook_file": "./playbooks/playbook_remote.yml",
"inventory_file_template": "{{ .HostAlias }} ansible_host={{ .ID }} ansible_user={{ .User }} ansible_ssh_common_args='-o StrictHostKeyChecking=no -o ProxyCommand=\"sh -c \\\"aws ssm start-session --target %h --document-name AWS-StartSSHSession --parameters portNumber=%p\\\"\"'\n"
}
]
}
```
</Tab>
<Tab heading="HCL2">
```hcl
variables {
instance_role = "SSMInstanceProfile"
}
source "amazon-ebs" "ansible-example" {
region = "us-east-1"
ami_name = "packer-ami-ansible"
instance_type = "t2.micro"
source_ami_filter {
filters = {
name = "ubuntu/images/*ubuntu-xenial-16.04-amd64-server-*"
virtualization-type = "hvm"
root-device-type = "ebs"
}
owners = [ "099720109477" ]
most_recent = true
}
communicator = "ssh"
ssh_username = "ubuntu"
ssh_interface = "session_manager"
iam_instance_profile = var.instance_role
}
build {
sources = ["source.amazon-ebs.ansible-example"]
provisioner "ansible" {
use_proxy = false
playbook_file = "./playbooks/playbook_remote.yml"
ansible_env_vars = ["PACKER_BUILD_NAME={{ build_name }}"]
inventory_file_template = "{{ .HostAlias }} ansible_host={{ .ID }} ansible_user={{ .User }} ansible_ssh_common_args='-o StrictHostKeyChecking=no -o ProxyCommand=\"sh -c \\\"aws ssm start-session --target %h --document-name AWS-StartSSHSession --parameters portNumber=%p\\\"\"'\n"
}
}
```
</Tab>
</Tabs>
### Troubleshooting
If you are using an Ansible version >= 2.8 and Packer hangs in the
"Gathering Facts" stage, this could be the result of a pipelineing issue with
the proxy adapter that Packer uses. Setting `use_proxy: false,` in your
Packer config should resolve the issue. In the future we will default to setting
this, so you won't have to but for now it is a manual change you must make.