TransWikia.com

Packer won't correctly use private key for SSH auth in provisioning step

Server Fault Asked by siride on February 11, 2021

I use Packer to build VirtualBox images, with the Ansible provisioner to set up the images. The builder step creates a temporary user (ssh_username and ssh_password). The Ansible provisioner runs using this temporary user. I, of course, want to get rid of this user after it’s set up our more secure public key-only user. So I added a second Ansible provisioning step that connects as the secure user and removes the insecure user. Or at least that was the plan. However, Ansible via packer is unable to actually connect to the VM using this method.

Here is the relevant portion of the packer.json file:

"provisioners": [
    {
        "type": "ansible",
        "playbook_file": "playbooks/image/image.yml",
        "groups": [
            "{{user `ansible_group`}}"
        ],
        "user": "vagrant",
        "extra_arguments": [
            "--vault-password-file", "scripts/get-vault-password.sh",
            "-e", "global_configuration_user={{user `configuration_user`}}",
            "-e", "global_deployment_user={{user `deployment_user`}}",
            "-e", "ansible_ssh_pass=vagrant",
            "-vvvvv"
        ]
    },
    {
        "type": "ansible",
        "playbook_file": "playbooks/image/removeVagrant.yml",
        "groups": [
            "{{user `ansible_group`}}"
        ],
        "user": "{{user `configuration_user`}}",
        "extra_arguments": [
            "--vault-password-file", "scripts/get-vault-password.sh",
            "-e", "global_configuration_user={{user `configuration_user`}}",
            "-e", "global_deployment_user={{user `deployment_user`}}",
            "-e", "ansible_ssh_private_key_file=~/.ssh/id_{{user `configuration_user`}}_rsa",
            "-vvvvv"
        ]
    }
],

The first provisioning step works without issue. It’s the second one that fails with permission denied. Ansible is trying to execute the following SSH command:

ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o Port=37947 -o 'IdentityFile="/home/redacted/.ssh/id_rsa"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=redacted -o ConnectTimeout=10 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null -o ControlPath=/home/redacted/.ansible/cp/ansible-ssh-%h-%p-%r 127.0.0.1 '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo ~/.ansible/tmp/ansible-tmp-1491233126.24-276699777493633 `" && echo ansible-tmp-1491233126.24-276699777493633="` echo ~/.ansible/tmp/ansible-tmp-1491233126.24-276699777493633 `" ) && sleep 0'"'"''

The relevant part of the SSH debug output is:

debug1: SSH2_MSG_NEWKEYS received
debug2: key: /home/redacted/.ssh/id_rsa, explicit, agent
debug2: key: redacted
debug2: key: redacted
debug2: key: redacted
debug2: key: redacted
debug2: key: redacted
debug2: key: redacted
debug2: key: redacted
debug3: send packet: type 5
debug3: receive packet: type 6
debug2: service_accept: ssh-userauth
debug1: SSH2_MSG_SERVICE_ACCEPT received
debug3: send packet: type 50
debug3: receive packet: type 51
debug1: Authentications that can continue: publickey
debug3: start over, passed a different list publickey
debug3: preferred gssapi-with-mic,gssapi-keyex,hostbased,publickey
debug3: authmethod_lookup publickey
debug3: remaining preferred: ,gssapi-keyex,hostbased,publickey
debug3: authmethod_is_enabled publickey
debug1: Next authentication method: publickey
debug1: Offering RSA public key: /home/redacted/.ssh/id_rsa
debug3: send_pubkey_test
debug3: send packet: type 50
debug2: we sent a publickey packet, wait for reply
debug3: receive packet: type 51
debug1: Authentications that can continue: publickey
debug1: Offering RSA public key: key2
debug3: send_pubkey_test
debug3: send packet: type 50
debug2: we sent a publickey packet, wait for reply
debug3: receive packet: type 51

It then continues to try the remaining keys I have in my .ssh directory, which all fail, of course. The first one, the one that was requested, has already failed.

I ran Packer with -on-error=abort so that it would leave the VM up. I tried SSHing in with the same command, and I get a connection refused error. The reason for this, I discovered, is that it’s trying to connect to an SSH proxy port that Packer sets up. So I use the actual forwarded port that is set up on the Virtualbox VM, and SSH connection succeeds. Thus, it appears that the problem lies with the SSH proxy. However, I’m not sure how to make it behave.

I also tried using the ssh_authorized_key_file option (https://www.packer.io/docs/provisioners/ansible.html) in my Packer configuration for the Ansible provisioner. In this case, I got a Packer error saying that it failed to parse the authorized key (source code is here: https://github.com/bhcleek/packer-provisioner-ansible/blob/master/provisioner/ansible/provisioner.go). It didn’t tell me what the problem is. The Go documentation for the SSH library said that it’s the standard SSH keyfile format. Or that’s my best interpretation of it (https://godoc.org/golang.org/x/crypto/ssh#ParseAuthorizedKey).

One Answer

I had this problem and was unable to get the Packer SSH-Proxy to behave successfully.

In order for packer to not create the temporary key, you need to either bake the "provisioning key" into the AMI or have it exist on AWS ahead of time.

If you follow option 1 - you need to provide both the ssh_private_key_file option to the builder config, as well as setting ssh_agent_auth to true - like so:

   "ssh_username": "ubuntu",
   "ssh_private_key_file": "../provision",
   "ssh_agent_auth": true,

If you follow option 2 - provide the ssh_keypair_name option to the builder.

In both cases, you'll need provide the user to the Ansible provisioner, but specified key pair should be used instead of the temporary one packer generates.

NB: When I removed the user from the box using the Ansible provisioner it caused ansible to fail. I suspect that this is because it's not possible to make ansible connect to the target machine without going through the proxy, and it's not possible to specify the proxy user in a provisioner. I needed to perform the "Remove user and shutdown the machine" in a single async call.

Answered by prater on February 11, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP