Ansible SSH forwarding doesn't seem to work with Vagrant
Asked Answered
O

6

25

OK, strange question. I have SSH forwarding working with Vagrant. But I'm trying to get it working when using Ansible as a Vagrant provisioner.

I found out exactly what Ansible is executing, and tried it myself from the command line, sure enough, it fails there too.

[/common/picsolve-ansible/u12.04%]ssh -o HostName=127.0.0.1 \
 -o User=vagrant -o  Port=2222 -o UserKnownHostsFile=/dev/null \
 -o StrictHostKeyChecking=no -o PasswordAuthentication=no \
 -o IdentityFile=/Users/bryanhunt/.vagrant.d/insecure_private_key \
 -o IdentitiesOnly=yes -o LogLevel=FATAL \
 -o ForwardAgent=yes "/bin/sh  \
 -c 'git clone [email protected]:bryan_picsolve/poc_docker.git /home/vagrant/poc_docker' "
Permission denied (publickey,password).

But when I just run vagrant ssh the agent forwarding works correctly, and I can checkout R/W my github project.

[/common/picsolve-ansible/u12.04%]vagrant ssh
vagrant@vagrant-ubuntu-precise-64:~$ /bin/sh  -c 'git clone [email protected]:bryan_picsolve/poc_docker.git /home/vagrant/poc_docker'
Cloning into '/home/vagrant/poc_docker'...
remote: Counting objects: 18, done.
remote: Compressing objects: 100% (14/14), done.
remote: Total 18 (delta 4), reused 0 (delta 0)
Receiving objects: 100% (18/18), done.
Resolving deltas: 100% (4/4), done.
vagrant@vagrant-ubuntu-precise-64:~$

Has anyone got any idea how it is working?

Update:

By means of ps awux I determined the exact command being executed by Vagrant.

I replicated it and git checkout worked.

 ssh [email protected] -p 2222 \
  -o Compression=yes \
  -o StrictHostKeyChecking=no \
  -o LogLevel=FATAL \ 
  -o StrictHostKeyChecking=no \
  -o UserKnownHostsFile=/dev/null \
  -o IdentitiesOnly=yes \
  -i /Users/bryanhunt/.vagrant.d/insecure_private_key \
  -o ForwardAgent=yes \
  -o LogLevel=DEBUG \
   "/bin/sh  -c 'git clone [email protected]:bryan_picsolve/poc_docker.git /home/vagrant/poc_docker' "
Ordonez answered 6/1, 2014 at 14:57 Comment(2)
Have you checked related questions #11956025 and #12924175?Shulamith
I took a look, but they didn't directly address my issue. I've found out what was wrong. I'll post the solution now.Ordonez
O
21

As of ansible 1.5 (devel aa2d6e47f0) last updated 2014/03/24 14:23:18 (GMT +100) and Vagrant 1.5.1 this now works.

My Vagrant configuration contains the following:

config.vm.provision "ansible" do |ansible|
   ansible.playbook = "../playbooks/basho_bench.yml"
   ansible.sudo = true
   ansible.host_key_checking = false
   ansible.verbose =  'vvvv'
   ansible.extra_vars = { ansible_ssh_user: 'vagrant', 
                 ansible_connection: 'ssh',
                 ansible_ssh_args: '-o ForwardAgent=yes'}

It is also a good idea to explicitly disable sudo use. For example, when using the Ansible git module, I do this:

- name: checkout basho_bench repository 
  sudo: no
  action: git [email protected]:basho/basho_bench.git dest=basho_bench
Ordonez answered 31/3, 2014 at 17:40 Comment(2)
From my experience, I believe that you have to be manually specifying an inventory in order for this to work. It didn't work for me if I just let vagrant make the inventory.Pronominal
If you destroy and re-create your Vagrant box, ssh-agent forwarding will be silently disabled, unless you pass an empty known hosts file, per Ben Darnell’s answer: https://mcmap.net/q/527159/-ansible-ssh-forwarding-doesn-39-t-seem-to-work-with-vagrantWareing
L
16

The key difference appears to be the UserKnownHostFile setting. Even with StrictHostKeyChecking turned off, ssh quietly disables certain features including agent forwarding when there is a conflicting entry in the known hosts file (these conflicts are common for vagrant since multiple VMs may have the same address at different times). It works for me if I point UserKnownHostFile to /dev/null:

config.vm.provision "ansible" do |ansible|
  ansible.playbook = "playbook.yml"

  ansible.raw_ssh_args = ['-o UserKnownHostsFile=/dev/null']
end
Luminous answered 16/5, 2014 at 20:30 Comment(1)
Wow, how many ways can this stuff break, I had it working before, but got broken again, followed your suggestion and it worked. This stuff is very necessary but very brittle, not helped by the britleness of ssh command, that util is really showing it's age/cruft.Infantryman
H
8

Here's a workaround:

Create an ansible.cfg file in the same directory as your Vagrantfile with the following lines:

[ssh_connection]
ssh_args = -o ForwardAgent=yes
Hebdomadary answered 10/2, 2014 at 15:39 Comment(2)
That works when using Ansible without Vagrant (I use the same config), but not (if I recall correctly) when using it with Vagrant. IMHO the hassle involved in getting this stuff working is a weak point in an otherwise fantastic tool.Infantryman
ansible uses ssh in provisioning, and adds your vagrant VM to ~/.ssh/known_hosts. AgentForwarding depends on verified host keys, so before you run ansible to provision better remove any outdated key with: ssh-keygen -R [127.0.0.1]:2222Norvin
N
3

You can simply add this line to your Vagrantfile to enable the ssh forwarding:

config.ssh.forward_agent = true

Note: Don't forget to execute the task with become: false

Hope, this will help.

Navarrette answered 17/1, 2015 at 20:44 Comment(1)
This is the only thing that worked for my, even though I set ssh_args = -A in the ansible.cfg (and do not forget to execute the task with become: false).Bardwell
S
2

I've found that I need to do two separate things (on Ubuntu 12.04) to get it working:

  • the -o ForwardAgent thing that @Lorin mentions
  • adding /etc/sudoers.d/01-make_SSH_AUTH_SOCK_AVAILABLE with these contents:

    Defaults env_keep += "SSH_AUTH_SOCK"
    
Sheugh answered 8/3, 2014 at 5:21 Comment(2)
I'm pretty sure I've tried both. I'll try again next weekend with latest Ansible, and both your suggestions, thx, bryanOrdonez
@Ordonez Beware the ControlMaster! As far as I can tell, that keeps your SSH connection alive for 60 seconds even if Vagrant has stopped. So if you make a change that affects how ssh works, it won't take effect if the original connection is running. I recommend that you delete the ControlMaster and ControlPersist options from the ansible.cfg, at least while you're debugging.Sheugh
F
2

I struggled with a very similar problem for a few hours. Vagrant 1.7.2 ansible 1.9.4

My symptoms:

failed: [vagrant1] => {"cmd": "/usr/bin/git ls-remote '' -h refs/heads/HEAD", "failed": true, "rc": 128}
stderr: Permission denied (publickey).
fatal: Could not read from remote repository.

Please make sure you have the correct access rights
and the repository exists.

msg: Permission denied (publickey).
fatal: Could not read from remote repository.

Please make sure you have the correct access rights
and the repository exists.

FATAL: all hosts have already failed -- aborting

SSH'ing into the guest, I found that my ssh-agent was forwarding as expected:

vagrant@vagrant-ubuntu-trusty-64:~$ ssh -T [email protected]
Hi baxline! You've successfully authenticated, but GitHub does not provide shell access.

However, from the host machine, I could not open the connection:

$ ansible web -a "ssh-add -L"
vagrant1 | FAILED | rc=2 >>
Could not open a connection to your authentication agent.

After confirming that my ansible.cfg file was set up, as @Lorin noted, and my Vagrantfile set config.ssh.forward_agent = true, I still came up short.

The solution was to delete all lines in my host's ~/.ssh/known_hosts file that were associated with my guest. For me, they were the lines that started with:

[127.0.0.1]:2201 ssh-rsa
[127.0.0.1]:2222 ssh-rsa
[127.0.01]:2222 ssh-rsa
[127.0.0.1]:2200 ssh-rsa

Note the third line has a funny ip address. I'm not certain, but I believe that line was the culprit. These lines are created as I destroy and create vagrant VMs.

Freezer answered 13/1, 2016 at 16:47 Comment(1)
Top tip about the known_hosts file on the host machine (not the VM). This was my problem tooMaharashtra

© 2022 - 2024 — McMap. All rights reserved.