SSH Fails Due to Key File Permissions When I Try to Provision a Vagrant VM with Ansible on Windows/Cygwin
Asked Answered
D

6

8

I’m using Cygwin (CYGWIN_NT-6.3-WOW64) under Windows 8. I’m also running Vagrant (1.7.2) and Ansible (1.8.4). To be complete, my Virtualbox is 4.3.22.

Cygwin and Vagrant have been installed from their respective Windows install packages. I’m running Python 2.7.8 under Cygwin and used ‘pip install ansible’ to install Ansible.

All of these applications work fine in their own right. Cygwin works wonderfully; I use it as my shell all day, every day with no problems.

Vagrant and Virtualbox also work with no problems when I run Vagrant under Cygwin. Ansible works fine under Cygwin as well when I run plays or modules against the servers on my network.

The problem I run into is when I try to use Ansible to provision a Vagrant VM running locally.

For example, I vagrant up a VM and then draft a simple playbook to provision it. Following are the Vagrantfile:

VAGRANTFILE_API_VERSION = "2"
Vagrant.configure(VAGRANTFILE_API_VERSION) do |config|
  config.vm.define :drupal1 do |config|
  config.vm.box = "centos65-x86_64-updated"
  config.vm.hostname = "drupal1"
  config.vm.network "forwarded_port", guest: 80, host: 10080
  config.vm.network :private_network, ip: "192.168.56.101"
  config.vm.provider "virtualbox" do |v|
    v.name   = "Drupal Server 1"
    v.memory = 1024
  end
  config.vm.provision :ansible do |ansible|
    ansible.playbook = "provisioning/gather_facts.yml"
  end
end

and playbook:

---
- hosts: all
  gather_facts: yes

However, when I run ‘vagrant provision drupal1’, I get the following error:

vagrant provision drupal1 ==> drupal1: Running provisioner: ansible... PYTHONUNBUFFERED=1 ANSIBLE_FORCE_COLOR=true ANSIBLE_HOST_KEY_CHECKING=false ANSIBLE_SSH_ARGS='-o UserKnownHostsFile=/dev/null -o ControlMaster=auto -o ControlPersist=60s' ansible-playbook --private-key=C:/Users/mjenkins/workspace/Vagrant_VMs/Drupal1/.vagrant/machines/drupal1/virtualbox/private_key --user=vagrant --connection=ssh --limit='drupal1' --inventory-file=C:/Users/mjenkins/workspace/Vagrant_VMs/Drupal1/.vagrant/provisioners/ansible/inventory provisioning/gather_facts.yml PLAY [all] GATHERING FACTS fatal: [drupal1] => private_key_file (C:/Users/mjenkins/workspace/Vagrant_VMs/Drupal1/.vagrant/machines/drupal1/virtualbox/private_key) is group-readable or world-readable and thus insecure - you will probably get an SSH failure PLAY RECAP

to retry, use: --limit @/home/mjenkins/gather_facts.retry

drupal1 : ok=0 changed=0 unreachable=1
failed=0 Ansible failed to complete successfully. Any error output should be visible above. Please fix these errors and try again. Looking at the error, its plainly obvious that it has something to do with Ansible’s interpretation of my key and the file permissions on either it or the folder its in.

Here are a few observations and steps I’ve tried:

  1. I tried setting the permissions on the file and all the directories leading up to the file in Cygwin. That is chmod -R 700 .vagrant in the project directory. Still got the same error.

  2. The key file is being referenced using a Windows path, not a Cygwin path (odd, though, that the file in the limit output has a Cygwin path). So I checked the permissions from the Windows side and changed it so that ‘Everyone’ has no access to .vagrant and all files/folders under it. Still got the same error.

  3. Then I thought there might still be some problems with the file permissions/paths between my Cygwin based Ansible so I installed Python for Windows; used that pip to install Ansible, set my paths to that location, created an ansible-playbook.bat file, and ran Vagrant from a Windows cmd shell. Glad to say that tool chain worked….but I still got the same problem.

At this point I’m just about out of ideas so I turn to you, friends of Stackoverflow, for your input.

Any thoughts on solving this problem?

Defector answered 12/3, 2015 at 22:12 Comment(2)
I've ran into similar errors recently. It's due to an extra check that was added to Ansible in version 1.8. If you downgrade your Ansible to 1.7.X, the problem will disappear. Ansible was not meant to run on Windows (the control node), so formally it is not a bug, however I'm still considering to open an issue.Rivkarivkah
Thanks for the input, @MarcinPłonka. I'm not sure its Ansible, though, in my case. I'm leaning more towards Vagrant failing in a Windowsy environment. When I use ansible from the command line with the inventory and private key generated by Vagrant, everything works fine. That is, I am running ansible-playbook manually, not the 'vagrant provision' command. When I run the playbook standalone against the VMs, I connect fine and all steps are executed as exepcted.Defector
D
3

Your private key is very open and accessible by anyone. A check in SSH client prevents using such keys.

Try changing permissions with chmod from your cygwin or git bash, on your private and public keys. On C:/Users/mjenkins/workspace/Vagrant_VMs/Drupal1/.vagrant/machines/drupal1/virtualbox/private_key with chmod 700 private_key and ensure you have -rwx------ with ls -la

Dried answered 15/3, 2015 at 17:54 Comment(5)
Thanks for the suggestion. I've confirmed the permissions on the keys from the Windows and Cygwin side and it still fails with the same error.Defector
Its not about those keys. Looking at the message: private_key_file (C:/Users/mjenkins/workspace/Vagrant_VMs/Drupal1/.vagrant/machines/drupal1/virtualbox/private_key Its about the key inside your vagrant machine. Check the permissions in that.Dried
The permissions on the keys inside the VM are set by Vagrant correctly. The .ssh dir is "drwx------" and authorized_keys is "-rw-------". As it is, connecting from Ansible outside of Vagrant works as expected which further validates the correct key set up in the VM.Defector
@Dried I am experiencing the same problem. Nothing happen when I try to change the chmod on private_key. Could you detail how to do what you describe ?Gazebo
Use sudo to change the file permissions, if nothing happens. Also the file must be owned by the user who owns the entire home folder that contains your .ssh directoryDried
L
3

BAAAH! I just commented out the check in lib/ansible/runner/connection.py

Then I had to add in ansible.cfg [ssh_connection] control_path = /tmp

Lavaliere answered 21/5, 2015 at 13:59 Comment(2)
(why is is ansible's business to check ssh checks??)Lavaliere
Cryptography police : STAY RIGHT WHERE YOU ARELashandralashar
B
2

I had similar issue and figured out a solution. I added following entries in my vagrant file

config.ssh.insert_key = false

config.ssh.private_key_path = "~/.vagrant.d/insecure_private_key"

and copied the insecure_private_key from my windows user folder to cygwin home as the path above. afterwards I did a

chmod 700 ~/.vagrant.d/insecure_private_key

and as a last step I removed the content of this file in cygwin home

~/.ssh/known_hosts

once I rerun the ansible-playbook command, I confirmed to add my localhost back to the known_hosts and the ssh connection worked.

Burny answered 30/7, 2015 at 9:4 Comment(0)
A
2

My solution to this was to override synced folder's permissions settings in the VagrantFile with the following ones:

Vagrant.configure(2) do |config|
  config.vm.synced_folder "./", "/vagrant", 
     owner: "vagrant",
     mount_options: ["dmode=775,fmode=600"]
     ...
Autobiography answered 27/12, 2015 at 23:9 Comment(1)
Simple and functional, I like itOsric
K
0

truly saying it is much simpler if you understand what is happening.

  1. Vagrant keep one folder for sharing file with host and other VM, that is /vagrant . Anything into that will be having mode 777 nothing can be done for that. sudo chmod too will not help , and you cannot change the mode.

  2. Ansible is asking you to reduce the mode so that is not readable by group or all

so it is as simple as making a copy of the private key from /vagrant/.vagrant/machines/yourmachine/virtualbox or any provisioner/ to may be home i.e ~ or /root

and then change chmod to 700 and use it in the inventory list in hosts file.

Karim answered 28/9, 2015 at 6:8 Comment(0)
T
0

You could use the ansible_local provisioner for Vagrant. That will install Ansible into the VM. If you work with multiple vagrant virtual machines, then is is useful to let one be the ansible controller. This would then need the private SSH key. That can be done in the Vagrantfile with:

  config.vm.provision "file", source: "~/.vagrant.d/insecure_private_key", destination: "/home/vagrant/.ssh/id_rsa"
  config.vm.provision "shell", inline: "chmod 600 /home/vagrant/.ssh/id_rsa"
Talaria answered 6/5, 2022 at 9:53 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.