vagrant-libvirt: vagrant up hangs on reconnecting with new ssh key (remote libvirt server)
Steps to reproduce
- set up remote SSH connection to libvirt server
- vagrant up
Expected behaviour
Vagrant should reconnect to the machine using the new SSH key and then carry on.
Actual behaviour
Vagrant hangs on this step and never reconnects to the machine.
System configuration
OS/Distro version:: Arch Linux on both workstation and server
Libvirt version: libvirt-4.6.0-3
Output of vagrant version; vagrant plugin list:
$ vagrant version
Installed Version: 2.1.5
Latest Version: 2.1.5
You're running an up-to-date version of Vagrant!
$ vagrant plugin list
pkg-config (1.3.1, global)
vagrant-libvirt (0.0.43, global)
Output of VAGRANT_LOG=debug vagrant ... --provider=libvirt
DEBUG ssh: == Net-SSH connection debug-level log START ==
DEBUG ssh: D, [2018-09-20T08:59:19.027464 #6638] DEBUG -- net.ssh.transport.session[3fd9fc3f8c78]: establishing connection to 192.168.121.90:22 through proxy
D, [2018-09-20T08:59:19.261438 #6638] DEBUG -- net.ssh.transport.session[3fd9fc3f8c78]: connection established
I, [2018-09-20T08:59:19.261734 #6638] INFO -- net.ssh.transport.server_version[3fd9fc401ef4]: negotiating protocol version
D, [2018-09-20T08:59:19.261835 #6638] DEBUG -- net.ssh.transport.server_version[3fd9fc401ef4]: local is `SSH-2.0-Ruby/Net::SSH_5.0.2 x86_64-linux'
D, [2018-09-20T08:59:19.262048 #6638] DEBUG -- net.ssh.transport.server_version[3fd9fc401ef4]: remote is `SSH-2.0-OpenSSH_7.4'
I, [2018-09-20T08:59:19.262570 #6638] INFO -- net.ssh.transport.algorithms[3fd9fc4011e8]: sending KEXINIT
D, [2018-09-20T08:59:19.263264 #6638] DEBUG -- io[3fd9fc3f83e0]: queueing packet nr 0 type 20 len 1156
D, [2018-09-20T08:59:19.263445 #6638] DEBUG -- io[3fd9fc3f83e0]: sent 1160 bytes
D, [2018-09-20T08:59:19.268106 #6638] DEBUG -- io[3fd9fc3f83e0]: read 1280 bytes
D, [2018-09-20T08:59:19.268331 #6638] DEBUG -- io[3fd9fc3f83e0]: received packet nr 0 type 20 len 1276
I, [2018-09-20T08:59:19.268426 #6638] INFO -- net.ssh.transport.algorithms[3fd9fc4011e8]: got KEXINIT from server
I, [2018-09-20T08:59:19.268771 #6638] INFO -- net.ssh.transport.algorithms[3fd9fc4011e8]: negotiating algorithms
D, [2018-09-20T08:59:19.269003 #6638] DEBUG -- net.ssh.transport.algorithms[3fd9fc4011e8]: negotiated:
* kex: diffie-hellman-group-exchange-sha1
* host_key: ssh-rsa
* encryption_server: aes128-cbc
* encryption_client: aes128-cbc
* hmac_client: hmac-sha1
* hmac_server: hmac-sha1
* compression_client: none
* compression_server: none
* language_client:
* language_server:
D, [2018-09-20T08:59:19.269050 #6638] DEBUG -- net.ssh.transport.algorithms[3fd9fc4011e8]: exchanging keys
D, [2018-09-20T08:59:19.269370 #6638] DEBUG -- io[3fd9fc3f83e0]: queueing packet nr 1 type 34 len 20
D, [2018-09-20T08:59:19.269452 #6638] DEBUG -- io[3fd9fc3f83e0]: sent 24 bytes
D, [2018-09-20T08:59:19.275742 #6638] DEBUG -- io[3fd9fc3f83e0]: read 152 bytes
D, [2018-09-20T08:59:19.275979 #6638] DEBUG -- io[3fd9fc3f83e0]: received packet nr 1 type 31 len 148
D, [2018-09-20T08:59:19.279892 #6638] DEBUG -- io[3fd9fc3f83e0]: queueing packet nr 2 type 32 len 140
D, [2018-09-20T08:59:19.280011 #6638] DEBUG -- io[3fd9fc3f83e0]: sent 144 bytes
D, [2018-09-20T08:59:19.284508 #6638] DEBUG -- io[3fd9fc3f83e0]: read 720 bytes
D, [2018-09-20T08:59:19.284707 #6638] DEBUG -- io[3fd9fc3f83e0]: received packet nr 2 type 33 len 700
D, [2018-09-20T08:59:19.287389 #6638] DEBUG -- io[3fd9fc3f83e0]: queueing packet nr 3 type 21 len 20
D, [2018-09-20T08:59:19.287496 #6638] DEBUG -- io[3fd9fc3f83e0]: sent 24 bytes
D, [2018-09-20T08:59:19.287615 #6638] DEBUG -- io[3fd9fc3f83e0]: received packet nr 3 type 21 len 12
D, [2018-09-20T08:59:19.288268 #6638] DEBUG -- net.ssh.authentication.session[3fd9fc3c3f14]: beginning authentication of `vagrant'
D, [2018-09-20T08:59:19.288474 #6638] DEBUG -- io[3fd9fc3f83e0]: queueing packet nr 4 type 5 len 28
D, [2018-09-20T08:59:19.288610 #6638] DEBUG -- io[3fd9fc3f83e0]: sent 52 bytes
D, [2018-09-20T08:59:19.330664 #6638] DEBUG -- io[3fd9fc3f83e0]: read 52 bytes
D, [2018-09-20T08:59:19.330955 #6638] DEBUG -- io[3fd9fc3f83e0]: received packet nr 4 type 6 len 28
D, [2018-09-20T08:59:19.331176 #6638] DEBUG -- net.ssh.authentication.session[3fd9fc3c3f14]: trying none
D, [2018-09-20T08:59:19.331599 #6638] DEBUG -- io[3fd9fc3f83e0]: queueing packet nr 5 type 50 len 44
D, [2018-09-20T08:59:19.331705 #6638] DEBUG -- io[3fd9fc3f83e0]: sent 68 bytes
D, [2018-09-20T08:59:19.335411 #6638] DEBUG -- io[3fd9fc3f83e0]: read 84 bytes
D, [2018-09-20T08:59:19.335730 #6638] DEBUG -- io[3fd9fc3f83e0]: received packet nr 5 type 51 len 60
D, [2018-09-20T08:59:19.335938 #6638] DEBUG -- net.ssh.authentication.session[3fd9fc3c3f14]: allowed methods: publickey,gssapi-keyex,gssapi-with-mic
D, [2018-09-20T08:59:19.336059 #6638] DEBUG -- net.ssh.authentication.methods.none[3fd9fc3c222c]: none failed
D, [2018-09-20T08:59:19.336159 #6638] DEBUG -- net.ssh.authentication.session[3fd9fc3c3f14]: trying publickey
D, [2018-09-20T08:59:19.336637 #6638] DEBUG -- net.ssh.authentication.agent[3fd9fc3c0670]: connecting to ssh-agent
D, [2018-09-20T08:59:19.336925 #6638] DEBUG -- net.ssh.authentication.agent[3fd9fc3c0670]: sending agent request 1 len 44
D, [2018-09-20T08:59:19.337282 #6638] DEBUG -- net.ssh.authentication.agent[3fd9fc3c0670]: received agent packet 5 len 1
D, [2018-09-20T08:59:19.337406 #6638] DEBUG -- net.ssh.authentication.agent[3fd9fc3c0670]: sending agent request 11 len 0
D, [2018-09-20T08:59:19.337600 #6638] DEBUG -- net.ssh.authentication.agent[3fd9fc3c0670]: received agent packet 12 len 634
D, [2018-09-20T08:59:19.338293 #6638] DEBUG -- net.ssh.authentication.methods.publickey[3fd9fc3c0a30]: trying publickey (dd:3b:b8:2e:85:04:06:e9:ab:ff:a8:0a:c0:04:6e:d6)
D, [2018-09-20T08:59:19.338615 #6638] DEBUG -- io[3fd9fc3f83e0]: queueing packet nr 6 type 50 len 348
D, [2018-09-20T08:59:19.338705 #6638] DEBUG -- io[3fd9fc3f83e0]: sent 372 bytes
D, [2018-09-20T08:59:19.342707 #6638] DEBUG -- io[3fd9fc3f83e0]: read 324 bytes
D, [2018-09-20T08:59:19.342936 #6638] DEBUG -- io[3fd9fc3f83e0]: received packet nr 6 type 60 len 300
D, [2018-09-20T08:59:19.347911 #6638] DEBUG -- io[3fd9fc3f83e0]: queueing packet nr 7 type 50 len 620
D, [2018-09-20T08:59:19.348004 #6638] DEBUG -- io[3fd9fc3f83e0]: sent 644 bytes
D, [2018-09-20T08:59:19.357083 #6638] DEBUG -- io[3fd9fc3f83e0]: read 36 bytes
D, [2018-09-20T08:59:19.357518 #6638] DEBUG -- io[3fd9fc3f83e0]: received packet nr 7 type 52 len 12
D, [2018-09-20T08:59:19.357699 #6638] DEBUG -- net.ssh.authentication.methods.publickey[3fd9fc3c0a30]: publickey succeeded (dd:3b:b8:2e:85:04:06:e9:ab:ff:a8:0a:c0:04:6e:d6)
DEBUG ssh: == Net-SSH connection debug-level log END ==
INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
INFO ssh: Execute: (sudo=false)
DEBUG ssh: stderr: 41e57d38-b4f7-4e46-9c38-13873d338b86-vagrant-ssh
DEBUG ssh: Exit status: 0
DEBUG ssh: Checking key permissions: /home/spike/.vagrant.d/insecure_private_key
INFO interface: detail:
Vagrant insecure key detected. Vagrant will automatically replace
this with a newly generated keypair for better security.
INFO interface: detail: default:
default: Vagrant insecure key detected. Vagrant will automatically replace
default: this with a newly generated keypair for better security.
default:
default: Vagrant insecure key detected. Vagrant will automatically replace
default: this with a newly generated keypair for better security.
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Re-using SSH connection.
INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
INFO ssh: Execute: (sudo=false)
DEBUG ssh: stderr: 41e57d38-b4f7-4e46-9c38-13873d338b86-vagrant-ssh
DEBUG ssh: Exit status: 0
INFO guest: Autodetecting host type for [#<Vagrant::Machine: default (VagrantPlugins::ProviderLibvirt::Provider)>]
DEBUG guest: Trying: elementary
DEBUG ssh: Re-using SSH connection.
INFO ssh: Execute: if test -r /etc/os-release; then
source /etc/os-release && test 'xelementary' = "x$ID" && exit
fi
if test -x /usr/bin/lsb_release; then
/usr/bin/lsb_release -i 2>/dev/null | grep -qi 'elementary' && exit
fi
if test -r /etc/issue; then
cat /etc/issue | grep -qi 'elementary' && exit
fi
exit 1
(sudo=false)
DEBUG ssh: stderr: 41e57d38-b4f7-4e46-9c38-13873d338b86-vagrant-ssh
DEBUG ssh: Exit status: 1
DEBUG guest: Trying: mint
DEBUG ssh: Re-using SSH connection.
INFO ssh: Execute: if test -r /etc/os-release; then
source /etc/os-release && test 'xLinux Mint' = "x$ID" && exit
fi
if test -x /usr/bin/lsb_release; then
/usr/bin/lsb_release -i 2>/dev/null | grep -qi 'Linux Mint' && exit
fi
if test -r /etc/issue; then
cat /etc/issue | grep -qi 'Linux Mint' && exit
fi
exit 1
(sudo=false)
DEBUG ssh: stderr: 41e57d38-b4f7-4e46-9c38-13873d338b86-vagrant-ssh
DEBUG ssh: Exit status: 1
DEBUG guest: Trying: atomic
DEBUG ssh: Re-using SSH connection.
INFO ssh: Execute: grep 'ostree=' /proc/cmdline (sudo=false)
DEBUG ssh: stderr: 41e57d38-b4f7-4e46-9c38-13873d338b86-vagrant-ssh
DEBUG ssh: Exit status: 1
DEBUG guest: Trying: trisquel
DEBUG ssh: Re-using SSH connection.
INFO ssh: Execute: [ -x /usr/bin/lsb_release ] && /usr/bin/lsb_release -i 2>/dev/null | grep Trisquel (sudo=false)
DEBUG ssh: stderr: 41e57d38-b4f7-4e46-9c38-13873d338b86-vagrant-ssh
DEBUG ssh: Exit status: 1
DEBUG guest: Trying: amazon
DEBUG ssh: Re-using SSH connection.
INFO ssh: Execute: grep 'Amazon Linux' /etc/os-release (sudo=false)
DEBUG ssh: stderr: 41e57d38-b4f7-4e46-9c38-13873d338b86-vagrant-ssh
DEBUG ssh: Exit status: 1
DEBUG guest: Trying: dragonflybsd
DEBUG ssh: Re-using SSH connection.
INFO ssh: Execute: uname -s | grep -i 'DragonFly' (sudo=false)
DEBUG ssh: stderr: 41e57d38-b4f7-4e46-9c38-13873d338b86-vagrant-ssh
DEBUG ssh: Exit status: 1
DEBUG guest: Trying: kali
DEBUG ssh: Re-using SSH connection.
INFO ssh: Execute: if test -r /etc/os-release; then
source /etc/os-release && test 'xkali' = "x$ID" && exit
fi
if test -x /usr/bin/lsb_release; then
/usr/bin/lsb_release -i 2>/dev/null | grep -qi 'kali' && exit
fi
if test -r /etc/issue; then
cat /etc/issue | grep -qi 'kali' && exit
fi
exit 1
(sudo=false)
DEBUG ssh: stderr: 41e57d38-b4f7-4e46-9c38-13873d338b86-vagrant-ssh
DEBUG ssh: Exit status: 1
DEBUG guest: Trying: ubuntu
DEBUG ssh: Re-using SSH connection.
INFO ssh: Execute: if test -r /etc/os-release; then
source /etc/os-release && test 'xubuntu' = "x$ID" && exit
fi
if test -x /usr/bin/lsb_release; then
/usr/bin/lsb_release -i 2>/dev/null | grep -qi 'ubuntu' && exit
fi
if test -r /etc/issue; then
cat /etc/issue | grep -qi 'ubuntu' && exit
fi
exit 1
(sudo=false)
DEBUG ssh: stderr: 41e57d38-b4f7-4e46-9c38-13873d338b86-vagrant-ssh
DEBUG ssh: Exit status: 1
DEBUG guest: Trying: fedora
DEBUG ssh: Re-using SSH connection.
INFO ssh: Execute: grep 'Fedora release' /etc/redhat-release (sudo=false)
DEBUG ssh: stderr: 41e57d38-b4f7-4e46-9c38-13873d338b86-vagrant-ssh
DEBUG ssh: Exit status: 1
DEBUG guest: Trying: alt
DEBUG ssh: Re-using SSH connection.
INFO ssh: Execute: cat /etc/altlinux-release (sudo=false)
DEBUG ssh: stderr: 41e57d38-b4f7-4e46-9c38-13873d338b86-vagrant-ssh
DEBUG ssh: stderr: cat: /etc/altlinux-release: No such file or directory
DEBUG ssh: Exit status: 1
DEBUG guest: Trying: pld
DEBUG ssh: Re-using SSH connection.
INFO ssh: Execute: cat /etc/pld-release (sudo=false)
DEBUG ssh: stderr: 41e57d38-b4f7-4e46-9c38-13873d338b86-vagrant-ssh
DEBUG ssh: stderr: cat: /etc/pld-release: No such file or directory
DEBUG ssh: Exit status: 1
DEBUG guest: Trying: funtoo
DEBUG ssh: Re-using SSH connection.
INFO ssh: Execute: grep Funtoo /etc/gentoo-release (sudo=false)
DEBUG ssh: stderr: 41e57d38-b4f7-4e46-9c38-13873d338b86-vagrant-ssh
DEBUG ssh: stderr: grep: /etc/gentoo-release: No such file or directory
DEBUG ssh: Exit status: 2
DEBUG guest: Trying: tinycore
DEBUG ssh: Re-using SSH connection.
INFO ssh: Execute: if test -r /etc/os-release; then
source /etc/os-release && test 'xCore Linux' = "x$ID" && exit
fi
if test -x /usr/bin/lsb_release; then
/usr/bin/lsb_release -i 2>/dev/null | grep -qi 'Core Linux' && exit
fi
if test -r /etc/issue; then
cat /etc/issue | grep -qi 'Core Linux' && exit
fi
exit 1
(sudo=false)
DEBUG ssh: stderr: 41e57d38-b4f7-4e46-9c38-13873d338b86-vagrant-ssh
DEBUG ssh: Exit status: 1
DEBUG guest: Trying: netbsd
DEBUG ssh: Re-using SSH connection.
INFO ssh: Execute: uname -s | grep NetBSD (sudo=false)
DEBUG ssh: stderr: 41e57d38-b4f7-4e46-9c38-13873d338b86-vagrant-ssh
DEBUG ssh: Exit status: 1
DEBUG guest: Trying: gentoo
DEBUG ssh: Re-using SSH connection.
INFO ssh: Execute: grep Gentoo /etc/gentoo-release (sudo=false)
DEBUG ssh: stderr: 41e57d38-b4f7-4e46-9c38-13873d338b86-vagrant-ssh
DEBUG ssh: stderr: grep: /etc/gentoo-release
DEBUG ssh: stderr: : No such file or directory
DEBUG ssh: Exit status: 2
DEBUG guest: Trying: omnios
DEBUG ssh: Re-using SSH connection.
INFO ssh: Execute: cat /etc/release | grep -i OmniOS (sudo=false)
DEBUG ssh: stderr: 41e57d38-b4f7-4e46-9c38-13873d338b86-vagrant-ssh
DEBUG ssh: stderr: cat: /etc/release: No such file or directory
DEBUG ssh: Exit status: 1
DEBUG guest: Trying: redhat
DEBUG ssh: Re-using SSH connection.
INFO ssh: Execute: cat /etc/redhat-release (sudo=false)
DEBUG ssh: stderr: 41e57d38-b4f7-4e46-9c38-13873d338b86-vagrant-ssh
DEBUG ssh: Exit status: 0
INFO guest: Detected: redhat!
DEBUG guest: Searching for cap: insert_public_key
DEBUG guest: Checking in: redhat
DEBUG guest: Checking in: linux
DEBUG guest: Found cap: insert_public_key in linux
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Re-using SSH connection.
INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
INFO ssh: Execute: (sudo=false)
DEBUG ssh: stderr: 41e57d38-b4f7-4e46-9c38-13873d338b86-vagrant-ssh
DEBUG ssh: Exit status: 0
DEBUG guest: Searching for cap: remove_public_key
DEBUG guest: Checking in: redhat
DEBUG guest: Checking in: linux
DEBUG guest: Found cap: remove_public_key in linux
INFO ssh: Inserting key to avoid password: ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDEd+fyoYAkspBr+dpU1Y8kiS0fw6HsUD8tbxcTIS3en5z++1JRk4RY3tlvC+cMseH4sEwLOKsCbFaEectbFNGBYlUzIPBT3uao5SNUlsRLFj+u93incOG2EdEbap6syKrXd5u/IvWMuCkAlmxemx/bk9lQXkaRb2cMrWU47RcQCQ7Li4ELxtmjdeVFxOudsxiEV/sAT3Q7yNVfxWDiYxClYdN0KbfrK32F/+7r0LPtDyxX/sMgWJ+pA8sS2SmiT6C5ZO/xEdcwFFyaOujwiQwgkCgbVubWCTK7BxJZl1VCMjArITl1Sznd4oNmnNsLIsLsms85fsQAsFNm1gLI+44p vagrant
INFO interface: detail:
Inserting generated public key within guest...
INFO interface: detail: default:
default: Inserting generated public key within guest...
default:
default: Inserting generated public key within guest...
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Re-using SSH connection.
INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
INFO ssh: Execute: (sudo=false)
DEBUG ssh: stderr: 41e57d38-b4f7-4e46-9c38-13873d338b86-vagrant-ssh
DEBUG ssh: Exit status: 0
DEBUG guest: Searching for cap: insert_public_key
DEBUG guest: Checking in: redhat
DEBUG guest: Checking in: linux
DEBUG guest: Found cap: insert_public_key in linux
INFO guest: Execute capability: insert_public_key [#<Vagrant::Machine: default (VagrantPlugins::ProviderLibvirt::Provider)>, "ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDEd+fyoYAkspBr+dpU1Y8kiS0fw6HsUD8tbxcTIS3en5z++1JRk4RY3tlvC+cMseH4sEwLOKsCbFaEectbFNGBYlUzIPBT3uao5SNUlsRLFj+u93incOG2EdEbap6syKrXd5u/IvWMuCkAlmxemx/bk9lQXkaRb2cMrWU47RcQCQ7Li4ELxtmjdeVFxOudsxiEV/sAT3Q7yNVfxWDiYxClYdN0KbfrK32F/+7r0LPtDyxX/sMgWJ+pA8sS2SmiT6C5ZO/xEdcwFFyaOujwiQwgkCgbVubWCTK7BxJZl1VCMjArITl1Sznd4oNmnNsLIsLsms85fsQAsFNm1gLI+44p vagrant"] (redhat)
DEBUG ssh: Uploading: /tmp/vagrant-linux-insert-public-key20180920-6638-b8fvfl to /tmp/vagrant-insert-pubkey-1537448362
DEBUG ssh: Re-using SSH connection.
DEBUG ssh: Re-using SSH connection.
INFO ssh: Execute: mkdir -p ~/.ssh
chmod 0700 ~/.ssh
cat '/tmp/vagrant-insert-pubkey-1537448362' >> ~/.ssh/authorized_keys && chmod 0600 ~/.ssh/authorized_keys
result=$?
rm -f '/tmp/vagrant-insert-pubkey-1537448362'
exit $result
(sudo=false)
DEBUG ssh: stderr: 41e57d38-b4f7-4e46-9c38-13873d338b86-vagrant-ssh
DEBUG ssh: Exit status: 0
DEBUG host: Searching for cap: set_ssh_key_permissions
DEBUG host: Checking in: arch
DEBUG host: Checking in: linux
DEBUG host: Found cap: set_ssh_key_permissions in linux
DEBUG host: Searching for cap: set_ssh_key_permissions
DEBUG host: Checking in: arch
DEBUG host: Checking in: linux
DEBUG host: Found cap: set_ssh_key_permissions in linux
INFO host: Execute capability: set_ssh_key_permissions [#<Vagrant::Environment: /home/spike/Vagrant/tyrellcorp>, #<Pathname:/home/spike/Vagrant/tyrellcorp/.vagrant/machines/default/libvirt/private_key>] (arch)
INFO interface: detail: Removing insecure key from the guest if it's present...
INFO interface: detail: default: Removing insecure key from the guest if it's present...
default: Removing insecure key from the guest if it's present...
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Re-using SSH connection.
INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
INFO ssh: Execute: (sudo=false)
DEBUG ssh: stderr: 41e57d38-b4f7-4e46-9c38-13873d338b86-vagrant-ssh
DEBUG ssh: Exit status: 0
DEBUG guest: Searching for cap: remove_public_key
DEBUG guest: Checking in: redhat
DEBUG guest: Checking in: linux
DEBUG guest: Found cap: remove_public_key in linux
INFO guest: Execute capability: remove_public_key [#<Vagrant::Machine: default (VagrantPlugins::ProviderLibvirt::Provider)>, "ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA6NF8iallvQVp22WDkTkyrtvp9eWW6A8YVr+kz4TjGYe7gHzIw+niNltGEFHzD8+v1I2YJ6oXevct1YeS0o9HZyN1Q9qgCgzUFtdOKLv6IedplqoPkcmF0aYet2PkEDo3MlTBckFXPITAMzF8dJSIFo9D8HfdOV0IAdx4O7PtixWKn5y2hMNG0zQPyUecp4pzC6kivAIhyfHilFR61RGL+GPXQ2MWZWFYbAGjyiYJnAmCP3NOTd0jMZEnDkbUvxhMmBYSdETk1rRgm+R4LOzFUGaHqHDLKLX+FIPKcF96hrucXzcWyLbIbEgE98OHlnVYCzRdK8jlqm8tehUc9c9WhQ== vagrant insecure public key"] (redhat)
DEBUG ssh: Uploading: /tmp/vagrant-linux-remove-public-key20180920-6638-1ga9i76 to /tmp/vagrant-remove-pubkey-1537448363
DEBUG ssh: Re-using SSH connection.
DEBUG ssh: Re-using SSH connection.
INFO ssh: Execute: if test -f ~/.ssh/authorized_keys; then
grep -v -x -f '/tmp/vagrant-remove-pubkey-1537448363' ~/.ssh/authorized_keys > ~/.ssh/authorized_keys.tmp
mv ~/.ssh/authorized_keys.tmp ~/.ssh/authorized_keys && chmod 0600 ~/.ssh/authorized_keys
result=$?
fi
rm -f '/tmp/vagrant-remove-pubkey-1537448363'
exit $result
(sudo=false)
DEBUG ssh: stderr: 41e57d38-b4f7-4e46-9c38-13873d338b86-vagrant-ssh
DEBUG ssh: Exit status: 0
INFO interface: detail: Key inserted! Disconnecting and reconnecting using new SSH key...
INFO interface: detail: default: Key inserted! Disconnecting and reconnecting using new SSH key...
default: Key inserted! Disconnecting and reconnecting using new SSH key...
^C INFO interface: warn: Waiting for cleanup before exiting...
DEBUG ssh: Checking whether SSH is ready...
INFO interface: warn: ==> default: Waiting for cleanup before exiting...
==> default: Waiting for cleanup before exiting...
ERROR warden: Error occurred: Call to virDomainLookupByUUID failed: Cannot write data: Broken pipe
INFO warden: Beginning recovery process...
INFO warden: Calling recover: #<VagrantPlugins::ProviderLibvirt::Action::WaitTillUp:0x00007fb3f8984028>
ERROR warden: Error occurred: Call to virDomainLookupByUUID failed: internal error: client socket is closed
INFO warden: Beginning recovery process...
INFO warden: Calling recover: #<VagrantPlugins::ProviderLibvirt::Action::WaitTillUp:0x00007fb3f8984028>
ERROR warden: Error occurred: Call to virDomainLookupByUUID failed: internal error: client socket is closed
INFO warden: Beginning recovery process...
INFO warden: Calling recover: #<VagrantPlugins::ProviderLibvirt::Action::WaitTillUp:0x00007fb3f8984028>
ERROR warden: Error occurred: Call to virDomainLookupByUUID failed: internal error: client socket is closed
INFO warden: Beginning recovery process...
INFO warden: Calling recover: #<VagrantPlugins::ProviderLibvirt::Action::WaitTillUp:0x00007fb3f8984028>
ERROR warden: Error occurred: Call to virDomainLookupByUUID failed: internal error: client socket is closed
INFO warden: Beginning recovery process...
INFO warden: Calling recover: #<VagrantPlugins::ProviderLibvirt::Action::WaitTillUp:0x00007fb3f8984028>
ERROR warden: Error occurred: Call to virDomainLookupByUUID failed: internal error: client socket is closed
INFO warden: Beginning recovery process...
INFO warden: Calling recover: #<VagrantPlugins::ProviderLibvirt::Action::WaitTillUp:0x00007fb3f8984028>
ERROR warden: Error occurred: Call to virDomainLookupByUUID failed: internal error: client socket is closed
INFO warden: Beginning recovery process...
INFO warden: Calling recover: #<VagrantPlugins::ProviderLibvirt::Action::WaitTillUp:0x00007fb3f8984028>
ERROR warden: Error occurred: Call to virDomainLookupByUUID failed: internal error: client socket is closed
INFO warden: Beginning recovery process...
INFO warden: Calling recover: #<VagrantPlugins::ProviderLibvirt::Action::WaitTillUp:0x00007fb3f8984028>
ERROR warden: Error occurred: Call to virDomainLookupByUUID failed: internal error: client socket is closed
INFO warden: Beginning recovery process...
INFO warden: Calling recover: #<VagrantPlugins::ProviderLibvirt::Action::WaitTillUp:0x00007fb3f8984028>
ERROR warden: Error occurred: Call to virDomainLookupByUUID failed: internal error: client socket is closed
INFO warden: Beginning recovery process...
INFO warden: Calling recover: #<VagrantPlugins::ProviderLibvirt::Action::WaitTillUp:0x00007fb3f8984028>
ERROR warden: Error occurred: Call to virDomainLookupByUUID failed: internal error: client socket is closed
INFO warden: Beginning recovery process...
INFO warden: Calling recover: #<VagrantPlugins::ProviderLibvirt::Action::WaitTillUp:0x00007fb3f8984028>
ERROR warden: Error occurred: Call to virDomainLookupByUUID failed: internal error: client socket is closed
INFO warden: Beginning recovery process...
INFO warden: Calling recover: #<VagrantPlugins::ProviderLibvirt::Action::WaitTillUp:0x00007fb3f8984028>
ERROR warden: Error occurred: Call to virDomainLookupByUUID failed: internal error: client socket is closed
INFO warden: Beginning recovery process...
INFO warden: Calling recover: #<VagrantPlugins::ProviderLibvirt::Action::WaitTillUp:0x00007fb3f8984028>
ERROR warden: Error occurred: Call to virDomainLookupByUUID failed: internal error: client socket is closed
INFO warden: Beginning recovery process...
INFO warden: Calling recover: #<VagrantPlugins::ProviderLibvirt::Action::WaitTillUp:0x00007fb3f8984028>
ERROR warden: Error occurred: Call to virDomainLookupByUUID failed: internal error: client socket is closed
INFO warden: Beginning recovery process...
INFO warden: Calling recover: #<VagrantPlugins::ProviderLibvirt::Action::WaitTillUp:0x00007fb3f8984028>
ERROR warden: Error occurred: Call to virDomainLookupByUUID failed: internal error: client socket is closed
INFO warden: Beginning recovery process...
INFO warden: Calling recover: #<VagrantPlugins::ProviderLibvirt::Action::WaitTillUp:0x00007fb3f8984028>
ERROR warden: Error occurred: Call to virDomainLookupByUUID failed: internal error: client socket is closed
INFO warden: Beginning recovery process...
INFO warden: Calling recover: #<VagrantPlugins::ProviderLibvirt::Action::WaitTillUp:0x00007fb3f8984028>
ERROR warden: Error occurred: Call to virDomainLookupByUUID failed: internal error: client socket is closed
INFO warden: Beginning recovery process...
INFO warden: Calling recover: #<VagrantPlugins::ProviderLibvirt::Action::WaitTillUp:0x00007fb3f8984028>
ERROR warden: Error occurred: Call to virDomainLookupByUUID failed: internal error: client socket is closed
INFO warden: Beginning recovery process...
INFO warden: Recovery complete.
ERROR warden: Error occurred: Call to virDomainLookupByUUID failed: internal error: client socket is closed
INFO warden: Beginning recovery process...
INFO warden: Calling recover: #<Vagrant::Action::Builtin::Call:0x000055d5a9900d88>
INFO warden: Beginning recovery process...
INFO warden: Calling recover: #<VagrantPlugins::ProviderLibvirt::Action::WaitTillUp:0x00007fb3f8984028>
ERROR warden: Error occurred: Call to virDomainLookupByUUID failed: internal error: client socket is closed
INFO warden: Beginning recovery process...
INFO warden: Calling recover: #<Vagrant::Action::Builtin::Call:0x000055d5a9900d88>
INFO warden: Beginning recovery process...
INFO warden: Calling recover: #<VagrantPlugins::ProviderLibvirt::Action::WaitTillUp:0x00007fb3f8984028>
ERROR warden: Error occurred: Call to virDomainLookupByUUID failed: internal error: client socket is closed
INFO warden: Beginning recovery process...
INFO warden: Calling recover: #<Vagrant::Action::Builtin::Call:0x000055d5a9900d88>
INFO warden: Beginning recovery process...
INFO warden: Calling recover: #<VagrantPlugins::ProviderLibvirt::Action::WaitTillUp:0x00007fb3f8984028>
INFO environment: Released process lock: machine-action-3f7d73f31ece305c4ef7ad5f17d11177
INFO environment: Running hook: environment_unload
INFO runner: Preparing hooks for middleware sequence...
INFO runner: 2 hooks defined.
INFO runner: Running action: environment_unload #<Vagrant::Action::Builder:0x000055d5a93e6730>
Traceback (most recent call last):
26: from /opt/vagrant/embedded/gems/gems/vagrant-2.1.5/lib/vagrant/batch_action.rb:82:in `block (2 levels) in run'
25: from /opt/vagrant/embedded/gems/gems/vagrant-2.1.5/lib/vagrant/machine.rb:194:in `action'
24: from /opt/vagrant/embedded/gems/gems/vagrant-2.1.5/lib/vagrant/machine.rb:194:in `call'
23: from /opt/vagrant/embedded/gems/gems/vagrant-2.1.5/lib/vagrant/environment.rb:614:in `lock'
22: from /opt/vagrant/embedded/gems/gems/vagrant-2.1.5/lib/vagrant/machine.rb:208:in `block in action'
21: from /opt/vagrant/embedded/gems/gems/vagrant-2.1.5/lib/vagrant/machine.rb:239:in `action_raw'
20: from /opt/vagrant/embedded/gems/gems/vagrant-2.1.5/lib/vagrant/action/runner.rb:66:in `run'
19: from /opt/vagrant/embedded/gems/gems/vagrant-2.1.5/lib/vagrant/util/busy.rb:19:in `busy'
18: from /opt/vagrant/embedded/gems/gems/vagrant-2.1.5/lib/vagrant/action/runner.rb:66:in `block in run'
17: from /opt/vagrant/embedded/gems/gems/vagrant-2.1.5/lib/vagrant/action/builder.rb:116:in `call'
16: from /opt/vagrant/embedded/gems/gems/vagrant-2.1.5/lib/vagrant/action/warden.rb:28:in `call'
15: from /opt/vagrant/embedded/gems/gems/vagrant-2.1.5/lib/vagrant/action/warden.rb:53:in `rescue in call'
14: from /opt/vagrant/embedded/gems/gems/vagrant-2.1.5/lib/vagrant/action/warden.rb:64:in `recover'
13: from /opt/vagrant/embedded/gems/gems/vagrant-2.1.5/lib/vagrant/action/warden.rb:64:in `each'
12: from /opt/vagrant/embedded/gems/gems/vagrant-2.1.5/lib/vagrant/action/warden.rb:67:in `block in recover'
11: from /opt/vagrant/embedded/gems/gems/vagrant-2.1.5/lib/vagrant/action/builtin/call.rb:61:in `recover'
10: from /opt/vagrant/embedded/gems/gems/vagrant-2.1.5/lib/vagrant/action/warden.rb:64:in `recover'
9: from /opt/vagrant/embedded/gems/gems/vagrant-2.1.5/lib/vagrant/action/warden.rb:64:in `each'
8: from /opt/vagrant/embedded/gems/gems/vagrant-2.1.5/lib/vagrant/action/warden.rb:67:in `block in recover'
7: from /home/spike/.vagrant.d/gems/2.5.1/gems/vagrant-libvirt-0.0.43/lib/vagrant-libvirt/action/wait_till_up.rb:84:in `recover'
6: from /home/spike/.vagrant.d/gems/2.5.1/gems/vagrant-libvirt-0.0.43/lib/vagrant-libvirt/action/wait_till_up.rb:88:in `terminate'
5: from /home/spike/.vagrant.d/gems/2.5.1/gems/vagrant-libvirt-0.0.43/lib/vagrant-libvirt/provider.rb:101:in `state'
4: from /home/spike/.vagrant.d/gems/2.5.1/gems/vagrant-libvirt-0.0.43/lib/vagrant-libvirt/driver.rb:66:in `created?'
3: from /home/spike/.vagrant.d/gems/2.5.1/gems/vagrant-libvirt-0.0.43/lib/vagrant-libvirt/driver.rb:52:in `get_domain'
2: from /home/spike/.vagrant.d/gems/2.5.1/gems/fog-libvirt-0.5.0/lib/fog/libvirt/models/compute/servers.rb:15:in `get'
1: from /home/spike/.vagrant.d/gems/2.5.1/gems/fog-libvirt-0.5.0/lib/fog/libvirt/requests/compute/list_domains.rb:9:in `list_domains'
/home/spike/.vagrant.d/gems/2.5.1/gems/fog-libvirt-0.5.0/lib/fog/libvirt/requests/compute/list_domains.rb:9:in `lookup_domain_by_uuid': Call to virDomainLookupByUUID failed: internal error: client socket is closed (Libvirt::RetrieveError)
A Vagrantfile to reproduce the issue:
# -*- mode: ruby -*-
# vi: set ft=ruby :
Vagrant.configure("2") do |config|
config.vm.box = "centos/7"
config.vm.provider :libvirt do |libvirt|
libvirt.host = "xxx.xxx.xxx.xx"
libvirt.connect_via_ssh = "true"
libvirt.username = "user"
libvirt.id_ssh_key_file = "/home/user/my_private_key"
libvirt.storage_pool_name = "VMs"
libvirt.cpus = 2
libvirt.memory = 2048
end
config.vm.provision "shell", inline: <<-SHELL
yum install -y epel-release
yum install -y wget vim git
wget https://raw.githubusercontent.com/thestinger/termite/master/termite.terminfo
tic -x termite.terminfo
yum install -y docker-compose docker
systemctl start docker
SHELL
end
Are you using upstream vagrant package or your distros package? Distro
** Additional Details **
I also want to add that even though I have to CTRL-C to get out of the hang, the box still works. I am able to vagrant ssh into the machine, and to use other things like vagrant provision. So best I can tell the key is actually being reinserted, but for some reason the host machine has an issue connecting to the remote server during that process.
About this issue
- Original URL
- State: closed
- Created 6 years ago
- Comments: 21 (10 by maintainers)
Commits related to this issue
- Change proxy_command to use embedded OpenSSH functionality instead of (#1222) Solves vagrant not detecting end of ssh connection when a proxy is used, described in #921 Allows ssh argument popula... — committed to vagrant-libvirt/vagrant-libvirt by electrofelix 3 years ago
- Move proxy_command to config and support templating Migrate the proxy_command specification to the config and add support for user override template to be used for edge cases. Moving it to the config... — committed to vagrant-libvirt/vagrant-libvirt by electrofelix 3 years ago
- Move proxy_command to config and support templating (#1226) Migrate the proxy_command specification to the config and add support for user override template to be used for edge cases. Moving it to t... — committed to vagrant-libvirt/vagrant-libvirt by electrofelix 3 years ago
I just ran into this. I think https://github.com/vagrant-libvirt/vagrant-libvirt/issues/989 might have as well.
Trying the
-q0modification from https://github.com/vagrant-libvirt/vagrant-libvirt/issues/921#issuecomment-464334757 seems to have fixed it.I hope the fix can get merged into production soon. This bug blocks network configuration, so if you’re setting static ips, you’re out of luck. Not to mention general provisioning steps.
@pigam that’s very useful to know, unfortunately I’m pretty certain that the ‘-N’ option is not common for nc implementations, so it might not be suitable.
@pigam @binaryronin would you be able to test by modifying the code in ~/.vagrant.d/gems/<vagrant-ruby-version>/gems/vagrant-libvirt-<version>/lib/vagrant-libvirt/provider.rb and change the following block of code: https://github.com/vagrant-libvirt/vagrant-libvirt/blob/c02905be14f95ddbc13d91ae39ff3061f7021fd5/lib/vagrant-libvirt/provider.rb#L71-L78
To replace the
nc %h %pwith either of-W %h:%pornc -q0 %h %pand see if it works consistently when using the config optionconnect_via_ssh = true.May need to consider how to allow users to explicitly control the
proxy_commanda bit better here, potentially by performing substitutions on a provided string, and maybe flag a warning in the output that using a remote libvirt host may have issues with the set proxy_command if it doesn’t react to EOF as expected.