Welcome to the Linux Foundation Forum!

SSH Smoke Test section - SSH Credential issue

Although, I tested the connection onto the VM machine to confirm the ssh configuration, and entering several time the private key for the sshUser, I still have a authentication identified in this ticket: https://issues.jenkins.io/browse/JENKINS-67564
The SSH Pipeline Steps plugin indicates to be build on top of the Groovy SSH library; is it relying onto the SSH Credential plugin?


  • gouravshah
    gouravshah Posts: 95

    To me it appears to be an issue on the side of the server that you are connecting to. Please check the ssh daemon config /etc/ssh/sshd_config to see if key based authentication, root auth is enabled. If you make changes to this config, you would also need to restart ssh daemon in order for this to be effective.

  • bauger
    bauger Posts: 37

    Thanks your reply.
    For the last couple of days I tried different options and got completely confused by the following points:
    1. Why do we need to store the SSH private key in Jenkins and not the public key like in the video example when trying to connect onto the dev VM machine from personal laptop? The following article helped me to understand why storing credentials in Jenkins is a good practice rather than exporting the public key into the building agent and get it hard-coded in any script (https://support.cloudbees.com/hc/en-us/articles/222838288). At this point, I am still not clear if reversing the logic between the public and private keys means that the SSH daemon is embeded in Jenkins master node and what is the purpose of the ~/.authorized_keys file (specifically if we declare the public key in the google cloud console).
    2. The Jenkins plugin documentation is sometime very light; consequently I mixed the purpose of using an SSH build agent with the SSH step pipeline plugins. In my attempts to get a running pod to run the script, the only successful one I got was by defining a jenkins/ssh-agent container declared through the podTemplate feature (passing the pod label template defined through the UI -- in the node management/cloud interface -- and pointing to a jenkins/inbound-agent container as suggested by the Kubernetes plugin documentation does not work as when launching the pipeline build, pods are ended quickly and a new one is triggered consecutively; declaring a container pointing to jenkins/inbound-agent in the pipeline script has the same effect). In the case of using the jenkins/ssh-agent (see code belowI used the same public key build to open a connection between the transient pod and the dev VM to run the inspec command; that does not seem right...
    3. At some point I was also confused with the notion of ssh-agent; but with the combination of all previous point, my eyes started to squint from reading a lot of documentation...

    I think that a diagram explaining the intertwined usage of SSH public/private keys and the current infrastructure relying on Jenkins running on Kubernetes with transient building agent pods provisioned to connect onto the master node through JNLP or SSH would be helpful.


    Pipeline script:

  • gouravshah
    gouravshah Posts: 95

    @bauger fundamentally, you would need private key configured on the side which is going to initiate the connection (in this case jenkins host). You would configure public key on the destination which is going to accept the connection. If this point it clear, rest should fall in place.

    Its great to see you going through so much of documentation and trying to understand how it works. I would recommend you to join the office hours on Mon/Tue, so that I could answer all the questions that you have and help you get more clarity, the fastest possible way.


Upcoming Training