elasticsearch-py: ssl verification fails despite verify_certs=false

In elasticsearch version 6.6.1 and elasticsearch-dsl version 6.1.0, ssl verification seems to ignore the verify_certs option. When set to True, the cert is still verified and fails on self-signed certs.

In version elasticsearch 5.5.1, and elasticsearch-dsl version 5.4.0, the verify_certs options works as expected.

client = Elasticsearch( hosts=['localhost'], verify_certs=False, timeout=60 )

elasticsearch.exceptions.SSLError: ConnectionError([SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:777)) caused by: SSLError([SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:777))

About this issue

  • Original URL
  • State: closed
  • Created 6 years ago
  • Reactions: 3
  • Comments: 28 (8 by maintainers)

Commits related to this issue

Most upvoted comments

I went through the debugger a bunch and found that verify_certs is ignored if ca_certs is None or set to some value (None is taken as “use defaults”, which results in certs being set to required). Simply set this to a False value of some sort that isn’t None and it should work.

es = Elasticsearch("https://user:pass@myelasticsearch",
                   ca_certs=False,
                   verify_certs=False)

This seems to be an issue with the underlying Python library, but it’s difficult to figure that out due to the way keyword args are passed around in the Elasticsearch library.

@gnarlyman thanks for the issue and the good eye. I’ll get this fixed asap.

But please note that the use of verify_certs is depreicated.

Please try creating an ssl_context object and set the verification mode on the context.

import ssl
from elasticsearch.connection import create_ssl_context

ssl_context = create_ssl_context(<use `cafile`, or `cadata` or `capath` to set your CA or CAs)
context.check_hostname = False
context.verify_mode = ssl.CERT_NONE

es = Elasticsearch('localhost', ssl_context=context, timeout=60

I experienced a similar problem and the following way seems to work for elastic 6.3.1 and urllib3 1.25.3.

from elasticsearch import Elasticsearch, RequestsHttpConnection

es = Elasticsearch([{'host': 'https://admin:admin@localhost:9200'}], 
                   verify_certs=False,
                   connection_class=RequestsHttpConnection)

In a nutshell, the default connection class is Urllib3HttpConnection, which raises the exception below:

elasticsearch.exceptions.SSLError: ConnectionError([SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:720)) caused by: SSLError([SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:720))

If the connection class is set to RequestsHttpConnection, just a warning message will appear: UserWarning: Connecting to https://localhost:9200 using SSL with verify_certs=False is insecure.

Hi @geajack ,

Have you tried the following code?

from elasticsearch import Elasticsearch, RequestsHttpConnection

es = Elasticsearch([{'host': 'https://admin:admin@localhost:9200'}], 
                   verify_certs=False,
                   connection_class=RequestsHttpConnection)

find more details here #712 (comment)

For me, it only worked after removing list and dict, and simply using the raw connection string. ES version 6.6.0 and elasticsearch6 (6.4.2) python package.

from elasticsearch import Elasticsearch, RequestsHttpConnection

es = Elasticsearch('https://admin:admin@localhost:9200', 
                   verify_certs=False,
                   connection_class=RequestsHttpConnection)

Hi @geajack ,

Have you tried the following code?

from elasticsearch import Elasticsearch, RequestsHttpConnection

es = Elasticsearch([{'host': 'https://admin:admin@localhost:9200'}], 
                   verify_certs=False,
                   connection_class=RequestsHttpConnection)

find more details here https://github.com/elastic/elasticsearch-py/issues/712#issuecomment-497251933

I gave the workaround a try and I could not make it work. I tried two variations, one with a cafile (from certifi) and one without a cafile when creating the SSL context (+ I also explicitly set verify_certs to False).

This is my test program:

import ssl
from elasticsearch import Elasticsearch
from elasticsearch.connection import create_ssl_context


def main():
    # no cafile!
    ssl_context = create_ssl_context()
    ssl_context.check_hostname = False
    ssl_context.verify_mode = ssl.CERT_NONE

    es = Elasticsearch(hosts=[{'host': 'localhost', 'port': 39200}],
                       scheme="https",
                       # to ensure that it does not use the default value `True`
                       verify_certs=False,
                       ssl_context=ssl_context,
                       http_auth=("rally", "rally-password"))
    es.info()


if __name__ == '__main__':
    main()

It always fails with:

elasticsearch.exceptions.SSLError: ConnectionError([SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:777)) caused by: SSLError([SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:777))

When running this in the REPL, I noticed that ssl_context.verify_mode has been set to VerifyMode.CERT_REQUIRED again after the (failing) call es.info().

Tbh, I did not completely debug the issue. I think the reason is that when creating Urllib3HttpConnection ca_cert is always set and further down the line urrlib3 overrides the verification mode again when a certificate is provided.

Did anyone manage to make it work with python3.5 and latest urllib3?

@vibha0411 , you can try downgrading elasticsearch to 7.9.1. It helped in my case. The new version is too strict.

I’m still having trouble with this. I have the OpenDistro ElasticSearch docker running on port 9200. I can get through to it with curl:

jack@Tower:~$ curl https://admin:admin@localhost:9200 --insecure
{
  "name" : "70dcf08de37f",
  "cluster_name" : "docker-cluster",
  "cluster_uuid" : "Ag1T5K2MR-aX8DbgWeO0AQ",
  "version" : {
    "number" : "7.1.1",
    "build_flavor" : "oss",
    "build_type" : "tar",
    "build_hash" : "7a013de",
    "build_date" : "2019-05-23T14:04:00.380842Z",
    "build_snapshot" : false,
    "lucene_version" : "8.0.0",
    "minimum_wire_compatibility_version" : "6.8.0",
    "minimum_index_compatibility_version" : "6.0.0-beta1"
  },
  "tagline" : "You Know, for Search"
}

Note that the connection is HTTPS, but the --insecure option tells curl not to check the certificate. However, this code fails in Python:

from elasticsearch import Elasticsearch

client = Elasticsearch(
    "https://admin:admin@localhost:9200",
    verify_certs=False
)

client.indices.create(index="sessions")

yielding the same error as in the OP:

elasticsearch.exceptions.SSLError: ConnectionError([SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1056)) caused by: SSLError([SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1056))

I’m using the latest version of elasticsearch as far as I know, I installed it with pip on Python 3.7 just today.