10000 [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed when using KUBECONFIG env in config.load_kube_config(config_file= · Issue #2329 · kubernetes-client/python · GitHub
[go: up one dir, main page]

Skip to content

[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed when using KUBECONFIG env in config.load_kube_config(config_file= #2329

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
d33psky opened this issue Jan 22, 2025 · 3 comments
Labels
kind/bug Categorizes issue or PR as related to a bug. lifecycle/stale Denotes an issue or PR has remained open with no activity and has become stale.

Comments

@d33psky
Copy link
d33psky commented Jan 22, 2025

I do not have permission to reopen #1767 but the issue persists into 2025 and needs a fix or generic workaround.

What happened (please include outputs or screenshots):
Setting the config_file variable in the config.load_kube_config arguments to point to a kube config different than the default ~/.kube/config breaks with

urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='XXX', port=16443): Max retries exceeded with url: /api/v1/nodes (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1000)')))

(with host variable replaced by XXX) while kubectl can use the file just fine via KUBECONFIG env var.

What you expected to happen:
kubernetes-client/python should be able to use a different config than the default one.

How to reproduce it (as minimally and precisely as possible):
have a KUBECONFIG environment variable set to a kube config file different than the default ~/.kube/config file

>>> import kubernetes
>>> import os
>>> os.path.exists(os.environ["KUBECONFIG"])
True
>>> kubernetes.config.load_kube_config(os.environ["KUBECONFIG"])
>>> v1 = kubernetes.client.CoreV1Api()
>>> v1.list_node()

This throws the SSLCertVerificationError error shown above.

Anything else we need to know?:
Original ticket is #1767

Environment:

  • Kubernetes version (kubectl version): Client Version: v1.31.5 Kustomize Version: v5.4.2 Server Version: v1.30.8
  • OS (e.g., MacOS 10.13.6): Ubuntu 22.04.5 LTS
  • Python version (python --version) Python 3.12.3
  • Python client version (pip list | grep kubernetes) kubernetes 31.0.0
@d33psky d33psky added the kind/bug Categorizes issue or PR as related to a bug. label Jan 22, 2025
@rossigee
Copy link
rossigee commented Jan 26, 2025

I hit this too recently. I think I understand what is going on and I've put a rough PR together that solves it for me. Will submit shortly.

@rossigee
Copy link
rossigee commented Jan 26, 2025

@d33psky - it seems it can be done like this (no patch necessary)...

    kubernetes.config.load_kube_config(os.environ["KUBECONFIG"])
    configuration = kubernetes.client.Configuration.get_default_copy()
    configuration.ssl_ca_cert = "/etc/ssl/certs/ca-certificates.crt"
    api_client = kubernetes.client.ApiClient(configuration=configuration)
    v1 = kubernetes.client.CoreV1Api(api_client)

@k8s-triage-robot
Copy link

The Kubernetes project currently lacks enough contributors to adequately respond to all issues.

This bot triages un-triaged issues according to the following rules:

  • After 90d of inactivity, lifecycle/stale is applied
  • After 30d of inactivity since lifecycle/stale was applied, lifecycle/rotten is applied
  • After 30d of inactivity since lifecycle/rotten was applied, the issue is closed

You can:

  • Mark this issue as fresh with /remove-lifecycle stale
  • Close this issue with /close
  • Offer to help out with Issue Triage

Please send feedback to sig-contributor-experience at kubernetes/community.

/lifecycle stale

@k8s-ci-robot k8s-ci-robot added the lifecycle/stale Denotes an issue or PR has remained open with no activity and has become stale. label Apr 26, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/bug Categorizes issue or PR as related to a bug. lifecycle/stale Denotes an issue or PR has remained open with no activity and has become stale.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants
0