8000 Getting HTTPError: HTTP Error 403: Forbidden when trying to load California Housing dataset · Issue #28297 · scikit-learn/scikit-learn · GitHub
[go: up one dir, main page]

Skip to content

Getting HTTPError: HTTP Error 403: Forbidden when trying to load California Housing dataset #28297

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
AryamanBhatia opened this issue Jan 28, 2024 · 36 comments
Labels

Comments

@AryamanBhatia
Copy link

Describe the bug

When trying to load the dataset I get an error.

Steps/Code to Reproduce

from sklearn.datasets import fetch_california_housing
from sklearn.model_selection import train_test_split 
from sklearn.preprocessing import StandardScaler

housing = fetch_california_housing()

X_train_full, X_test, y_train_full, y_test = train_test_split(
housing.data, housing.target)
X_train, X_valid, y_train, y_valid = train_test_split(X_train_full, y_train_full)

scaler = StandardScaler()
X_train_scaled = scaler.fit_transform(X_train)
X_valid_scaled = scaler.transform(X_valid)
X_test_scaled = scaler.transform(X_test)

Expected Results

Dataset loads

Actual Results

HTTPError                                 Traceback (most recent call last)
[/var/folders/wx/mz49j6yd5514yjn5k60sk6900000gn/T/ipykernel_16344/1379907178.py](https://file+.vscode-resource.vscode-cdn.net/var/folders/wx/mz49j6yd5514yjn5k60sk6900000gn/T/ipykernel_16344/1379907178.py) in <module>
      3 from sklearn.preprocessing import StandardScaler
      4 
----> 5 housing = fetch_california_housing()
      6 
      7 X_train_full, X_test, y_train_full, y_test = train_test_split(

[~/opt/anaconda3/lib/python3.9/site-packages/sklearn/datasets/_california_housing.py](https://file+.vscode-resource.vscode-cdn.net/Users/aryamanbhatia/neural%20network%20for%20practice/~/opt/anaconda3/lib/python3.9/site-packages/sklearn/datasets/_california_housing.py) in fetch_california_housing(data_home, download_if_missing, return_X_y, as_frame)
    133     This dataset consists of 20,640 samples and 9 features.
    134     """
--> 135     data_home = get_data_home(data_home=data_home)
    136     if not exists(data_home):
    137         makedirs(data_home)

[~/opt/anaconda3/lib/python3.9/site-packages/sklearn/datasets/_base.py](https://file+.vscode-resource.vscode-cdn.net/Users/aryamanbhatia/neural%20network%20for%20practice/~/opt/anaconda3/lib/python3.9/site-packages/sklearn/datasets/_base.py) in _fetch_remote(remote, dirname)

[~/opt/anaconda3/lib/python3.9/urllib/request.py](https://file+.vscode-resource.vscode-cdn.net/Users/aryamanbhatia/neural%20network%20for%20practice/~/opt/anaconda3/lib/python3.9/urllib/request.py) in urlretrieve(url, filename, reporthook, data)
    237     url_type, path = _splittype(url)
    238 
--> 239     with contextlib.closing(urlopen(url, data)) as fp:
    240         headers = fp.info()
    241 
...
--> 641         raise HTTPError(req.full_url, code, msg, hdrs, fp)
    642 
    643 class HTTPRedirectHandler(BaseHandler):

HTTPError: HTTP Error 403: Forbidden

Versions

System:
    python: 3.9.13 (main, Aug 25 2022, 18:29:29)  [Clang 12.0.0 ]
executable: /Users/aryamanbhatia/opt/anaconda3/bin/python
   machine: macOS-10.16-x86_64-i386-64bit

Python dependencies:
          pip: 22.2.2
   setuptools: 63.4.1
      sklearn: 1.0.2
        numpy: 1.26.3
        scipy: 1.9.1
       Cython: 0.29.32
       pandas: 1.4.4
   matplotlib: 3.5.2
       joblib: 1.1.0
threadpoolctl: 2.2.0

Built with OpenMP: True
@AryamanBhatia AryamanBhatia added Bug Needs Triage Issue requires triage labels Jan 28, 2024
@glemaitre
Copy link
Member

I cannot reproduce. Either it was something transient or it is on your network side.

@minkhantDoctoral
Copy link

I got the same error on Google Colab too. Someone said this bug occurs very frequent. Sometime it works, sometime it doesn't. Is there any solution to it?

@AsierMM
Copy link
AsierMM commented May 14, 2024

I tried yo change the account and it worked. If you change the account that you are working with and try the same code it should be fine.

@tdtu98
Copy link
tdtu98 commented May 25, 2024

Got the same error, don't know how to fix it.

@Diyago
Copy link
Diyago commented May 25, 2024

same, seems no fix half year(

@Diyago
Copy link
Diyago commented May 25, 2024

image

@lemeb
Copy link
lemeb commented May 27, 2024

Same here! (Based in France FYI, but using Colab)

@mateors6
Copy link
mateors6 commented Jun 7, 2024

I got the same problem in Spain, using Colab. But if I run it on Spyder it works fine!

@glemaitre
Copy link
Member
glemaitre commented Jun 7, 2024

I'm reopening this issue because we have too many reported issues.

@glemaitre glemaitre reopened this Jun 7, 2024
@glemaitre
Copy link
Member

I would propose to store the dataset directly on OpenML and call fetch_openml internally.

@lesteve
Copy link
8000 Member
lesteve commented Jun 7, 2024

I was able to reproduce the HTTP 403 inside Colab. Note this is not only fetch_california_housing but all scikit-learn datasets on Figshare, so fetch_covtype, fetch_rcv1, etc ...

❯ git grep 'figshare'
sklearn/datasets/_california_housing.py:    url="https://ndownloader.figshare.com/files/5976036",
sklearn/datasets/_covtype.py:    url="https://ndownloader.figshare.com/files/5976039",
sklearn/datasets/_kddcup99.py:    url="https://ndownloader.figshare.com/files/5976045",
sklearn/datasets/_kddcup99.py:    url="https://ndownloader.figshare.com/files/5976042",
sklearn/datasets/_lfw.py:    url="https://ndownloader.figshare.com/files/5976018",
sklearn/datasets/_lfw.py:    url="https://ndownloader.figshare.com/files/5976015",
sklearn/datasets/_lfw.py:        url="https://ndownloader.figshare.com/files/5976012",
sklearn/datasets/_lfw.py:        url="https://ndownloader.figshare.com/files/5976009",
sklearn/datasets/_lfw.py:        url="https://ndownloader.figshare.com/files/5976006",
sklearn/datasets/_olivetti_faces.py:    url="https://ndownloader.figshare.com/files/5976027",
sklearn/datasets/_rcv1.py:        url="https://ndownloader.figshare.com/files/5976069",
sklearn/datasets/_rcv1.py:        url="https://ndownloader.figshare.com/files/5976066",
sklearn/datasets/_rcv1.py:        url="https://ndownloader.figshare.com/files/5976063",
sklearn/datasets/_rcv1.py:        url="https://ndownloader.figshare.com/files/5976060",
sklearn/datasets/_rcv1.py:        url="https://ndownloader.figshare.com/files/5976057",
sklearn/datasets/_rcv1.py:    url="https://ndownloader.figshare.com/files/5976048",
sklearn/datasets/_species_distributions.py:    url="https://ndownloader.figshare.com/files/5976075",
sklearn/datasets/_species_distributions.py:    url="https://ndownloader.figshare.com/files/5976078",
sklearn/datasets/_twenty_newsgroups.py:    url="https://ndownloader.figshare.com/files/5975967"

I am wondering if Figshare is not blocking some IP address, the HTTP 403 Forbidden is a bit unexpected ...

Retrying a few times, it seems to be quite consistent, i.e. if you keep retrying the problem does not go away.

@glemaitre
Copy link
Member

I inquiry the support from figshare to see if they are aware of some limitations on their side.

@Ummyers
Copy link
Ummyers commented Jun 8, 2024

Captura de pantalla 2024-06-08 a la(s) 1 13 30 p m

Same here! Use Google Colab, Mexico City

@intagliated
Copy link

I am also facing the same issue .

image

Like @AsierMM mentioned changing the account fixes it . I hope there is a better fix :)

@ogrisel ogrisel removed the Needs Triage Issue requires triage label Jun 10, 2024
@ogrisel
Copy link
Member
ogrisel commented Jun 10, 2024

@glemaitre did you receive any feedback from figshare? Is there a public discussion to link to to monitor the resolution of this problem?

@ogrisel
Copy link
Member
ogrisel commented Jun 10, 2024

Related: googlecolab/colabtools#4601

@glemaitre
Copy link
Member

@glemaitre did you receive any feedback from figshare? Is there a public discussion to link to to monitor the resolution of this problem?

I have a ticket on the tracker: https://support.figshare.com/support/tickets/481164 but I don't think that you see it if you are not logged in.

For the moment, we are just asking me if this is related to an outage period by checking the "status page" (https://status.figshare.com/). I'll answer that this is not the case.

@glemaitre
Copy link
Member

Here for tracking the discussion:

It was my answer:

This is really unlikely that this is linked to an outage or a service disruptions. We got multiple reports. Be aware that this is not limited to our specific file: googlecolab/colabtools#4601
At first, one pattern that we could observe is that the 403 error was usually impacting people in Asia.
With Google Colab, this is more difficult to know since we don't have information about the node (or asking our user to debug this is quite tedious).
But since we observe this error quite repetitively, make me think that this is really not related to service disruptions.

And the support answer:

The content seems to be restricted on the part of Asia, but not from our side.
We had people contacting us regarding this and once they switch to a VPN or use a different network they can access the file.
The issue is not from our side.
If there's anything else we can assist you with, please let us know.

@lesteve
Copy link
Member
lesteve commented Jun 10, 2024

FWIW I tried again on Google Colab today and sklearn.datasets.fetch_california_housing worked ... In both working and non working cases, the IP address I was getting inside Colab seemed to come from the US, according to websites like https://iplocation.com/.

Below I post the output of wget inside a Colab session where fetch_california_housing works. Not sure if there is a command that would be more useful to run next time we can reproduce the issue.

# Use !wget with ! at the beginning inside Colab
!wget https://ndownloader.figshare.com/files/5976036
--2024-06-10 14:27:58--  https://ndownloader.figshare.com/files/5976036
Resolving ndownloader.figshare.com (ndownloader.figshare.com)... 34.247.104.150, 34.248.16.15, 52.209.182.24, ...
Connecting to ndownloader.figshare.com (ndownloader.figshare.com)|34.247.104.150|:443... connected.
HTTP request sent, awaiting response... 302 Found
Location: https://s3-eu-west-1.amazonaws.com/pfigshare-u-files/5976036/cal_housing.tgz?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAIYCQYOYV5JSSROOA/20240610/eu-west-1/s3/aws4_request&X-Amz-Date=20240610T142758Z&X-Amz-Expires=10&X-Amz-SignedHeaders=host&X-Amz-Signature=70632a1b1291980f200baf2baa7aae34459f94a56d641f0a22c9291663dd04cc [following]
--2024-06-10 14:27:58--  https://s3-eu-west-1.amazonaws.com/pfigshare-u-files/5976036/cal_housing.tgz?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAIYCQYOYV5JSSROOA/20240610/eu-west-1/s3/aws4_request&X-Amz-Date=20240610T142758Z&X-Amz-Expires=10&X-Amz-SignedHeaders=host&X-Amz-Signature=70632a1b1291980f200baf2baa7aae34459f94a56d641f0a22c9291663dd04cc
Resolving s3-eu-west-1.amazonaws.com (s3-eu-west-1.amazonaws.com)... 52.92.36.48, 52.92.0.64, 52.218.28.243, ...
Connecting to s3-eu-west-1.amazonaws.com (s3-eu-west-1.amazonaws.com)|52.92.36.48|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 441963 (432K) [binary/octet-stream]
Saving to: ‘5976036’

5976036             100%[===================>] 431.60K  1.01MB/s    in 0.4s    

2024-06-10 14:27:59 (1.01 MB/s) - ‘5976036’ saved [441963/441963]

Since it seems like the requests gets a HTTP 302 from ndownloader.figshare.com and then goes to a S3 bucket on AWS, a wild guess may be that the HTTP 403 was coming from AWS? Edit: maybe not, at least googlecolab/colabtools#4601 screenshot seem to indicate that the HTTP 403 happens on ndownloader.figshare.com.

@lesteve
Copy link
Member
lesteve commented Jun 11, 2024

This also happens sporadically in the CI by the way e.g. this build with the error:

____________________________ [doctest] compose.rst _____________________________
285 the regressor that will be used for prediction, and the transformer that will
286 be applied to the target variable::
287 
288   >>> import numpy as np
289   >>> from sklearn.datasets import fetch_california_housing
290   >>> from sklearn.compose import TransformedTargetRegressor
291   >>> from sklearn.preprocessing import QuantileTransformer
292   >>> from sklearn.linear_model import LinearRegression
293   >>> from sklearn.model_selection import train_test_split
294   >>> X, y = fetch_california_housing(return_X_y=True)
UNEXPECTED EXCEPTION: <HTTPError 403: 'Forbidden'>
Traceback (most recent call last):
  File "/usr/share/miniconda/envs/testvenv/lib/python3.9/doctest.py", line 1334, in __run
    exec(compile(example.source, filename, "single",
  File "<doctest compose.rst[59]>", line 1, in <module>
  File "/home/vsts/work/1/s/sklearn/utils/_param_validation.py", line 213, in wrapper
    return func(*args, **kwargs)
  File "/home/vsts/work/1/s/sklearn/datasets/_california_housing.py", line 177, in fetch_california_housing
    archive_path = _fetch_remote(
  File "/home/vsts/work/1/s/sklearn/datasets/_base.py", line 1466, in _fetch_remote
    urlretrieve(remote.url, file_path)
  File "/usr/share/miniconda/envs/testvenv/lib/python3.9/urllib/request.py", line 239, in urlretrieve
    with contextlib.closing(urlopen(url, data)) as fp:
  File "/usr/share/miniconda/envs/testvenv/lib/python3.9/urllib/request.py", line 214, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/share/miniconda/envs/testvenv/lib/python3.9/urllib/request.py", line 523, in open
    response = meth(req, response)
  File "/usr/share/miniconda/envs/testvenv/lib/python3.9/urllib/request.py", line 632, in http_response
    response = self.parent.error(
  File "/usr/share/miniconda/envs/testvenv/lib/python3.9/urllib/request.py", line 561, in error
    return self._call_chain(*args)
  File "/usr/share/miniconda/envs/testvenv/lib/python3.9/urllib/request.py", line 494, in _call_chain
    result = func(*args)
  File "/usr/share/miniconda/envs/testvenv/lib/python3.9/urllib/request.py", line 641, in http_error_default
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 403: Forbidden
/home/vsts/work/1/s/doc/modules/compose.rst:294: UnexpectedException

@prithivcendrol
Copy link

image

same happens to me don't know why it is happening

@ogrisel
Copy link
Member
ogrisel commented Jun 11, 2024

@prithivcendrol could you please inspect some information about the public IP address of the host doing the query to figshare with a tool like:

  • from the Linux/macOS command line
curl ipinfo.io
  • from a Jupyter notebook cell on a Linux/macOS machine with curl installed:
!curl ipinfo.io
  • from a Python script (any OS)
from urllib.request import urlopen
import json
from pprint import pprint

pprint(json.load(urlopen("https://ipinfo.io")))

when the 403 error happens? That would help us understand if the problem is specific to queries come from specific cloud data centers or regions of the world.

Note that some information return by this command, please feel free to edit things out and only keep coarse grained info.

@ogrisel
Copy link
Member
ogrisel commented Jun 11, 2024

For information, I could just reproduce a 403 when running:

from sklearn.datasets import fetch_california_housing

fetch_california_housing()

on google colab and here is the ipinfo of nodes where I observed the 403 error:

{'city': 'Las Vegas',
 'country': 'US',
 'hostname': '220.127.125.34.bc.googleusercontent.com',
 'ip': '34.125.127.220',
 'loc': '36.1750,-115.1372',
 'org': 'AS396982 Google LLC',
 'postal': '89111',
 'readme': 'https://ipinfo.io/missingauth',
 'region': 'Nevada',
 'timezone': 'America/Los_Angeles'}
{'city': 'Las Vegas',
 'country': 'US',
 'hostname': '221.230.125.34.bc.googleusercontent.com',
 'ip': '34.125.230.221',
 'loc': '36.1750,-115.1372',
 'org': 'AS396982 Google LLC',
 'postal': '89111',
 'readme': 'https://ipinfo.io/missingauth',
 'region': 'Nevada',
 'timezone': 'America/Los_Angeles'}
{'city': 'Council Bluffs',
 'country': 'US',
 'hostname': '229.228.121.34.bc.googleusercontent.com',
 'ip': '34.121.228.229',
 'loc': '41.2619,-95.8608',
 'org': 'AS396982 Google LLC',
 'postal': '51502',
 'readme': 'https://ipinfo.io/missingauth',
 'region': 'Iowa',
 'timezone': 'America/Chicago'}
{'city': 'Groningen',
 'country': 'NL',
 'hostname': '219.242.91.34.bc.googleusercontent.com',
 'ip': '34.91.242.219',
 'loc': '53.2192,6.5667',
 'org': 'AS396982 Google LLC',
 'postal': '9711',
 'readme': 'https://ipinfo.io/missingauth',
 'region': 'Groningen',
 'timezone': 'Europe/Amsterdam'}
{'city': 'North Charleston',
 'country': 'US',
 'hostname': '218.47.74.34.bc.googleusercontent.com',
 'ip': '34.74.47.218',
 'loc': '32.8546,-79.9748',
 'org': 'AS396982 Google LLC',
 'postal': '29415',
 'readme': 'https://ipinfo.io/missingauth',
 'region': 'South Carolina',
 'timezone': 'America/New_York'}
{'city': 'Salt Lake City',
 'country': 'US',
 'hostname': '146.190.106.34.bc.googleusercontent.com',
 'ip': '34.106.190.146',
 'loc': '40.7608,-111.8911',
 'org': 'AS396982 Google LLC',
 'postal': '84101',
 'readme': 'https://ipinfo.io/missingauth',
 'region': 'Utah',
 'timezone': 'America/Denver'}

so this confirms that this is not Asia-specific.

EDIT: add more examples of ipinfo outputs of nodes with 403 errors.

@ogrisel
Copy link
Member
ogrisel commented Jun 11, 2024

I restarted a colab session and tried the same. This time it worked (no 403 error) from a data-center in Taipei:

{'city': 'Taipei',
 'country': 'TW',
 'hostname': '228.229.221.35.bc.googleusercontent.com',
 'ip': '35.221.229.228',
 'loc': '25.0478,121.5319',
 'org': 'AS396982 Google LLC',
 'readme': 'https://ipinfo.io/missingauth',
 'region': 'Taiwan',
 'timezone': 'Asia/Taipei'}

Here are other examples of google colab hosts where I got no error:

{'city': 'The Dalles',
 'country': 'US',
 'hostname': '90.6.247.35.bc.googleusercontent.com',
 'ip': '35.247.6.90',
 'loc': '45.5946,-121.1787',
 'org': 'AS396982 Google LLC',
 'postal': '97058',
 'readme': 'https://ipinfo.io/missingauth',
 'region': 'Oregon',
 'timezone': 'America/Los_Angeles'}
{'city': 'North Charleston',
 'country': 'US',
 'hostname': '215.122.196.104.bc.googleusercontent.com',
 'ip': '104.196.122.215',
 'loc': '32.8546,-79.9748',
 'org': 'AS396982 Google LLC',
 'postal': '29415',
 'readme': 'https://ipinfo.io/missingauth',
 'region': 'South Carolina',
 'timezone': 'America/New_York'}
{'city': 'North Charleston',
 'country': 'US',
 'hostname': '189.78.148.34.bc.googleusercontent.com',
 'ip': '34.148.78.189',
 'loc': '32.8546,-79.9748',
 'org': 'AS396982 Google LLC',
 'postal': '29415',
 'readme': 'https://ipinfo.io/missingauth',
 'region': 'South Carolina',
 'timezone': 'America/New_York'}
{'city': 'North Charleston',
 'country': 'US',
 'hostname': '112.105.23.34.bc.googleusercontent.com',
 'ip': '34.23.105.112',
 'loc': '32.8546,-79.9748',
 'org': 'AS396982 Google LLC',
 'postal': '29415',
 'readme': 'https://ipinfo.io/missingauth',
 'region': 'South Carolina',
 'timezone': 'America/New_York'}
{'city': 'Washington',
 'country': 'US',
 'hostname': '97.202.150.34.bc.googleusercontent.com',
 'ip': '34.150.202.97',
 'loc': '38.8951,-77.0364',
 'org': 'AS396982 Google LLC',
 'postal': '20004',
 'readme': 'https://ipinfo.io/missingauth',
 'region': 'Washington, D.C.',
 'timezone': 'America/New_York'}
{'city': 'Council Bluffs',
 'country': 'US',
 'hostname': '244.131.42.34.bc.googleusercontent.com',
 'ip': '34.42.131.244',
 'loc': '41.2619,-95.8608',
 'org': 'AS396982 Google LLC',
 'postal': '51502',
 'readme': 'https://ipinfo.io/missingauth',
 'region': 'Iowa',
 'timezone': 'America/Chicago'}

@ogrisel
Copy link
Member
ogrisel commented Jun 11, 2024

Note that I got one session from North Charleston that failed (403) (with IP address 34.74.47.218) and others that worked (with IP addresses 104.196.122.215, 34.148.78.189 and 34.23.105.112).

So this is might not even be related to specific regions or data-centers but rather specific IP addresses.

@lesteve
Copy link
Member
lesteve commented Jun 11, 2024

@ogrisel you may want to indicate how you manage to restart a Colab session and get a different IP address, which may be a reasonable short-term work-around for people that encounter this issue, although I guess it depends a bit how often you end-up on a node with HTTP 403 issues ...

In my naive attempts, I was always getting the same IP address inside Colab, but probably this is because I don't know Colab very well ...

@ogrisel
Copy link
Member
ogrisel commented Jun 11, 2024

Indeed, it's easy to disconnect the runtime (and then reconnect by reexecuting the cells from the start of the notebook) using the following drop down menu:

image

You have approximately a 50%/50% chance to get rebalanced to a node where the HTTP requests go through.

@gigit0000
Copy link

It's strange - the code runs perfectly on Colab but fails on Kaggle, at the same time even on the same local machine. Maybe this info can help troubleshoot the issue.

@Abdullah-R
Copy link

If you're experiencing this on Google Colab, the error seems to disappear if you switch to a TPU runtime (different servers hosting I guess?).

@lesteve
Copy link
Member
lesteve commented Jun 18, 2024

If you're experiencing this on Google Colab, the error seems to disappear if you switch to a TPU runtime (different servers hosting I guess?).

I think this is a similar work-around as disconnecting the runtime mentioned in #28297 (comment) although a wild-guess could be that the TPU servers have less chance to be blocked (at least for now)?

@lesteve
Copy link
Member
lesteve commented Jun 18, 2024

It's strange - the code runs perfectly on Colab but fails on Kaggle, at the same time even on the same local machine. Maybe this info can help troubleshoot the issue.

I tried on Kaggle notebooks and it seems like indeed the problem happens from time to time. The work-around is to click on the power button (Stop session), reexecute the code (you will get a different IP address) and cross your fingers 🤞, make sure you have internet enabled as well (just in case).

The debugging command I used is the following one, which downloads the file needed for California housing:

!wget https://ndownloader.figshare.com/files/5976036 --timeout 3

Here is a screenshot where it did not work (HTTP 403)
image

Here is a screenshot where it did work (HTTP 200):
image

@gigit0000
Copy link

I tried on Kaggle notebooks and it seems like indeed the problem happens from time to time. The work-around is to click on the power button (Stop session), reexecute the code (you will get a different IP address) and cross your fingers 🤞,

@lesteve Thanks for your comment. Yes, that's exactly what I did. I just reported to possibly help troubleshoot this issue :)

@lesteve
Copy link
Member
lesteve commented Jun 19, 2024

Some kind of summary comment:

@gigit0000
Copy link

@lesteve Thank you!

@lesteve
Copy link
Member
lesteve commented Jul 3, 2024

I got an answer from Figshare support saying that they fixed the issue. I tried roughly 10 times on Colab and the same on Kaggle notebooks and I was not able to reproduce the issue 🎉.

I am going to close this one, if you encounter the issue again please comment in this issue!

You can also try the documented work-arounds in #28297 (comment) and mention in case that does not fix the issue.

@lesteve lesteve closed this as completed Jul 3, 2024
@gigit0000
Copy link

@lesteve Great support - thank you!!!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

0