8000 :lock: :robot: CI Update lock files for main CI build(s) :lock: :robot: by scikit-learn-bot · Pull Request #31429 · scikit-learn/scikit-learn · GitHub
[go: up one dir, main page]

Skip to content

🔒 🤖 CI Update lock files for main CI build(s) 🔒 🤖 #31429

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged

Conversation

scikit-learn-bot
Copy link
Contributor

Update lock files.

Note

If the CI tasks fail, create a new branch based on this PR and add the required fixes to that branch.

@scikit-learn-bot scikit-learn-bot force-pushed the auto-update-lock-files-main branch from 341b1a7 to d6a3d2a Compare May 26, 2025 05:05
Copy link
github-actions bot commented May 26, 2025

✔️ Linting Passed

All linting checks passed. Your pull request is in excellent shape! ☀️

Generated for commit: 8ad1667. Link to the linter CI: here

@ogrisel
Copy link
Member
ogrisel commented May 27, 2025

For the record, there is a single test failure happening in test_solver_consistency[seed1-20-float32-0.1-sag-None] on a macOS runner:

sklearn/linear_model/tests/test_ridge.py:783
>       assert_allclose(ridge.coef_, svd_ridge.coef_, atol=1e-3, rtol=1e-3)

...

E       AssertionError: 
E       Not equal to tolerance rtol=0.001, atol=0.001
E       
E       Mismatched elements: 1 / 30 (3.33%)
E       Max absolute difference among violations: 0.00572298
E       Max relative difference among violations: 0.0024261

The tolerance levels are already quite high for the float32 parametrization of the test, but the violations are barely 2.5x the tolerance level.

@ogrisel
Copy link
Member
ogrisel commented May 27, 2025

This might be caused by the cython-3.1.0 to cython-3.1.1 upgrade. The other bumps seem unrelated.

@ogrisel
Copy link
Member
ogrisel commented May 27, 2025

This test fails locally with the "sag" solver if I explore other seeds so it seems to be a seed sensitivity problem.

@ogrisel
Copy link
Member
ogrisel commented May 27, 2025

Actually, the random state seed is not propagated to the Ridge's random_state param. This is probably the cause of the problem.

@ogrisel
Copy link
Member
ogrisel commented May 27, 2025

I pushed an empty commit to confirm that the failure is independent of the Cython update and decouple this PR from the actual fix in #31434.

@ogrisel ogrisel enabled auto-merge (squash) May 27, 2025 09:57
@ogrisel ogrisel merged commit 89b395e into scikit-learn:main May 27, 2025
34 checks passed
jeremiedbb pushed a commit to jeremiedbb/scikit-learn that referenced this pull request May 30, 2025
Co-authored-by: Lock file bot <noreply@github.com>
Co-authored-by: Olivier Grisel <olivier.grisel@ensta.org>
elhambbi pushed a commit to elhambbi/scikit-learn that referenced this pull request Jun 1, 2025
Co-authored-by: Lock file bot <noreply@github.com>
Co-authored-by: Olivier Grisel <olivier.grisel@ensta.org>
jeremiedbb pushed a commit that referenced this pull request Jun 5, 2025
Co-authored-by: Lock file bot <noreply@github.com>
Co-authored-by: Olivier Grisel <olivier.grisel@ensta.org>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants
0