8000 CI: revert LDFLAGS sysroot work-around by lesteve · Pull Request #22535 · scikit-learn/scikit-learn · GitHub
[go: up one dir, main page]

Skip to content

CI: revert LDFLAGS sysroot work-around #22535

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed

Conversation

lesteve
Copy link
Member
@lesteve lesteve commented Feb 18, 2022

This is to see if #20654 is still necessary.

Fix #20640 if CI passes

@ogrisel
Copy link
Member
ogrisel commented Feb 18, 2022

There is a similar config in sklearn/_build_utils/openmp_helpers.py as well. Removing this one will also require running a full [cd build] to check that the wheels can still be built.

@ogrisel
Copy link
Member
ogrisel commented Feb 18, 2022

And maybe a conda-forge feedstock build in a test branch pointing to a specific git hash of this branch.

@lesteve
Copy link
Member Author
lesteve commented Feb 18, 2022

There is a similar config in sklearn/_build_utils/openmp_helpers.py as well. Removing this one will also require running a full [cd build] to check that the wheels can still be built.

OK I just did pushed a commit doing this.

@lesteve lesteve force-pushed the remove-build-against-system-fix branch from f680526 to f5d8161 Compare February 18, 2022 14:35
@ogrisel
Copy link
Member
ogrisel commented Feb 18, 2022

The openmp build problem is still there in

  • Linux pylatest_pip_openblas_pandas:

https://dev.azure.com/scikit-learn/scikit-learn/_build/results?buildId=38274&view=logs&j=78a0bf4f-79e5-5387-94ec-13e67d216d6e&t=75a90307-b084-59e7-ba24-7f7161804495&l=858

  gcc -pthread -B /usr/share/miniconda/envs/testvenv/compiler_compat objects/test_program.o -o test_program -fopenmp
  /usr/share/miniconda/envs/testvenv/compiler_compat/ld: warning: libdl.so.2, needed by /usr/lib/gcc/x86_64-linux-gnu/9/libgomp.so, not found (try using -rpath or -rpath-link)
  /usr/share/miniconda/envs/testvenv/compiler_compat/ld: /usr/lib/gcc/x86_64-linux-gnu/9/libgomp.so: undefined reference to `dlopen@GLIBC_2.2.5'
  /usr/share/miniconda/envs/testvenv/compiler_compat/ld: /usr/lib/gcc/x86_64-linux-gnu/9/libgomp.so: undefined reference to `dlerror@GLIBC_2.2.5'
  /usr/share/miniconda/envs/testvenv/compiler_compat/ld: /usr/lib/gcc/x86_64-linux-gnu/9/libgomp.so: undefined reference to `dlclose@GLIBC_2.2.5'
  /usr/share/miniconda/envs/testvenv/compiler_compat/ld: /usr/lib/gcc/x86_64-linux-gnu/9/libgomp.so: undefined reference to `dlsym@GLIBC_2.2.5'

@ogrisel
Copy link
Member
ogrisel commented Feb 18, 2022

Maybe we should just install the compilers meta package from conda-forge in the linux conda envs if we want to link against an openmp runtime shipped with the conda env.

@lesteve
Copy link
Member Author
lesteve commented Feb 18, 2022

I think the problem still exists but only in conda defaults channel (for some Python version starting in 3.9.x, #20640 (comment) seems to say the problem started happening in 3.9.6) so switching the pylatest_pip* builds to conda-forge may be a work-around. Note this is only installing python, the rest of the dependencies are installed with pip.

There is still a build using conda defaults but it is using python 3.7 so probably unaffected by this issue. We may still want to set the env variable for this particular, in case one day the Python version is bumped to 3.9 and we hit this problem again.

8000

@jeremiedbb
Copy link
Member

I think the problem still exists but only in conda defaults channel

Could we completely get rid of using the default channel ? At least for building sklearn, we can still test it with runtime dependencies from the default channel.

@lesteve
Copy link
Member Author
lesteve commented Feb 21, 2022

At least for building sklearn, we can still test it with runtime dependencies from the default channel.

I am going to vote -1 on this since having a different build and test environment will make my life harder for lock files in CI #22448. Also I am not sure how to do it would you build with conda-forge, then create a new environment with defaults, and copy the scikit-learn folder with built files into the new environment, crossing fingers that everything works?

About completely getting rid of conda defaults: I guess some of #21337 may be relevant. One of the argument for keeping it is that some (maybe a significant amount of) users may get scikit-learn from conda defaults and we want to make sure that nothing breaks.

To be honest at the moment we are testing conda defaults in a very light manner:

  • the python 3.7 build with min dependencies so it does not really test latest available packages in the conda defaults
  • install only Python + ccache and all the rest with pip in pylatest_pip_openblas_pandas build
  • install only Python + ccache and all the rest with pip in pylatest_pip_scipy_dev build

@jeremiedbb
Copy link
Member

Also I am not sure how to do it would you build with conda-forge, then create a new environment with defaults, and copy the scikit-learn folder with built files into the new environment, crossing fingers that everything works?

Haha that's not what I had in mind :)

I would just always use conda-forge in our CI. I was just thinking that when testing our wheels we might still want to test them in an env with deps from the default channel. But maybe not and we just let the maintainers of the default channel do that.

One of the argument for keeping it is that some (maybe a significant amount of) users may get scikit-learn from conda defaults

But there's a lot less users that build sklearn and for those I think we can slowly push them to use conda-forge, or at least only garantee that everything works with conda-forge.

@jeremiedbb
Copy link
Member

Seems like we still need that

@jeremiedbb jeremiedbb closed this Mar 11, 2022
@lesteve
Copy link
Member Author
lesteve commented Mar 12, 2022

Having spent quite some time lately because of #22448, I may revisit this at one point 😉

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

CI issue with OpenMP
3 participants
0