-
Notifications
You must be signed in to change notification settings - Fork 27
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Install Request: Intel OneAPI compiler 2024.2 and MPI #581
Comments
Confirmed to go with 2024.2. |
Base installer:
Also HPCToolkit test install done, so I can get what the compiler and mpi versions are inside. Buildscript updated. The modulefile creator probably needs updating based on what and where all the subcomponents are. |
Have updated the module file creator for the mpi module (the easy part!). |
Doing
Will not add |
Done
|
There's a |
CCL: might want |
As a note, 2024 puts a unified view of most things into |
I think I have got everything in the buildscript now. |
Install on:
|
Uh oh, 2024.2 does still have ifort but does not have icc or icpc anymore, and the clang-based icx and icpx require newer GLIBC so aren't going to work.
So the only working compiler in there is ifort. |
The modulefiles I did make are now in the https://github.com/UCL-RITS/rcps-modulefiles/tree/rhel-upgrade branch of the modulefile repo, as |
Faster way to check the environment:
That gives you everything except CCL, which is also included in our MPI module. (Haven't checked what CCL brings in if you were to source its Then do the same in a new terminal so you don't have the environment changes, but source the whole oneapi setup script.
You can then see only the things that were added by using a script like this (from https://www.unix.com/302281269-post7.html?s=6637ae638fba973573e4463fe2340d6c):
That lets you check faster against our existing modulefile builders in the buildscript that it is setting the correct environment variables for each. You could also sort the diffs to make it easier to compare. For variables like LIBRARY_PATH you will get the whole combined path that is set up as multiple prepend-path lines in the modulefile. This can be streamlined further. (Left as an exercise for the reader!) |
The 2023 toolkits are no longer available. |
We've installed Intel oneAPI 2024.0.1 with Intel MPI 2021.11 on Young using the base and HPC kit installers supplied by the user. After a correction to the module files this is now working for the user. Will also install on the other clusters with the installers in /shared/ucl/apps/pkg-store on each cluster:
|
Intel oneAPI 2024.0.1 is the latest version we can run on the clusters until we update the underlying OS to RedHatn9 or equivalent. All done until then. |
IN:06711429
Requested to be able to compile new NWChem.
Was asked about known good compilers:
I'd rather go right for the ones in OneAPI 2024.2 which is the current release, if possible.
There is the question of whether
FC
,F77
andF90
should point toifort
vsifx
. Spack has gone forifx
as the default Fortran compiler in their wrappers now. It sounds like NWChem is fine withifx
, but not everything works with that yet.Am wondering if we should have two modulefiles for this install, one in beta-modules (for genuine beta reasons) that sets the Fortran compiler environment variables to
ifx
. (They can always be set after loading the module in either case, and build systems may choose differently anyway).(Note: NWChem itself will need https://nwchemgit.github.io/Compiling-NWChem.html#how-to-commodity-clusters-with-intel-omni-path for OmniPath with new GlobalArrays).
The text was updated successfully, but these errors were encountered: