Compare commits

...

64 Commits

Author SHA1 Message Date
Massimiliano Culpo 37242e9422 Set __version__ to v0.20.2 2023-08-29 11:18:59 +02:00
Harmen Stoppels 03d49523b7 GHA: pin flake8 to 6.0.0 to avoid style change 2023-08-29 11:18:38 +02:00
Jordan Galby 0d892ee678 Fix Spack freeze on install child process unexpected exit (#39015)
* Fix spack frozen on child process defunct

* Rename parent/child pipe to read/write to emphasize non-duplex mode
2023-08-18 14:11:53 +02:00
Jordan Galby 183e9212ce Fix package.py error handling bug (#39017) 2023-08-18 14:11:53 +02:00
Harmen Stoppels df4a2457a4 Fix broken semver regex (#39414) 2023-08-18 14:11:53 +02:00
Harmen Stoppels 63c55dd3ba Fix broken inode assertion (#39188) 2023-08-18 14:11:53 +02:00
Xavier Delaruelle d76baacf71 modules: use curly braces to enclose value in Tcl modulefile (#38375)
Use curly braces instead of quotes to enclose value or text in Tcl
modulefile. Within curly braces Tcl special characters like [, ] or $
are treated verbatim whereas they are evaluated within quotes.

Curly braces is Tcl recommended way to enclose verbatim content [1].

Note: if curly braces charaters are used within content, they must be
balanced. This point has been checked against current repository and no
unbalanced curly braces has been spotted.

Fixes #24243

[1] https://wiki.tcl-lang.org/page/Tcl+Minimal+Escaping+Style
2023-08-18 14:11:53 +02:00
Harmen Stoppels 765b2b8413 repo cache: use -inf default instead of 0 (#39214)
FastPackageChecker.modified_since should use a default number < 0

When the repo cache does not exist, Spack uses mtime 0. This causes the repo
cache not to be generated when the repo has mtime 0.

Some popular package managers such as spack use 0 mtime normalization for
reproducible tarballs. So when installing spack with spack from a buildcache, the
repo cache doesn't generate

Also add some typehints
2023-08-18 14:11:53 +02:00
Todd Gamblin 3cfeb72197
mypy: add more ignored modules to `pyproject.toml` (#38769)
`mypy` will check *all* imported packages, even optional dependencies outside your
project, and this can cause issues if you are targeting python versions *older* than the
one you're running in. `mypy` will report issues in the latest versions of dependencies
as errors even if installing on some older python would have installed an older version
of the dependency.

We saw this problem before with `numpy` in #34732. We've started seeing it with IPython
in #38704. This fixes the issue by exempting `IPython` and a number of other imports of
Spack's from `mypy` checking.
2023-07-11 14:00:41 +02:00
Massimiliano Culpo 58158ee743
Prevent "spack external find" to error out on wrong permissions (#38755)
fixes #38733
2023-07-11 14:00:41 +02:00
Harmen Stoppels ef5d110d4a
Fix multiple quadratic complexity issues in environments (#38771)
1. Fix O(n^2) iteration in `_get_overwrite_specs`
2. Early exit `get_by_hash` on full hash
3. Fix O(n^2) double lookup in `all_matching_specs` with hashes
4. Fix some legibility issues
2023-07-11 14:00:40 +02:00
Massimiliano Culpo 769b01e6f0
Add CHANGELOG entry for v0.20.1 (#38836) 2023-07-11 14:00:37 +02:00
Massimiliano Culpo e12dc18149
Set __version__ to v0.20.2.dev0 2023-07-11 13:45:33 +02:00
Harmen Stoppels e8658d6493 Set version to v0.20.1 2023-07-07 12:08:59 +02:00
Massimiliano Culpo 0a4bd29ce5 Change mirror urls on the backport PR 2023-07-07 12:08:59 +02:00
Massimiliano Culpo 506b899676 Mark an unparse test xfail 2023-07-07 12:08:59 +02:00
Xavier Delaruelle c2103b27f6 modules: ignore more Modules variables in from_sourcing_file (#38455)
Update list of excluded variables in `from_sourcing_file` function to
cover all variables specific to Environment Modules or Lmod. Add
specifically variables relative to the definition of `module()`, `ml()`
and `_module_raw()` Bash functions.

Fixes #13504
2023-07-07 12:08:59 +02:00
Xavier Delaruelle 1a2e845958 Add raw attribute to env.set command (#38465)
Update `env.set` command and underlying `SetEnv` object to add the `raw`
boolean attribute. `raw` is optional and set to False by default. When
set to True, value format is skipped for object when generating
environment modifications.

With this change it is now possible to define environment variable
whose value contains variable reference syntax (like `{foo}` or `{}`)
that should be set as-is.

Fixes #29578
2023-07-07 12:08:59 +02:00
Greg Becker e5f270c8da show external status as [e] (#33792) 2023-07-07 12:08:59 +02:00
Gurkirat Singh 0fd224404a docs: add quotes around some values in a YAML example (#38412) 2023-07-07 12:08:59 +02:00
Xavier Delaruelle 215020f9bb modules: use depends-on to autoload module with Lmod on Tcl (#38347)
Update Tcl modulefile template to use the `depends-on` command to
autoload modules if Lmod is the current module tool.

Autoloading modules with `module load` command in Tcl modulefile does
not work well for Lmod at some extend. An attempt to unload then load
designated module is performed each time such command is encountered. It
may lead to a load storm that may not end correctly with large number of
module dependencies.

`depends-on` command should be used for Lmod instead of `module load`,
as it checks if module is already loaded, and does not attempt to reload
this module.

Lua modulefile template already uses `depends_on` command to autoload
dependencies. Thus it is already considered that to use Lmod with Spack,
it must support `depends_on` command (version 7.6+).

Environment Modules copes well with `module load` command to autoload
dependencies (version 3.2+). `depends-on` command is supported starting
version 5.1 (as an alias of `prereq-all` command) which was relased last
year.

This change introduces a test to determine if current module tool that
evaluates modulefile is Lmod. If so, autoload dependencies are defined
with `depends-on` command. Otherwise `module load` command is used.

Test is based on `LMOD_VERSION_MAJOR` environment variable, which is set
by Lmod starting version 5.1.

Fixes #36764
2023-07-07 12:08:59 +02:00
Xavier Delaruelle b366cb3c90 modules: append trailing delimiter to MANPATH when set (#36678)
Update modulefile templates to append a trailing delimiter to MANPATH
environment variable, if the modulefile sets it.

With a trailing delimiter at ends of MANPATH's value, man will search
the system man pages after searching the specific paths set.

Using append-path/append_path to add this element, the module tool
ensures it is appended only once. When modulefile is unloaded, the
number of append attempt is decreased, thus the trailing delimiter is
removed only if this number equals 0.

Disclaimer: no path element should be appended to MANPATH by generated
modulefiles. It should always be prepended to ensure this variable's
value ends with the trailing delimiter.

Fixes #11355.
2023-07-07 12:08:59 +02:00
Peter Scheibel e0bba8f4a3 pip is a pythonextension not a pythonpackage, and it turns out we werent doing our external surgery on things that inherited pythonextension (#38186) 2023-07-07 12:08:59 +02:00
Jonathon Anderson 60195d72c9 containerize: use an ENTRYPOINT script (#37769) 2023-07-07 12:08:59 +02:00
Massimiliano Culpo 2008503a1f Fix compiler removal from command line (#38057)
* Improve lib/spack/spack/test/cmd/compiler.py

* Use "tmp_path" in the "mock_executable" fixture

* Return a pathlib.Path from mock_executable

* Fix mock_executable fixture on Windows

"mock_gcc" was very similar to mock_executable, so use the latter to reduce code duplication

* Remove wrong compiler cache, fix compiler removal

fixes #37996

_CACHE_CONFIG_FILES was both unneeded and wrong, if called
subsequently with different scopes.

Here we remove that cache, and we fix an issue with compiler
removal triggered by having the same compiler spec in multiple
scopes.
2023-07-07 12:08:59 +02:00
QuellynSnead 9df47aabdb libxcb/xcb-proto: Enable internal Python dependency (#37575)
In the past, Spack did not allow two different versions of the
same package within a DAG. That led to difficulties with packages
that still required Python 2 while other packages had already
switched to Python 3.

The libxcb and xcb-proto packages did not have Python 3 support
for a time. To get around this issue, Spack maintainers disabled
their dependency on an internal (i.e., Spack-provided) Python
(see #4145),forcing these packages to look for a system-provided
Python (see #7646).

This has worked for us all right, but with the arrival of our most
recent platform we seem to be missing the critical xcbgen Python
module on the system. Since most software has largely moved on to
Python 3 now, let's re-enable internal Spack dependencies for the
libxcb and xcb-proto packages.
2023-07-07 12:08:59 +02:00
Tiziano MĂĽller 80e90b924a Bugfix: cray manifest parsing regression (#37909)
fa7719a changed syntax for specifying exact versions, which are
required for some compiler specs (including those read as part
of parsing a Cray manifest). This fixes that and also makes a
couple other improvements to manifest parsing.

* Instantiate compiler specs with exact versions (fixes #37893)
* fix slingshot network detection (CPE 22.10+ has libcxi.so
  in /usr/lib64)
* "spack external find": add arg to ignore default dir for cray
  manifests
2023-07-07 12:08:59 +02:00
Jonathon Anderson c6ff664366 containers: don't install epel-release on Fedora (#37766) 2023-07-07 12:08:59 +02:00
Tamara Dahlgren d27debd940 Bugfix/tests: add slash to test log message (#37874) 2023-07-07 12:08:59 +02:00
Tamara Dahlgren c93b8bceb8 Bugfix/tests: write not append stand-alone test status (#37841) 2023-07-07 12:08:59 +02:00
Massimiliano Culpo f602c67606 Update RtD and Sphinx configuration (#38046) 2023-07-07 12:08:59 +02:00
Massimiliano Culpo 3a082f0112 archspec: fix entry in the JSON file (#37793) 2023-07-07 12:08:59 +02:00
Massimiliano Culpo 9fb25b7404 Memoize a few hot functions during module file generation (#37739) 2023-07-07 12:08:59 +02:00
Peter Scheibel 9924c92c40 Add explicit CMake .libs implementation that returns an empty list; same for .headers (#35816) 2023-07-07 12:08:59 +02:00
Massimiliano Culpo 16cb6ac1ed Simplify implementation of "get_compiler_config" (#37989) 2023-07-07 12:08:59 +02:00
Harmen Stoppels 5821746258 fix InternalConcretizerError msg (#37791) 2023-07-07 12:08:59 +02:00
Greg Becker d860083b08 bugfix: env concretize after remove (#37877) 2023-07-07 12:08:59 +02:00
Harmen Stoppels f2d3818d5c spack remove: fix traversal when user specs intersect (#37882)
drop unnecessary double loop over the matching user specs.
2023-07-07 12:08:59 +02:00
Harmen Stoppels 0052f330be Set version to v0.20.1-dev 2023-07-07 12:08:59 +02:00
Todd Gamblin 456db45c4a Update `CHANGELOG.md` for v0.20.0 2023-05-21 01:47:57 +02:00
Massimiliano Culpo e493ab31c6 Set version to 0.20.0 2023-05-19 18:46:40 +02:00
Harmen Stoppels e0f45b33e9 spack env create: generate a view when newly created env has concrete specs (#37799) 2023-05-19 18:46:40 +02:00
Massimiliano Culpo bb61ecb9b9 lmod: allow core compiler to be specified with a version range (#37789)
Use CompilerSpec with satisfies instead of string equality tests

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2023-05-19 15:14:17 +02:00
Greg Becker 9694225b80 compiler specs: do not print '@=' when clear from context (#37787)
Ensure that spack compiler add/find/list and lists of concrete specs
print the compiler effectively as {compiler.name}{@compiler.version}.

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2023-05-19 15:14:17 +02:00
Peter Scheibel 3b15e7bf41 Tk/Tcl packages: speed up file search (#35902) 2023-05-18 12:36:08 +02:00
Peter Scheibel ac5f0cc340 Bugfix: allow preferred new versions from externals (#37747) 2023-05-18 12:36:08 +02:00
Harmen Stoppels f67840511a lmod: fix build, bump patch version (#37744) 2023-05-18 12:36:08 +02:00
Massimiliano Culpo bd9cfa3a47 Limit deepcopy to just the initial "all" section (#37718)
Modifications:
- [x] Limit the scope of the deepcopy when initializing module file writers
2023-05-18 12:36:08 +02:00
Scott Wittenburg 96c262b13e gitlab ci: no copy-only pipelines w/ deprecated config (#37720)
Make it clear that copy-only pipelines are not supported while still
using the deprecated ci config format. Also ensure that the deprecated
stack does not fail on spack pipelines for tags.
2023-05-18 12:36:08 +02:00
Tamara Dahlgren d22fd79a0b spack test: fix stand-alone test suite status reporting (#37602)
* Fix reporting of packageless specs as having no tests

* Add test_test_output_multiple_specs with update to simple-standalone-test (and tests)

* Refactored test status summary; added more tests or checks
2023-05-18 12:36:08 +02:00
Massimiliano Culpo 8cf4bf7559 Fix `spack find` not able to display version ranges in compilers (#37715) 2023-05-18 12:36:08 +02:00
John W. Parent 14a703a4bb Windows: fix MSVC version handling (#37711)
MSVC compiler logic was using string parsing to extract version
from compiler spec, which was fragile. This broke in #37572, so has
been fixed and made more robust by using attribute access.
2023-05-18 12:36:08 +02:00
Harmen Stoppels d7726f80e8 gha rhel8-platform-python: configure git safe.directory (#37708) 2023-05-18 12:36:08 +02:00
Peter Scheibel d69c3a6ab7 Requirements and preferences should not define (non-git) versions (#37687)
Ensure that requirements `packages:*:require:@x` and preferences `packages:*:version:[x]`
fail concretization when no version defined in the package satisfies `x`. This always holds
except for git versions -- they are defined on the fly.
2023-05-18 12:36:08 +02:00
Harmen Stoppels 1fd964140d gha bootstrap-dev-rhel8: configure git safe.directory (#37702)
git has been updated to something more recent
2023-05-18 12:36:08 +02:00
Harmen Stoppels c9bab946d4 check_modules_set_name: do not check for "enable" key (#37701) 2023-05-18 12:36:08 +02:00
Greg Becker 74a5cd2bb0 unify: when_possible and unify: true -- Bugfix for error in 37438 (#37681)
Two bugs came in from #37438

1. `unify: when_possible` was broken, because of an incorrect assertion. abstract/concrete
   spec pairs were compared against the results that were in the process of being computed,
   rather than against the previous results.
2. `unify: true` had an ordering bug that could mix the association between abstract and
   concrete specs

- [x] 1 is resolved by creating a lookup from old concrete specs to old abstract specs,
      and we use that to associate the "new" concrete specs that happen to be the old
      ones with their abstract specs (since those are stripped out for concretization
- [x] 2 is resolved by combining the new and old abstract as lists instead of combining
      them as sets. This is important because `set() | set()` does not make any ordering
      promises, even though set ordering is otherwise guaranteed in `python@3.7:`
2023-05-18 12:36:08 +02:00
Carson Woods 151ce6f923 Improve package source code context display on error (#37655)
Spack displays package code context when it shouldn't (e.g., on `FetchError`s)
and doesn't display it when it should (e.g., when errors occur in builder classes.
The line attribution can sometimes be off by one, as well.

- [x] Display package context when errors occur in a subclass of `PackageBase`
- [x] Display package context when errors occur in a subclass of `BaseBuilder`
- [x] Do not display package context when errors occur in `PackageBase`,
      `BaseBuilder` or other core code that is not in a `package.py` file.
- [x] Fix off-by-one error for core code (don't subtract one from the line number *unless*
      it's in an actual `package.py` file.

---------

Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2023-05-18 12:36:08 +02:00
Scott Wittenburg 1c31ce82af gitlab ci: reduce job name length of build_systems pipeline (#37686) 2023-05-16 00:31:04 +02:00
Todd Gamblin caab2cbfd2 bugfix: allow reuse of packages from foreign namespaces
We currently throw a nasty error if you try to reuse packages from some other namespace
(e.g., OLCF), but we should be able to reuse patched local versions of builtin packages.

Right now the only obstacle to that is that we try to look up virtual info for unknown
namespaces, and we can't get the package from the repo to do that. We *can* assume that
a package with a known namespace is similar, and that its virtual provider information
is reasonably accurate, so we now do that. This isn't 100% accurate, but neither is
relying on the package itself, as it may have gone out of date.

The real solution here is virtual edge information, but this is a stopgap until we have
that.
2023-05-15 20:25:09 +02:00
Todd Gamblin a6f41006eb bugfix: don't look up virtual information for unknown packages
`spec_clauses()` attempts to look up package information for concrete specs in order to
determine which virtuals they may provide. This fails for renamed/deleted dependencies
of buildcaches and installed packages.

This will eventually be fixed by #35258, which adds virtual information on edges, but we
need a workaround to make older buildcaches usable.

- [x] make an exception for renamed packages and omit their virtual constraints
- [x] add a note that this will be solved by adding virtuals to edges
2023-05-15 20:25:09 +02:00
Todd Gamblin 18b4670d9f bugfix: don't look up patches from packages for concrete specs
The concretizer can fail with `reuse:true` if a buildcache or installation contains a
package with a dependency that has been renamed or deleted in the main repo (e.g.,
`netcdf` was refactored to `netcdf-c`, `netcdf-fortran`, etc., but there are still
binary packages with dependencies called `netcdf`).

We should still be able to install things for which we are missing `package.py` files.

`Spec.inject_patches_variant()` was failing this requirement by attempting to look up
the package class for concrete specs.  This isn't needed -- we can skip it.

- [x] swap two conditions in `Spec.inject_patches_variant()`
2023-05-15 20:25:09 +02:00
Harmen Stoppels 322fe415e4 Bump tutorial command (#37674) 2023-05-15 20:25:09 +02:00
Harmen Stoppels 096bfa4ba9 oneapi: before script load modules (#37678) 2023-05-15 20:25:09 +02:00
113 changed files with 2151 additions and 904 deletions

View File

@ -5,3 +5,8 @@ updates:
directory: "/"
schedule:
interval: "daily"
# Requirements to build documentation
- package-ecosystem: "pip"
directory: "/lib/spack/docs"
schedule:
interval: "daily"

View File

@ -137,6 +137,7 @@ jobs:
- name: Setup repo and non-root user
run: |
git --version
git config --global --add safe.directory /__w/spack/spack
git fetch --unshallow
. .github/workflows/setup_git.sh
useradd spack-test

View File

@ -44,7 +44,7 @@ jobs:
cache: 'pip'
- name: Install Python packages
run: |
python3 -m pip install --upgrade pip setuptools types-six black==23.1.0 mypy isort clingo flake8
python3 -m pip install --upgrade pip setuptools types-six black==23.1.0 mypy isort clingo flake8==6.0.0
- name: Setup git configuration
run: |
# Need this for the git tests to succeed.
@ -72,6 +72,7 @@ jobs:
- name: Setup repo and non-root user
run: |
git --version
git config --global --add safe.directory /__w/spack/spack
git fetch --unshallow
. .github/workflows/setup_git.sh
useradd spack-test

View File

@ -1,10 +1,16 @@
version: 2
build:
os: "ubuntu-22.04"
apt_packages:
- graphviz
tools:
python: "3.11"
sphinx:
configuration: lib/spack/docs/conf.py
fail_on_warning: true
python:
version: 3.7
install:
- requirements: lib/spack/docs/requirements.txt

View File

@ -1,3 +1,239 @@
# v0.20.1 (2023-07-10)
## Spack Bugfixes
- Spec removed from an environment where not actually removed if `--force` was not given (#37877)
- Speed-up module file generation (#37739)
- Hotfix for a few recipes that treat CMake as a link dependency (#35816)
- Fix re-running stand-alone test a second time, which was getting a trailing spurious failure (#37840)
- Fixed reading JSON manifest on Cray, reporting non-concrete specs (#37909)
- Fixed a few bugs when generating Dockerfiles from Spack (#37766,#37769)
- Fixed a few long-standing bugs when generating module files (#36678,#38347,#38465,#38455)
- Fixed issues with building Python extensions using an external Python (#38186)
- Fixed compiler removal from command line (#38057)
- Show external status as [e] (#33792)
- Backported `archspec` fixes (#37793)
- Improved a few error messages (#37791)
# v0.20.0 (2023-05-21)
`v0.20.0` is a major feature release.
## Features in this release
1. **`requires()` directive and enhanced package requirements**
We've added some more enhancements to requirements in Spack (#36286).
There is a new `requires()` directive for packages. `requires()` is the opposite of
`conflicts()`. You can use it to impose constraints on this package when certain
conditions are met:
```python
requires(
"%apple-clang",
when="platform=darwin",
msg="This package builds only with clang on macOS"
)
```
More on this in [the docs](
https://spack.rtfd.io/en/latest/packaging_guide.html#conflicts-and-requirements).
You can also now add a `when:` clause to `requires:` in your `packages.yaml`
configuration or in an environment:
```yaml
packages:
openmpi:
require:
- any_of: ["%gcc"]
when: "@:4.1.4"
message: "Only OpenMPI 4.1.5 and up can build with fancy compilers"
```
More details can be found [here](
https://spack.readthedocs.io/en/latest/build_settings.html#package-requirements)
2. **Exact versions**
Spack did not previously have a way to distinguish a version if it was a prefix of
some other version. For example, `@3.2` would match `3.2`, `3.2.1`, `3.2.2`, etc. You
can now match *exactly* `3.2` with `@=3.2`. This is useful, for example, if you need
to patch *only* the `3.2` version of a package. The new syntax is described in [the docs](
https://spack.readthedocs.io/en/latest/basic_usage.html#version-specifier).
Generally, when writing packages, you should prefer to use ranges like `@3.2` over
the specific versions, as this allows the concretizer more leeway when selecting
versions of dependencies. More details and recommendations are in the [packaging guide](
https://spack.readthedocs.io/en/latest/packaging_guide.html#ranges-versus-specific-versions).
See #36273 for full details on the version refactor.
3. **New testing interface**
Writing package tests is now much simpler with a new [test interface](
https://spack.readthedocs.io/en/latest/packaging_guide.html#stand-alone-tests).
Writing a test is now as easy as adding a method that starts with `test_`:
```python
class MyPackage(Package):
...
def test_always_fails(self):
"""use assert to always fail"""
assert False
def test_example(self):
"""run installed example"""
example = which(self.prefix.bin.example)
example()
```
You can use Python's native `assert` statement to implement your checks -- no more
need to fiddle with `run_test` or other test framework methods. Spack will
introspect the class and run `test_*` methods when you run `spack test`,
4. **More stable concretization**
* Now, `spack concretize` will *only* concretize the new portions of the environment
and will not change existing parts of an environment unless you specify `--force`.
This has always been true for `unify:false`, but not for `unify:true` and
`unify:when_possible` environments. Now it is true for all of them (#37438, #37681).
* The concretizer has a new `--reuse-deps` argument that *only* reuses dependencies.
That is, it will always treat the *roots* of your environment as it would with
`--fresh`. This allows you to upgrade just the roots of your environment while
keeping everything else stable (#30990).
5. **Weekly develop snapshot releases**
Since last year, we have maintained a buildcache of `develop` at
https://binaries.spack.io/develop, but the cache can grow to contain so many builds
as to be unwieldy. When we get a stable `develop` build, we snapshot the release and
add a corresponding tag the Spack repository. So, you can use a stack from a specific
day. There are now tags in the spack repository like:
* `develop-2023-05-14`
* `develop-2023-05-18`
that correspond to build caches like:
* https://binaries.spack.io/develop-2023-05-14/e4s
* https://binaries.spack.io/develop-2023-05-18/e4s
We plan to store these snapshot releases weekly.
6. **Specs in buildcaches can be referenced by hash.**
* Previously, you could run `spack buildcache list` and see the hashes in
buildcaches, but referring to them by hash would fail.
* You can now run commands like `spack spec` and `spack install` and refer to
buildcache hashes directly, e.g. `spack install /abc123` (#35042)
7. **New package and buildcache index websites**
Our public websites for searching packages have been completely revamped and updated.
You can check them out here:
* *Package Index*: https://packages.spack.io
* *Buildcache Index*: https://cache.spack.io
Both are searchable and more interactive than before. Currently major releases are
shown; UI for browsing `develop` snapshots is coming soon.
8. **Default CMake and Meson build types are now Release**
Spack has historically defaulted to building with optimization and debugging, but
packages like `llvm` can be enormous with debug turned on. Our default build type for
all Spack packages is now `Release` (#36679, #37436). This has a number of benefits:
* much smaller binaries;
* higher default optimization level; and
* defining `NDEBUG` disables assertions, which may lead to further speedups.
You can still get the old behavior back through requirements and package preferences.
## Other new commands and directives
* `spack checksum` can automatically add new versions to package (#24532)
* new command: `spack pkg grep` to easily search package files (#34388)
* New `maintainers` directive (#35083)
* Add `spack buildcache push` (alias to `buildcache create`) (#34861)
* Allow using `-j` to control the parallelism of concretization (#37608)
* Add `--exclude` option to 'spack external find' (#35013)
## Other new features of note
* editing: add higher-precedence `SPACK_EDITOR` environment variable
* Many YAML formatting improvements from updating `ruamel.yaml` to the latest version
supporting Python 3.6. (#31091, #24885, #37008).
* Requirements and preferences should not define (non-git) versions (#37687, #37747)
* Environments now store spack version/commit in `spack.lock` (#32801)
* User can specify the name of the `packages` subdirectory in repositories (#36643)
* Add container images supporting RHEL alternatives (#36713)
* make version(...) kwargs explicit (#36998)
## Notable refactors
* buildcache create: reproducible tarballs (#35623)
* Bootstrap most of Spack dependencies using environments (#34029)
* Split `satisfies(..., strict=True/False)` into two functions (#35681)
* spack install: simplify behavior when inside environments (#35206)
## Binary cache and stack updates
* Major simplification of CI boilerplate in stacks (#34272, #36045)
* Many improvements to our CI pipeline's reliability
## Removals, Deprecations, and disablements
* Module file generation is disabled by default; you'll need to enable it to use it (#37258)
* Support for Python 2 was deprecated in `v0.19.0` and has been removed. `v0.20.0` only
supports Python 3.6 and higher.
* Deprecated target names are no longer recognized by Spack. Use generic names instead:
* `graviton` is now `cortex_a72`
* `graviton2` is now `neoverse_n1`
* `graviton3` is now `neoverse_v1`
* `blacklist` and `whitelist` in module configuration were deprecated in `v0.19.0` and are
removed in this release. Use `exclude` and `include` instead.
* The `ignore=` parameter of the `extends()` directive has been removed. It was not used by
any builtin packages and is no longer needed to avoid conflicts in environment views (#35588).
* Support for the old YAML buildcache format has been removed. It was deprecated in `v0.19.0` (#34347).
* `spack find --bootstrap` has been removed. It was deprecated in `v0.19.0`. Use `spack
--bootstrap find` instead (#33964).
* `spack bootstrap trust` and `spack bootstrap untrust` are now removed, having been
deprecated in `v0.19.0`. Use `spack bootstrap enable` and `spack bootstrap disable`.
* The `--mirror-name`, `--mirror-url`, and `--directory` options to buildcache and
mirror commands were deprecated in `v0.19.0` and have now been removed. They have been
replaced by positional arguments (#37457).
* Deprecate `env:` as top level environment key (#37424)
* deprecate buildcache create --rel, buildcache install --allow-root (#37285)
* Support for very old perl-like spec format strings (e.g., `$_$@$%@+$+$=`) has been
removed (#37425). This was deprecated in in `v0.15` (#10556).
## Notable Bugfixes
* bugfix: don't fetch package metadata for unknown concrete specs (#36990)
* Improve package source code context display on error (#37655)
* Relax environment manifest filename requirements and lockfile identification criteria (#37413)
* `installer.py`: drop build edges of installed packages by default (#36707)
* Bugfix: package requirements with git commits (#35057, #36347)
* Package requirements: allow single specs in requirement lists (#36258)
* conditional variant values: allow boolean (#33939)
* spack uninstall: follow run/link edges on --dependents (#34058)
## Spack community stats
* 7,179 total packages, 499 new since `v0.19.0`
* 329 new Python packages
* 31 new R packages
* 336 people contributed to this release
* 317 committers to packages
* 62 committers to core
# v0.19.1 (2023-02-07)
### Spack Bugfixes

View File

@ -0,0 +1,16 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
# The name of the Pygments (syntax highlighting) style to use.
# We use our own extension of the default style with a few modifications
from pygments.styles.default import DefaultStyle
from pygments.token import Generic
class SpackStyle(DefaultStyle):
styles = DefaultStyle.styles.copy()
background_color = "#f4f4f8"
styles[Generic.Output] = "#355"
styles[Generic.Prompt] = "bold #346ec9"

View File

@ -149,7 +149,6 @@ graphviz_dot_args = [
# Get nice vector graphics
graphviz_output_format = "svg"
# Add any paths that contain templates here, relative to this directory.
templates_path = ["_templates"]
@ -233,30 +232,8 @@ nitpick_ignore = [
# If true, sectionauthor and moduleauthor directives will be shown in the
# output. They are ignored by default.
# show_authors = False
# The name of the Pygments (syntax highlighting) style to use.
# We use our own extension of the default style with a few modifications
from pygments.style import Style
from pygments.styles.default import DefaultStyle
from pygments.token import Comment, Generic, Text
class SpackStyle(DefaultStyle):
styles = DefaultStyle.styles.copy()
background_color = "#f4f4f8"
styles[Generic.Output] = "#355"
styles[Generic.Prompt] = "bold #346ec9"
import pkg_resources
dist = pkg_resources.Distribution(__file__)
sys.path.append(".") # make 'conf' module findable
ep = pkg_resources.EntryPoint.parse("spack = conf:SpackStyle", dist=dist)
dist._ep_map = {"pygments.styles": {"plugin1": ep}}
pkg_resources.working_set.add(dist)
pygments_style = "spack"
sys.path.append("./_pygments")
pygments_style = "style.SpackStyle"
# A list of ignored prefixes for module index sorting.
# modindex_common_prefix = []
@ -341,16 +318,15 @@ html_last_updated_fmt = "%b %d, %Y"
# Output file base name for HTML help builder.
htmlhelp_basename = "Spackdoc"
# -- Options for LaTeX output --------------------------------------------------
latex_elements = {
# The paper size ('letterpaper' or 'a4paper').
#'papersize': 'letterpaper',
# 'papersize': 'letterpaper',
# The font size ('10pt', '11pt' or '12pt').
#'pointsize': '10pt',
# 'pointsize': '10pt',
# Additional stuff for the LaTeX preamble.
#'preamble': '',
# 'preamble': '',
}
# Grouping the document tree into LaTeX files. List of tuples

View File

@ -616,7 +616,7 @@ to customize the generation of container recipes:
- No
* - ``os_packages:command``
- Tool used to manage system packages
- ``apt``, ``yum``, ``zypper``, ``apk``, ``yum_amazon``
- ``apt``, ``yum``, ``dnf``, ``dnf_epel``, ``zypper``, ``apk``, ``yum_amazon``
- Only with custom base images
* - ``os_packages:update``
- Whether or not to update the list of available packages

View File

@ -916,9 +916,9 @@ function, as shown in the example below:
.. code-block:: yaml
projections:
zlib: {name}-{version}
^mpi: {name}-{version}/{^mpi.name}-{^mpi.version}-{compiler.name}-{compiler.version}
all: {name}-{version}/{compiler.name}-{compiler.version}
zlib: "{name}-{version}"
^mpi: "{name}-{version}/{^mpi.name}-{^mpi.version}-{compiler.name}-{compiler.version}"
all: "{name}-{version}/{compiler.name}-{compiler.version}"
The entries in the projections configuration file must all be either
specs or the keyword ``all``. For each spec, the projection used will

View File

@ -1,13 +1,8 @@
# These dependencies should be installed using pip in order
# to build the documentation.
sphinx>=3.4,!=4.1.2,!=5.1.0
sphinxcontrib-programoutput
sphinx-design
sphinx-rtd-theme
python-levenshtein
# Restrict to docutils <0.17 to workaround a list rendering issue in sphinx.
# https://stackoverflow.com/questions/67542699
docutils <0.17
pygments <2.13
urllib3 <2
sphinx==6.2.1
sphinxcontrib-programoutput==0.17
sphinx_design==0.4.1
sphinx-rtd-theme==1.2.1
python-levenshtein==0.21.0
docutils==0.18.1
pygments==2.15.1
urllib3==2.0.2

View File

@ -18,7 +18,7 @@ archspec
* Homepage: https://pypi.python.org/pypi/archspec
* Usage: Labeling, comparison and detection of microarchitectures
* Version: 0.2.1 (commit 4b1f21802a23b536bbcce73d3c631a566b20e8bd)
* Version: 0.2.1 (commit 9e1117bd8a2f0581bced161f2a2e8d6294d0300b)
astunparse
----------------

View File

@ -2803,7 +2803,7 @@
"flags" : "-march=armv8.2-a+fp16+dotprod+crypto -mtune=cortex-a72"
},
{
"versions": "10.2",
"versions": "10.2:10.2.99",
"flags" : "-mcpu=zeus"
},
{

View File

@ -139,7 +139,7 @@ class OpenFileTracker(object):
def release_by_stat(self, stat):
key = (stat.st_dev, stat.st_ino, os.getpid())
open_file = self._descriptors.get(key)
assert open_file, "Attempted to close non-existing inode: %s" % stat.st_inode
assert open_file, "Attempted to close non-existing inode: %s" % stat.st_ino
open_file.refs -= 1
if not open_file.refs:

View File

@ -4,7 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
#: PEP440 canonical <major>.<minor>.<micro>.<devN> string
__version__ = "0.20.0.dev0"
__version__ = "0.20.2"
spack_version = __version__

View File

@ -1028,7 +1028,7 @@ def get_cmake_prefix_path(pkg):
def _setup_pkg_and_run(
serialized_pkg, function, kwargs, child_pipe, input_multiprocess_fd, jsfd1, jsfd2
serialized_pkg, function, kwargs, write_pipe, input_multiprocess_fd, jsfd1, jsfd2
):
context = kwargs.get("context", "build")
@ -1049,12 +1049,12 @@ def _setup_pkg_and_run(
pkg, dirty=kwargs.get("dirty", False), context=context
)
return_value = function(pkg, kwargs)
child_pipe.send(return_value)
write_pipe.send(return_value)
except StopPhase as e:
# Do not create a full ChildError from this, it's not an error
# it's a control statement.
child_pipe.send(e)
write_pipe.send(e)
except BaseException:
# catch ANYTHING that goes wrong in the child process
exc_type, exc, tb = sys.exc_info()
@ -1103,10 +1103,10 @@ def _setup_pkg_and_run(
context,
package_context,
)
child_pipe.send(ce)
write_pipe.send(ce)
finally:
child_pipe.close()
write_pipe.close()
if input_multiprocess_fd is not None:
input_multiprocess_fd.close()
@ -1150,7 +1150,7 @@ def start_build_process(pkg, function, kwargs):
For more information on `multiprocessing` child process creation
mechanisms, see https://docs.python.org/3/library/multiprocessing.html#contexts-and-start-methods
"""
parent_pipe, child_pipe = multiprocessing.Pipe()
read_pipe, write_pipe = multiprocessing.Pipe(duplex=False)
input_multiprocess_fd = None
jobserver_fd1 = None
jobserver_fd2 = None
@ -1175,7 +1175,7 @@ def start_build_process(pkg, function, kwargs):
serialized_pkg,
function,
kwargs,
child_pipe,
write_pipe,
input_multiprocess_fd,
jobserver_fd1,
jobserver_fd2,
@ -1184,6 +1184,12 @@ def start_build_process(pkg, function, kwargs):
p.start()
# We close the writable end of the pipe now to be sure that p is the
# only process which owns a handle for it. This ensures that when p
# closes its handle for the writable end, read_pipe.recv() will
# promptly report the readable end as being ready.
write_pipe.close()
except InstallError as e:
e.pkg = pkg
raise
@ -1193,7 +1199,16 @@ def start_build_process(pkg, function, kwargs):
if input_multiprocess_fd is not None:
input_multiprocess_fd.close()
child_result = parent_pipe.recv()
def exitcode_msg(p):
typ = "exit" if p.exitcode >= 0 else "signal"
return f"{typ} {abs(p.exitcode)}"
try:
child_result = read_pipe.recv()
except EOFError:
p.join()
raise InstallError(f"The process has stopped unexpectedly ({exitcode_msg(p)})")
p.join()
# If returns a StopPhase, raise it
@ -1213,9 +1228,16 @@ def start_build_process(pkg, function, kwargs):
child_result.print_context()
raise child_result
# Fallback. Usually caught beforehand in EOFError above.
if p.exitcode != 0:
raise InstallError(f"The process failed unexpectedly ({exitcode_msg(p)})")
return child_result
CONTEXT_BASES = (spack.package_base.PackageBase, spack.build_systems._checks.BaseBuilder)
def get_package_context(traceback, context=3):
"""Return some context for an error message when the build fails.
@ -1244,32 +1266,37 @@ def get_package_context(traceback, context=3):
stack = make_stack(traceback)
basenames = tuple(base.__name__ for base in CONTEXT_BASES)
for tb in stack:
frame = tb.tb_frame
if "self" in frame.f_locals:
# Find the first proper subclass of PackageBase.
# Find the first proper subclass of the PackageBase or BaseBuilder, but
# don't provide context if the code is actually in the base classes.
obj = frame.f_locals["self"]
if isinstance(obj, spack.package_base.PackageBase):
break
func = getattr(obj, tb.tb_frame.f_code.co_name, "")
if func:
typename, *_ = func.__qualname__.partition(".")
if isinstance(obj, CONTEXT_BASES) and typename not in basenames:
break
else:
return None
# We found obj, the Package implementation we care about.
# Point out the location in the install method where we failed.
lines = [
"{0}:{1:d}, in {2}:".format(
inspect.getfile(frame.f_code),
frame.f_lineno - 1, # subtract 1 because f_lineno is 0-indexed
frame.f_code.co_name,
)
]
filename = inspect.getfile(frame.f_code)
lineno = frame.f_lineno
if os.path.basename(filename) == "package.py":
# subtract 1 because we inject a magic import at the top of package files.
# TODO: get rid of the magic import.
lineno -= 1
lines = ["{0}:{1:d}, in {2}:".format(filename, lineno, frame.f_code.co_name)]
# Build a message showing context in the install method.
sourcelines, start = inspect.getsourcelines(frame)
# Calculate lineno of the error relative to the start of the function.
# Subtract 1 because f_lineno is 0-indexed.
fun_lineno = frame.f_lineno - start - 1
fun_lineno = lineno - start
start_ctx = max(0, fun_lineno - context)
sourcelines = sourcelines[start_ctx : fun_lineno + context + 1]
@ -1365,7 +1392,7 @@ class ChildError(InstallError):
test_log = join_path(os.path.dirname(self.log_name), spack_install_test_log)
if os.path.isfile(test_log):
out.write("\nSee test log for details:\n")
out.write(" {0}n".format(test_log))
out.write(" {0}\n".format(test_log))
return out.getvalue()

View File

@ -180,51 +180,6 @@ class PythonExtension(spack.package_base.PackageBase):
work_dir="spack-test",
)
class PythonPackage(PythonExtension):
"""Specialized class for packages that are built using pip."""
#: Package name, version, and extension on PyPI
pypi: Optional[str] = None
# To be used in UI queries that require to know which
# build-system class we are using
build_system_class = "PythonPackage"
#: Legacy buildsystem attribute used to deserialize and install old specs
legacy_buildsystem = "python_pip"
#: Callback names for install-time test
install_time_test_callbacks = ["test"]
build_system("python_pip")
with spack.multimethod.when("build_system=python_pip"):
extends("python")
depends_on("py-pip", type="build")
# FIXME: technically wheel is only needed when building from source, not when
# installing a downloaded wheel, but I don't want to add wheel as a dep to every
# package manually
depends_on("py-wheel", type="build")
py_namespace: Optional[str] = None
@lang.classproperty
def homepage(cls):
if cls.pypi:
name = cls.pypi.split("/")[0]
return "https://pypi.org/project/" + name + "/"
@lang.classproperty
def url(cls):
if cls.pypi:
return "https://files.pythonhosted.org/packages/source/" + cls.pypi[0] + "/" + cls.pypi
@lang.classproperty
def list_url(cls):
if cls.pypi:
name = cls.pypi.split("/")[0]
return "https://pypi.org/simple/" + name + "/"
def update_external_dependencies(self, extendee_spec=None):
"""
Ensure all external python packages have a python dependency
@ -270,6 +225,51 @@ class PythonPackage(PythonExtension):
python._mark_concrete()
self.spec.add_dependency_edge(python, deptypes=("build", "link", "run"))
class PythonPackage(PythonExtension):
"""Specialized class for packages that are built using pip."""
#: Package name, version, and extension on PyPI
pypi: Optional[str] = None
# To be used in UI queries that require to know which
# build-system class we are using
build_system_class = "PythonPackage"
#: Legacy buildsystem attribute used to deserialize and install old specs
legacy_buildsystem = "python_pip"
#: Callback names for install-time test
install_time_test_callbacks = ["test"]
build_system("python_pip")
with spack.multimethod.when("build_system=python_pip"):
extends("python")
depends_on("py-pip", type="build")
# FIXME: technically wheel is only needed when building from source, not when
# installing a downloaded wheel, but I don't want to add wheel as a dep to every
# package manually
depends_on("py-wheel", type="build")
py_namespace: Optional[str] = None
@lang.classproperty
def homepage(cls):
if cls.pypi:
name = cls.pypi.split("/")[0]
return "https://pypi.org/project/" + name + "/"
@lang.classproperty
def url(cls):
if cls.pypi:
return "https://files.pythonhosted.org/packages/source/" + cls.pypi[0] + "/" + cls.pypi
@lang.classproperty
def list_url(cls):
if cls.pypi:
name = cls.pypi.split("/")[0]
return "https://pypi.org/simple/" + name + "/"
def get_external_python_for_prefix(self):
"""
For an external package that extends python, find the most likely spec for the python

View File

@ -756,6 +756,7 @@ def generate_gitlab_ci_yaml(
# Get the joined "ci" config with all of the current scopes resolved
ci_config = cfg.get("ci")
config_deprecated = False
if not ci_config:
tty.warn("Environment does not have `ci` a configuration")
gitlabci_config = yaml_root.get("gitlab-ci")
@ -768,6 +769,7 @@ def generate_gitlab_ci_yaml(
)
translate_deprecated_config(gitlabci_config)
ci_config = gitlabci_config
config_deprecated = True
# Default target is gitlab...and only target is gitlab
if not ci_config.get("target", "gitlab") == "gitlab":
@ -831,6 +833,14 @@ def generate_gitlab_ci_yaml(
# Values: "spack_pull_request", "spack_protected_branch", or not set
spack_pipeline_type = os.environ.get("SPACK_PIPELINE_TYPE", None)
copy_only_pipeline = spack_pipeline_type == "spack_copy_only"
if copy_only_pipeline and config_deprecated:
tty.warn(
"SPACK_PIPELINE_TYPE=spack_copy_only is not supported when using\n",
"deprecated ci configuration, a no-op pipeline will be generated\n",
"instead.",
)
if "mirrors" not in yaml_root or len(yaml_root["mirrors"].values()) < 1:
tty.die("spack ci generate requires an env containing a mirror")
@ -1207,7 +1217,7 @@ def generate_gitlab_ci_yaml(
).format(c_spec, release_spec)
tty.debug(debug_msg)
if prune_dag and not rebuild_spec and spack_pipeline_type != "spack_copy_only":
if prune_dag and not rebuild_spec and not copy_only_pipeline:
tty.debug(
"Pruning {0}/{1}, does not need rebuild.".format(
release_spec.name, release_spec.dag_hash()
@ -1298,7 +1308,7 @@ def generate_gitlab_ci_yaml(
max_length_needs = length_needs
max_needs_job = job_name
if spack_pipeline_type != "spack_copy_only":
if not copy_only_pipeline:
output_object[job_name] = job_object
job_id += 1
@ -1330,7 +1340,7 @@ def generate_gitlab_ci_yaml(
"when": ["runner_system_failure", "stuck_or_timeout_failure", "script_failure"],
}
if spack_pipeline_type == "spack_copy_only":
if copy_only_pipeline and not config_deprecated:
stage_names.append("copy")
sync_job = copy.deepcopy(spack_ci_ir["jobs"]["copy"]["attributes"])
sync_job["stage"] = "copy"
@ -1474,12 +1484,18 @@ def generate_gitlab_ci_yaml(
sorted_output = cinw.needs_to_dependencies(sorted_output)
else:
# No jobs were generated
tty.debug("No specs to rebuild, generating no-op job")
noop_job = spack_ci_ir["jobs"]["noop"]["attributes"]
noop_job["retry"] = service_job_retries
sorted_output = {"no-specs-to-rebuild": noop_job}
if copy_only_pipeline and config_deprecated:
tty.debug("Generating no-op job as copy-only is unsupported here.")
noop_job["script"] = [
'echo "copy-only pipelines are not supported with deprecated ci configs"'
]
sorted_output = {"unsupported-copy": noop_job}
else:
tty.debug("No specs to rebuild, generating no-op job")
sorted_output = {"no-specs-to-rebuild": noop_job}
if known_broken_specs_encountered:
tty.error("This pipeline generated hashes known to be broken on develop:")

View File

@ -347,7 +347,7 @@ def iter_groups(specs, indent, all_headers):
spack.spec.architecture_color,
architecture if architecture else "no arch",
spack.spec.compiler_color,
f"{compiler.name}@{compiler.version}" if compiler else "no compiler",
f"{compiler.display_str}" if compiler else "no compiler",
)
# Sometimes we want to display specs that are not yet concretized.

View File

@ -53,7 +53,7 @@ def setup_parser(subparser):
"--scope",
choices=scopes,
metavar=scopes_metavar,
default=spack.config.default_modify_scope("compilers"),
default=None,
help="configuration scope to modify",
)
@ -98,7 +98,7 @@ def compiler_find(args):
config = spack.config.config
filename = config.get_config_filename(args.scope, "compilers")
tty.msg("Added %d new compiler%s to %s" % (n, s, filename))
colify(reversed(sorted(c.spec for c in new_compilers)), indent=4)
colify(reversed(sorted(c.spec.display_str for c in new_compilers)), indent=4)
else:
tty.msg("Found no new compilers")
tty.msg("Compilers are defined in the following files:")
@ -106,19 +106,21 @@ def compiler_find(args):
def compiler_remove(args):
cspec = spack.spec.CompilerSpec(args.compiler_spec)
compilers = spack.compilers.compilers_for_spec(cspec, scope=args.scope)
if not compilers:
tty.die("No compilers match spec %s" % cspec)
elif not args.all and len(compilers) > 1:
tty.error("Multiple compilers match spec %s. Choose one:" % cspec)
colify(reversed(sorted([c.spec for c in compilers])), indent=4)
compiler_spec = spack.spec.CompilerSpec(args.compiler_spec)
candidate_compilers = spack.compilers.compilers_for_spec(compiler_spec, scope=args.scope)
if not candidate_compilers:
tty.die("No compilers match spec %s" % compiler_spec)
if not args.all and len(candidate_compilers) > 1:
tty.error(f"Multiple compilers match spec {compiler_spec}. Choose one:")
colify(reversed(sorted([c.spec.display_str for c in candidate_compilers])), indent=4)
tty.msg("Or, use `spack compiler remove -a` to remove all of them.")
sys.exit(1)
for compiler in compilers:
spack.compilers.remove_compiler_from_config(compiler.spec, scope=args.scope)
tty.msg("Removed compiler %s" % compiler.spec)
for current_compiler in candidate_compilers:
spack.compilers.remove_compiler_from_config(current_compiler.spec, scope=args.scope)
tty.msg(f"{current_compiler.spec.display_str} has been removed")
def compiler_info(args):
@ -130,7 +132,7 @@ def compiler_info(args):
tty.die("No compilers match spec %s" % cspec)
else:
for c in compilers:
print(str(c.spec) + ":")
print(c.spec.display_str + ":")
print("\tpaths:")
for cpath in ["cc", "cxx", "f77", "fc"]:
print("\t\t%s = %s" % (cpath, getattr(c, cpath, None)))
@ -188,7 +190,7 @@ def compiler_list(args):
os_str += "-%s" % target
cname = "%s{%s} %s" % (spack.spec.compiler_color, name, os_str)
tty.hline(colorize(cname), char="-")
colify(reversed(sorted(c.spec for c in compilers)))
colify(reversed(sorted(c.spec.display_str for c in compilers)))
def compiler(parser, args):

View File

@ -302,7 +302,7 @@ def env_create(args):
# the environment should not include a view.
with_view = None
_env_create(
env = _env_create(
args.create_env,
init_file=args.envfile,
dir=args.dir,
@ -310,6 +310,9 @@ def env_create(args):
keep_relative=args.keep_relative,
)
# Generate views, only really useful for environments created from spack.lock files.
env.regenerate_views()
def _env_create(name_or_path, *, init_file=None, dir=False, with_view=None, keep_relative=False):
"""Create a new environment, with an optional yaml description.

View File

@ -79,6 +79,12 @@ def setup_parser(subparser):
read_cray_manifest.add_argument(
"--directory", default=None, help="specify a directory storing a group of manifest files"
)
read_cray_manifest.add_argument(
"--ignore-default-dir",
action="store_true",
default=False,
help="ignore the default directory of manifest files",
)
read_cray_manifest.add_argument(
"--dry-run",
action="store_true",
@ -177,11 +183,16 @@ def external_read_cray_manifest(args):
manifest_directory=args.directory,
dry_run=args.dry_run,
fail_on_error=args.fail_on_error,
ignore_default_dir=args.ignore_default_dir,
)
def _collect_and_consume_cray_manifest_files(
manifest_file=None, manifest_directory=None, dry_run=False, fail_on_error=False
manifest_file=None,
manifest_directory=None,
dry_run=False,
fail_on_error=False,
ignore_default_dir=False,
):
manifest_files = []
if manifest_file:
@ -191,7 +202,7 @@ def _collect_and_consume_cray_manifest_files(
if manifest_directory:
manifest_dirs.append(manifest_directory)
if os.path.isdir(cray_manifest.default_path):
if not ignore_default_dir and os.path.isdir(cray_manifest.default_path):
tty.debug(
"Cray manifest path {0} exists: collecting all files to read.".format(
cray_manifest.default_path

View File

@ -116,21 +116,23 @@ def one_spec_or_raise(specs):
def check_module_set_name(name):
modules_config = spack.config.get("modules")
valid_names = set(
[
key
for key, value in modules_config.items()
if isinstance(value, dict) and value.get("enable", [])
]
)
if "enable" in modules_config and modules_config["enable"]:
valid_names.add("default")
modules = spack.config.get("modules")
if name != "prefix_inspections" and name in modules:
return
if name not in valid_names:
msg = "Cannot use invalid module set %s." % name
msg += " Valid module set names are %s" % list(valid_names)
raise spack.config.ConfigError(msg)
names = [k for k in modules if k != "prefix_inspections"]
if not names:
raise spack.config.ConfigError(
f"Module set configuration is missing. Cannot use module set '{name}'"
)
pretty_names = "', '".join(names)
raise spack.config.ConfigError(
f"Cannot use invalid module set '{name}'.",
f"Valid module set names are: '{pretty_names}'.",
)
_missing_modules_warning = (

View File

@ -25,7 +25,7 @@ level = "long"
# tutorial configuration parameters
tutorial_branch = "releases/v0.19"
tutorial_branch = "releases/v0.20"
tutorial_mirror = "file:///mirror"
tutorial_key = os.path.join(spack.paths.share_path, "keys", "tutorial.pub")

View File

@ -37,7 +37,6 @@ _other_instance_vars = [
"implicit_rpaths",
"extra_rpaths",
]
_cache_config_file = []
# TODO: Caches at module level make it difficult to mock configurations in
# TODO: unit tests. It might be worth reworking their implementation.
@ -112,36 +111,26 @@ def _to_dict(compiler):
def get_compiler_config(scope=None, init_config=True):
"""Return the compiler configuration for the specified architecture."""
def init_compiler_config():
"""Compiler search used when Spack has no compilers."""
compilers = find_compilers()
compilers_dict = []
for compiler in compilers:
compilers_dict.append(_to_dict(compiler))
spack.config.set("compilers", compilers_dict, scope=scope)
config = spack.config.get("compilers", scope=scope) or []
if config or not init_config:
return config
merged_config = spack.config.get("compilers")
if merged_config:
return config
_init_compiler_config(scope=scope)
config = spack.config.get("compilers", scope=scope)
# Update the configuration if there are currently no compilers
# configured. Avoid updating automatically if there ARE site
# compilers configured but no user ones.
if not config and init_config:
if scope is None:
# We know no compilers were configured in any scope.
init_compiler_config()
config = spack.config.get("compilers", scope=scope)
elif scope == "user":
# Check the site config and update the user config if
# nothing is configured at the site level.
site_config = spack.config.get("compilers", scope="site")
sys_config = spack.config.get("compilers", scope="system")
if not site_config and not sys_config:
init_compiler_config()
config = spack.config.get("compilers", scope=scope)
return config
elif config:
return config
else:
return [] # Return empty list which we will later append to.
return config
def _init_compiler_config(*, scope):
"""Compiler search used when Spack has no compilers."""
compilers = find_compilers()
compilers_dict = []
for compiler in compilers:
compilers_dict.append(_to_dict(compiler))
spack.config.set("compilers", compilers_dict, scope=scope)
def compiler_config_files():
@ -165,52 +154,65 @@ def add_compilers_to_config(compilers, scope=None, init_config=True):
compiler_config = get_compiler_config(scope, init_config)
for compiler in compilers:
compiler_config.append(_to_dict(compiler))
global _cache_config_file
_cache_config_file = compiler_config
spack.config.set("compilers", compiler_config, scope=scope)
@_auto_compiler_spec
def remove_compiler_from_config(compiler_spec, scope=None):
"""Remove compilers from the config, by spec.
"""Remove compilers from configuration by spec.
If scope is None, all the scopes are searched for removal.
Arguments:
compiler_specs: a list of CompilerSpec objects.
scope: configuration scope to modify.
compiler_spec: compiler to be removed
scope: configuration scope to modify
"""
# Need a better way for this
global _cache_config_file
candidate_scopes = [scope]
if scope is None:
candidate_scopes = spack.config.config.scopes.keys()
removal_happened = False
for current_scope in candidate_scopes:
removal_happened |= _remove_compiler_from_scope(compiler_spec, scope=current_scope)
return removal_happened
def _remove_compiler_from_scope(compiler_spec, scope):
"""Removes a compiler from a specific configuration scope.
Args:
compiler_spec: compiler to be removed
scope: configuration scope under consideration
Returns:
True if one or more compiler entries were actually removed, False otherwise
"""
assert scope is not None, "a specific scope is needed when calling this function"
compiler_config = get_compiler_config(scope)
config_length = len(compiler_config)
filtered_compiler_config = [
comp
for comp in compiler_config
compiler_entry
for compiler_entry in compiler_config
if not spack.spec.parse_with_version_concrete(
comp["compiler"]["spec"], compiler=True
compiler_entry["compiler"]["spec"], compiler=True
).satisfies(compiler_spec)
]
# Update the cache for changes
_cache_config_file = filtered_compiler_config
if len(filtered_compiler_config) == config_length: # No items removed
CompilerSpecInsufficientlySpecificError(compiler_spec)
spack.config.set("compilers", filtered_compiler_config, scope=scope)
if len(filtered_compiler_config) == len(compiler_config):
return False
# We need to preserve the YAML type for comments, hence we are copying the
# items in the list that has just been retrieved
compiler_config[:] = filtered_compiler_config
spack.config.set("compilers", compiler_config, scope=scope)
return True
def all_compilers_config(scope=None, init_config=True):
"""Return a set of specs for all the compiler versions currently
available to build with. These are instances of CompilerSpec.
"""
# Get compilers for this architecture.
# Create a cache of the config file so we don't load all the time.
global _cache_config_file
if not _cache_config_file:
_cache_config_file = get_compiler_config(scope, init_config)
return _cache_config_file
else:
return _cache_config_file
return get_compiler_config(scope, init_config)
def all_compiler_specs(scope=None, init_config=True):

View File

@ -30,7 +30,7 @@ fortran_mapping = {
def get_valid_fortran_pth(comp_ver):
cl_ver = str(comp_ver).split("@")[1]
cl_ver = str(comp_ver)
sort_fn = lambda fc_ver: StrictVersion(fc_ver)
sort_fc_ver = sorted(list(avail_fc_version), key=sort_fn)
for ver in sort_fc_ver:
@ -75,7 +75,7 @@ class Msvc(Compiler):
# file based on compiler executable path.
def __init__(self, *args, **kwargs):
new_pth = [pth if pth else get_valid_fortran_pth(args[0]) for pth in args[3]]
new_pth = [pth if pth else get_valid_fortran_pth(args[0].version) for pth in args[3]]
args[3][:] = new_pth
super(Msvc, self).__init__(*args, **kwargs)
if os.getenv("ONEAPI_ROOT"):

View File

@ -1353,17 +1353,11 @@ def use_configuration(*scopes_or_paths):
configuration = _config_from(scopes_or_paths)
config.clear_caches(), configuration.clear_caches()
# Save and clear the current compiler cache
saved_compiler_cache = spack.compilers._cache_config_file
spack.compilers._cache_config_file = []
saved_config, config = config, configuration
try:
yield configuration
finally:
# Restore previous config files
spack.compilers._cache_config_file = saved_compiler_cache
config = saved_config

View File

@ -17,7 +17,7 @@
"template": "container/fedora_38.dockerfile",
"image": "docker.io/fedora:38"
},
"os_package_manager": "yum",
"os_package_manager": "dnf",
"build": "spack/fedora38",
"build_tags": {
"develop": "latest"
@ -31,7 +31,7 @@
"template": "container/fedora_37.dockerfile",
"image": "docker.io/fedora:37"
},
"os_package_manager": "yum",
"os_package_manager": "dnf",
"build": "spack/fedora37",
"build_tags": {
"develop": "latest"
@ -45,7 +45,7 @@
"template": "container/rockylinux_9.dockerfile",
"image": "docker.io/rockylinux:9"
},
"os_package_manager": "yum",
"os_package_manager": "dnf_epel",
"build": "spack/rockylinux9",
"build_tags": {
"develop": "latest"
@ -59,7 +59,7 @@
"template": "container/rockylinux_8.dockerfile",
"image": "docker.io/rockylinux:8"
},
"os_package_manager": "yum",
"os_package_manager": "dnf_epel",
"build": "spack/rockylinux8",
"build_tags": {
"develop": "latest"
@ -73,7 +73,7 @@
"template": "container/almalinux_9.dockerfile",
"image": "quay.io/almalinux/almalinux:9"
},
"os_package_manager": "yum",
"os_package_manager": "dnf_epel",
"build": "spack/almalinux9",
"build_tags": {
"develop": "latest"
@ -87,7 +87,7 @@
"template": "container/almalinux_8.dockerfile",
"image": "quay.io/almalinux/almalinux:8"
},
"os_package_manager": "yum",
"os_package_manager": "dnf_epel",
"build": "spack/almalinux8",
"build_tags": {
"develop": "latest"
@ -101,7 +101,7 @@
"template": "container/centos_stream.dockerfile",
"image": "quay.io/centos/centos:stream"
},
"os_package_manager": "yum",
"os_package_manager": "dnf_epel",
"build": "spack/centos-stream",
"final": {
"image": "quay.io/centos/centos:stream"
@ -185,6 +185,16 @@
"install": "apt-get -yqq install",
"clean": "rm -rf /var/lib/apt/lists/*"
},
"dnf": {
"update": "dnf update -y",
"install": "dnf install -y",
"clean": "rm -rf /var/cache/dnf && dnf clean all"
},
"dnf_epel": {
"update": "dnf update -y && dnf install -y epel-release && dnf update -y",
"install": "dnf install -y",
"clean": "rm -rf /var/cache/dnf && dnf clean all"
},
"yum": {
"update": "yum update -y && yum install -y epel-release && yum update -y",
"install": "yum install -y",

View File

@ -48,7 +48,8 @@ def translated_compiler_name(manifest_compiler_name):
def compiler_from_entry(entry):
compiler_name = translated_compiler_name(entry["name"])
paths = entry["executables"]
version = entry["version"]
# to instantiate a compiler class we may need a concrete version:
version = "={}".format(entry["version"])
arch = entry["arch"]
operating_system = arch["os"]
target = arch["target"]

View File

@ -112,10 +112,15 @@ def path_to_dict(search_paths):
# Reverse order of search directories so that a lib in the first
# entry overrides later entries
for search_path in reversed(search_paths):
for lib in os.listdir(search_path):
lib_path = os.path.join(search_path, lib)
if llnl.util.filesystem.is_readable_file(lib_path):
path_to_lib[lib_path] = lib
try:
for lib in os.listdir(search_path):
lib_path = os.path.join(search_path, lib)
if llnl.util.filesystem.is_readable_file(lib_path):
path_to_lib[lib_path] = lib
except OSError as e:
msg = f"cannot scan '{search_path}' for external software: {str(e)}"
llnl.util.tty.debug(msg)
return path_to_lib

View File

@ -39,7 +39,6 @@ import spack.spec
import spack.stage
import spack.store
import spack.subprocess_context
import spack.traverse
import spack.user_environment as uenv
import spack.util.cpus
import spack.util.environment
@ -51,6 +50,7 @@ import spack.util.spack_json as sjson
import spack.util.spack_yaml as syaml
import spack.util.url
import spack.version
from spack import traverse
from spack.filesystem_view import SimpleFilesystemView, inverse_view_func_parser, view_func_parser
from spack.installer import PackageInstaller
from spack.spec import Spec
@ -437,32 +437,6 @@ def _is_dev_spec_and_has_changed(spec):
return mtime > record.installation_time
def _spec_needs_overwrite(spec, changed_dev_specs):
"""Check whether the current spec needs to be overwritten because either it has
changed itself or one of its dependencies have changed
"""
# if it's not installed, we don't need to overwrite it
if not spec.installed:
return False
# If the spec itself has changed this is a trivial decision
if spec in changed_dev_specs:
return True
# if spec and all deps aren't dev builds, we don't need to overwrite it
if not any(spec.satisfies(c) for c in ("dev_path=*", "^dev_path=*")):
return False
# If any dep needs overwrite, or any dep is missing and is a dev build then
# overwrite this package
if any(
((not dep.installed) and dep.satisfies("dev_path=*"))
or _spec_needs_overwrite(dep, changed_dev_specs)
for dep in spec.traverse(root=False)
):
return True
def _error_on_nonempty_view_dir(new_root):
"""Defensively error when the target view path already exists and is not an
empty directory. This usually happens when the view symlink was removed, but
@ -647,18 +621,16 @@ class ViewDescriptor:
From the list of concretized user specs in the environment, flatten
the dags, and filter selected, installed specs, remove duplicates on dag hash.
"""
dag_hash = lambda spec: spec.dag_hash()
# With deps, requires traversal
if self.link == "all" or self.link == "run":
deptype = ("run") if self.link == "run" else ("link", "run")
specs = list(
spack.traverse.traverse_nodes(
concretized_root_specs, deptype=deptype, key=dag_hash
traverse.traverse_nodes(
concretized_root_specs, deptype=deptype, key=traverse.by_dag_hash
)
)
else:
specs = list(dedupe(concretized_root_specs, key=dag_hash))
specs = list(dedupe(concretized_root_specs, key=traverse.by_dag_hash))
# Filter selected, installed specs
with spack.store.db.read_transaction():
@ -1221,28 +1193,27 @@ class Environment:
old_specs = set(self.user_specs)
new_specs = set()
for spec in matches:
if spec in list_to_change:
try:
list_to_change.remove(spec)
self.update_stale_references(list_name)
new_specs = set(self.user_specs)
except spack.spec_list.SpecListError:
# define new specs list
new_specs = set(self.user_specs)
msg = f"Spec '{spec}' is part of a spec matrix and "
msg += f"cannot be removed from list '{list_to_change}'."
if force:
msg += " It will be removed from the concrete specs."
# Mock new specs, so we can remove this spec from concrete spec lists
new_specs.remove(spec)
tty.warn(msg)
if spec not in list_to_change:
continue
try:
list_to_change.remove(spec)
self.update_stale_references(list_name)
new_specs = set(self.user_specs)
except spack.spec_list.SpecListError:
# define new specs list
new_specs = set(self.user_specs)
msg = f"Spec '{spec}' is part of a spec matrix and "
msg += f"cannot be removed from list '{list_to_change}'."
if force:
msg += " It will be removed from the concrete specs."
# Mock new specs, so we can remove this spec from concrete spec lists
new_specs.remove(spec)
tty.warn(msg)
else:
if list_name == user_speclist_name:
self.manifest.remove_user_spec(str(spec))
else:
if list_name == user_speclist_name:
for user_spec in matches:
self.manifest.remove_user_spec(str(user_spec))
else:
for user_spec in matches:
self.manifest.remove_definition(str(user_spec), list_name=list_name)
self.manifest.remove_definition(str(spec), list_name=list_name)
# If force, update stale concretized specs
for spec in old_specs - new_specs:
@ -1352,6 +1323,10 @@ class Environment:
self.concretized_order = []
self.specs_by_hash = {}
# Remove concrete specs that no longer correlate to a user spec
for spec in set(self.concretized_user_specs) - set(self.user_specs):
self.deconcretize(spec)
# Pick the right concretization strategy
if self.unify == "when_possible":
return self._concretize_together_where_possible(tests=tests)
@ -1365,6 +1340,16 @@ class Environment:
msg = "concretization strategy not implemented [{0}]"
raise SpackEnvironmentError(msg.format(self.unify))
def deconcretize(self, spec):
# spec has to be a root of the environment
index = self.concretized_user_specs.index(spec)
dag_hash = self.concretized_order.pop(index)
del self.concretized_user_specs[index]
# If this was the only user spec that concretized to this concrete spec, remove it
if dag_hash not in self.concretized_order:
del self.specs_by_hash[dag_hash]
def _get_specs_to_concretize(
self,
) -> Tuple[Set[spack.spec.Spec], Set[spack.spec.Spec], List[spack.spec.Spec]]:
@ -1402,6 +1387,10 @@ class Environment:
if not new_user_specs:
return []
old_concrete_to_abstract = {
concrete: abstract for (abstract, concrete) in self.concretized_specs()
}
self.concretized_user_specs = []
self.concretized_order = []
self.specs_by_hash = {}
@ -1413,11 +1402,13 @@ class Environment:
result = []
for abstract, concrete in sorted(result_by_user_spec.items()):
# If the "abstract" spec is a concrete spec from the previous concretization
# translate it back to an abstract spec. Otherwise, keep the abstract spec
abstract = old_concrete_to_abstract.get(abstract, abstract)
if abstract in new_user_specs:
result.append((abstract, concrete))
else:
assert (abstract, concrete) in result
self._add_concrete_spec(abstract, concrete)
return result
def _concretize_together(
@ -1436,7 +1427,7 @@ class Environment:
self.specs_by_hash = {}
try:
concrete_specs = spack.concretize.concretize_specs_together(
concrete_specs: List[spack.spec.Spec] = spack.concretize.concretize_specs_together(
*specs_to_concretize, tests=tests
)
except spack.error.UnsatisfiableSpecError as e:
@ -1455,11 +1446,14 @@ class Environment:
)
raise
# zip truncates the longer list, which is exactly what we want here
concretized_specs = [x for x in zip(new_user_specs | kept_user_specs, concrete_specs)]
# set() | set() does not preserve ordering, even though sets are ordered
ordered_user_specs = list(new_user_specs) + list(kept_user_specs)
concretized_specs = [x for x in zip(ordered_user_specs, concrete_specs)]
for abstract, concrete in concretized_specs:
self._add_concrete_spec(abstract, concrete)
return concretized_specs
# zip truncates the longer list, which is exactly what we want here
return list(zip(new_user_specs, concrete_specs))
def _concretize_separately(self, tests=False):
"""Concretization strategy that concretizes separately one
@ -1802,17 +1796,29 @@ class Environment:
self.specs_by_hash[h] = concrete
def _get_overwrite_specs(self):
# Collect all specs in the environment first before checking which ones
# to rebuild to avoid checking the same specs multiple times
specs_to_check = set()
for dag_hash in self.concretized_order:
root_spec = self.specs_by_hash[dag_hash]
specs_to_check.update(root_spec.traverse(root=True))
changed_dev_specs = set(s for s in specs_to_check if _is_dev_spec_and_has_changed(s))
# Find all dev specs that were modified.
changed_dev_specs = [
s
for s in traverse.traverse_nodes(
self.concrete_roots(), order="breadth", key=traverse.by_dag_hash
)
if _is_dev_spec_and_has_changed(s)
]
# Collect their hashes, and the hashes of their installed parents.
# Notice: with order=breadth all changed dev specs are at depth 0,
# even if they occur as parents of one another.
return [
s.dag_hash() for s in specs_to_check if _spec_needs_overwrite(s, changed_dev_specs)
spec.dag_hash()
for depth, spec in traverse.traverse_nodes(
changed_dev_specs,
root=True,
order="breadth",
depth=True,
direction="parents",
key=traverse.by_dag_hash,
)
if depth == 0 or spec.installed
]
def _install_log_links(self, spec):
@ -1919,7 +1925,7 @@ class Environment:
def all_specs(self):
"""Return all specs, even those a user spec would shadow."""
roots = [self.specs_by_hash[h] for h in self.concretized_order]
specs = [s for s in spack.traverse.traverse_nodes(roots, lambda s: s.dag_hash())]
specs = [s for s in traverse.traverse_nodes(roots, key=traverse.by_dag_hash)]
specs.sort()
return specs
@ -1965,13 +1971,18 @@ class Environment:
roots *without* associated user spec"""
return [root for _, root in self.concretized_specs()]
def get_by_hash(self, dag_hash):
matches = {}
roots = [self.specs_by_hash[h] for h in self.concretized_order]
for spec in spack.traverse.traverse_nodes(roots, key=lambda s: s.dag_hash()):
def get_by_hash(self, dag_hash: str) -> List[Spec]:
# If it's not a partial hash prefix we can early exit
early_exit = len(dag_hash) == 32
matches = []
for spec in traverse.traverse_nodes(
self.concrete_roots(), key=traverse.by_dag_hash, order="breadth"
):
if spec.dag_hash().startswith(dag_hash):
matches[spec.dag_hash()] = spec
return list(matches.values())
matches.append(spec)
if early_exit:
break
return matches
def get_one_by_hash(self, dag_hash):
"""Returns the single spec from the environment which matches the
@ -1983,11 +1994,14 @@ class Environment:
def all_matching_specs(self, *specs: spack.spec.Spec) -> List[Spec]:
"""Returns all concretized specs in the environment satisfying any of the input specs"""
key = lambda s: s.dag_hash()
# Look up abstract hashes ahead of time, to avoid O(n^2) traversal.
specs = [s.lookup_hash() for s in specs]
# Avoid double lookup by directly calling _satisfies.
return [
s
for s in spack.traverse.traverse_nodes(self.concrete_roots(), key=key)
if any(s.satisfies(t) for t in specs)
for s in traverse.traverse_nodes(self.concrete_roots(), key=traverse.by_dag_hash)
if any(s._satisfies(t) for t in specs)
]
@spack.repo.autospec
@ -2011,9 +2025,9 @@ class Environment:
env_root_to_user = {root.dag_hash(): user for user, root in self.concretized_specs()}
root_matches, dep_matches = [], []
for env_spec in spack.traverse.traverse_nodes(
for env_spec in traverse.traverse_nodes(
specs=[root for _, root in self.concretized_specs()],
key=lambda s: s.dag_hash(),
key=traverse.by_dag_hash,
order="breadth",
):
if not env_spec.satisfies(spec):
@ -2087,8 +2101,8 @@ class Environment:
if recurse_dependencies:
specs.extend(
spack.traverse.traverse_nodes(
specs, root=False, deptype=("link", "run"), key=lambda s: s.dag_hash()
traverse.traverse_nodes(
specs, root=False, deptype=("link", "run"), key=traverse.by_dag_hash
)
)
@ -2097,9 +2111,7 @@ class Environment:
def _to_lockfile_dict(self):
"""Create a dictionary to store a lockfile for this environment."""
concrete_specs = {}
for s in spack.traverse.traverse_nodes(
self.specs_by_hash.values(), key=lambda s: s.dag_hash()
):
for s in traverse.traverse_nodes(self.specs_by_hash.values(), key=traverse.by_dag_hash):
spec_dict = s.node_dict_with_hashes(hash=ht.dag_hash)
# Assumes no legacy formats, since this was just created.
spec_dict[ht.dag_hash.name] = s.dag_hash()
@ -2256,7 +2268,7 @@ class Environment:
def update_environment_repository(self) -> None:
"""Updates the repository associated with the environment."""
for spec in spack.traverse.traverse_nodes(self.new_specs):
for spec in traverse.traverse_nodes(self.new_specs):
if not spec.concrete:
raise ValueError("specs passed to environment.write() must be concrete!")

View File

@ -215,6 +215,31 @@ def print_message(logger: LogType, msg: str, verbose: bool = False):
tty.info(msg, format="g")
def overall_status(current_status: "TestStatus", substatuses: List["TestStatus"]) -> "TestStatus":
"""Determine the overall status based on the current and associated sub status values.
Args:
current_status: current overall status, assumed to default to PASSED
substatuses: status of each test part or overall status of each test spec
Returns:
test status encompassing the main test and all subtests
"""
if current_status in [TestStatus.SKIPPED, TestStatus.NO_TESTS, TestStatus.FAILED]:
return current_status
skipped = 0
for status in substatuses:
if status == TestStatus.FAILED:
return status
elif status == TestStatus.SKIPPED:
skipped += 1
if skipped and skipped == len(substatuses):
return TestStatus.SKIPPED
return current_status
class PackageTest:
"""The class that manages stand-alone (post-install) package tests."""
@ -308,14 +333,12 @@ class PackageTest:
# to start with the same name) may not have PASSED. This extra
# check is used to ensure the containing test part is not claiming
# to have passed when at least one subpart failed.
if status == TestStatus.PASSED:
for pname, substatus in self.test_parts.items():
if pname != part_name and pname.startswith(part_name):
if substatus == TestStatus.FAILED:
print(f"{substatus}: {part_name}{extra}")
self.test_parts[part_name] = substatus
self.counts[substatus] += 1
return
substatuses = []
for pname, substatus in self.test_parts.items():
if pname != part_name and pname.startswith(part_name):
substatuses.append(substatus)
if substatuses:
status = overall_status(status, substatuses)
print(f"{status}: {part_name}{extra}")
self.test_parts[part_name] = status
@ -420,6 +443,26 @@ class PackageTest:
lines.append(f"{totals:=^80}")
return lines
def write_tested_status(self):
"""Write the overall status to the tested file.
If there any test part failures, then the tests failed. If all test
parts are skipped, then the tests were skipped. If any tests passed
then the tests passed; otherwise, there were not tests executed.
"""
status = TestStatus.NO_TESTS
if self.counts[TestStatus.FAILED] > 0:
status = TestStatus.FAILED
else:
skipped = self.counts[TestStatus.SKIPPED]
if skipped and self.parts() == skipped:
status = TestStatus.SKIPPED
elif self.counts[TestStatus.PASSED] > 0:
status = TestStatus.PASSED
with open(self.tested_file, "w") as f:
f.write(f"{status.value}\n")
@contextlib.contextmanager
def test_part(pkg: Pb, test_name: str, purpose: str, work_dir: str = ".", verbose: bool = False):
@ -654,8 +697,9 @@ def process_test_parts(pkg: Pb, test_specs: List[spack.spec.Spec], verbose: bool
try:
tests = test_functions(spec.package_class)
except spack.repo.UnknownPackageError:
# some virtuals don't have a package
tests = []
# Some virtuals don't have a package so we don't want to report
# them as not having tests when that isn't appropriate.
continue
if len(tests) == 0:
tester.status(spec.name, TestStatus.NO_TESTS)
@ -682,7 +726,7 @@ def process_test_parts(pkg: Pb, test_specs: List[spack.spec.Spec], verbose: bool
finally:
if tester.ran_tests():
fs.touch(tester.tested_file)
tester.write_tested_status()
# log one more test message to provide a completion timestamp
# for CDash reporting
@ -889,20 +933,15 @@ class TestSuite:
if remove_directory:
shutil.rmtree(test_dir)
tested = os.path.exists(self.tested_file_for_spec(spec))
if tested:
status = TestStatus.PASSED
else:
self.ensure_stage()
if spec.external and not externals:
status = TestStatus.SKIPPED
elif not spec.installed:
status = TestStatus.SKIPPED
else:
status = TestStatus.NO_TESTS
status = self.test_status(spec, externals)
self.counts[status] += 1
self.write_test_result(spec, status)
except SkipTest:
status = TestStatus.SKIPPED
self.counts[status] += 1
self.write_test_result(spec, TestStatus.SKIPPED)
except BaseException as exc:
status = TestStatus.FAILED
self.counts[status] += 1
@ -939,6 +978,31 @@ class TestSuite:
if failures:
raise TestSuiteFailure(failures)
def test_status(self, spec: spack.spec.Spec, externals: bool) -> Optional[TestStatus]:
"""Determine the overall test results status for the spec.
Args:
spec: instance of the spec under test
externals: ``True`` if externals are to be tested, else ``False``
Returns:
the spec's test status if available or ``None``
"""
tests_status_file = self.tested_file_for_spec(spec)
if not os.path.exists(tests_status_file):
self.ensure_stage()
if spec.external and not externals:
status = TestStatus.SKIPPED
elif not spec.installed:
status = TestStatus.SKIPPED
else:
status = TestStatus.NO_TESTS
return status
with open(tests_status_file, "r") as f:
value = (f.read()).strip("\n")
return TestStatus(int(value)) if value else TestStatus.NO_TESTS
def ensure_stage(self):
"""Ensure the test suite stage directory exists."""
if not os.path.exists(self.stage):

View File

@ -40,7 +40,7 @@ from typing import Optional
import llnl.util.filesystem
import llnl.util.tty as tty
from llnl.util.lang import dedupe
from llnl.util.lang import dedupe, memoized
import spack.build_environment
import spack.config
@ -170,17 +170,10 @@ def merge_config_rules(configuration, spec):
Returns:
dict: actions to be taken on the spec passed as an argument
"""
# Get the top-level configuration for the module type we are using
module_specific_configuration = copy.deepcopy(configuration)
# Construct a dictionary with the actions we need to perform on the spec
# passed as a parameter
# The keyword 'all' is always evaluated first, all the others are
# evaluated in order of appearance in the module file
spec_configuration = module_specific_configuration.pop("all", {})
for constraint, action in module_specific_configuration.items():
spec_configuration = copy.deepcopy(configuration.get("all", {}))
for constraint, action in configuration.items():
if spec.satisfies(constraint):
if hasattr(constraint, "override") and constraint.override:
spec_configuration = {}
@ -200,14 +193,14 @@ def merge_config_rules(configuration, spec):
# configuration
# Hash length in module files
hash_length = module_specific_configuration.get("hash_length", 7)
hash_length = configuration.get("hash_length", 7)
spec_configuration["hash_length"] = hash_length
verbose = module_specific_configuration.get("verbose", False)
verbose = configuration.get("verbose", False)
spec_configuration["verbose"] = verbose
# module defaults per-package
defaults = module_specific_configuration.get("defaults", [])
defaults = configuration.get("defaults", [])
spec_configuration["defaults"] = defaults
return spec_configuration
@ -678,7 +671,14 @@ class BaseContext(tengine.Context):
# the configure option section
return None
def modification_needs_formatting(self, modification):
"""Returns True if environment modification entry needs to be formatted."""
return (
not isinstance(modification, (spack.util.environment.SetEnv)) or not modification.raw
)
@tengine.context_property
@memoized
def environment_modifications(self):
"""List of environment modifications to be processed."""
# Modifications guessed by inspecting the spec prefix
@ -740,15 +740,29 @@ class BaseContext(tengine.Context):
_check_tokens_are_valid(x.name, message=msg)
# Transform them
x.name = spec.format(x.name, transform=transform)
try:
# Not every command has a value
x.value = spec.format(x.value)
except AttributeError:
pass
if self.modification_needs_formatting(x):
try:
# Not every command has a value
x.value = spec.format(x.value)
except AttributeError:
pass
x.name = str(x.name).replace("-", "_")
return [(type(x).__name__, x) for x in env if x.name not in exclude]
@tengine.context_property
def has_manpath_modifications(self):
"""True if MANPATH environment variable is modified."""
for modification_type, cmd in self.environment_modifications:
if not isinstance(
cmd, (spack.util.environment.PrependPath, spack.util.environment.AppendPath)
):
continue
if cmd.name == "MANPATH":
return True
else:
return False
@tengine.context_property
def autoload(self):
"""List of modules that needs to be loaded automatically."""

View File

@ -7,7 +7,7 @@ import collections
import itertools
import os.path
import posixpath
from typing import Any, Dict
from typing import Any, Dict, List
import llnl.util.lang as lang
@ -56,7 +56,7 @@ def make_context(spec, module_set_name, explicit):
return LmodContext(conf)
def guess_core_compilers(name, store=False):
def guess_core_compilers(name, store=False) -> List[spack.spec.CompilerSpec]:
"""Guesses the list of core compilers installed in the system.
Args:
@ -64,21 +64,19 @@ def guess_core_compilers(name, store=False):
modules.yaml configuration file
Returns:
List of core compilers, if found, or None
List of found core compilers
"""
core_compilers = []
for compiler_config in spack.compilers.all_compilers_config():
for compiler in spack.compilers.all_compilers():
try:
compiler = compiler_config["compiler"]
# A compiler is considered to be a core compiler if any of the
# C, C++ or Fortran compilers reside in a system directory
is_system_compiler = any(
os.path.dirname(x) in spack.util.environment.SYSTEM_DIRS
for x in compiler["paths"].values()
if x is not None
os.path.dirname(getattr(compiler, x, "")) in spack.util.environment.SYSTEM_DIRS
for x in ("cc", "cxx", "f77", "fc")
)
if is_system_compiler:
core_compilers.append(str(compiler["spec"]))
core_compilers.append(compiler.spec)
except (KeyError, TypeError, AttributeError):
continue
@ -89,10 +87,10 @@ def guess_core_compilers(name, store=False):
modules_cfg = spack.config.get(
"modules:" + name, {}, scope=spack.config.default_modify_scope()
)
modules_cfg.setdefault("lmod", {})["core_compilers"] = core_compilers
modules_cfg.setdefault("lmod", {})["core_compilers"] = [str(x) for x in core_compilers]
spack.config.set("modules:" + name, modules_cfg, scope=spack.config.default_modify_scope())
return core_compilers or None
return core_compilers
class LmodConfiguration(BaseConfiguration):
@ -104,7 +102,7 @@ class LmodConfiguration(BaseConfiguration):
default_projections = {"all": posixpath.join("{name}", "{version}")}
@property
def core_compilers(self):
def core_compilers(self) -> List[spack.spec.CompilerSpec]:
"""Returns the list of "Core" compilers
Raises:
@ -112,14 +110,18 @@ class LmodConfiguration(BaseConfiguration):
specified in the configuration file or the sequence
is empty
"""
value = configuration(self.name).get("core_compilers") or guess_core_compilers(
self.name, store=True
)
compilers = [
spack.spec.CompilerSpec(c) for c in configuration(self.name).get("core_compilers", [])
]
if not value:
if not compilers:
compilers = guess_core_compilers(self.name, store=True)
if not compilers:
msg = 'the key "core_compilers" must be set in modules.yaml'
raise CoreCompilersNotFoundError(msg)
return value
return compilers
@property
def core_specs(self):
@ -132,6 +134,7 @@ class LmodConfiguration(BaseConfiguration):
return configuration(self.name).get("filter_hierarchy_specs", {})
@property
@lang.memoized
def hierarchy_tokens(self):
"""Returns the list of tokens that are part of the modulefile
hierarchy. 'compiler' is always present.
@ -156,6 +159,7 @@ class LmodConfiguration(BaseConfiguration):
return tokens
@property
@lang.memoized
def requires(self):
"""Returns a dictionary mapping all the requirements of this spec
to the actual provider. 'compiler' is always present among the
@ -222,6 +226,7 @@ class LmodConfiguration(BaseConfiguration):
return available
@property
@lang.memoized
def missing(self):
"""Returns the list of tokens that are not available."""
return [x for x in self.hierarchy_tokens if x not in self.available]
@ -283,16 +288,18 @@ class LmodFileLayout(BaseFileLayout):
# If we are dealing with a core compiler, return 'Core'
core_compilers = self.conf.core_compilers
if name == "compiler" and str(value) in core_compilers:
if name == "compiler" and any(
spack.spec.CompilerSpec(value).satisfies(c) for c in core_compilers
):
return "Core"
# CompilerSpec does not have an hash, as we are not allowed to
# CompilerSpec does not have a hash, as we are not allowed to
# use different flavors of the same compiler
if name == "compiler":
return path_part_fmt.format(token=value)
# In case the hierarchy token refers to a virtual provider
# we need to append an hash to the version to distinguish
# we need to append a hash to the version to distinguish
# among flavors of the same library (e.g. openblas~openmp vs.
# openblas+openmp)
path = path_part_fmt.format(token=value)
@ -313,6 +320,7 @@ class LmodFileLayout(BaseFileLayout):
return parts
@property
@lang.memoized
def unlocked_paths(self):
"""Returns a dictionary mapping conditions to a list of unlocked
paths.
@ -424,6 +432,7 @@ class LmodContext(BaseContext):
return self.conf.missing
@tengine.context_property
@lang.memoized
def unlocked_paths(self):
"""Returns the list of paths that are unlocked unconditionally."""
layout = make_layout(self.spec, self.conf.name, self.conf.explicit)

View File

@ -1231,6 +1231,7 @@ class PackageBase(WindowsRPath, PackageViewMixin, metaclass=PackageMeta):
if any(dt in cls.dependencies[name][cond].type for cond in conds for dt in deptypes)
)
# TODO: allow more than one active extendee.
@property
def extendee_spec(self):
"""
@ -1246,7 +1247,6 @@ class PackageBase(WindowsRPath, PackageViewMixin, metaclass=PackageMeta):
if dep.name in self.extendees:
deps.append(dep)
# TODO: allow more than one active extendee.
if deps:
assert len(deps) == 1
return deps[0]
@ -1256,7 +1256,6 @@ class PackageBase(WindowsRPath, PackageViewMixin, metaclass=PackageMeta):
if self.spec._concrete:
return None
else:
# TODO: do something sane here with more than one extendee
# If it's not concrete, then return the spec from the
# extends() directive since that is all we know so far.
spec_str, kwargs = next(iter(self.extendees.items()))
@ -2017,7 +2016,8 @@ class PackageBase(WindowsRPath, PackageViewMixin, metaclass=PackageMeta):
# stack instead of from traceback.
# The traceback is truncated here, so we can't use it to
# traverse the stack.
m = "\n".join(spack.build_environment.get_package_context(tb))
context = spack.build_environment.get_package_context(tb)
m = "\n".join(context) if context else ""
exc = e # e is deleted after this block

View File

@ -37,7 +37,9 @@ _xc_craype_dir = "/opt/cray/pe/cdt"
def slingshot_network():
return os.path.exists("/opt/cray/pe") and os.path.exists("/lib64/libcxi.so")
return os.path.exists("/opt/cray/pe") and (
os.path.exists("/lib64/libcxi.so") or os.path.exists("/usr/lib64/libcxi.so")
)
def _target_name_from_craype_target_name(name):

View File

@ -686,7 +686,7 @@ def is_relocatable(spec):
Raises:
ValueError: if the spec is not installed
"""
if not spec.install_status():
if not spec.installed:
raise ValueError("spec is not installed [{0}]".format(str(spec)))
if spec.external or spec.virtual:

View File

@ -24,7 +24,7 @@ import sys
import traceback
import types
import uuid
from typing import Dict, Union
from typing import Any, Dict, List, Union
import llnl.util.filesystem as fs
import llnl.util.lang
@ -424,7 +424,7 @@ class FastPackageChecker(collections.abc.Mapping):
def last_mtime(self):
return max(sinfo.st_mtime for sinfo in self._packages_to_stats.values())
def modified_since(self, since):
def modified_since(self, since: float) -> List[str]:
return [name for name, sinfo in self._packages_to_stats.items() if sinfo.st_mtime > since]
def __getitem__(self, item):
@ -550,35 +550,34 @@ class RepoIndex(object):
when they're needed.
``Indexers`` should be added to the ``RepoIndex`` using
``add_index(name, indexer)``, and they should support the interface
``add_indexer(name, indexer)``, and they should support the interface
defined by ``Indexer``, so that the ``RepoIndex`` can read, generate,
and update stored indices.
Generated indexes are accessed by name via ``__getitem__()``.
Generated indexes are accessed by name via ``__getitem__()``."""
"""
def __init__(self, package_checker, namespace, cache):
def __init__(
self,
package_checker: FastPackageChecker,
namespace: str,
cache: spack.util.file_cache.FileCache,
):
self.checker = package_checker
self.packages_path = self.checker.packages_path
if sys.platform == "win32":
self.packages_path = spack.util.path.convert_to_posix_path(self.packages_path)
self.namespace = namespace
self.indexers = {}
self.indexes = {}
self.indexers: Dict[str, Indexer] = {}
self.indexes: Dict[str, Any] = {}
self.cache = cache
def add_indexer(self, name, indexer):
def add_indexer(self, name: str, indexer: Indexer):
"""Add an indexer to the repo index.
Arguments:
name (str): name of this indexer
indexer (object): an object that supports create(), read(),
write(), and get_index() operations
"""
name: name of this indexer
indexer: object implementing the ``Indexer`` interface"""
self.indexers[name] = indexer
def __getitem__(self, name):
@ -599,17 +598,15 @@ class RepoIndex(object):
because the main bottleneck here is loading all the packages. It
can take tens of seconds to regenerate sequentially, and we'd
rather only pay that cost once rather than on several
invocations.
"""
invocations."""
for name, indexer in self.indexers.items():
self.indexes[name] = self._build_index(name, indexer)
def _build_index(self, name, indexer):
def _build_index(self, name: str, indexer: Indexer):
"""Determine which packages need an update, and update indexes."""
# Filename of the provider index cache (we assume they're all json)
cache_filename = "{0}/{1}-index.json".format(name, self.namespace)
cache_filename = f"{name}/{self.namespace}-index.json"
# Compute which packages needs to be updated in the cache
index_mtime = self.cache.mtime(cache_filename)
@ -633,8 +630,7 @@ class RepoIndex(object):
needs_update = self.checker.modified_since(new_index_mtime)
for pkg_name in needs_update:
namespaced_name = "%s.%s" % (self.namespace, pkg_name)
indexer.update(namespaced_name)
indexer.update(f"{self.namespace}.{pkg_name}")
indexer.write(new)

View File

@ -861,9 +861,9 @@ class SpackSolverSetup(object):
def __init__(self, tests=False):
self.gen = None # set by setup()
self.declared_versions = {}
self.possible_versions = {}
self.deprecated_versions = {}
self.declared_versions = collections.defaultdict(list)
self.possible_versions = collections.defaultdict(set)
self.deprecated_versions = collections.defaultdict(set)
self.possible_virtuals = None
self.possible_compilers = []
@ -1669,9 +1669,34 @@ class SpackSolverSetup(object):
if concrete_build_deps or dtype != "build":
clauses.append(fn.attr("depends_on", spec.name, dep.name, dtype))
# Ensure Spack will not coconcretize this with another provider
# for the same virtual
for virtual in dep.package.virtuals_provided:
# TODO: We have to look up info from package.py here, but we'd
# TODO: like to avoid this entirely. We should not need to look
# TODO: up potentially wrong info if we have virtual edge info.
try:
try:
pkg = dep.package
except spack.repo.UnknownNamespaceError:
# Try to look up the package of the same name and use its
# providers. This is as good as we can do without edge info.
pkg_class = spack.repo.path.get_pkg_class(dep.name)
spec = spack.spec.Spec(f"{dep.name}@{dep.version}")
pkg = pkg_class(spec)
virtuals = pkg.virtuals_provided
except spack.repo.UnknownPackageError:
# Skip virtual node constriants for renamed/deleted packages,
# so their binaries can still be installed.
# NOTE: with current specs (which lack edge attributes) this
# can allow concretizations with two providers, but it's unlikely.
continue
# Don't concretize with two providers of the same virtual.
# See above for exception for unknown packages.
# TODO: we will eventually record provider information on edges,
# TODO: which avoids the need for the package lookup above.
for virtual in virtuals:
clauses.append(fn.attr("virtual_node", virtual.name))
clauses.append(fn.provider(dep.name, virtual.name))
@ -1697,10 +1722,6 @@ class SpackSolverSetup(object):
def build_version_dict(self, possible_pkgs):
"""Declare any versions in specs not declared in packages."""
self.declared_versions = collections.defaultdict(list)
self.possible_versions = collections.defaultdict(set)
self.deprecated_versions = collections.defaultdict(set)
packages_yaml = spack.config.get("packages")
packages_yaml = _normalize_packages_yaml(packages_yaml)
for pkg_name in possible_pkgs:
@ -1734,13 +1755,47 @@ class SpackSolverSetup(object):
# All the preferred version from packages.yaml, versions in external
# specs will be computed later
version_preferences = packages_yaml.get(pkg_name, {}).get("version", [])
for idx, v in enumerate(version_preferences):
# v can be a string so force it into an actual version for comparisons
ver = vn.Version(v)
version_defs = []
pkg_class = spack.repo.path.get_pkg_class(pkg_name)
for vstr in version_preferences:
v = vn.ver(vstr)
if isinstance(v, vn.GitVersion):
version_defs.append(v)
else:
satisfying_versions = self._check_for_defined_matching_versions(pkg_class, v)
# Amongst all defined versions satisfying this specific
# preference, the highest-numbered version is the
# most-preferred: therefore sort satisfying versions
# from greatest to least
version_defs.extend(sorted(satisfying_versions, reverse=True))
for weight, vdef in enumerate(llnl.util.lang.dedupe(version_defs)):
self.declared_versions[pkg_name].append(
DeclaredVersion(version=ver, idx=idx, origin=Provenance.PACKAGES_YAML)
DeclaredVersion(version=vdef, idx=weight, origin=Provenance.PACKAGES_YAML)
)
self.possible_versions[pkg_name].add(ver)
self.possible_versions[pkg_name].add(vdef)
def _check_for_defined_matching_versions(self, pkg_class, v):
"""Given a version specification (which may be a concrete version,
range, etc.), determine if any package.py version declarations
or externals define a version which satisfies it.
This is primarily for determining whether a version request (e.g.
version preferences, which should not themselves define versions)
refers to a defined version.
This function raises an exception if no satisfying versions are
found.
"""
pkg_name = pkg_class.name
satisfying_versions = list(x for x in pkg_class.versions if x.satisfies(v))
satisfying_versions.extend(x for x in self.possible_versions[pkg_name] if x.satisfies(v))
if not satisfying_versions:
raise spack.config.ConfigError(
"Preference for version {0} does not match any version"
" defined for {1} (in its package.py or any external)".format(str(v), pkg_name)
)
return satisfying_versions
def add_concrete_versions_from_specs(self, specs, origin):
"""Add concrete versions to possible versions from lists of CLI/dev specs."""
@ -2173,14 +2228,6 @@ class SpackSolverSetup(object):
# get possible compilers
self.possible_compilers = self.generate_possible_compilers(specs)
# traverse all specs and packages to build dict of possible versions
self.build_version_dict(possible)
self.add_concrete_versions_from_specs(specs, Provenance.SPEC)
self.add_concrete_versions_from_specs(dev_specs, Provenance.DEV_SPEC)
req_version_specs = _get_versioned_specs_from_pkg_requirements()
self.add_concrete_versions_from_specs(req_version_specs, Provenance.PACKAGE_REQUIREMENT)
self.gen.h1("Concrete input spec definitions")
self.define_concrete_input_specs(specs, possible)
@ -2208,6 +2255,14 @@ class SpackSolverSetup(object):
self.provider_requirements()
self.external_packages()
# traverse all specs and packages to build dict of possible versions
self.build_version_dict(possible)
self.add_concrete_versions_from_specs(specs, Provenance.SPEC)
self.add_concrete_versions_from_specs(dev_specs, Provenance.DEV_SPEC)
req_version_specs = self._get_versioned_specs_from_pkg_requirements()
self.add_concrete_versions_from_specs(req_version_specs, Provenance.PACKAGE_REQUIREMENT)
self.gen.h1("Package Constraints")
for pkg in sorted(self.pkgs):
self.gen.h2("Package rules: %s" % pkg)
@ -2254,55 +2309,78 @@ class SpackSolverSetup(object):
if self.concretize_everything:
self.gen.fact(fn.concretize_everything())
def _get_versioned_specs_from_pkg_requirements(self):
"""If package requirements mention versions that are not mentioned
elsewhere, then we need to collect those to mark them as possible
versions.
"""
req_version_specs = list()
config = spack.config.get("packages")
for pkg_name, d in config.items():
if pkg_name == "all":
continue
if "require" in d:
req_version_specs.extend(self._specs_from_requires(pkg_name, d["require"]))
return req_version_specs
def _get_versioned_specs_from_pkg_requirements():
"""If package requirements mention versions that are not mentioned
elsewhere, then we need to collect those to mark them as possible
versions.
"""
req_version_specs = list()
config = spack.config.get("packages")
for pkg_name, d in config.items():
if pkg_name == "all":
continue
if "require" in d:
req_version_specs.extend(_specs_from_requires(pkg_name, d["require"]))
return req_version_specs
def _specs_from_requires(pkg_name, section):
if isinstance(section, str):
spec = spack.spec.Spec(section)
if not spec.name:
spec.name = pkg_name
extracted_specs = [spec]
else:
spec_strs = []
for spec_group in section:
if isinstance(spec_group, str):
spec_strs.append(spec_group)
else:
# Otherwise it is an object. The object can contain a single
# "spec" constraint, or a list of them with "any_of" or
# "one_of" policy.
if "spec" in spec_group:
new_constraints = [spec_group["spec"]]
else:
key = "one_of" if "one_of" in spec_group else "any_of"
new_constraints = spec_group[key]
spec_strs.extend(new_constraints)
extracted_specs = []
for spec_str in spec_strs:
spec = spack.spec.Spec(spec_str)
def _specs_from_requires(self, pkg_name, section):
"""Collect specs from requirements which define versions (i.e. those that
have a concrete version). Requirements can define *new* versions if
they are included as part of an equivalence (hash=number) but not
otherwise.
"""
if isinstance(section, str):
spec = spack.spec.Spec(section)
if not spec.name:
spec.name = pkg_name
extracted_specs.append(spec)
extracted_specs = [spec]
else:
spec_strs = []
for spec_group in section:
if isinstance(spec_group, str):
spec_strs.append(spec_group)
else:
# Otherwise it is an object. The object can contain a single
# "spec" constraint, or a list of them with "any_of" or
# "one_of" policy.
if "spec" in spec_group:
new_constraints = [spec_group["spec"]]
else:
key = "one_of" if "one_of" in spec_group else "any_of"
new_constraints = spec_group[key]
spec_strs.extend(new_constraints)
version_specs = [x for x in extracted_specs if x.versions.concrete]
for spec in version_specs:
spec.attach_git_version_lookup()
return version_specs
extracted_specs = []
for spec_str in spec_strs:
spec = spack.spec.Spec(spec_str)
if not spec.name:
spec.name = pkg_name
extracted_specs.append(spec)
version_specs = []
for spec in extracted_specs:
if spec.versions.concrete:
# Note: this includes git versions
version_specs.append(spec)
continue
# Prefer spec's name if it exists, in case the spec is
# requiring a specific implementation inside of a virtual section
# e.g. packages:mpi:require:openmpi@4.0.1
pkg_class = spack.repo.path.get_pkg_class(spec.name or pkg_name)
satisfying_versions = self._check_for_defined_matching_versions(
pkg_class, spec.versions
)
# Version ranges ("@1.3" without the "=", "@1.2:1.4") and lists
# will end up here
ordered_satisfying_versions = sorted(satisfying_versions, reverse=True)
vspecs = list(spack.spec.Spec("@{0}".format(x)) for x in ordered_satisfying_versions)
version_specs.extend(vspecs)
for spec in version_specs:
spec.attach_git_version_lookup()
return version_specs
class SpecBuilder(object):
@ -2758,12 +2836,13 @@ class InternalConcretizerError(spack.error.UnsatisfiableSpecError):
"""
def __init__(self, provided, conflicts):
indented = [" %s\n" % conflict for conflict in conflicts]
error_msg = "".join(indented)
msg = "Spack concretizer internal error. Please submit a bug report"
msg += "\n Please include the command, environment if applicable,"
msg += "\n and the following error message."
msg = "\n %s is unsatisfiable, errors are:\n%s" % (provided, error_msg)
msg = (
"Spack concretizer internal error. Please submit a bug report and include the "
"command, environment if applicable and the following error message."
f"\n {provided} is unsatisfiable, errors are:"
)
msg += "".join([f"\n {conflict}" for conflict in conflicts])
super(spack.error.UnsatisfiableSpecError, self).__init__(msg)

View File

@ -50,6 +50,7 @@ line is a spec for a particular installation of the mpileaks package.
"""
import collections
import collections.abc
import enum
import io
import itertools
import os
@ -173,6 +174,16 @@ CLEARSIGN_FILE_REGEX = re.compile(
SPECFILE_FORMAT_VERSION = 3
# InstallStatus is used to map install statuses to symbols for display
# Options are artificially disjoint for dispay purposes
class InstallStatus(enum.Enum):
installed = "@g{[+]} "
upstream = "@g{[^]} "
external = "@g{[e]} "
absent = "@K{ - } "
missing = "@r{[-]} "
def colorize_spec(spec):
"""Returns a spec colorized according to the colors specified in
color_formats."""
@ -679,6 +690,16 @@ class CompilerSpec(object):
d = d["compiler"]
return CompilerSpec(d["name"], vn.VersionList.from_dict(d))
@property
def display_str(self):
"""Equivalent to {compiler.name}{@compiler.version} for Specs, without extra
@= for readability."""
if self.concrete:
return f"{self.name}@{self.version}"
elif self.versions != vn.any_version:
return f"{self.name}@{self.versions}"
return self.name
def __str__(self):
out = self.name
if self.versions and self.versions != vn.any_version:
@ -1730,14 +1751,14 @@ class Spec(object):
def short_spec(self):
"""Returns a version of the spec with the dependencies hashed
instead of completely enumerated."""
spec_format = "{name}{@version}{%compiler}"
spec_format = "{name}{@version}{%compiler.name}{@compiler.version}"
spec_format += "{variants}{arch=architecture}{/hash:7}"
return self.format(spec_format)
@property
def cshort_spec(self):
"""Returns an auto-colorized version of ``self.short_spec``."""
spec_format = "{name}{@version}{%compiler}"
spec_format = "{name}{@version}{%compiler.name}{@compiler.version}"
spec_format += "{variants}{arch=architecture}{/hash:7}"
return self.cformat(spec_format)
@ -2789,11 +2810,11 @@ class Spec(object):
# Also record all patches required on dependencies by
# depends_on(..., patch=...)
for dspec in root.traverse_edges(deptype=all, cover="edges", root=False):
pkg_deps = dspec.parent.package_class.dependencies
if dspec.spec.name not in pkg_deps:
if dspec.spec.concrete:
continue
if dspec.spec.concrete:
pkg_deps = dspec.parent.package_class.dependencies
if dspec.spec.name not in pkg_deps:
continue
patches = []
@ -4391,12 +4412,20 @@ class Spec(object):
def install_status(self):
"""Helper for tree to print DB install status."""
if not self.concrete:
return None
try:
record = spack.store.db.get_record(self)
return record.installed
except KeyError:
return None
return InstallStatus.absent
if self.external:
return InstallStatus.external
upstream, record = spack.store.db.query_by_spec_hash(self.dag_hash())
if not record:
return InstallStatus.absent
elif upstream and record.installed:
return InstallStatus.upstream
elif record.installed:
return InstallStatus.installed
else:
return InstallStatus.missing
def _installed_explicitly(self):
"""Helper for tree to print DB install status."""
@ -4410,7 +4439,10 @@ class Spec(object):
def tree(self, **kwargs):
"""Prints out this spec and its dependencies, tree-formatted
with indentation."""
with indentation.
Status function may either output a boolean or an InstallStatus
"""
color = kwargs.pop("color", clr.get_color_when())
depth = kwargs.pop("depth", False)
hashes = kwargs.pop("hashes", False)
@ -4442,14 +4474,12 @@ class Spec(object):
if status_fn:
status = status_fn(node)
if node.installed_upstream:
out += clr.colorize("@g{[^]} ", color=color)
elif status is None:
out += clr.colorize("@K{ - } ", color=color) # !installed
if status in list(InstallStatus):
out += clr.colorize(status.value, color=color)
elif status:
out += clr.colorize("@g{[+]} ", color=color) # installed
out += clr.colorize("@g{[+]} ", color=color)
else:
out += clr.colorize("@r{[-]} ", color=color) # missing
out += clr.colorize("@r{[-]} ", color=color)
if hashes:
out += clr.colorize("@K{%s} ", color=color) % node.dag_hash(hlen)

View File

@ -4,7 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import itertools
import textwrap
from typing import List
from typing import List, Optional, Tuple
import llnl.util.lang
@ -66,17 +66,17 @@ class Context(metaclass=ContextMeta):
return dict(d)
def make_environment(dirs=None):
"""Returns an configured environment for template rendering."""
@llnl.util.lang.memoized
def make_environment(dirs: Optional[Tuple[str, ...]] = None):
"""Returns a configured environment for template rendering."""
# Import at this scope to avoid slowing Spack startup down
import jinja2
if dirs is None:
# Default directories where to search for templates
builtins = spack.config.get("config:template_dirs", ["$spack/share/spack/templates"])
extensions = spack.extensions.get_template_dirs()
dirs = [canonicalize_path(d) for d in itertools.chain(builtins, extensions)]
# avoid importing this at the top level as it's used infrequently and
# slows down startup a bit.
import jinja2
dirs = tuple(canonicalize_path(d) for d in itertools.chain(builtins, extensions))
# Loader for the templates
loader = jinja2.FileSystemLoader(dirs)
@ -100,9 +100,15 @@ def quote(text):
return ['"{0}"'.format(line) for line in text]
def curly_quote(text):
"""Encloses each line of text in curly braces"""
return ["{{{0}}}".format(line) for line in text]
def _set_filters(env):
"""Sets custom filters to the template engine environment"""
env.filters["textwrap"] = textwrap.wrap
env.filters["prepend_to_line"] = prepend_to_line
env.filters["join"] = "\n".join
env.filters["quote"] = quote
env.filters["curly_quote"] = curly_quote

View File

@ -115,9 +115,6 @@ def default_config(tmpdir, config_directory, monkeypatch, install_mockery_mutabl
spack.config.config, old_config = cfg, spack.config.config
spack.config.config.set("repos", [spack.paths.mock_packages_path])
# This is essential, otherwise the cache will create weird side effects
# that will compromise subsequent tests if compilers.yaml is modified
monkeypatch.setattr(spack.compilers, "_cache_config_file", [])
njobs = spack.config.get("config:build_jobs")
if not njobs:
spack.config.set("config:build_jobs", 4, scope="user")

View File

@ -8,8 +8,6 @@ import sys
import pytest
import llnl.util.filesystem
import spack.compilers
import spack.main
import spack.version
@ -18,124 +16,8 @@ compiler = spack.main.SpackCommand("compiler")
@pytest.fixture
def mock_compiler_version():
return "4.5.3"
@pytest.fixture()
def mock_compiler_dir(tmpdir, mock_compiler_version):
"""Return a directory containing a fake, but detectable compiler."""
tmpdir.ensure("bin", dir=True)
bin_dir = tmpdir.join("bin")
gcc_path = bin_dir.join("gcc")
gxx_path = bin_dir.join("g++")
gfortran_path = bin_dir.join("gfortran")
gcc_path.write(
"""\
#!/bin/sh
for arg in "$@"; do
if [ "$arg" = -dumpversion ]; then
echo '%s'
fi
done
"""
% mock_compiler_version
)
# Create some mock compilers in the temporary directory
llnl.util.filesystem.set_executable(str(gcc_path))
gcc_path.copy(gxx_path, mode=True)
gcc_path.copy(gfortran_path, mode=True)
return str(tmpdir)
@pytest.mark.skipif(
sys.platform == "win32",
reason="Cannot execute bash \
script on Windows",
)
@pytest.mark.regression("11678,13138")
def test_compiler_find_without_paths(no_compilers_yaml, working_env, tmpdir):
with tmpdir.as_cwd():
with open("gcc", "w") as f:
f.write(
"""\
#!/bin/sh
echo "0.0.0"
"""
)
os.chmod("gcc", 0o700)
os.environ["PATH"] = str(tmpdir)
output = compiler("find", "--scope=site")
assert "gcc" in output
@pytest.mark.regression("17589")
def test_compiler_find_no_apple_gcc(no_compilers_yaml, working_env, tmpdir):
with tmpdir.as_cwd():
# make a script to emulate apple gcc's version args
with open("gcc", "w") as f:
f.write(
"""\
#!/bin/sh
if [ "$1" = "-dumpversion" ]; then
echo "4.2.1"
elif [ "$1" = "--version" ]; then
echo "Configured with: --prefix=/dummy"
echo "Apple clang version 11.0.0 (clang-1100.0.33.16)"
echo "Target: x86_64-apple-darwin18.7.0"
echo "Thread model: posix"
echo "InstalledDir: /dummy"
else
echo "clang: error: no input files"
fi
"""
)
os.chmod("gcc", 0o700)
os.environ["PATH"] = str(tmpdir)
output = compiler("find", "--scope=site")
assert "gcc" not in output
def test_compiler_remove(mutable_config, mock_packages):
assert spack.spec.CompilerSpec("gcc@=4.5.0") in spack.compilers.all_compiler_specs()
args = spack.util.pattern.Bunch(all=True, compiler_spec="gcc@4.5.0", add_paths=[], scope=None)
spack.cmd.compiler.compiler_remove(args)
assert spack.spec.CompilerSpec("gcc@=4.5.0") not in spack.compilers.all_compiler_specs()
@pytest.mark.skipif(
sys.platform == "win32",
reason="Cannot execute bash \
script on Windows",
)
def test_compiler_add(mutable_config, mock_packages, mock_compiler_dir, mock_compiler_version):
# Compilers available by default.
old_compilers = set(spack.compilers.all_compiler_specs())
args = spack.util.pattern.Bunch(
all=None, compiler_spec=None, add_paths=[mock_compiler_dir], scope=None
)
spack.cmd.compiler.compiler_find(args)
# Ensure new compiler is in there
new_compilers = set(spack.compilers.all_compiler_specs())
new_compiler = new_compilers - old_compilers
assert any(c.version == spack.version.Version(mock_compiler_version) for c in new_compiler)
@pytest.fixture
def clangdir(tmpdir):
"""Create a directory with some dummy compiler scripts in it.
def compilers_dir(mock_executable):
"""Create a directory with some mock compiler scripts in it.
Scripts are:
- clang
@ -145,11 +27,9 @@ def clangdir(tmpdir):
- gfortran-8
"""
with tmpdir.as_cwd():
with open("clang", "w") as f:
f.write(
"""\
#!/bin/sh
clang_path = mock_executable(
"clang",
output="""
if [ "$1" = "--version" ]; then
echo "clang version 11.0.0 (clang-1100.0.33.16)"
echo "Target: x86_64-apple-darwin18.7.0"
@ -159,12 +39,11 @@ else
echo "clang: error: no input files"
exit 1
fi
"""
)
shutil.copy("clang", "clang++")
""",
)
shutil.copy(clang_path, clang_path.parent / "clang++")
gcc_script = """\
#!/bin/sh
gcc_script = """
if [ "$1" = "-dumpversion" ]; then
echo "8"
elif [ "$1" = "-dumpfullversion" ]; then
@ -178,120 +57,187 @@ else
exit 1
fi
"""
with open("gcc-8", "w") as f:
f.write(gcc_script.format("gcc", "gcc-8"))
with open("g++-8", "w") as f:
f.write(gcc_script.format("g++", "g++-8"))
with open("gfortran-8", "w") as f:
f.write(gcc_script.format("GNU Fortran", "gfortran-8"))
os.chmod("clang", 0o700)
os.chmod("clang++", 0o700)
os.chmod("gcc-8", 0o700)
os.chmod("g++-8", 0o700)
os.chmod("gfortran-8", 0o700)
mock_executable("gcc-8", output=gcc_script.format("gcc", "gcc-8"))
mock_executable("g++-8", output=gcc_script.format("g++", "g++-8"))
mock_executable("gfortran-8", output=gcc_script.format("GNU Fortran", "gfortran-8"))
yield tmpdir
return clang_path.parent
@pytest.mark.skipif(
sys.platform == "win32",
reason="Cannot execute bash \
script on Windows",
)
@pytest.mark.regression("17590")
def test_compiler_find_mixed_suffixes(no_compilers_yaml, working_env, clangdir):
"""Ensure that we'll mix compilers with different suffixes when necessary."""
os.environ["PATH"] = str(clangdir)
@pytest.mark.skipif(sys.platform == "win32", reason="Cannot execute bash script on Windows")
@pytest.mark.regression("11678,13138")
def test_compiler_find_without_paths(no_compilers_yaml, working_env, mock_executable):
"""Tests that 'spack compiler find' looks into PATH by default, if no specific path
is given.
"""
gcc_path = mock_executable("gcc", output='echo "0.0.0"')
os.environ["PATH"] = str(gcc_path.parent)
output = compiler("find", "--scope=site")
assert "clang@=11.0.0" in output
assert "gcc@=8.4.0" in output
assert "gcc" in output
@pytest.mark.regression("17589")
def test_compiler_find_no_apple_gcc(no_compilers_yaml, working_env, mock_executable):
"""Tests that Spack won't mistake Apple's GCC as a "real" GCC, since it's really
Clang with a few tweaks.
"""
gcc_path = mock_executable(
"gcc",
output="""
if [ "$1" = "-dumpversion" ]; then
echo "4.2.1"
elif [ "$1" = "--version" ]; then
echo "Configured with: --prefix=/dummy"
echo "Apple clang version 11.0.0 (clang-1100.0.33.16)"
echo "Target: x86_64-apple-darwin18.7.0"
echo "Thread model: posix"
echo "InstalledDir: /dummy"
else
echo "clang: error: no input files"
fi
""",
)
os.environ["PATH"] = str(gcc_path.parent)
output = compiler("find", "--scope=site")
assert "gcc" not in output
@pytest.mark.regression("37996")
def test_compiler_remove(mutable_config, mock_packages):
"""Tests that we can remove a compiler from configuration."""
assert spack.spec.CompilerSpec("gcc@=4.5.0") in spack.compilers.all_compiler_specs()
args = spack.util.pattern.Bunch(all=True, compiler_spec="gcc@4.5.0", add_paths=[], scope=None)
spack.cmd.compiler.compiler_remove(args)
assert spack.spec.CompilerSpec("gcc@=4.5.0") not in spack.compilers.all_compiler_specs()
@pytest.mark.regression("37996")
def test_removing_compilers_from_multiple_scopes(mutable_config, mock_packages):
# Duplicate "site" scope into "user" scope
site_config = spack.config.get("compilers", scope="site")
spack.config.set("compilers", site_config, scope="user")
assert spack.spec.CompilerSpec("gcc@=4.5.0") in spack.compilers.all_compiler_specs()
args = spack.util.pattern.Bunch(all=True, compiler_spec="gcc@4.5.0", add_paths=[], scope=None)
spack.cmd.compiler.compiler_remove(args)
assert spack.spec.CompilerSpec("gcc@=4.5.0") not in spack.compilers.all_compiler_specs()
@pytest.mark.skipif(sys.platform == "win32", reason="Cannot execute bash script on Windows")
def test_compiler_add(mutable_config, mock_packages, mock_executable):
"""Tests that we can add a compiler to configuration."""
expected_version = "4.5.3"
gcc_path = mock_executable(
"gcc",
output=f"""\
for arg in "$@"; do
if [ "$arg" = -dumpversion ]; then
echo '{expected_version}'
fi
done
""",
)
bin_dir = gcc_path.parent
root_dir = bin_dir.parent
compilers_before_find = set(spack.compilers.all_compiler_specs())
args = spack.util.pattern.Bunch(
all=None, compiler_spec=None, add_paths=[str(root_dir)], scope=None
)
spack.cmd.compiler.compiler_find(args)
compilers_after_find = set(spack.compilers.all_compiler_specs())
compilers_added_by_find = compilers_after_find - compilers_before_find
assert len(compilers_added_by_find) == 1
new_compiler = compilers_added_by_find.pop()
assert new_compiler.version == spack.version.Version(expected_version)
@pytest.mark.skipif(sys.platform == "win32", reason="Cannot execute bash script on Windows")
@pytest.mark.regression("17590")
def test_compiler_find_mixed_suffixes(no_compilers_yaml, working_env, compilers_dir):
"""Ensure that we'll mix compilers with different suffixes when necessary."""
os.environ["PATH"] = str(compilers_dir)
output = compiler("find", "--scope=site")
assert "clang@11.0.0" in output
assert "gcc@8.4.0" in output
config = spack.compilers.get_compiler_config("site", False)
clang = next(c["compiler"] for c in config if c["compiler"]["spec"] == "clang@=11.0.0")
gcc = next(c["compiler"] for c in config if c["compiler"]["spec"] == "gcc@=8.4.0")
gfortran_path = str(clangdir.join("gfortran-8"))
gfortran_path = str(compilers_dir / "gfortran-8")
assert clang["paths"] == {
"cc": str(clangdir.join("clang")),
"cxx": str(clangdir.join("clang++")),
"cc": str(compilers_dir / "clang"),
"cxx": str(compilers_dir / "clang++"),
# we only auto-detect mixed clang on macos
"f77": gfortran_path if sys.platform == "darwin" else None,
"fc": gfortran_path if sys.platform == "darwin" else None,
}
assert gcc["paths"] == {
"cc": str(clangdir.join("gcc-8")),
"cxx": str(clangdir.join("g++-8")),
"cc": str(compilers_dir / "gcc-8"),
"cxx": str(compilers_dir / "g++-8"),
"f77": gfortran_path,
"fc": gfortran_path,
}
@pytest.mark.skipif(
sys.platform == "win32",
reason="Cannot execute bash \
script on Windows",
)
@pytest.mark.skipif(sys.platform == "win32", reason="Cannot execute bash script on Windows")
@pytest.mark.regression("17590")
def test_compiler_find_prefer_no_suffix(no_compilers_yaml, working_env, clangdir):
def test_compiler_find_prefer_no_suffix(no_compilers_yaml, working_env, compilers_dir):
"""Ensure that we'll pick 'clang' over 'clang-gpu' when there is a choice."""
with clangdir.as_cwd():
shutil.copy("clang", "clang-gpu")
shutil.copy("clang++", "clang++-gpu")
os.chmod("clang-gpu", 0o700)
os.chmod("clang++-gpu", 0o700)
clang_path = compilers_dir / "clang"
shutil.copy(clang_path, clang_path.parent / "clang-gpu")
shutil.copy(clang_path, clang_path.parent / "clang++-gpu")
os.environ["PATH"] = str(clangdir)
os.environ["PATH"] = str(compilers_dir)
output = compiler("find", "--scope=site")
assert "clang@=11.0.0" in output
assert "gcc@=8.4.0" in output
assert "clang@11.0.0" in output
assert "gcc@8.4.0" in output
config = spack.compilers.get_compiler_config("site", False)
clang = next(c["compiler"] for c in config if c["compiler"]["spec"] == "clang@=11.0.0")
assert clang["paths"]["cc"] == str(clangdir.join("clang"))
assert clang["paths"]["cxx"] == str(clangdir.join("clang++"))
assert clang["paths"]["cc"] == str(compilers_dir / "clang")
assert clang["paths"]["cxx"] == str(compilers_dir / "clang++")
@pytest.mark.skipif(
sys.platform == "win32",
reason="Cannot execute bash \
script on Windows",
)
def test_compiler_find_path_order(no_compilers_yaml, working_env, clangdir):
"""Ensure that we find compilers that come first in the PATH first"""
with clangdir.as_cwd():
os.mkdir("first_in_path")
shutil.copy("gcc-8", "first_in_path/gcc-8")
shutil.copy("g++-8", "first_in_path/g++-8")
shutil.copy("gfortran-8", "first_in_path/gfortran-8")
# the first_in_path folder should be searched first
os.environ["PATH"] = "{0}:{1}".format(str(clangdir.join("first_in_path")), str(clangdir))
@pytest.mark.skipif(sys.platform == "win32", reason="Cannot execute bash script on Windows")
def test_compiler_find_path_order(no_compilers_yaml, working_env, compilers_dir):
"""Ensure that we look for compilers in the same order as PATH, when there are duplicates"""
new_dir = compilers_dir / "first_in_path"
new_dir.mkdir()
for name in ("gcc-8", "g++-8", "gfortran-8"):
shutil.copy(compilers_dir / name, new_dir / name)
# Set PATH to have the new folder searched first
os.environ["PATH"] = "{}:{}".format(str(new_dir), str(compilers_dir))
compiler("find", "--scope=site")
config = spack.compilers.get_compiler_config("site", False)
gcc = next(c["compiler"] for c in config if c["compiler"]["spec"] == "gcc@=8.4.0")
assert gcc["paths"] == {
"cc": str(clangdir.join("first_in_path", "gcc-8")),
"cxx": str(clangdir.join("first_in_path", "g++-8")),
"f77": str(clangdir.join("first_in_path", "gfortran-8")),
"fc": str(clangdir.join("first_in_path", "gfortran-8")),
"cc": str(new_dir / "gcc-8"),
"cxx": str(new_dir / "g++-8"),
"f77": str(new_dir / "gfortran-8"),
"fc": str(new_dir / "gfortran-8"),
}
def test_compiler_list_empty(no_compilers_yaml, working_env, clangdir):
# Spack should not automatically search for compilers when listing them and none
# are available. And when stdout is not a tty like in tests, there should be no
# output and no error exit code.
os.environ["PATH"] = str(clangdir)
def test_compiler_list_empty(no_compilers_yaml, working_env, compilers_dir):
"""Spack should not automatically search for compilers when listing them and none are
available. And when stdout is not a tty like in tests, there should be no output and
no error exit code.
"""
os.environ["PATH"] = str(compilers_dir)
out = compiler("list")
assert not out
assert compiler.returncode == 0

View File

@ -396,17 +396,16 @@ def test_dev_build_rebuild_on_source_changes(
with envdir.as_cwd():
with open("spack.yaml", "w") as f:
f.write(
"""\
f"""\
spack:
specs:
- %s@0.0.0
- {test_spec}@0.0.0
develop:
dev-build-test-install:
spec: dev-build-test-install@0.0.0
path: %s
path: {build_dir}
"""
% (test_spec, build_dir)
)
env("create", "test", "./spack.yaml")

View File

@ -390,6 +390,19 @@ def test_remove_after_concretize():
assert not any(s.name == "mpileaks" for s in env_specs)
def test_remove_before_concretize():
e = ev.create("test")
e.unify = True
e.add("mpileaks")
e.concretize()
e.remove("mpileaks")
e.concretize()
assert not list(e.concretized_specs())
def test_remove_command():
env("create", "test")
assert "test" in env("list")
@ -906,7 +919,7 @@ packages:
mpileaks:
version: ["2.2"]
libelf:
version: ["0.8.11"]
version: ["0.8.10"]
"""
)
@ -3299,3 +3312,22 @@ def test_environment_created_in_users_location(mutable_config, tmpdir):
assert dir_name in out
assert env_dir in ev.root(dir_name)
assert os.path.isdir(os.path.join(env_dir, dir_name))
def test_environment_created_from_lockfile_has_view(mock_packages, tmpdir):
"""When an env is created from a lockfile, a view should be generated for it"""
env_a = str(tmpdir.join("a"))
env_b = str(tmpdir.join("b"))
# Create an environment and install a package in it
env("create", "-d", env_a)
with ev.Environment(env_a):
add("libelf")
install("--fake")
# Create another environment from the lockfile of the first environment
env("create", "-d", env_b, os.path.join(env_a, "spack.lock"))
# Make sure the view was created
with ev.Environment(env_b) as e:
assert os.path.isdir(e.view_path_default)

View File

@ -44,9 +44,8 @@ def define_plat_exe(exe):
def test_find_external_single_package(mock_executable, executables_found, _platform_executables):
pkgs_to_check = [spack.repo.path.get_pkg_class("cmake")]
executables_found(
{mock_executable("cmake", output="echo cmake version 1.foo"): define_plat_exe("cmake")}
)
cmake_path = mock_executable("cmake", output="echo cmake version 1.foo")
executables_found({str(cmake_path): define_plat_exe("cmake")})
pkg_to_entries = spack.detection.by_executable(pkgs_to_check)
@ -71,7 +70,7 @@ def test_find_external_two_instances_same_package(
"cmake", output="echo cmake version 3.17.2", subdir=("base2", "bin")
)
cmake_exe = define_plat_exe("cmake")
executables_found({cmake_path1: cmake_exe, cmake_path2: cmake_exe})
executables_found({str(cmake_path1): cmake_exe, str(cmake_path2): cmake_exe})
pkg_to_entries = spack.detection.by_executable(pkgs_to_check)
@ -107,7 +106,7 @@ def test_get_executables(working_env, mock_executable):
cmake_path1 = mock_executable("cmake", output="echo cmake version 1.foo")
path_to_exe = spack.detection.executables_in_path([os.path.dirname(cmake_path1)])
cmake_exe = define_plat_exe("cmake")
assert path_to_exe[cmake_path1] == cmake_exe
assert path_to_exe[str(cmake_path1)] == cmake_exe
external = SpackCommand("external")
@ -334,7 +333,7 @@ def test_packages_yaml_format(mock_executable, mutable_config, monkeypatch, _pla
assert "extra_attributes" in external_gcc
extra_attributes = external_gcc["extra_attributes"]
assert "prefix" not in extra_attributes
assert extra_attributes["compilers"]["c"] == gcc_exe
assert extra_attributes["compilers"]["c"] == str(gcc_exe)
def test_overriding_prefix(mock_executable, mutable_config, monkeypatch, _platform_executables):
@ -397,3 +396,30 @@ def test_use_tags_for_detection(command_args, mock_executable, mutable_config, m
assert "The following specs have been" in output
assert "cmake" in output
assert "openssl" not in output
@pytest.mark.regression("38733")
@pytest.mark.skipif(sys.platform == "win32", reason="the test uses bash scripts")
def test_failures_in_scanning_do_not_result_in_an_error(
mock_executable, monkeypatch, mutable_config
):
"""Tests that scanning paths with wrong permissions, won't cause `external find` to error."""
cmake_exe1 = mock_executable(
"cmake", output="echo cmake version 3.19.1", subdir=("first", "bin")
)
cmake_exe2 = mock_executable(
"cmake", output="echo cmake version 3.23.3", subdir=("second", "bin")
)
# Remove access from the first directory executable
cmake_exe1.parent.chmod(0o600)
value = os.pathsep.join([str(cmake_exe1.parent), str(cmake_exe2.parent)])
monkeypatch.setenv("PATH", value)
output = external("find", "cmake")
assert external.returncode == 0
assert "The following specs have been" in output
assert "cmake" in output
assert "3.23.3" in output
assert "3.19.1" not in output

View File

@ -357,3 +357,18 @@ def test_find_loaded(database, working_env):
output = find("--loaded")
expected = find()
assert output == expected
@pytest.mark.regression("37712")
def test_environment_with_version_range_in_compiler_doesnt_fail(tmp_path):
"""Tests that having an active environment with a root spec containing a compiler constrained
by a version range (i.e. @X.Y rather the single version than @=X.Y) doesn't result in an error
when invoking "spack find".
"""
test_environment = ev.create_in_dir(tmp_path)
test_environment.add("zlib %gcc@12.1.0")
test_environment.write()
with test_environment:
output = find()
assert "zlib%gcc@12.1.0" in output

View File

@ -319,3 +319,17 @@ def test_report_filename_for_cdash(install_mockery_mutable_config, mock_fetch):
spack.cmd.common.arguments.sanitize_reporter_options(args)
filename = spack.cmd.test.report_filename(args, suite)
assert filename != "https://blahblah/submit.php?project=debugging"
def test_test_output_multiple_specs(
mock_test_stage, mock_packages, mock_archive, mock_fetch, install_mockery_mutable_config
):
"""Ensure proper reporting for suite with skipped, failing, and passed tests."""
install("test-error", "simple-standalone-test@0.9", "simple-standalone-test@1.0")
out = spack_test("run", "test-error", "simple-standalone-test", fail_on_error=False)
# Note that a spec with passing *and* skipped tests is still considered
# to have passed at this level. If you want to see the spec-specific
# part result summaries, you'll have to look at the "test-out.txt" files
# for each spec.
assert "1 failed, 2 passed of 3 specs" in out

View File

@ -337,8 +337,6 @@ class TestConcretize(object):
# Get the compiler that matches the spec (
compiler = spack.compilers.compiler_for_spec("clang@=12.2.0", spec.architecture)
# Clear cache for compiler config since it has its own cache mechanism outside of config
spack.compilers._cache_config_file = []
# Configure spack to have two identical compilers with different flags
default_dict = spack.compilers._to_dict(compiler)
@ -2137,7 +2135,7 @@ class TestConcretize(object):
{
"compiler": {
"spec": "gcc@foo",
"paths": {"cc": gcc_path, "cxx": gcc_path, "f77": None, "fc": None},
"paths": {"cc": str(gcc_path), "cxx": str(gcc_path), "f77": None, "fc": None},
"operating_system": "debian6",
"modules": [],
}

View File

@ -152,7 +152,9 @@ class TestConcretizePreferences(object):
assert spec.version == Version("2.2")
def test_preferred_versions_mixed_version_types(self):
update_packages("mixedversions", "version", ["2.0"])
if spack.config.get("config:concretizer") == "original":
pytest.skip("This behavior is not enforced for the old concretizer")
update_packages("mixedversions", "version", ["=2.0"])
spec = concretize("mixedversions")
assert spec.version == Version("2.0")
@ -228,6 +230,29 @@ mpileaks:
spec.concretize()
assert spec.version == Version("3.5.0")
def test_preferred_undefined_raises(self):
"""Preference should not specify an undefined version"""
if spack.config.get("config:concretizer") == "original":
pytest.xfail("This behavior is not enforced for the old concretizer")
update_packages("python", "version", ["3.5.0.1"])
spec = Spec("python")
with pytest.raises(spack.config.ConfigError):
spec.concretize()
def test_preferred_truncated(self):
"""Versions without "=" are treated as version ranges: if there is
a satisfying version defined in the package.py, we should use that
(don't define a new version).
"""
if spack.config.get("config:concretizer") == "original":
pytest.skip("This behavior is not enforced for the old concretizer")
update_packages("python", "version", ["3.5"])
spec = Spec("python")
spec.concretize()
assert spec.satisfies("@3.5.1")
def test_develop(self):
"""Test concretization with develop-like versions"""
spec = Spec("develop-test")

View File

@ -66,6 +66,28 @@ class V(Package):
)
_pkgt = (
"t",
"""\
class T(Package):
version('2.1')
version('2.0')
depends_on('u', when='@2.1:')
""",
)
_pkgu = (
"u",
"""\
class U(Package):
version('1.1')
version('1.0')
""",
)
@pytest.fixture
def create_test_repo(tmpdir, mutable_config):
repo_path = str(tmpdir)
@ -79,7 +101,7 @@ repo:
)
packages_dir = tmpdir.join("packages")
for pkg_name, pkg_str in [_pkgx, _pkgy, _pkgv]:
for pkg_name, pkg_str in [_pkgx, _pkgy, _pkgv, _pkgt, _pkgu]:
pkg_dir = packages_dir.ensure(pkg_name, dir=True)
pkg_file = pkg_dir.join("package.py")
with open(str(pkg_file), "w") as f:
@ -144,6 +166,45 @@ packages:
Spec("x@1.1").concretize()
def test_require_undefined_version(concretize_scope, test_repo):
"""If a requirement specifies a numbered version that isn't in
the associated package.py and isn't part of a Git hash
equivalence (hash=number), then Spack should raise an error
(it is assumed this is a typo, and raising the error here
avoids a likely error when Spack attempts to fetch the version).
"""
if spack.config.get("config:concretizer") == "original":
pytest.skip("Original concretizer does not support configuration requirements")
conf_str = """\
packages:
x:
require: "@1.2"
"""
update_packages_config(conf_str)
with pytest.raises(spack.config.ConfigError):
Spec("x").concretize()
def test_require_truncated(concretize_scope, test_repo):
"""A requirement specifies a version range, with satisfying
versions defined in the package.py. Make sure we choose one
of the defined versions (vs. allowing the requirement to
define a new version).
"""
if spack.config.get("config:concretizer") == "original":
pytest.skip("Original concretizer does not support configuration requirements")
conf_str = """\
packages:
x:
require: "@1"
"""
update_packages_config(conf_str)
xspec = Spec("x").concretized()
assert xspec.satisfies("@1.1")
def test_git_user_supplied_reference_satisfaction(
concretize_scope, test_repo, mock_git_version_info, monkeypatch
):
@ -220,6 +281,40 @@ packages:
assert s1.version.ref == a_commit_hash
def test_requirement_adds_version_satisfies(
concretize_scope, test_repo, mock_git_version_info, monkeypatch
):
"""Make sure that new versions added by requirements are factored into
conditions. In this case create a new version that satisfies a
depends_on condition and make sure it is triggered (i.e. the
dependency is added).
"""
if spack.config.get("config:concretizer") == "original":
pytest.skip("Original concretizer does not support configuration" " requirements")
repo_path, filename, commits = mock_git_version_info
monkeypatch.setattr(
spack.package_base.PackageBase, "git", path_to_file_url(repo_path), raising=False
)
# Sanity check: early version of T does not include U
s0 = Spec("t@2.0").concretized()
assert not ("u" in s0)
conf_str = """\
packages:
t:
require: "@{0}=2.2"
""".format(
commits[0]
)
update_packages_config(conf_str)
s1 = Spec("t").concretized()
assert "u" in s1
assert s1.satisfies("@2.2")
def test_requirement_adds_git_hash_version(
concretize_scope, test_repo, mock_git_version_info, monkeypatch
):
@ -272,8 +367,11 @@ packages:
def test_preference_adds_new_version(
concretize_scope, test_repo, mock_git_version_info, monkeypatch
):
"""Normally a preference cannot define a new version, but that constraint
is ignored if the version is a Git hash-based version.
"""
if spack.config.get("config:concretizer") == "original":
pytest.skip("Original concretizer does not support configuration requirements")
pytest.skip("Original concretizer does not enforce this constraint for preferences")
repo_path, filename, commits = mock_git_version_info
monkeypatch.setattr(
@ -296,6 +394,29 @@ packages:
assert not s3.satisfies("@2.3")
def test_external_adds_new_version_that_is_preferred(concretize_scope, test_repo):
"""Test that we can use a version, not declared in package recipe, as the
preferred version if that version appears in an external spec.
"""
if spack.config.get("config:concretizer") == "original":
pytest.skip("Original concretizer does not enforce this constraint for preferences")
conf_str = """\
packages:
y:
version: ["2.7"]
externals:
- spec: y@2.7 # Not defined in y
prefix: /fake/nonexistent/path/
buildable: false
"""
update_packages_config(conf_str)
spec = Spec("x").concretized()
assert spec["y"].satisfies("@2.7")
assert spack.version.Version("2.7") not in spec["y"].package.versions
def test_requirement_is_successfully_applied(concretize_scope, test_repo):
"""If a simple requirement can be satisfied, make sure the
concretization succeeds and the requirement spec is applied.

View File

@ -1669,22 +1669,21 @@ def clear_directive_functions():
@pytest.fixture
def mock_executable(tmpdir):
def mock_executable(tmp_path):
"""Factory to create a mock executable in a temporary directory that
output a custom string when run.
"""
import jinja2
shebang = "#!/bin/sh\n" if sys.platform != "win32" else "@ECHO OFF"
def _factory(name, output, subdir=("bin",)):
f = tmpdir.ensure(*subdir, dir=True).join(name)
executable_dir = tmp_path.joinpath(*subdir)
executable_dir.mkdir(parents=True, exist_ok=True)
executable_path = executable_dir / name
if sys.platform == "win32":
f += ".bat"
t = jinja2.Template("{{ shebang }}{{ output }}\n")
f.write(t.render(shebang=shebang, output=output))
f.chmod(0o755)
return str(f)
executable_path = executable_dir / (name + ".bat")
executable_path.write_text(f"{ shebang }{ output }\n")
executable_path.chmod(0o755)
return executable_path
return _factory

View File

@ -4,7 +4,7 @@ lmod:
hash_length: 0
core_compilers:
- 'clang@3.3'
- 'clang@12.0.0'
core_specs:
- 'mpich@3.0.1'

View File

@ -0,0 +1,5 @@
enable:
- lmod
lmod:
core_compilers:
- 'clang@12.0.0'

View File

@ -0,0 +1,5 @@
enable:
- lmod
lmod:
core_compilers:
- 'clang@=12.0.0'

View File

@ -0,0 +1,17 @@
#!/usr/bin/env bash
#
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
_module_raw() { return 1; };
module() { return 1; };
ml() { return 1; };
export -f _module_raw;
export -f module;
export -f ml;
export MODULES_AUTO_HANDLING=1
export __MODULES_LMCONFLICT=bar&foo
export NEW_VAR=new

View File

@ -400,7 +400,7 @@ def test_sanitize_literals(env, exclude, include):
({"SHLVL": "1"}, ["SH.*"], [], [], ["SHLVL"]),
# Check we can include using a regex
({"SHLVL": "1"}, ["SH.*"], ["SH.*"], ["SHLVL"], []),
# Check regex to exclude Modules v4 related vars
# Check regex to exclude Environment Modules related vars
(
{"MODULES_LMALTNAME": "1", "MODULES_LMCONFLICT": "2"},
["MODULES_(.*)"],
@ -415,6 +415,13 @@ def test_sanitize_literals(env, exclude, include):
[],
["A_modquar", "b_modquar", "C_modshare"],
),
(
{"__MODULES_LMTAG": "1", "__MODULES_LMPREREQ": "2"},
["__MODULES_(.*)"],
[],
[],
["__MODULES_LMTAG", "__MODULES_LMPREREQ"],
),
],
)
def test_sanitize_regex(env, exclude, include, expected, deleted):
@ -489,3 +496,19 @@ def test_exclude_lmod_variables():
# Check that variables related to lmod are not in there
modifications = env.group_by_name()
assert not any(x.startswith("LMOD_") for x in modifications)
@pytest.mark.skipif(sys.platform == "win32", reason="Not supported on Windows (yet)")
@pytest.mark.regression("13504")
def test_exclude_modules_variables():
# Construct the list of environment modifications
file = os.path.join(datadir, "sourceme_modules.sh")
env = EnvironmentModifications.from_sourcing_file(file)
# Check that variables related to modules are not in there
modifications = env.group_by_name()
assert not any(x.startswith("MODULES_") for x in modifications)
assert not any(x.startswith("__MODULES_") for x in modifications)
assert not any(x.startswith("BASH_FUNC_ml") for x in modifications)
assert not any(x.startswith("BASH_FUNC_module") for x in modifications)
assert not any(x.startswith("BASH_FUNC__module_raw") for x in modifications)

View File

@ -1399,17 +1399,24 @@ def test_print_install_test_log_skipped(install_mockery, mock_packages, capfd, r
assert out == ""
def test_print_install_test_log_missing(
def test_print_install_test_log_failures(
tmpdir, install_mockery, mock_packages, ensure_debug, capfd
):
"""Confirm expected error on attempt to print missing test log file."""
"""Confirm expected outputs when there are test failures."""
name = "trivial-install-test-package"
s = spack.spec.Spec(name).concretized()
pkg = s.package
# Missing test log is an error
pkg.run_tests = True
pkg.tester.test_log_file = str(tmpdir.join("test-log.txt"))
pkg.tester.add_failure(AssertionError("test"), "test-failure")
spack.installer.print_install_test_log(pkg)
err = capfd.readouterr()[1]
assert "no test log file" in err
# Having test log results in path being output
fs.touch(pkg.tester.test_log_file)
spack.installer.print_install_test_log(pkg)
out = capfd.readouterr()[0]
assert "See test results at" in out

View File

@ -8,6 +8,8 @@ import sys
import pytest
import spack.cmd.modules
import spack.config
import spack.error
import spack.modules.tcl
import spack.package_base
@ -187,3 +189,31 @@ def test_load_installed_package_not_in_repo(install_mockery, mock_fetch, monkeyp
assert module_path
spack.package_base.PackageBase.uninstall_by_spec(spec)
@pytest.mark.regression("37649")
def test_check_module_set_name(mutable_config):
"""Tests that modules set name are validated correctly and an error is reported if the
name we require does not exist or is reserved by the configuration."""
# Minimal modules.yaml config.
spack.config.set(
"modules",
{
"prefix_inspections": {"./bin": ["PATH"]},
# module sets
"first": {},
"second": {},
},
)
# Valid module set name
spack.cmd.modules.check_module_set_name("first")
# Invalid module set names
msg = "Valid module set names are"
with pytest.raises(spack.config.ConfigError, match=msg):
spack.cmd.modules.check_module_set_name("prefix_inspections")
with pytest.raises(spack.config.ConfigError, match=msg):
spack.cmd.modules.check_module_set_name("third")

View File

@ -45,6 +45,18 @@ def provider(request):
@pytest.mark.usefixtures("config", "mock_packages")
class TestLmod(object):
@pytest.mark.regression("37788")
@pytest.mark.parametrize("modules_config", ["core_compilers", "core_compilers_at_equal"])
def test_layout_for_specs_compiled_with_core_compilers(
self, modules_config, module_configuration, factory
):
"""Tests that specs compiled with core compilers are in the 'Core' folder. Also tests that
we can use both ``compiler@version`` and ``compiler@=version`` to specify a core compiler.
"""
module_configuration(modules_config)
module, spec = factory("libelf%clang@12.0.0")
assert "Core" in module.layout.available_path_parts
def test_file_layout(self, compiler, provider, factory, module_configuration):
"""Tests the layout of files in the hierarchy is the one expected."""
module_configuration("complex_hierarchy")
@ -61,7 +73,7 @@ class TestLmod(object):
# is transformed to r"Core" if the compiler is listed among core
# compilers
# Check that specs listed as core_specs are transformed to "Core"
if compiler == "clang@=3.3" or spec_string == "mpich@3.0.1":
if compiler == "clang@=12.0.0" or spec_string == "mpich@3.0.1":
assert "Core" in layout.available_path_parts
else:
assert compiler.replace("@=", "/") in layout.available_path_parts
@ -155,6 +167,46 @@ class TestLmod(object):
assert len([x for x in content if 'append_path("SPACE", "qux", " ")' in x]) == 1
assert len([x for x in content if 'remove_path("SPACE", "qux", " ")' in x]) == 1
@pytest.mark.regression("11355")
def test_manpath_setup(self, modulefile_content, module_configuration):
"""Tests specific setup of MANPATH environment variable."""
module_configuration("autoload_direct")
# no manpath set by module
content = modulefile_content("mpileaks")
assert len([x for x in content if 'append_path("MANPATH", "", ":")' in x]) == 0
# manpath set by module with prepend_path
content = modulefile_content("module-manpath-prepend")
assert (
len([x for x in content if 'prepend_path("MANPATH", "/path/to/man", ":")' in x]) == 1
)
assert (
len([x for x in content if 'prepend_path("MANPATH", "/path/to/share/man", ":")' in x])
== 1
)
assert len([x for x in content if 'append_path("MANPATH", "", ":")' in x]) == 1
# manpath set by module with append_path
content = modulefile_content("module-manpath-append")
assert len([x for x in content if 'append_path("MANPATH", "/path/to/man", ":")' in x]) == 1
assert len([x for x in content if 'append_path("MANPATH", "", ":")' in x]) == 1
# manpath set by module with setenv
content = modulefile_content("module-manpath-setenv")
assert len([x for x in content if 'setenv("MANPATH", "/path/to/man")' in x]) == 1
assert len([x for x in content if 'append_path("MANPATH", "", ":")' in x]) == 0
@pytest.mark.regression("29578")
def test_setenv_raw_value(self, modulefile_content, module_configuration):
"""Tests that we can set environment variable value without formatting it."""
module_configuration("autoload_direct")
content = modulefile_content("module-setenv-raw")
assert len([x for x in content if 'setenv("FOO", "{{name}}, {name}, {{}}, {}")' in x]) == 1
def test_help_message(self, modulefile_content, module_configuration):
"""Tests the generation of module help message."""
@ -180,6 +232,18 @@ class TestLmod(object):
)
assert help_msg in "".join(content)
content = modulefile_content("module-long-help target=core2")
help_msg = (
"help([[Name : module-long-help]])"
"help([[Version: 1.0]])"
"help([[Target : core2]])"
"help()"
"help([[Package to test long description message generated in modulefile."
"Message too long is wrapped over multiple lines.]])"
)
assert help_msg in "".join(content)
def test_exclude(self, modulefile_content, module_configuration):
"""Tests excluding the generation of selected modules."""
module_configuration("exclude")

View File

@ -29,7 +29,7 @@ class TestTcl(object):
module_configuration("autoload_direct")
content = modulefile_content(mpich_spec_string)
assert 'module-whatis "mpich @3.0.4"' in content
assert "module-whatis {mpich @3.0.4}" in content
def test_autoload_direct(self, modulefile_content, module_configuration):
"""Tests the automatic loading of direct dependencies."""
@ -37,6 +37,11 @@ class TestTcl(object):
module_configuration("autoload_direct")
content = modulefile_content(mpileaks_spec_string)
assert (
len([x for x in content if "if {![info exists ::env(LMOD_VERSION_MAJOR)]} {" in x])
== 1
)
assert len([x for x in content if "depends-on " in x]) == 2
assert len([x for x in content if "module load " in x]) == 2
# dtbuild1 has
@ -46,6 +51,11 @@ class TestTcl(object):
# Just make sure the 'build' dependency is not there
content = modulefile_content("dtbuild1")
assert (
len([x for x in content if "if {![info exists ::env(LMOD_VERSION_MAJOR)]} {" in x])
== 1
)
assert len([x for x in content if "depends-on " in x]) == 2
assert len([x for x in content if "module load " in x]) == 2
# The configuration file sets the verbose keyword to False
@ -58,6 +68,11 @@ class TestTcl(object):
module_configuration("autoload_all")
content = modulefile_content(mpileaks_spec_string)
assert (
len([x for x in content if "if {![info exists ::env(LMOD_VERSION_MAJOR)]} {" in x])
== 1
)
assert len([x for x in content if "depends-on " in x]) == 5
assert len([x for x in content if "module load " in x]) == 5
# dtbuild1 has
@ -67,6 +82,11 @@ class TestTcl(object):
# Just make sure the 'build' dependency is not there
content = modulefile_content("dtbuild1")
assert (
len([x for x in content if "if {![info exists ::env(LMOD_VERSION_MAJOR)]} {" in x])
== 1
)
assert len([x for x in content if "depends-on " in x]) == 2
assert len([x for x in content if "module load " in x]) == 2
def test_prerequisites_direct(self, modulefile_content, module_configuration):
@ -92,17 +112,18 @@ class TestTcl(object):
content = modulefile_content("mpileaks platform=test target=x86_64")
assert len([x for x in content if x.startswith("prepend-path CMAKE_PREFIX_PATH")]) == 0
assert len([x for x in content if 'setenv FOO "foo"' in x]) == 1
assert len([x for x in content if 'setenv OMPI_MCA_mpi_leave_pinned "1"' in x]) == 1
assert len([x for x in content if 'setenv OMPI_MCA_MPI_LEAVE_PINNED "1"' in x]) == 0
assert len([x for x in content if "setenv FOO {foo}" in x]) == 1
assert len([x for x in content if "setenv OMPI_MCA_mpi_leave_pinned {1}" in x]) == 1
assert len([x for x in content if "setenv OMPI_MCA_MPI_LEAVE_PINNED {1}" in x]) == 0
assert len([x for x in content if "unsetenv BAR" in x]) == 1
assert len([x for x in content if "setenv MPILEAKS_ROOT" in x]) == 1
content = modulefile_content("libdwarf platform=test target=core2")
assert len([x for x in content if x.startswith("prepend-path CMAKE_PREFIX_PATH")]) == 0
assert len([x for x in content if 'setenv FOO "foo"' in x]) == 0
assert len([x for x in content if "setenv FOO {foo}" in x]) == 0
assert len([x for x in content if "unsetenv BAR" in x]) == 0
assert len([x for x in content if "depends-on foo/bar" in x]) == 1
assert len([x for x in content if "module load foo/bar" in x]) == 1
assert len([x for x in content if "setenv LIBDWARF_ROOT" in x]) == 1
@ -112,14 +133,63 @@ class TestTcl(object):
module_configuration("module_path_separator")
content = modulefile_content("module-path-separator")
assert len([x for x in content if 'append-path --delim ":" COLON "foo"' in x]) == 1
assert len([x for x in content if 'prepend-path --delim ":" COLON "foo"' in x]) == 1
assert len([x for x in content if 'remove-path --delim ":" COLON "foo"' in x]) == 1
assert len([x for x in content if 'append-path --delim ";" SEMICOLON "bar"' in x]) == 1
assert len([x for x in content if 'prepend-path --delim ";" SEMICOLON "bar"' in x]) == 1
assert len([x for x in content if 'remove-path --delim ";" SEMICOLON "bar"' in x]) == 1
assert len([x for x in content if 'append-path --delim " " SPACE "qux"' in x]) == 1
assert len([x for x in content if 'remove-path --delim " " SPACE "qux"' in x]) == 1
assert len([x for x in content if "append-path --delim {:} COLON {foo}" in x]) == 1
assert len([x for x in content if "prepend-path --delim {:} COLON {foo}" in x]) == 1
assert len([x for x in content if "remove-path --delim {:} COLON {foo}" in x]) == 1
assert len([x for x in content if "append-path --delim {;} SEMICOLON {bar}" in x]) == 1
assert len([x for x in content if "prepend-path --delim {;} SEMICOLON {bar}" in x]) == 1
assert len([x for x in content if "remove-path --delim {;} SEMICOLON {bar}" in x]) == 1
assert len([x for x in content if "append-path --delim { } SPACE {qux}" in x]) == 1
assert len([x for x in content if "remove-path --delim { } SPACE {qux}" in x]) == 1
@pytest.mark.regression("11355")
def test_manpath_setup(self, modulefile_content, module_configuration):
"""Tests specific setup of MANPATH environment variable."""
module_configuration("autoload_direct")
# no manpath set by module
content = modulefile_content("mpileaks")
assert len([x for x in content if "append-path --delim {:} MANPATH {}" in x]) == 0
# manpath set by module with prepend-path
content = modulefile_content("module-manpath-prepend")
assert (
len([x for x in content if "prepend-path --delim {:} MANPATH {/path/to/man}" in x])
== 1
)
assert (
len(
[
x
for x in content
if "prepend-path --delim {:} MANPATH {/path/to/share/man}" in x
]
)
== 1
)
assert len([x for x in content if "append-path --delim {:} MANPATH {}" in x]) == 1
# manpath set by module with append-path
content = modulefile_content("module-manpath-append")
assert (
len([x for x in content if "append-path --delim {:} MANPATH {/path/to/man}" in x]) == 1
)
assert len([x for x in content if "append-path --delim {:} MANPATH {}" in x]) == 1
# manpath set by module with setenv
content = modulefile_content("module-manpath-setenv")
assert len([x for x in content if "setenv MANPATH {/path/to/man}" in x]) == 1
assert len([x for x in content if "append-path --delim {:} MANPATH {}" in x]) == 0
@pytest.mark.regression("29578")
def test_setenv_raw_value(self, modulefile_content, module_configuration):
"""Tests that we can set environment variable value without formatting it."""
module_configuration("autoload_direct")
content = modulefile_content("module-setenv-raw")
assert len([x for x in content if "setenv FOO {{{name}}, {name}, {{}}, {}}" in x]) == 1
def test_help_message(self, modulefile_content, module_configuration):
"""Tests the generation of module help message."""
@ -129,11 +199,11 @@ class TestTcl(object):
help_msg = (
"proc ModulesHelp { } {"
' puts stderr "Name : mpileaks"'
' puts stderr "Version: 2.3"'
' puts stderr "Target : core2"'
' puts stderr ""'
' puts stderr "Mpileaks is a mock package that passes audits"'
" puts stderr {Name : mpileaks}"
" puts stderr {Version: 2.3}"
" puts stderr {Target : core2}"
" puts stderr {}"
" puts stderr {Mpileaks is a mock package that passes audits}"
"}"
)
assert help_msg in "".join(content)
@ -142,9 +212,23 @@ class TestTcl(object):
help_msg = (
"proc ModulesHelp { } {"
' puts stderr "Name : libdwarf"'
' puts stderr "Version: 20130729"'
' puts stderr "Target : core2"'
" puts stderr {Name : libdwarf}"
" puts stderr {Version: 20130729}"
" puts stderr {Target : core2}"
"}"
)
assert help_msg in "".join(content)
content = modulefile_content("module-long-help target=core2")
help_msg = (
"proc ModulesHelp { } {"
" puts stderr {Name : module-long-help}"
" puts stderr {Version: 1.0}"
" puts stderr {Target : core2}"
" puts stderr {}"
" puts stderr {Package to test long description message generated in modulefile.}"
" puts stderr {Message too long is wrapped over multiple lines.}"
"}"
)
assert help_msg in "".join(content)
@ -299,14 +383,14 @@ class TestTcl(object):
content = modulefile_content("mpileaks")
assert len([x for x in content if "setenv FOOBAR" in x]) == 1
assert len([x for x in content if 'setenv FOOBAR "mpileaks"' in x]) == 1
assert len([x for x in content if "setenv FOOBAR {mpileaks}" in x]) == 1
spec = spack.spec.Spec("mpileaks")
spec.concretize()
content = modulefile_content(str(spec["callpath"]))
assert len([x for x in content if "setenv FOOBAR" in x]) == 1
assert len([x for x in content if 'setenv FOOBAR "callpath"' in x]) == 1
assert len([x for x in content if "setenv FOOBAR {callpath}" in x]) == 1
def test_override_config(self, module_configuration, factory):
"""Tests overriding some sections of the configuration file."""
@ -347,7 +431,7 @@ class TestTcl(object):
assert 'puts stderr "sentence from package"' in content
short_description = 'module-whatis "This package updates the context for Tcl modulefiles."'
short_description = "module-whatis {This package updates the context for Tcl modulefiles.}"
assert short_description in content
@pytest.mark.regression("4400")
@ -394,10 +478,16 @@ class TestTcl(object):
# Test the mpileaks that should have the autoloaded dependencies
content = modulefile_content("mpileaks ^mpich2")
assert len([x for x in content if "depends-on " in x]) == 2
assert len([x for x in content if "module load " in x]) == 2
# Test the mpileaks that should NOT have the autoloaded dependencies
content = modulefile_content("mpileaks ^mpich")
assert (
len([x for x in content if "if {![info exists ::env(LMOD_VERSION_MAJOR)]} {" in x])
== 0
)
assert len([x for x in content if "depends-on " in x]) == 0
assert len([x for x in content if "module load " in x]) == 0
def test_modules_no_arch(self, factory, module_configuration):

View File

@ -62,7 +62,7 @@ def source_file(tmpdir, is_relocatable):
src = tmpdir.join("relocatable.c")
shutil.copy(template_src, str(src))
else:
template_dirs = [os.path.join(spack.paths.test_path, "data", "templates")]
template_dirs = (os.path.join(spack.paths.test_path, "data", "templates"),)
env = spack.tengine.make_environment(template_dirs)
template = env.get_template("non_relocatable.c")
text = template.render({"prefix": spack.store.layout.root})
@ -246,12 +246,12 @@ def test_set_elf_rpaths(mock_patchelf):
# the call made to patchelf itself
patchelf = mock_patchelf("echo $@")
rpaths = ["/usr/lib", "/usr/lib64", "/opt/local/lib"]
output = spack.relocate._set_elf_rpaths(patchelf, rpaths)
output = spack.relocate._set_elf_rpaths(str(patchelf), rpaths)
# Assert that the arguments of the call to patchelf are as expected
assert "--force-rpath" in output
assert "--set-rpath " + ":".join(rpaths) in output
assert patchelf in output
assert str(patchelf) in output
@skip_unless_linux
@ -261,7 +261,7 @@ def test_set_elf_rpaths_warning(mock_patchelf):
rpaths = ["/usr/lib", "/usr/lib64", "/opt/local/lib"]
# To avoid using capfd in order to check if the warning was triggered
# here we just check that output is not set
output = spack.relocate._set_elf_rpaths(patchelf, rpaths)
output = spack.relocate._set_elf_rpaths(str(patchelf), rpaths)
assert output is None

View File

@ -71,7 +71,7 @@ class TestTengineEnvironment(object):
"""Tests the template retrieval mechanism hooked into config files"""
# Check the directories are correct
template_dirs = spack.config.get("config:template_dirs")
template_dirs = [canonicalize_path(x) for x in template_dirs]
template_dirs = tuple([canonicalize_path(x) for x in template_dirs])
assert len(template_dirs) == 3
env = tengine.make_environment(template_dirs)

View File

@ -12,6 +12,7 @@ from llnl.util.filesystem import join_path, mkdirp, touch
import spack.install_test
import spack.spec
from spack.install_test import TestStatus
from spack.util.executable import which
@ -20,7 +21,7 @@ def _true(*args, **kwargs):
return True
def ensure_results(filename, expected):
def ensure_results(filename, expected, present=True):
assert os.path.exists(filename)
with open(filename, "r") as fd:
lines = fd.readlines()
@ -29,7 +30,10 @@ def ensure_results(filename, expected):
if expected in line:
have = True
break
assert have
if present:
assert have, f"Expected '{expected}' in the file"
else:
assert not have, f"Expected '{expected}' NOT to be in the file"
def test_test_log_name(mock_packages, config):
@ -78,8 +82,8 @@ def test_write_test_result(mock_packages, mock_test_stage):
assert spec.name in msg
def test_test_uninstalled(mock_packages, install_mockery, mock_test_stage):
"""Attempt to perform stand-alone test for uninstalled package."""
def test_test_not_installed(mock_packages, install_mockery, mock_test_stage):
"""Attempt to perform stand-alone test for not_installed package."""
spec = spack.spec.Spec("trivial-smoke-test").concretized()
test_suite = spack.install_test.TestSuite([spec])
@ -91,10 +95,7 @@ def test_test_uninstalled(mock_packages, install_mockery, mock_test_stage):
@pytest.mark.parametrize(
"arguments,status,msg",
[
({}, spack.install_test.TestStatus.SKIPPED, "Skipped"),
({"externals": True}, spack.install_test.TestStatus.NO_TESTS, "No tests"),
],
[({}, TestStatus.SKIPPED, "Skipped"), ({"externals": True}, TestStatus.NO_TESTS, "No tests")],
)
def test_test_external(
mock_packages, install_mockery, mock_test_stage, monkeypatch, arguments, status, msg
@ -156,6 +157,7 @@ def test_test_spec_passes(mock_packages, install_mockery, mock_test_stage, monke
ensure_results(test_suite.results_file, "PASSED")
ensure_results(test_suite.log_file_for_spec(spec), "simple stand-alone")
ensure_results(test_suite.log_file_for_spec(spec), "standalone-ifc", present=False)
def test_get_test_suite():
@ -212,8 +214,10 @@ def test_test_functions_pkgless(mock_packages, install_mockery, ensure_debug, ca
spec = spack.spec.Spec("simple-standalone-test").concretized()
fns = spack.install_test.test_functions(spec.package, add_virtuals=True)
out = capsys.readouterr()
assert len(fns) == 1, "Expected only one test function"
assert "does not appear to have a package file" in out[1]
assert len(fns) == 2, "Expected two test functions"
for f in fns:
assert f[1].__name__ in ["test_echo", "test_skip"]
assert "virtual does not appear to have a package file" in out[1]
# TODO: This test should go away when compilers as dependencies is supported
@ -301,7 +305,7 @@ def test_test_part_fail(tmpdir, install_mockery_mutable_config, mock_fetch, mock
for part_name, status in pkg.tester.test_parts.items():
assert part_name.endswith(name)
assert status == spack.install_test.TestStatus.FAILED
assert status == TestStatus.FAILED
def test_test_part_pass(install_mockery_mutable_config, mock_fetch, mock_test_stage):
@ -317,7 +321,7 @@ def test_test_part_pass(install_mockery_mutable_config, mock_fetch, mock_test_st
for part_name, status in pkg.tester.test_parts.items():
assert part_name.endswith(name)
assert status == spack.install_test.TestStatus.PASSED
assert status == TestStatus.PASSED
def test_test_part_skip(install_mockery_mutable_config, mock_fetch, mock_test_stage):
@ -331,7 +335,7 @@ def test_test_part_skip(install_mockery_mutable_config, mock_fetch, mock_test_st
for part_name, status in pkg.tester.test_parts.items():
assert part_name.endswith(name)
assert status == spack.install_test.TestStatus.SKIPPED
assert status == TestStatus.SKIPPED
def test_test_part_missing_exe_fail_fast(
@ -354,7 +358,7 @@ def test_test_part_missing_exe_fail_fast(
assert len(test_parts) == 1
for part_name, status in test_parts.items():
assert part_name.endswith(name)
assert status == spack.install_test.TestStatus.FAILED
assert status == TestStatus.FAILED
def test_test_part_missing_exe(
@ -375,7 +379,90 @@ def test_test_part_missing_exe(
assert len(test_parts) == 1
for part_name, status in test_parts.items():
assert part_name.endswith(name)
assert status == spack.install_test.TestStatus.FAILED
assert status == TestStatus.FAILED
# TODO (embedded test parts): Update this once embedded test part tracking
# TODO (embedded test parts): properly handles the nested context managers.
@pytest.mark.parametrize(
"current,substatuses,expected",
[
(TestStatus.PASSED, [TestStatus.PASSED, TestStatus.PASSED], TestStatus.PASSED),
(TestStatus.FAILED, [TestStatus.PASSED, TestStatus.PASSED], TestStatus.FAILED),
(TestStatus.SKIPPED, [TestStatus.PASSED, TestStatus.PASSED], TestStatus.SKIPPED),
(TestStatus.NO_TESTS, [TestStatus.PASSED, TestStatus.PASSED], TestStatus.NO_TESTS),
(TestStatus.PASSED, [TestStatus.PASSED, TestStatus.SKIPPED], TestStatus.PASSED),
(TestStatus.PASSED, [TestStatus.PASSED, TestStatus.FAILED], TestStatus.FAILED),
(TestStatus.PASSED, [TestStatus.SKIPPED, TestStatus.SKIPPED], TestStatus.SKIPPED),
],
)
def test_embedded_test_part_status(
install_mockery_mutable_config, mock_fetch, mock_test_stage, current, substatuses, expected
):
"""Check to ensure the status of the enclosing test part reflects summary of embedded parts."""
s = spack.spec.Spec("trivial-smoke-test").concretized()
pkg = s.package
base_name = "test_example"
part_name = f"{pkg.__class__.__name__}::{base_name}"
pkg.tester.test_parts[part_name] = current
for i, status in enumerate(substatuses):
pkg.tester.test_parts[f"{part_name}_{i}"] = status
pkg.tester.status(base_name, current)
assert pkg.tester.test_parts[part_name] == expected
@pytest.mark.parametrize(
"statuses,expected",
[
([TestStatus.PASSED, TestStatus.PASSED], TestStatus.PASSED),
([TestStatus.PASSED, TestStatus.SKIPPED], TestStatus.PASSED),
([TestStatus.PASSED, TestStatus.FAILED], TestStatus.FAILED),
([TestStatus.SKIPPED, TestStatus.SKIPPED], TestStatus.SKIPPED),
([], TestStatus.NO_TESTS),
],
)
def test_write_tested_status(
tmpdir, install_mockery_mutable_config, mock_fetch, mock_test_stage, statuses, expected
):
"""Check to ensure the status of the enclosing test part reflects summary of embedded parts."""
s = spack.spec.Spec("trivial-smoke-test").concretized()
pkg = s.package
for i, status in enumerate(statuses):
pkg.tester.test_parts[f"test_{i}"] = status
pkg.tester.counts[status] += 1
pkg.tester.tested_file = tmpdir.join("test-log.txt")
pkg.tester.write_tested_status()
with open(pkg.tester.tested_file, "r") as f:
status = int(f.read().strip("\n"))
assert TestStatus(status) == expected
@pytest.mark.regression("37840")
def test_write_tested_status_no_repeats(
tmpdir, install_mockery_mutable_config, mock_fetch, mock_test_stage
):
"""Emulate re-running the same stand-alone tests a second time."""
s = spack.spec.Spec("trivial-smoke-test").concretized()
pkg = s.package
statuses = [TestStatus.PASSED, TestStatus.PASSED]
for i, status in enumerate(statuses):
pkg.tester.test_parts[f"test_{i}"] = status
pkg.tester.counts[status] += 1
pkg.tester.tested_file = tmpdir.join("test-log.txt")
pkg.tester.write_tested_status()
pkg.tester.write_tested_status()
# The test should NOT result in a ValueError: invalid literal for int()
# with base 10: '2\n2' (i.e., the results being appended instead of
# written to the file).
with open(pkg.tester.tested_file, "r") as f:
status = int(f.read().strip("\n"))
assert TestStatus(status) == TestStatus.PASSED
def test_check_special_outputs(tmpdir):

View File

@ -244,6 +244,7 @@ def check_ast_roundtrip(code1, filename="internal", mode="exec"):
assert ast.dump(ast1) == ast.dump(ast2), error_msg
@pytest.mark.xfail(reason="https://github.com/spack/spack/pull/38424")
def test_core_lib_files():
"""Roundtrip source files from the Python core libs."""
test_directories = [

View File

@ -17,6 +17,7 @@ from llnl.util.filesystem import working_dir
import spack.package_base
import spack.spec
from spack.version import (
SEMVER_REGEX,
GitVersion,
StandardVersion,
Version,
@ -935,7 +936,7 @@ def test_inclusion_upperbound():
@pytest.mark.skipif(sys.platform == "win32", reason="Not supported on Windows (yet)")
def test_git_version_repo_attached_after_serialization(
mock_git_version_info, mock_packages, monkeypatch
mock_git_version_info, mock_packages, config, monkeypatch
):
"""Test that a GitVersion instance can be serialized and deserialized
without losing its repository reference.
@ -954,7 +955,9 @@ def test_git_version_repo_attached_after_serialization(
@pytest.mark.skipif(sys.platform == "win32", reason="Not supported on Windows (yet)")
def test_resolved_git_version_is_shown_in_str(mock_git_version_info, mock_packages, monkeypatch):
def test_resolved_git_version_is_shown_in_str(
mock_git_version_info, mock_packages, config, monkeypatch
):
"""Test that a GitVersion from a commit without a user supplied version is printed
as <hash>=<version>, and not just <hash>."""
repo_path, _, commits = mock_git_version_info
@ -968,9 +971,31 @@ def test_resolved_git_version_is_shown_in_str(mock_git_version_info, mock_packag
assert str(spec.version) == f"{commit}=1.0-git.1"
def test_unresolvable_git_versions_error(mock_packages):
def test_unresolvable_git_versions_error(config, mock_packages):
"""Test that VersionLookupError is raised when a git prop is not set on a package."""
with pytest.raises(VersionLookupError):
# The package exists, but does not have a git property set. When dereferencing
# the version, we should get VersionLookupError, not a generic AttributeError.
spack.spec.Spec(f"git-test-commit@{'a' * 40}").version.ref_version
@pytest.mark.parametrize(
"tag,expected",
[
("v100.2.3", "100.2.3"),
("v1.2.3", "1.2.3"),
("v1.2.3-pre.release+build.1", "1.2.3-pre.release+build.1"),
("v1.2.3+build.1", "1.2.3+build.1"),
("v1.2.3+build_1", None),
("v1.2.3-pre.release", "1.2.3-pre.release"),
("v1.2.3-pre_release", None),
("1.2.3", "1.2.3"),
("1.2.3.", None),
],
)
def test_semver_regex(tag, expected):
result = SEMVER_REGEX.search(tag)
if expected is None:
assert result is None
else:
assert result.group() == expected

View File

@ -552,3 +552,8 @@ def traverse_tree(specs, cover="nodes", deptype="all", key=id, depth_first=True)
return traverse_breadth_first_tree_nodes(None, edges)
return traverse_edges(specs, order="pre", cover=cover, deptype=deptype, key=key, depth=True)
def by_dag_hash(s: "spack.spec.Spec") -> str:
"""Used very often as a key function for traversals."""
return s.dag_hash()

View File

@ -340,13 +340,20 @@ class NameValueModifier:
class SetEnv(NameValueModifier):
__slots__ = ("force",)
__slots__ = ("force", "raw")
def __init__(
self, name: str, value: str, *, trace: Optional[Trace] = None, force: bool = False
self,
name: str,
value: str,
*,
trace: Optional[Trace] = None,
force: bool = False,
raw: bool = False,
):
super().__init__(name, value, trace=trace)
self.force = force
self.raw = raw
def execute(self, env: MutableMapping[str, str]):
tty.debug(f"SetEnv: {self.name}={str(self.value)}", level=3)
@ -490,15 +497,16 @@ class EnvironmentModifications:
return Trace(filename=filename, lineno=lineno, context=current_context)
@system_env_normalize
def set(self, name: str, value: str, *, force: bool = False):
def set(self, name: str, value: str, *, force: bool = False, raw: bool = False):
"""Stores a request to set an environment variable.
Args:
name: name of the environment variable
value: value of the environment variable
force: if True, audit will not consider this modification a warning
raw: if True, format of value string is skipped
"""
item = SetEnv(name, value, trace=self._trace(), force=force)
item = SetEnv(name, value, trace=self._trace(), force=force, raw=raw)
self.env_modifications.append(item)
@system_env_normalize
@ -757,16 +765,21 @@ class EnvironmentModifications:
"PS1",
"PS2",
"ENV",
# Environment modules v4
# Environment Modules or Lmod
"LOADEDMODULES",
"_LMFILES_",
"BASH_FUNC_module()",
"MODULEPATH",
"MODULES_(.*)",
r"(\w*)_mod(quar|share)",
# Lmod configuration
r"LMOD_(.*)",
"MODULERCFILE",
"BASH_FUNC_ml()",
"BASH_FUNC_module()",
# Environment Modules-specific configuration
"MODULESHOME",
"BASH_FUNC__module_raw()",
r"MODULES_(.*)",
r"__MODULES_(.*)",
r"(\w*)_mod(quar|share)",
# Lmod-specific configuration
r"LMOD_(.*)",
]
)

View File

@ -4,6 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import errno
import math
import os
import shutil
@ -151,18 +152,17 @@ class FileCache(object):
return WriteTransaction(self._get_lock(key), acquire=WriteContextManager)
def mtime(self, key):
"""Return modification time of cache file, or 0 if it does not exist.
def mtime(self, key) -> float:
"""Return modification time of cache file, or -inf if it does not exist.
Time is in units returned by os.stat in the mtime field, which is
platform-dependent.
"""
if not self.init_entry(key):
return 0
return -math.inf
else:
sinfo = os.stat(self.cache_path(key))
return sinfo.st_mtime
return os.stat(self.cache_path(key)).st_mtime
def remove(self, key):
file = self.cache_path(key)

View File

@ -40,11 +40,16 @@ COMMIT_VERSION = re.compile(r"^[a-f0-9]{40}$")
SEGMENT_REGEX = re.compile(r"(?:(?P<num>[0-9]+)|(?P<str>[a-zA-Z]+))(?P<sep>[_.-]*)")
# regular expression for semantic versioning
SEMVER_REGEX = re.compile(
".+(?P<semver>([0-9]+)[.]([0-9]+)[.]([0-9]+)"
"(?:-([0-9A-Za-z-]+(?:[.][0-9A-Za-z-]+)*))?"
"(?:[+][0-9A-Za-z-]+)?)"
)
_VERSION_CORE = r"\d+\.\d+\.\d+"
_IDENT = r"[0-9A-Za-z-]+"
_SEPARATED_IDENT = rf"{_IDENT}(?:\.{_IDENT})*"
_PRERELEASE = rf"\-{_SEPARATED_IDENT}"
_BUILD = rf"\+{_SEPARATED_IDENT}"
_SEMVER = rf"{_VERSION_CORE}(?:{_PRERELEASE})?(?:{_BUILD})?"
# clamp on the end, so versions like v1.2.3-rc1 will match
# without the leading 'v'.
SEMVER_REGEX = re.compile(rf"{_SEMVER}$")
# Infinity-like versions. The order in the list implies the comparison rules
infinity_versions = ["stable", "trunk", "head", "master", "main", "develop"]
@ -1319,11 +1324,10 @@ class CommitLookup(object):
commit_to_version[tag_commit] = v
break
else:
# try to parse tag to copare versions spack does not know
match = SEMVER_REGEX.match(tag)
# try to parse tag to compare versions spack does not know
match = SEMVER_REGEX.search(tag)
if match:
semver = match.groupdict()["semver"]
commit_to_version[tag_commit] = semver
commit_to_version[tag_commit] = match.group()
ancestor_commits = []
for tag_commit in commit_to_version:

View File

@ -141,12 +141,29 @@ ignore_missing_imports = true
ignore_errors = true
ignore_missing_imports = true
# pytest (which we depend on) optionally imports numpy, which requires Python 3.8 in
# recent versions. mypy still imports its .pyi file, which has positional-only
# arguments, which don't work in 3.7, which causes mypy to bail out early if you have
# numpy installed.
# Spack imports a number of external packages, and they *may* require Python 3.8 or
# higher in recent versions. This can cause mypy to fail because we check for 3.7
# compatibility. We could restrict mypy to run for the oldest supported version (3.7),
# but that means most developers won't be able to run mypy, which means it'll fail
# more in CI. Instead, we exclude these imported packages from mypy checking.
[[tool.mypy.overrides]]
module = 'numpy'
module = [
'IPython',
'altgraph',
'attr',
'boto3',
'botocore',
'distro',
'jinja2',
'jsonschema',
'macholib',
'markupsafe',
'numpy',
'pyristent',
'pytest',
'ruamel.yaml',
'six',
]
follow_imports = 'skip'
follow_imports_for_stubs = true

View File

@ -225,7 +225,7 @@ spack:
- - $compiler
- - $target
mirrors: { "mirror": "s3://spack-binaries/develop/aws-ahug-aarch64" }
mirrors: { "mirror": "s3://spack-binaries/releases/v0.20/aws-ahug-aarch64" }
ci:
pipeline-gen:

View File

@ -222,7 +222,7 @@ spack:
- - $compiler
- - $target
mirrors: { "mirror": "s3://spack-binaries/develop/aws-ahug" }
mirrors: { "mirror": "s3://spack-binaries/releases/v0.20/aws-ahug" }
ci:
pipeline-gen:

View File

@ -132,7 +132,7 @@ spack:
- - $target
mirrors: { "mirror": "s3://spack-binaries/develop/aws-isc-aarch64" }
mirrors: { "mirror": "s3://spack-binaries/releases/v0.20/aws-isc-aarch64" }
ci:
pipeline-gen:

View File

@ -143,7 +143,7 @@ spack:
- - $target
mirrors: { "mirror": "s3://spack-binaries/develop/aws-isc" }
mirrors: { "mirror": "s3://spack-binaries/releases/v0.20/aws-isc" }
ci:
pipeline-gen:

View File

@ -21,7 +21,7 @@ spack:
- - $default_specs
- - $arch
mirrors: { "mirror": "s3://spack-binaries/develop/build_systems" }
mirrors: { "mirror": "s3://spack-binaries/releases/v0.20/build_systems" }
cdash:
build-group: Build tests for different build systems
build-group: Build Systems

View File

@ -6,9 +6,9 @@ spack:
mesa:
require: "+glx +osmesa +opengl ~opengles +llvm"
libosmesa:
require: ^mesa +osmesa
require: "mesa +osmesa"
libglx:
require: ^mesa +glx
require: "mesa +glx"
ospray:
require: "@2.8.0 +denoiser +mpi"
llvm:
@ -64,7 +64,7 @@ spack:
- [$sdk_base_spec]
- [$^visit_specs]
mirrors: { "mirror": "s3://spack-binaries/develop/data-vis-sdk" }
mirrors: { "mirror": "s3://spack-binaries/releases/v0.20/data-vis-sdk" }
ci:
pipeline-gen:

View File

@ -21,7 +21,7 @@ spack:
- readline
mirrors:
mirror: s3://spack-binaries/develop/deprecated
mirror: s3://spack-binaries/releases/v0.20/deprecated
gitlab-ci:
broken-tests-packages:
- gptune

View File

@ -24,7 +24,7 @@ spack:
- - $easy_specs
- - $arch
mirrors: { "mirror": "s3://spack-binaries/develop/e4s-mac" }
mirrors: { "mirror": "s3://spack-binaries/releases/v0.20/e4s-mac" }
ci:
pipeline-gen:

View File

@ -263,12 +263,16 @@ spack:
# SKIPPED
# - flecsi # dependency pfunit marks oneapi as an unsupported compiler
mirrors: { "mirror": "s3://spack-binaries/develop/e4s-oneapi" }
mirrors: { "mirror": "s3://spack-binaries/releases/v0.20/e4s-oneapi" }
ci:
pipeline-gen:
- build-job:
image: ecpe4s/ubuntu20.04-runner-x86_64-oneapi:2023-01-01
before_script:
- - . /bootstrap/runner/view/lmod/lmod/init/bash
- module use /opt/intel/oneapi/modulefiles
- module load compiler
cdash:
build-group: E4S OneAPI

View File

@ -207,7 +207,7 @@ spack:
# bricks: VSBrick-7pt.py-Scalar-8x8x8-1:30:3: error: 'vfloat512' was not declared in this scope
mirrors: { "mirror": "s3://spack-binaries/develop/e4s-power" }
mirrors: { "mirror": "s3://spack-binaries/releases/v0.20/e4s-power" }
ci:
pipeline-gen:

View File

@ -232,7 +232,7 @@ spack:
# CUDA failures
#- parsec +cuda # parsec/mca/device/cuda/transfer.c:168: multiple definition of `parsec_CUDA_d2h_max_flows';
mirrors: { "mirror": "s3://spack-binaries/develop/e4s" }
mirrors: { "mirror": "s3://spack-binaries/releases/v0.20/e4s" }
ci:
pipeline-gen:

View File

@ -51,7 +51,7 @@ spack:
# FAILURES
# - kokkos +wrapper +cuda cuda_arch=80 ^cuda@12.0.0 # https://github.com/spack/spack/issues/35378
mirrors: { "mirror": "s3://spack-binaries/develop/gpu-tests" }
mirrors: { "mirror": "s3://spack-binaries/releases/v0.20/gpu-tests" }
ci:
pipeline-gen:

View File

@ -77,7 +77,7 @@ spack:
- xgboost
mirrors:
mirror: s3://spack-binaries/develop/ml-linux-x86_64-cpu
mirror: s3://spack-binaries/releases/v0.20/ml-linux-x86_64-cpu
ci:
pipeline-gen:

View File

@ -80,7 +80,7 @@ spack:
- xgboost
mirrors:
mirror: s3://spack-binaries/develop/ml-linux-x86_64-cuda
mirror: s3://spack-binaries/releases/v0.20/ml-linux-x86_64-cuda
ci:
pipeline-gen:

View File

@ -83,7 +83,7 @@ spack:
- xgboost
mirrors:
mirror: s3://spack-binaries/develop/ml-linux-x86_64-rocm
mirror: s3://spack-binaries/releases/v0.20/ml-linux-x86_64-rocm
ci:
pipeline-gen:

View File

@ -38,7 +38,7 @@ spack:
- - $compiler
- - $target
mirrors: { "mirror": "s3://spack-binaries/develop/radiuss-aws-aarch64" }
mirrors: { "mirror": "s3://spack-binaries/releases/v0.20/radiuss-aws-aarch64" }
ci:
pipeline-gen:

View File

@ -44,7 +44,7 @@ spack:
- - $compiler
- - $target
mirrors: { "mirror": "s3://spack-binaries/develop/radiuss-aws" }
mirrors: { "mirror": "s3://spack-binaries/releases/v0.20/radiuss-aws" }
ci:
pipeline-gen:

View File

@ -41,7 +41,7 @@ spack:
- zfp
mirrors:
mirror: "s3://spack-binaries/develop/radiuss"
mirror: "s3://spack-binaries/releases/v0.20/radiuss"
specs:
- matrix:

View File

@ -50,7 +50,7 @@ spack:
- $gcc_spack_built_packages
mirrors:
mirror: s3://spack-binaries/develop/tutorial
mirror: s3://spack-binaries/releases/v0.20/tutorial
ci:
pipeline-gen:
- build-job:

View File

@ -1060,7 +1060,7 @@ _spack_external_list() {
}
_spack_external_read_cray_manifest() {
SPACK_COMPREPLY="-h --help --file --directory --dry-run --fail-on-error"
SPACK_COMPREPLY="-h --help --file --directory --ignore-default-dir --dry-run --fail-on-error"
}
_spack_fetch() {

View File

@ -37,7 +37,7 @@ RUN find -L {{ paths.view }}/* -type f -exec readlink -f '{}' \; | \
# Modifications to the environment that are necessary to run
RUN cd {{ paths.environment }} && \
spack env activate --sh -d . >> /etc/profile.d/z10_spack_environment.sh
spack env activate --sh -d . > activate.sh
{% if extra_instructions.build %}
{{ extra_instructions.build }}
@ -53,7 +53,13 @@ COPY --from=builder {{ paths.environment }} {{ paths.environment }}
COPY --from=builder {{ paths.store }} {{ paths.store }}
COPY --from=builder {{ paths.hidden_view }} {{ paths.hidden_view }}
COPY --from=builder {{ paths.view }} {{ paths.view }}
COPY --from=builder /etc/profile.d/z10_spack_environment.sh /etc/profile.d/z10_spack_environment.sh
RUN { \
echo '#!/bin/sh' \
&& echo '.' {{ paths.environment }}/activate.sh \
&& echo 'exec "$@"'; \
} > /entrypoint.sh \
&& chmod a+x /entrypoint.sh
{% block final_stage %}
@ -70,6 +76,6 @@ RUN {% if os_package_update %}{{ os_packages_final.update }} \
{% for label, value in labels.items() %}
LABEL "{{ label }}"="{{ value }}"
{% endfor %}
ENTRYPOINT ["/bin/bash", "--rcfile", "/etc/profile", "-l", "-c", "$*", "--" ]
ENTRYPOINT [ "/entrypoint.sh" ]
CMD [ "/bin/bash" ]
{% endif %}

View File

@ -1,7 +1,7 @@
{% extends "container/bootstrap-base.dockerfile" %}
{% block install_os_packages %}
RUN yum update -y \
&& yum install -y \
RUN dnf update -y \
&& dnf install -y \
bzip2 \
curl \
file \
@ -23,6 +23,6 @@ RUN yum update -y \
unzip \
zstd \
&& pip3 install boto3 \
&& rm -rf /var/cache/yum \
&& yum clean all
&& rm -rf /var/cache/dnf \
&& dnf clean all
{% endblock %}

View File

@ -1,9 +1,9 @@
{% extends "container/bootstrap-base.dockerfile" %}
{% block install_os_packages %}
RUN yum update -y \
&& yum install -y epel-release \
&& yum update -y \
&& yum --enablerepo epel install -y \
RUN dnf update -y \
&& dnf install -y epel-release \
&& dnf update -y \
&& dnf --enablerepo epel install -y \
bzip2 \
curl-minimal \
file \
@ -25,6 +25,6 @@ RUN yum update -y \
unzip \
zstd \
&& pip3 install boto3 \
&& rm -rf /var/cache/yum \
&& yum clean all
&& rm -rf /var/cache/dnf \
&& dnf clean all
{% endblock %}

View File

@ -1,13 +1,13 @@
{% extends "container/bootstrap-base.dockerfile" %}
{% block install_os_packages %}
RUN yum update -y \
RUN dnf update -y \
# See https://fedoraproject.org/wiki/EPEL#Quickstart for powertools
&& yum install -y dnf-plugins-core \
&& dnf install -y dnf-plugins-core \
&& dnf config-manager --set-enabled powertools \
&& yum install -y epel-release \
&& yum update -y \
&& yum --enablerepo epel groupinstall -y "Development Tools" \
&& yum --enablerepo epel install -y \
&& dnf install -y epel-release \
&& dnf update -y \
&& dnf --enablerepo epel groupinstall -y "Development Tools" \
&& dnf --enablerepo epel install -y \
curl \
findutils \
gcc-c++ \
@ -24,6 +24,6 @@ RUN yum update -y \
python38-setuptools \
unzip \
&& pip3 install boto3 \
&& rm -rf /var/cache/yum \
&& yum clean all
&& rm -rf /var/cache/dnf \
&& dnf clean all
{% endblock %}

View File

@ -1,7 +1,7 @@
{% extends "container/bootstrap-base.dockerfile" %}
{% block install_os_packages %}
RUN yum update -y \
&& yum install -y \
RUN dnf update -y \
&& dnf install -y \
bzip2 \
curl \
file \
@ -24,6 +24,6 @@ RUN yum update -y \
zstd \
xz \
&& pip3 install boto3 \
&& rm -rf /var/cache/yum \
&& yum clean all
&& rm -rf /var/cache/dnf \
&& dnf clean all
{% endblock %}

View File

@ -1,7 +1,7 @@
{% extends "container/bootstrap-base.dockerfile" %}
{% block install_os_packages %}
RUN yum update -y \
&& yum install -y \
RUN dnf update -y \
&& dnf install -y \
bzip2 \
curl \
file \
@ -24,6 +24,6 @@ RUN yum update -y \
xz \
zstd \
&& pip3 install boto3 \
&& rm -rf /var/cache/yum \
&& yum clean all
&& rm -rf /var/cache/dnf \
&& dnf clean all
{% endblock %}

View File

@ -1,7 +1,7 @@
{% extends "container/bootstrap-base.dockerfile" %}
{% block install_os_packages %}
RUN yum update -y \
&& yum install -y \
RUN dnf update -y \
&& dnf install -y \
bzip2 \
curl \
file \
@ -24,6 +24,6 @@ RUN yum update -y \
xz \
zstd \
&& pip3 install boto3 \
&& rm -rf /var/cache/yum \
&& yum clean all
&& rm -rf /var/cache/dnf \
&& dnf clean all
{% endblock %}

View File

@ -1,9 +1,9 @@
{% extends "container/bootstrap-base.dockerfile" %}
{% block install_os_packages %}
RUN yum update -y \
&& yum install -y epel-release \
&& yum update -y \
&& yum --enablerepo epel install -y \
RUN dnf update -y \
&& dnf install -y epel-release \
&& dnf update -y \
&& dnf --enablerepo epel install -y \
bzip2 \
curl-minimal \
file \
@ -26,6 +26,6 @@ RUN yum update -y \
xz \
zstd \
&& pip3 install boto3 \
&& rm -rf /var/cache/yum \
&& yum clean all
&& rm -rf /var/cache/dnf \
&& dnf clean all
{% endblock %}

View File

@ -84,6 +84,10 @@ setenv("{{ cmd.name }}", "{{ cmd.value }}")
unsetenv("{{ cmd.name }}")
{% endif %}
{% endfor %}
{# Make sure system man pages are enabled by appending trailing delimiter to MANPATH #}
{% if has_manpath_modifications %}
append_path("MANPATH", "", ":")
{% endif %}
{% endblock %}
{% block footer %}

View File

@ -11,24 +11,32 @@
{% block header %}
{% if short_description %}
module-whatis "{{ short_description }}"
module-whatis {{ '{' }}{{ short_description }}{{ '}' }}
{% endif %}
proc ModulesHelp { } {
puts stderr "Name : {{ spec.name }}"
puts stderr "Version: {{ spec.version }}"
puts stderr "Target : {{ spec.target }}"
puts stderr {{ '{' }}Name : {{ spec.name }}{{ '}' }}
puts stderr {{ '{' }}Version: {{ spec.version }}{{ '}' }}
puts stderr {{ '{' }}Target : {{ spec.target }}{{ '}' }}
{% if long_description %}
puts stderr ""
{{ long_description| textwrap(72)| quote()| prepend_to_line(' puts stderr ')| join() }}
puts stderr {}
{{ long_description| textwrap(72)| curly_quote()| prepend_to_line(' puts stderr ')| join() }}
{% endif %}
}
{% endblock %}
{% block autoloads %}
{% if autoload|length > 0 %}
if {![info exists ::env(LMOD_VERSION_MAJOR)]} {
{% for module in autoload %}
module load {{ module }}
module load {{ module }}
{% endfor %}
} else {
{% for module in autoload %}
depends-on {{ module }}
{% endfor %}
}
{% endif %}
{% endblock %}
{# #}
{% block prerequisite %}
@ -46,18 +54,22 @@ conflict {{ name }}
{% block environment %}
{% for command_name, cmd in environment_modifications %}
{% if command_name == 'PrependPath' %}
prepend-path --delim "{{ cmd.separator }}" {{ cmd.name }} "{{ cmd.value }}"
prepend-path --delim {{ '{' }}{{ cmd.separator }}{{ '}' }} {{ cmd.name }} {{ '{' }}{{ cmd.value }}{{ '}' }}
{% elif command_name in ('AppendPath', 'AppendFlagsEnv') %}
append-path --delim "{{ cmd.separator }}" {{ cmd.name }} "{{ cmd.value }}"
append-path --delim {{ '{' }}{{ cmd.separator }}{{ '}' }} {{ cmd.name }} {{ '{' }}{{ cmd.value }}{{ '}' }}
{% elif command_name in ('RemovePath', 'RemoveFlagsEnv') %}
remove-path --delim "{{ cmd.separator }}" {{ cmd.name }} "{{ cmd.value }}"
remove-path --delim {{ '{' }}{{ cmd.separator }}{{ '}' }} {{ cmd.name }} {{ '{' }}{{ cmd.value }}{{ '}' }}
{% elif command_name == 'SetEnv' %}
setenv {{ cmd.name }} "{{ cmd.value }}"
setenv {{ cmd.name }} {{ '{' }}{{ cmd.value }}{{ '}' }}
{% elif command_name == 'UnsetEnv' %}
unsetenv {{ cmd.name }}
{% endif %}
{# #}
{% endfor %}
{# Make sure system man pages are enabled by appending trailing delimiter to MANPATH #}
{% if has_manpath_modifications %}
append-path --delim {{ '{' }}:{{ '}' }} MANPATH {{ '{' }}{{ '}' }}
{% endif %}
{% endblock %}
{% block footer %}

Some files were not shown because too many files have changed in this diff Show More