mirror of
https://github.com/NixOS/nix.git
synced 2024-11-25 00:02:25 +00:00
Merge remote-tracking branch 'origin/master' into provenance
This commit is contained in:
commit
f12b4b902a
50
.github/ISSUE_TEMPLATE/bug_report.md
vendored
50
.github/ISSUE_TEMPLATE/bug_report.md
vendored
@ -1,36 +1,54 @@
|
||||
---
|
||||
name: Bug report
|
||||
about: Create a report to help us improve
|
||||
about: Report unexpected or incorrect behaviour
|
||||
title: ''
|
||||
labels: bug
|
||||
assignees: ''
|
||||
|
||||
---
|
||||
|
||||
**Describe the bug**
|
||||
## Describe the bug
|
||||
|
||||
A clear and concise description of what the bug is.
|
||||
<!--
|
||||
A clear and concise description of what the bug is.
|
||||
|
||||
If you have a problem with a specific package or NixOS,
|
||||
you probably want to file an issue at https://github.com/NixOS/nixpkgs/issues.
|
||||
If you have a problem with a specific package or NixOS,
|
||||
you probably want to file an issue at https://github.com/NixOS/nixpkgs/issues.
|
||||
-->
|
||||
|
||||
**Steps To Reproduce**
|
||||
## Steps To Reproduce
|
||||
|
||||
1. Go to '...'
|
||||
2. Click on '....'
|
||||
3. Scroll down to '....'
|
||||
4. See error
|
||||
<!--
|
||||
Example:
|
||||
|
||||
**Expected behavior**
|
||||
1. Clone this repository: ...
|
||||
2. Run `nix-... ...`
|
||||
3. Observe unexpected behaviour
|
||||
-->
|
||||
|
||||
A clear and concise description of what you expected to happen.
|
||||
## Expected behavior
|
||||
|
||||
**`nix-env --version` output**
|
||||
<!-- A clear and concise description of what you expected to happen. -->
|
||||
|
||||
**Additional context**
|
||||
## Metadata
|
||||
|
||||
Add any other context about the problem here.
|
||||
<!-- Please insert the output of running `nix-env --version` below this line -->
|
||||
|
||||
**Priorities**
|
||||
## Additional context
|
||||
|
||||
<!-- Add any other context about the problem here. -->
|
||||
|
||||
## Checklist
|
||||
|
||||
<!-- make sure this issue is not redundant or obsolete -->
|
||||
|
||||
- [ ] checked [latest Nix manual] \([source])
|
||||
- [ ] checked [open bug issues and pull requests] for possible duplicates
|
||||
|
||||
[latest Nix manual]: https://nixos.org/manual/nix/unstable/
|
||||
[source]: https://github.com/NixOS/nix/tree/master/doc/manual/source
|
||||
[open bug issues and pull requests]: https://github.com/NixOS/nix/labels/bug
|
||||
|
||||
---
|
||||
|
||||
Add :+1: to [issues you find important](https://github.com/NixOS/nix/issues?q=is%3Aissue+is%3Aopen+sort%3Areactions-%2B1-desc).
|
||||
|
35
.github/ISSUE_TEMPLATE/feature_request.md
vendored
35
.github/ISSUE_TEMPLATE/feature_request.md
vendored
@ -1,24 +1,39 @@
|
||||
---
|
||||
name: Feature request
|
||||
about: Suggest an idea for this project
|
||||
about: Suggest a new feature
|
||||
title: ''
|
||||
labels: feature
|
||||
assignees: ''
|
||||
|
||||
---
|
||||
|
||||
**Is your feature request related to a problem? Please describe.**
|
||||
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
|
||||
## Is your feature request related to a problem?
|
||||
|
||||
**Describe the solution you'd like**
|
||||
A clear and concise description of what you want to happen.
|
||||
<!-- A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] -->
|
||||
|
||||
**Describe alternatives you've considered**
|
||||
A clear and concise description of any alternative solutions or features you've considered.
|
||||
## Proposed solution
|
||||
|
||||
**Additional context**
|
||||
Add any other context or screenshots about the feature request here.
|
||||
<!-- A clear and concise description of what you want to happen. -->
|
||||
|
||||
**Priorities**
|
||||
## Alternative solutions
|
||||
|
||||
<!-- A clear and concise description of any alternative solutions or features you've considered. -->
|
||||
|
||||
## Additional context
|
||||
|
||||
<!-- Add any other context or screenshots about the feature request here. -->
|
||||
|
||||
## Checklist
|
||||
|
||||
<!-- make sure this issue is not redundant or obsolete -->
|
||||
|
||||
- [ ] checked [latest Nix manual] \([source])
|
||||
- [ ] checked [open feature issues and pull requests] for possible duplicates
|
||||
|
||||
[latest Nix manual]: https://nixos.org/manual/nix/unstable/
|
||||
[source]: https://github.com/NixOS/nix/tree/master/doc/manual/source
|
||||
[open feature issues and pull requests]: https://github.com/NixOS/nix/labels/feature
|
||||
|
||||
---
|
||||
|
||||
Add :+1: to [issues you find important](https://github.com/NixOS/nix/issues?q=is%3Aissue+is%3Aopen+sort%3Areactions-%2B1-desc).
|
||||
|
17
.github/ISSUE_TEMPLATE/installer.md
vendored
17
.github/ISSUE_TEMPLATE/installer.md
vendored
@ -23,14 +23,25 @@ assignees: ''
|
||||
|
||||
<details><summary>Output</summary>
|
||||
|
||||
```log
|
||||
<!-- paste console output inside the below code block -->
|
||||
|
||||
<!-- paste console output here and remove this comment -->
|
||||
```log
|
||||
|
||||
```
|
||||
|
||||
</details>
|
||||
|
||||
## Priorities
|
||||
## Checklist
|
||||
|
||||
<!-- make sure this issue is not redundant or obsolete -->
|
||||
|
||||
- [ ] checked [latest Nix manual] \([source])
|
||||
- [ ] checked [open installer issues and pull requests] for possible duplicates
|
||||
|
||||
[latest Nix manual]: https://nixos.org/manual/nix/unstable/
|
||||
[source]: https://github.com/NixOS/nix/tree/master/doc/manual/source
|
||||
[open installer issues and pull requests]: https://github.com/NixOS/nix/labels/installer
|
||||
|
||||
---
|
||||
|
||||
Add :+1: to [issues you find important](https://github.com/NixOS/nix/issues?q=is%3Aissue+is%3Aopen+sort%3Areactions-%2B1-desc).
|
||||
|
@ -26,6 +26,6 @@ assignees: ''
|
||||
[source]: https://github.com/NixOS/nix/tree/master/doc/manual/source
|
||||
[open documentation issues and pull requests]: https://github.com/NixOS/nix/labels/documentation
|
||||
|
||||
## Priorities
|
||||
---
|
||||
|
||||
Add :+1: to [issues you find important](https://github.com/NixOS/nix/issues?q=is%3Aissue+is%3Aopen+sort%3Areactions-%2B1-desc).
|
||||
|
8
.github/PULL_REQUEST_TEMPLATE.md
vendored
8
.github/PULL_REQUEST_TEMPLATE.md
vendored
@ -17,10 +17,12 @@ so you understand the process and the expectations.
|
||||
|
||||
-->
|
||||
|
||||
# Motivation
|
||||
## Motivation
|
||||
|
||||
<!-- Briefly explain what the change is about and why it is desirable. -->
|
||||
|
||||
# Context
|
||||
## Context
|
||||
|
||||
<!-- Provide context. Reference open issues if available. -->
|
||||
|
||||
<!-- Non-trivial change: Briefly outline the implementation strategy. -->
|
||||
@ -29,7 +31,7 @@ so you understand the process and the expectations.
|
||||
|
||||
<!-- Large change: Provide instructions to reviewers how to read the diff. -->
|
||||
|
||||
# Priorities and Process
|
||||
---
|
||||
|
||||
Add :+1: to [pull requests you find important](https://github.com/NixOS/nix/pulls?q=is%3Aopen+sort%3Areactions-%2B1-desc).
|
||||
|
||||
|
12
.github/workflows/ci.yml
vendored
12
.github/workflows/ci.yml
vendored
@ -128,7 +128,7 @@ jobs:
|
||||
- run: exec bash -c "nix-channel --update && nix-env -iA nixpkgs.hello && hello"
|
||||
|
||||
docker_push_image:
|
||||
needs: [check_secrets, tests]
|
||||
needs: [check_secrets, tests, vm_tests]
|
||||
permissions:
|
||||
contents: read
|
||||
packages: write
|
||||
@ -194,7 +194,13 @@ jobs:
|
||||
- uses: actions/checkout@v4
|
||||
- uses: DeterminateSystems/nix-installer-action@main
|
||||
- uses: DeterminateSystems/magic-nix-cache-action@main
|
||||
- run: nix build -L .#hydraJobs.tests.githubFlakes .#hydraJobs.tests.tarballFlakes .#hydraJobs.tests.functional_user
|
||||
- run: |
|
||||
nix build -L \
|
||||
.#hydraJobs.tests.functional_user \
|
||||
.#hydraJobs.tests.githubFlakes \
|
||||
.#hydraJobs.tests.nix-docker \
|
||||
.#hydraJobs.tests.tarballFlakes \
|
||||
;
|
||||
|
||||
flake_regressions:
|
||||
needs: vm_tests
|
||||
@ -214,4 +220,4 @@ jobs:
|
||||
path: flake-regressions/tests
|
||||
- uses: DeterminateSystems/nix-installer-action@main
|
||||
- uses: DeterminateSystems/magic-nix-cache-action@main
|
||||
- run: nix build --out-link ./new-nix && PATH=$(pwd)/new-nix/bin:$PATH scripts/flake-regressions.sh
|
||||
- run: nix build -L --out-link ./new-nix && PATH=$(pwd)/new-nix/bin:$PATH MAX_FLAKES=25 flake-regressions/eval-all.sh
|
||||
|
3
.gitignore
vendored
3
.gitignore
vendored
@ -102,9 +102,6 @@ perl/Makefile.config
|
||||
/tests/functional/restricted-innocent
|
||||
/tests/functional/shell
|
||||
/tests/functional/shell.drv
|
||||
/tests/functional/config.nix
|
||||
/tests/functional/ca/config.nix
|
||||
/tests/functional/dyn-drv/config.nix
|
||||
/tests/functional/repl-result-out
|
||||
/tests/functional/debugger-test-out
|
||||
/tests/functional/test-libstoreconsumer/test-libstoreconsumer
|
||||
|
128
Makefile
128
Makefile
@ -1,128 +0,0 @@
|
||||
# External build directory support
|
||||
|
||||
include mk/build-dir.mk
|
||||
|
||||
-include $(buildprefix)Makefile.config
|
||||
clean-files += $(buildprefix)Makefile.config
|
||||
|
||||
# List makefiles
|
||||
|
||||
include mk/platform.mk
|
||||
|
||||
ifeq ($(ENABLE_BUILD), yes)
|
||||
makefiles = \
|
||||
mk/precompiled-headers.mk \
|
||||
local.mk \
|
||||
src/libutil/local.mk \
|
||||
src/libstore/local.mk \
|
||||
src/libfetchers/local.mk \
|
||||
src/libmain/local.mk \
|
||||
src/libexpr/local.mk \
|
||||
src/libflake/local.mk \
|
||||
src/libcmd/local.mk \
|
||||
src/nix/local.mk \
|
||||
src/libutil-c/local.mk \
|
||||
src/libstore-c/local.mk \
|
||||
src/libexpr-c/local.mk
|
||||
|
||||
ifdef HOST_UNIX
|
||||
makefiles += \
|
||||
scripts/local.mk \
|
||||
maintainers/local.mk \
|
||||
misc/bash/local.mk \
|
||||
misc/fish/local.mk \
|
||||
misc/zsh/local.mk \
|
||||
misc/systemd/local.mk \
|
||||
misc/launchd/local.mk \
|
||||
misc/upstart/local.mk
|
||||
endif
|
||||
endif
|
||||
|
||||
ifeq ($(ENABLE_UNIT_TESTS), yes)
|
||||
makefiles += \
|
||||
src/libutil-tests/local.mk \
|
||||
src/libutil-test-support/local.mk \
|
||||
src/libstore-tests/local.mk \
|
||||
src/libstore-test-support/local.mk \
|
||||
src/libfetchers-tests/local.mk \
|
||||
src/libexpr-tests/local.mk \
|
||||
src/libexpr-test-support/local.mk \
|
||||
src/libflake-tests/local.mk
|
||||
endif
|
||||
|
||||
ifeq ($(ENABLE_FUNCTIONAL_TESTS), yes)
|
||||
ifdef HOST_UNIX
|
||||
makefiles += \
|
||||
tests/functional/local.mk \
|
||||
tests/functional/flakes/local.mk \
|
||||
tests/functional/ca/local.mk \
|
||||
tests/functional/git-hashing/local.mk \
|
||||
tests/functional/dyn-drv/local.mk \
|
||||
tests/functional/local-overlay-store/local.mk \
|
||||
tests/functional/test-libstoreconsumer/local.mk \
|
||||
tests/functional/plugins/local.mk
|
||||
endif
|
||||
endif
|
||||
|
||||
# Some makefiles require access to built programs and must be included late.
|
||||
makefiles-late =
|
||||
|
||||
ifeq ($(ENABLE_DOC_GEN), yes)
|
||||
makefiles-late += doc/manual/local.mk
|
||||
endif
|
||||
|
||||
# Miscellaneous global Flags
|
||||
|
||||
OPTIMIZE = 1
|
||||
|
||||
ifeq ($(OPTIMIZE), 1)
|
||||
GLOBAL_CXXFLAGS += -O3 $(CXXLTO)
|
||||
GLOBAL_LDFLAGS += $(CXXLTO)
|
||||
else
|
||||
GLOBAL_CXXFLAGS += -O0 -U_FORTIFY_SOURCE
|
||||
unexport NIX_HARDENING_ENABLE
|
||||
endif
|
||||
|
||||
ifdef HOST_WINDOWS
|
||||
# Windows DLLs are stricter about symbol visibility than Unix shared
|
||||
# objects --- see https://gcc.gnu.org/wiki/Visibility for details.
|
||||
# This is a temporary sledgehammer to export everything like on Unix,
|
||||
# and not detail with this yet.
|
||||
#
|
||||
# TODO do not do this, and instead do fine-grained export annotations.
|
||||
GLOBAL_LDFLAGS += -Wl,--export-all-symbols
|
||||
endif
|
||||
|
||||
GLOBAL_CXXFLAGS += -g -Wall -Wdeprecated-copy -Wignored-qualifiers -Wimplicit-fallthrough -Werror=unused-result -Werror=suggest-override -include $(buildprefix)config.h -std=c++2a -I src
|
||||
|
||||
# Include the main lib, causing rules to be defined
|
||||
|
||||
include mk/lib.mk
|
||||
|
||||
# Fallback stub rules for better UX when things are disabled
|
||||
#
|
||||
# These must be defined after `mk/lib.mk`. Otherwise the first rule
|
||||
# incorrectly becomes the default target.
|
||||
|
||||
ifneq ($(ENABLE_UNIT_TESTS), yes)
|
||||
.PHONY: check
|
||||
check:
|
||||
@echo "Unit tests are disabled. Configure without '--disable-unit-tests', or avoid calling 'make check'."
|
||||
@exit 1
|
||||
endif
|
||||
|
||||
ifneq ($(ENABLE_FUNCTIONAL_TESTS), yes)
|
||||
.PHONY: installcheck
|
||||
installcheck:
|
||||
@echo "Functional tests are disabled. Configure without '--disable-functional-tests', or avoid calling 'make installcheck'."
|
||||
@exit 1
|
||||
endif
|
||||
|
||||
# Documentation fallback stub rules.
|
||||
|
||||
ifneq ($(ENABLE_DOC_GEN), yes)
|
||||
.PHONY: manual-html manpages
|
||||
manual-html manpages:
|
||||
@echo "Generated docs are disabled. Configure without '--disable-doc-gen', or avoid calling 'make manpages' and 'make manual-html'."
|
||||
@exit 1
|
||||
endif
|
@ -1,54 +0,0 @@
|
||||
AR = @AR@
|
||||
BDW_GC_LIBS = @BDW_GC_LIBS@
|
||||
BOOST_LDFLAGS = @BOOST_LDFLAGS@
|
||||
BUILD_SHARED_LIBS = @BUILD_SHARED_LIBS@
|
||||
CC = @CC@
|
||||
CFLAGS = @CFLAGS@
|
||||
CXX = @CXX@
|
||||
CXXFLAGS = @CXXFLAGS@
|
||||
CXXLTO = @CXXLTO@
|
||||
EDITLINE_LIBS = @EDITLINE_LIBS@
|
||||
ENABLE_BUILD = @ENABLE_BUILD@
|
||||
ENABLE_DOC_GEN = @ENABLE_DOC_GEN@
|
||||
ENABLE_FUNCTIONAL_TESTS = @ENABLE_FUNCTIONAL_TESTS@
|
||||
ENABLE_S3 = @ENABLE_S3@
|
||||
ENABLE_UNIT_TESTS = @ENABLE_UNIT_TESTS@
|
||||
GTEST_LIBS = @GTEST_LIBS@
|
||||
HAVE_LIBCPUID = @HAVE_LIBCPUID@
|
||||
HAVE_SECCOMP = @HAVE_SECCOMP@
|
||||
HOST_OS = @host_os@
|
||||
INSTALL_UNIT_TESTS = @INSTALL_UNIT_TESTS@
|
||||
LDFLAGS = @LDFLAGS@
|
||||
LIBARCHIVE_LIBS = @LIBARCHIVE_LIBS@
|
||||
LIBBROTLI_LIBS = @LIBBROTLI_LIBS@
|
||||
LIBCURL_LIBS = @LIBCURL_LIBS@
|
||||
LIBGIT2_LIBS = @LIBGIT2_LIBS@
|
||||
LIBSECCOMP_LIBS = @LIBSECCOMP_LIBS@
|
||||
LOWDOWN_LIBS = @LOWDOWN_LIBS@
|
||||
OPENSSL_LIBS = @OPENSSL_LIBS@
|
||||
PACKAGE_NAME = @PACKAGE_NAME@
|
||||
PACKAGE_VERSION = @PACKAGE_VERSION@
|
||||
SHELL = @bash@
|
||||
SODIUM_LIBS = @SODIUM_LIBS@
|
||||
SQLITE3_LIBS = @SQLITE3_LIBS@
|
||||
bash = @bash@
|
||||
bindir = @bindir@
|
||||
checkbindir = @checkbindir@
|
||||
checklibdir = @checklibdir@
|
||||
datadir = @datadir@
|
||||
datarootdir = @datarootdir@
|
||||
docdir = @docdir@
|
||||
embedded_sandbox_shell = @embedded_sandbox_shell@
|
||||
exec_prefix = @exec_prefix@
|
||||
includedir = @includedir@
|
||||
libdir = @libdir@
|
||||
libexecdir = @libexecdir@
|
||||
localstatedir = @localstatedir@
|
||||
lsof = @lsof@
|
||||
mandir = @mandir@
|
||||
pkglibdir = $(libdir)/$(PACKAGE_NAME)
|
||||
prefix = @prefix@
|
||||
sandbox_shell = @sandbox_shell@
|
||||
storedir = @storedir@
|
||||
sysconfdir = @sysconfdir@
|
||||
system = @system@
|
8
build-utils-meson/libatomic/meson.build
Normal file
8
build-utils-meson/libatomic/meson.build
Normal file
@ -0,0 +1,8 @@
|
||||
|
||||
# Check if -latomic is needed
|
||||
# This is needed for std::atomic on some platforms
|
||||
# We did not manage to test this reliably on all platforms, so we hardcode
|
||||
# it for now.
|
||||
if host_machine.cpu_family() == 'arm'
|
||||
deps_other += cxx.find_library('atomic')
|
||||
endif
|
6
build-utils-meson/windows-version/meson.build
Normal file
6
build-utils-meson/windows-version/meson.build
Normal file
@ -0,0 +1,6 @@
|
||||
if host_machine.system() == 'windows'
|
||||
# https://learn.microsoft.com/en-us/cpp/porting/modifying-winver-and-win32-winnt?view=msvc-170
|
||||
# #define _WIN32_WINNT_WIN8 0x0602
|
||||
# We currently don't use any API which requires higher than this.
|
||||
add_project_arguments([ '-D_WIN32_WINNT=0x0602' ], language: 'cpp')
|
||||
endif
|
@ -1,527 +0,0 @@
|
||||
#!/bin/sh
|
||||
# install - install a program, script, or datafile
|
||||
|
||||
scriptversion=2011-11-20.07; # UTC
|
||||
|
||||
# This originates from X11R5 (mit/util/scripts/install.sh), which was
|
||||
# later released in X11R6 (xc/config/util/install.sh) with the
|
||||
# following copyright and license.
|
||||
#
|
||||
# Copyright (C) 1994 X Consortium
|
||||
#
|
||||
# Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
# of this software and associated documentation files (the "Software"), to
|
||||
# deal in the Software without restriction, including without limitation the
|
||||
# rights to use, copy, modify, merge, publish, distribute, sublicense, and/or
|
||||
# sell copies of the Software, and to permit persons to whom the Software is
|
||||
# furnished to do so, subject to the following conditions:
|
||||
#
|
||||
# The above copyright notice and this permission notice shall be included in
|
||||
# all copies or substantial portions of the Software.
|
||||
#
|
||||
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
# X CONSORTIUM BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN
|
||||
# AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNEC-
|
||||
# TION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
||||
#
|
||||
# Except as contained in this notice, the name of the X Consortium shall not
|
||||
# be used in advertising or otherwise to promote the sale, use or other deal-
|
||||
# ings in this Software without prior written authorization from the X Consor-
|
||||
# tium.
|
||||
#
|
||||
#
|
||||
# FSF changes to this file are in the public domain.
|
||||
#
|
||||
# Calling this script install-sh is preferred over install.sh, to prevent
|
||||
# 'make' implicit rules from creating a file called install from it
|
||||
# when there is no Makefile.
|
||||
#
|
||||
# This script is compatible with the BSD install script, but was written
|
||||
# from scratch.
|
||||
|
||||
nl='
|
||||
'
|
||||
IFS=" "" $nl"
|
||||
|
||||
# set DOITPROG to echo to test this script
|
||||
|
||||
# Don't use :- since 4.3BSD and earlier shells don't like it.
|
||||
doit=${DOITPROG-}
|
||||
if test -z "$doit"; then
|
||||
doit_exec=exec
|
||||
else
|
||||
doit_exec=$doit
|
||||
fi
|
||||
|
||||
# Put in absolute file names if you don't have them in your path;
|
||||
# or use environment vars.
|
||||
|
||||
chgrpprog=${CHGRPPROG-chgrp}
|
||||
chmodprog=${CHMODPROG-chmod}
|
||||
chownprog=${CHOWNPROG-chown}
|
||||
cmpprog=${CMPPROG-cmp}
|
||||
cpprog=${CPPROG-cp}
|
||||
mkdirprog=${MKDIRPROG-mkdir}
|
||||
mvprog=${MVPROG-mv}
|
||||
rmprog=${RMPROG-rm}
|
||||
stripprog=${STRIPPROG-strip}
|
||||
|
||||
posix_glob='?'
|
||||
initialize_posix_glob='
|
||||
test "$posix_glob" != "?" || {
|
||||
if (set -f) 2>/dev/null; then
|
||||
posix_glob=
|
||||
else
|
||||
posix_glob=:
|
||||
fi
|
||||
}
|
||||
'
|
||||
|
||||
posix_mkdir=
|
||||
|
||||
# Desired mode of installed file.
|
||||
mode=0755
|
||||
|
||||
chgrpcmd=
|
||||
chmodcmd=$chmodprog
|
||||
chowncmd=
|
||||
mvcmd=$mvprog
|
||||
rmcmd="$rmprog -f"
|
||||
stripcmd=
|
||||
|
||||
src=
|
||||
dst=
|
||||
dir_arg=
|
||||
dst_arg=
|
||||
|
||||
copy_on_change=false
|
||||
no_target_directory=
|
||||
|
||||
usage="\
|
||||
Usage: $0 [OPTION]... [-T] SRCFILE DSTFILE
|
||||
or: $0 [OPTION]... SRCFILES... DIRECTORY
|
||||
or: $0 [OPTION]... -t DIRECTORY SRCFILES...
|
||||
or: $0 [OPTION]... -d DIRECTORIES...
|
||||
|
||||
In the 1st form, copy SRCFILE to DSTFILE.
|
||||
In the 2nd and 3rd, copy all SRCFILES to DIRECTORY.
|
||||
In the 4th, create DIRECTORIES.
|
||||
|
||||
Options:
|
||||
--help display this help and exit.
|
||||
--version display version info and exit.
|
||||
|
||||
-c (ignored)
|
||||
-C install only if different (preserve the last data modification time)
|
||||
-d create directories instead of installing files.
|
||||
-g GROUP $chgrpprog installed files to GROUP.
|
||||
-m MODE $chmodprog installed files to MODE.
|
||||
-o USER $chownprog installed files to USER.
|
||||
-s $stripprog installed files.
|
||||
-t DIRECTORY install into DIRECTORY.
|
||||
-T report an error if DSTFILE is a directory.
|
||||
|
||||
Environment variables override the default commands:
|
||||
CHGRPPROG CHMODPROG CHOWNPROG CMPPROG CPPROG MKDIRPROG MVPROG
|
||||
RMPROG STRIPPROG
|
||||
"
|
||||
|
||||
while test $# -ne 0; do
|
||||
case $1 in
|
||||
-c) ;;
|
||||
|
||||
-C) copy_on_change=true;;
|
||||
|
||||
-d) dir_arg=true;;
|
||||
|
||||
-g) chgrpcmd="$chgrpprog $2"
|
||||
shift;;
|
||||
|
||||
--help) echo "$usage"; exit $?;;
|
||||
|
||||
-m) mode=$2
|
||||
case $mode in
|
||||
*' '* | *' '* | *'
|
||||
'* | *'*'* | *'?'* | *'['*)
|
||||
echo "$0: invalid mode: $mode" >&2
|
||||
exit 1;;
|
||||
esac
|
||||
shift;;
|
||||
|
||||
-o) chowncmd="$chownprog $2"
|
||||
shift;;
|
||||
|
||||
-s) stripcmd=$stripprog;;
|
||||
|
||||
-t) dst_arg=$2
|
||||
# Protect names problematic for 'test' and other utilities.
|
||||
case $dst_arg in
|
||||
-* | [=\(\)!]) dst_arg=./$dst_arg;;
|
||||
esac
|
||||
shift;;
|
||||
|
||||
-T) no_target_directory=true;;
|
||||
|
||||
--version) echo "$0 $scriptversion"; exit $?;;
|
||||
|
||||
--) shift
|
||||
break;;
|
||||
|
||||
-*) echo "$0: invalid option: $1" >&2
|
||||
exit 1;;
|
||||
|
||||
*) break;;
|
||||
esac
|
||||
shift
|
||||
done
|
||||
|
||||
if test $# -ne 0 && test -z "$dir_arg$dst_arg"; then
|
||||
# When -d is used, all remaining arguments are directories to create.
|
||||
# When -t is used, the destination is already specified.
|
||||
# Otherwise, the last argument is the destination. Remove it from $@.
|
||||
for arg
|
||||
do
|
||||
if test -n "$dst_arg"; then
|
||||
# $@ is not empty: it contains at least $arg.
|
||||
set fnord "$@" "$dst_arg"
|
||||
shift # fnord
|
||||
fi
|
||||
shift # arg
|
||||
dst_arg=$arg
|
||||
# Protect names problematic for 'test' and other utilities.
|
||||
case $dst_arg in
|
||||
-* | [=\(\)!]) dst_arg=./$dst_arg;;
|
||||
esac
|
||||
done
|
||||
fi
|
||||
|
||||
if test $# -eq 0; then
|
||||
if test -z "$dir_arg"; then
|
||||
echo "$0: no input file specified." >&2
|
||||
exit 1
|
||||
fi
|
||||
# It's OK to call 'install-sh -d' without argument.
|
||||
# This can happen when creating conditional directories.
|
||||
exit 0
|
||||
fi
|
||||
|
||||
if test -z "$dir_arg"; then
|
||||
do_exit='(exit $ret); exit $ret'
|
||||
trap "ret=129; $do_exit" 1
|
||||
trap "ret=130; $do_exit" 2
|
||||
trap "ret=141; $do_exit" 13
|
||||
trap "ret=143; $do_exit" 15
|
||||
|
||||
# Set umask so as not to create temps with too-generous modes.
|
||||
# However, 'strip' requires both read and write access to temps.
|
||||
case $mode in
|
||||
# Optimize common cases.
|
||||
*644) cp_umask=133;;
|
||||
*755) cp_umask=22;;
|
||||
|
||||
*[0-7])
|
||||
if test -z "$stripcmd"; then
|
||||
u_plus_rw=
|
||||
else
|
||||
u_plus_rw='% 200'
|
||||
fi
|
||||
cp_umask=`expr '(' 777 - $mode % 1000 ')' $u_plus_rw`;;
|
||||
*)
|
||||
if test -z "$stripcmd"; then
|
||||
u_plus_rw=
|
||||
else
|
||||
u_plus_rw=,u+rw
|
||||
fi
|
||||
cp_umask=$mode$u_plus_rw;;
|
||||
esac
|
||||
fi
|
||||
|
||||
for src
|
||||
do
|
||||
# Protect names problematic for 'test' and other utilities.
|
||||
case $src in
|
||||
-* | [=\(\)!]) src=./$src;;
|
||||
esac
|
||||
|
||||
if test -n "$dir_arg"; then
|
||||
dst=$src
|
||||
dstdir=$dst
|
||||
test -d "$dstdir"
|
||||
dstdir_status=$?
|
||||
else
|
||||
|
||||
# Waiting for this to be detected by the "$cpprog $src $dsttmp" command
|
||||
# might cause directories to be created, which would be especially bad
|
||||
# if $src (and thus $dsttmp) contains '*'.
|
||||
if test ! -f "$src" && test ! -d "$src"; then
|
||||
echo "$0: $src does not exist." >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if test -z "$dst_arg"; then
|
||||
echo "$0: no destination specified." >&2
|
||||
exit 1
|
||||
fi
|
||||
dst=$dst_arg
|
||||
|
||||
# If destination is a directory, append the input filename; won't work
|
||||
# if double slashes aren't ignored.
|
||||
if test -d "$dst"; then
|
||||
if test -n "$no_target_directory"; then
|
||||
echo "$0: $dst_arg: Is a directory" >&2
|
||||
exit 1
|
||||
fi
|
||||
dstdir=$dst
|
||||
dst=$dstdir/`basename "$src"`
|
||||
dstdir_status=0
|
||||
else
|
||||
# Prefer dirname, but fall back on a substitute if dirname fails.
|
||||
dstdir=`
|
||||
(dirname "$dst") 2>/dev/null ||
|
||||
expr X"$dst" : 'X\(.*[^/]\)//*[^/][^/]*/*$' \| \
|
||||
X"$dst" : 'X\(//\)[^/]' \| \
|
||||
X"$dst" : 'X\(//\)$' \| \
|
||||
X"$dst" : 'X\(/\)' \| . 2>/dev/null ||
|
||||
echo X"$dst" |
|
||||
sed '/^X\(.*[^/]\)\/\/*[^/][^/]*\/*$/{
|
||||
s//\1/
|
||||
q
|
||||
}
|
||||
/^X\(\/\/\)[^/].*/{
|
||||
s//\1/
|
||||
q
|
||||
}
|
||||
/^X\(\/\/\)$/{
|
||||
s//\1/
|
||||
q
|
||||
}
|
||||
/^X\(\/\).*/{
|
||||
s//\1/
|
||||
q
|
||||
}
|
||||
s/.*/./; q'
|
||||
`
|
||||
|
||||
test -d "$dstdir"
|
||||
dstdir_status=$?
|
||||
fi
|
||||
fi
|
||||
|
||||
obsolete_mkdir_used=false
|
||||
|
||||
if test $dstdir_status != 0; then
|
||||
case $posix_mkdir in
|
||||
'')
|
||||
# Create intermediate dirs using mode 755 as modified by the umask.
|
||||
# This is like FreeBSD 'install' as of 1997-10-28.
|
||||
umask=`umask`
|
||||
case $stripcmd.$umask in
|
||||
# Optimize common cases.
|
||||
*[2367][2367]) mkdir_umask=$umask;;
|
||||
.*0[02][02] | .[02][02] | .[02]) mkdir_umask=22;;
|
||||
|
||||
*[0-7])
|
||||
mkdir_umask=`expr $umask + 22 \
|
||||
- $umask % 100 % 40 + $umask % 20 \
|
||||
- $umask % 10 % 4 + $umask % 2
|
||||
`;;
|
||||
*) mkdir_umask=$umask,go-w;;
|
||||
esac
|
||||
|
||||
# With -d, create the new directory with the user-specified mode.
|
||||
# Otherwise, rely on $mkdir_umask.
|
||||
if test -n "$dir_arg"; then
|
||||
mkdir_mode=-m$mode
|
||||
else
|
||||
mkdir_mode=
|
||||
fi
|
||||
|
||||
posix_mkdir=false
|
||||
case $umask in
|
||||
*[123567][0-7][0-7])
|
||||
# POSIX mkdir -p sets u+wx bits regardless of umask, which
|
||||
# is incompatible with FreeBSD 'install' when (umask & 300) != 0.
|
||||
;;
|
||||
*)
|
||||
tmpdir=${TMPDIR-/tmp}/ins$RANDOM-$$
|
||||
trap 'ret=$?; rmdir "$tmpdir/d" "$tmpdir" 2>/dev/null; exit $ret' 0
|
||||
|
||||
if (umask $mkdir_umask &&
|
||||
exec $mkdirprog $mkdir_mode -p -- "$tmpdir/d") >/dev/null 2>&1
|
||||
then
|
||||
if test -z "$dir_arg" || {
|
||||
# Check for POSIX incompatibilities with -m.
|
||||
# HP-UX 11.23 and IRIX 6.5 mkdir -m -p sets group- or
|
||||
# other-writable bit of parent directory when it shouldn't.
|
||||
# FreeBSD 6.1 mkdir -m -p sets mode of existing directory.
|
||||
ls_ld_tmpdir=`ls -ld "$tmpdir"`
|
||||
case $ls_ld_tmpdir in
|
||||
d????-?r-*) different_mode=700;;
|
||||
d????-?--*) different_mode=755;;
|
||||
*) false;;
|
||||
esac &&
|
||||
$mkdirprog -m$different_mode -p -- "$tmpdir" && {
|
||||
ls_ld_tmpdir_1=`ls -ld "$tmpdir"`
|
||||
test "$ls_ld_tmpdir" = "$ls_ld_tmpdir_1"
|
||||
}
|
||||
}
|
||||
then posix_mkdir=:
|
||||
fi
|
||||
rmdir "$tmpdir/d" "$tmpdir"
|
||||
else
|
||||
# Remove any dirs left behind by ancient mkdir implementations.
|
||||
rmdir ./$mkdir_mode ./-p ./-- 2>/dev/null
|
||||
fi
|
||||
trap '' 0;;
|
||||
esac;;
|
||||
esac
|
||||
|
||||
if
|
||||
$posix_mkdir && (
|
||||
umask $mkdir_umask &&
|
||||
$doit_exec $mkdirprog $mkdir_mode -p -- "$dstdir"
|
||||
)
|
||||
then :
|
||||
else
|
||||
|
||||
# The umask is ridiculous, or mkdir does not conform to POSIX,
|
||||
# or it failed possibly due to a race condition. Create the
|
||||
# directory the slow way, step by step, checking for races as we go.
|
||||
|
||||
case $dstdir in
|
||||
/*) prefix='/';;
|
||||
[-=\(\)!]*) prefix='./';;
|
||||
*) prefix='';;
|
||||
esac
|
||||
|
||||
eval "$initialize_posix_glob"
|
||||
|
||||
oIFS=$IFS
|
||||
IFS=/
|
||||
$posix_glob set -f
|
||||
set fnord $dstdir
|
||||
shift
|
||||
$posix_glob set +f
|
||||
IFS=$oIFS
|
||||
|
||||
prefixes=
|
||||
|
||||
for d
|
||||
do
|
||||
test X"$d" = X && continue
|
||||
|
||||
prefix=$prefix$d
|
||||
if test -d "$prefix"; then
|
||||
prefixes=
|
||||
else
|
||||
if $posix_mkdir; then
|
||||
(umask=$mkdir_umask &&
|
||||
$doit_exec $mkdirprog $mkdir_mode -p -- "$dstdir") && break
|
||||
# Don't fail if two instances are running concurrently.
|
||||
test -d "$prefix" || exit 1
|
||||
else
|
||||
case $prefix in
|
||||
*\'*) qprefix=`echo "$prefix" | sed "s/'/'\\\\\\\\''/g"`;;
|
||||
*) qprefix=$prefix;;
|
||||
esac
|
||||
prefixes="$prefixes '$qprefix'"
|
||||
fi
|
||||
fi
|
||||
prefix=$prefix/
|
||||
done
|
||||
|
||||
if test -n "$prefixes"; then
|
||||
# Don't fail if two instances are running concurrently.
|
||||
(umask $mkdir_umask &&
|
||||
eval "\$doit_exec \$mkdirprog $prefixes") ||
|
||||
test -d "$dstdir" || exit 1
|
||||
obsolete_mkdir_used=true
|
||||
fi
|
||||
fi
|
||||
fi
|
||||
|
||||
if test -n "$dir_arg"; then
|
||||
{ test -z "$chowncmd" || $doit $chowncmd "$dst"; } &&
|
||||
{ test -z "$chgrpcmd" || $doit $chgrpcmd "$dst"; } &&
|
||||
{ test "$obsolete_mkdir_used$chowncmd$chgrpcmd" = false ||
|
||||
test -z "$chmodcmd" || $doit $chmodcmd $mode "$dst"; } || exit 1
|
||||
else
|
||||
|
||||
# Make a couple of temp file names in the proper directory.
|
||||
dsttmp=$dstdir/_inst.$$_
|
||||
rmtmp=$dstdir/_rm.$$_
|
||||
|
||||
# Trap to clean up those temp files at exit.
|
||||
trap 'ret=$?; rm -f "$dsttmp" "$rmtmp" && exit $ret' 0
|
||||
|
||||
# Copy the file name to the temp name.
|
||||
(umask $cp_umask && $doit_exec $cpprog "$src" "$dsttmp") &&
|
||||
|
||||
# and set any options; do chmod last to preserve setuid bits.
|
||||
#
|
||||
# If any of these fail, we abort the whole thing. If we want to
|
||||
# ignore errors from any of these, just make sure not to ignore
|
||||
# errors from the above "$doit $cpprog $src $dsttmp" command.
|
||||
#
|
||||
{ test -z "$chowncmd" || $doit $chowncmd "$dsttmp"; } &&
|
||||
{ test -z "$chgrpcmd" || $doit $chgrpcmd "$dsttmp"; } &&
|
||||
{ test -z "$stripcmd" || $doit $stripcmd "$dsttmp"; } &&
|
||||
{ test -z "$chmodcmd" || $doit $chmodcmd $mode "$dsttmp"; } &&
|
||||
|
||||
# If -C, don't bother to copy if it wouldn't change the file.
|
||||
if $copy_on_change &&
|
||||
old=`LC_ALL=C ls -dlL "$dst" 2>/dev/null` &&
|
||||
new=`LC_ALL=C ls -dlL "$dsttmp" 2>/dev/null` &&
|
||||
|
||||
eval "$initialize_posix_glob" &&
|
||||
$posix_glob set -f &&
|
||||
set X $old && old=:$2:$4:$5:$6 &&
|
||||
set X $new && new=:$2:$4:$5:$6 &&
|
||||
$posix_glob set +f &&
|
||||
|
||||
test "$old" = "$new" &&
|
||||
$cmpprog "$dst" "$dsttmp" >/dev/null 2>&1
|
||||
then
|
||||
rm -f "$dsttmp"
|
||||
else
|
||||
# Rename the file to the real destination.
|
||||
$doit $mvcmd -f "$dsttmp" "$dst" 2>/dev/null ||
|
||||
|
||||
# The rename failed, perhaps because mv can't rename something else
|
||||
# to itself, or perhaps because mv is so ancient that it does not
|
||||
# support -f.
|
||||
{
|
||||
# Now remove or move aside any old file at destination location.
|
||||
# We try this two ways since rm can't unlink itself on some
|
||||
# systems and the destination file might be busy for other
|
||||
# reasons. In this case, the final cleanup might fail but the new
|
||||
# file should still install successfully.
|
||||
{
|
||||
test ! -f "$dst" ||
|
||||
$doit $rmcmd -f "$dst" 2>/dev/null ||
|
||||
{ $doit $mvcmd -f "$dst" "$rmtmp" 2>/dev/null &&
|
||||
{ $doit $rmcmd -f "$rmtmp" 2>/dev/null; :; }
|
||||
} ||
|
||||
{ echo "$0: cannot unlink or rename $dst" >&2
|
||||
(exit 1); exit 1
|
||||
}
|
||||
} &&
|
||||
|
||||
# Now rename the file to the real destination.
|
||||
$doit $mvcmd "$dsttmp" "$dst"
|
||||
}
|
||||
fi || exit 1
|
||||
|
||||
trap '' 0
|
||||
fi
|
||||
done
|
||||
|
||||
# Local variables:
|
||||
# eval: (add-hook 'write-file-hooks 'time-stamp)
|
||||
# time-stamp-start: "scriptversion="
|
||||
# time-stamp-format: "%:y-%02m-%02d.%02H"
|
||||
# time-stamp-time-zone: "UTC"
|
||||
# time-stamp-end: "; # UTC"
|
||||
# End:
|
451
configure.ac
451
configure.ac
@ -1,451 +0,0 @@
|
||||
AC_INIT([nix],[m4_esyscmd(bash -c "echo -n $(cat ./.version)$VERSION_SUFFIX")])
|
||||
AC_CONFIG_MACRO_DIRS([m4])
|
||||
AC_CONFIG_SRCDIR(README.md)
|
||||
AC_CONFIG_AUX_DIR(config)
|
||||
|
||||
AC_PROG_SED
|
||||
|
||||
# Construct a Nix system name (like "i686-linux"):
|
||||
# https://www.gnu.org/software/autoconf/manual/html_node/Canonicalizing.html#index-AC_005fCANONICAL_005fHOST-1
|
||||
# The inital value is produced by the `config/config.guess` script:
|
||||
# upstream: https://git.savannah.gnu.org/cgit/config.git/tree/config.guess
|
||||
# It has the following form, which is not documented anywhere:
|
||||
# <cpu>-<vendor>-<os>[<version>][-<abi>]
|
||||
# If `./configure` is passed any of the `--host`, `--build`, `--target` options, the value comes from `config/config.sub` instead:
|
||||
# upstream: https://git.savannah.gnu.org/cgit/config.git/tree/config.sub
|
||||
AC_CANONICAL_HOST
|
||||
AC_MSG_CHECKING([for the canonical Nix system name])
|
||||
|
||||
AC_ARG_WITH(system, AS_HELP_STRING([--with-system=SYSTEM],[Platform identifier (e.g., `i686-linux').]),
|
||||
[system=$withval],
|
||||
[case "$host_cpu" in
|
||||
i*86)
|
||||
machine_name="i686";;
|
||||
amd64)
|
||||
machine_name="x86_64";;
|
||||
armv6|armv7)
|
||||
machine_name="${host_cpu}l";;
|
||||
*)
|
||||
machine_name="$host_cpu";;
|
||||
esac
|
||||
|
||||
case "$host_os" in
|
||||
linux-gnu*|linux-musl*)
|
||||
# For backward compatibility, strip the `-gnu' part.
|
||||
system="$machine_name-linux";;
|
||||
*)
|
||||
# Strip the version number from names such as `gnu0.3',
|
||||
# `darwin10.2.0', etc.
|
||||
system="$machine_name-`echo $host_os | "$SED" -e's/@<:@0-9.@:>@*$//g'`";;
|
||||
esac])
|
||||
|
||||
AC_MSG_RESULT($system)
|
||||
AC_SUBST(system)
|
||||
AC_DEFINE_UNQUOTED(SYSTEM, ["$system"], [platform identifier ('cpu-os')])
|
||||
|
||||
|
||||
# State should be stored in /nix/var, unless the user overrides it explicitly.
|
||||
test "$localstatedir" = '${prefix}/var' && localstatedir=/nix/var
|
||||
|
||||
# Assign a default value to C{,XX}FLAGS as the default configure script sets them
|
||||
# to -O2 otherwise, which we don't want to have hardcoded
|
||||
CFLAGS=${CFLAGS-""}
|
||||
CXXFLAGS=${CXXFLAGS-""}
|
||||
|
||||
AC_PROG_CC
|
||||
AC_PROG_CXX
|
||||
AC_PROG_CPP
|
||||
|
||||
AC_CHECK_TOOL([AR], [ar])
|
||||
|
||||
# Use 64-bit file system calls so that we can support files > 2 GiB.
|
||||
AC_SYS_LARGEFILE
|
||||
|
||||
|
||||
# Solaris-specific stuff.
|
||||
case "$host_os" in
|
||||
solaris*)
|
||||
# Solaris requires -lsocket -lnsl for network functions
|
||||
LDFLAGS="-lsocket -lnsl $LDFLAGS"
|
||||
;;
|
||||
esac
|
||||
|
||||
|
||||
ENSURE_NO_GCC_BUG_80431
|
||||
|
||||
|
||||
# Check for pubsetbuf.
|
||||
AC_MSG_CHECKING([for pubsetbuf])
|
||||
AC_LANG_PUSH(C++)
|
||||
AC_COMPILE_IFELSE([AC_LANG_PROGRAM([[#include <iostream>
|
||||
using namespace std;
|
||||
static char buf[1024];]],
|
||||
[[cerr.rdbuf()->pubsetbuf(buf, sizeof(buf));]])],
|
||||
[AC_MSG_RESULT(yes) AC_DEFINE(HAVE_PUBSETBUF, 1, [Whether pubsetbuf is available.])],
|
||||
AC_MSG_RESULT(no))
|
||||
AC_LANG_POP(C++)
|
||||
|
||||
|
||||
AC_CHECK_FUNCS([statvfs pipe2 close_range])
|
||||
|
||||
|
||||
# Check for lutimes, optionally used for changing the mtime of
|
||||
# symlinks.
|
||||
AC_CHECK_FUNCS([lutimes])
|
||||
|
||||
|
||||
# Check whether the store optimiser can optimise symlinks.
|
||||
AC_MSG_CHECKING([whether it is possible to create a link to a symlink])
|
||||
ln -s bla tmp_link
|
||||
if ln tmp_link tmp_link2 2> /dev/null; then
|
||||
AC_MSG_RESULT(yes)
|
||||
AC_DEFINE(CAN_LINK_SYMLINK, 1, [Whether link() works on symlinks.])
|
||||
else
|
||||
AC_MSG_RESULT(no)
|
||||
fi
|
||||
rm -f tmp_link tmp_link2
|
||||
|
||||
|
||||
# Check for <locale>.
|
||||
AC_LANG_PUSH(C++)
|
||||
AC_CHECK_HEADERS([locale])
|
||||
AC_LANG_POP(C++)
|
||||
|
||||
|
||||
AC_DEFUN([NEED_PROG],
|
||||
[
|
||||
AC_PATH_PROG($1, $2)
|
||||
if test -z "$$1"; then
|
||||
AC_MSG_ERROR([$2 is required])
|
||||
fi
|
||||
])
|
||||
|
||||
NEED_PROG(bash, bash)
|
||||
AC_PATH_PROG(flex, flex, false)
|
||||
AC_PATH_PROG(bison, bison, false)
|
||||
AC_PATH_PROG(dot, dot)
|
||||
AC_PATH_PROG(lsof, lsof, lsof)
|
||||
|
||||
|
||||
AC_SUBST(coreutils, [$(dirname $(type -p cat))])
|
||||
|
||||
|
||||
AC_ARG_WITH(store-dir, AS_HELP_STRING([--with-store-dir=PATH],[path of the Nix store (defaults to /nix/store)]),
|
||||
storedir=$withval, storedir='/nix/store')
|
||||
AC_SUBST(storedir)
|
||||
|
||||
|
||||
# Running the functional tests without building Nix is useful for testing
|
||||
# different pre-built versions of Nix against each other.
|
||||
AC_ARG_ENABLE(build, AS_HELP_STRING([--disable-build],[Do not build nix]),
|
||||
ENABLE_BUILD=$enableval, ENABLE_BUILD=yes)
|
||||
AC_SUBST(ENABLE_BUILD)
|
||||
|
||||
# Building without unit tests is useful for bootstrapping with a smaller footprint
|
||||
# or running the tests in a separate derivation. Otherwise, we do compile and
|
||||
# run them.
|
||||
|
||||
AC_ARG_ENABLE(unit-tests, AS_HELP_STRING([--disable-unit-tests],[Do not build the tests]),
|
||||
ENABLE_UNIT_TESTS=$enableval, ENABLE_UNIT_TESTS=$ENABLE_BUILD)
|
||||
AC_SUBST(ENABLE_UNIT_TESTS)
|
||||
|
||||
AS_IF(
|
||||
[test "$ENABLE_BUILD" == "no" && test "$ENABLE_UNIT_TESTS" == "yes"],
|
||||
[AC_MSG_ERROR([Cannot enable unit tests when building overall is disabled. Please do not pass '--enable-unit-tests' or do not pass '--disable-build'.])])
|
||||
|
||||
AC_ARG_ENABLE(functional-tests, AS_HELP_STRING([--disable-functional-tests],[Do not build the tests]),
|
||||
ENABLE_FUNCTIONAL_TESTS=$enableval, ENABLE_FUNCTIONAL_TESTS=yes)
|
||||
AC_SUBST(ENABLE_FUNCTIONAL_TESTS)
|
||||
|
||||
# documentation generation switch
|
||||
AC_ARG_ENABLE(doc-gen, AS_HELP_STRING([--disable-doc-gen],[disable documentation generation]),
|
||||
ENABLE_DOC_GEN=$enableval, ENABLE_DOC_GEN=$ENABLE_BUILD)
|
||||
AC_SUBST(ENABLE_DOC_GEN)
|
||||
|
||||
AS_IF(
|
||||
[test "$ENABLE_BUILD" == "no" && test "$ENABLE_DOC_GEN" == "yes"],
|
||||
[AC_MSG_ERROR([Cannot enable generated docs when building overall is disabled. Please do not pass '--enable-doc-gen' or do not pass '--disable-build'.])])
|
||||
|
||||
AS_IF(
|
||||
[test "$ENABLE_FUNCTIONAL_TESTS" == "yes" || test "$ENABLE_DOC_GEN" == "yes"],
|
||||
[NEED_PROG(jq, jq)])
|
||||
|
||||
AS_IF(
|
||||
[test "$ENABLE_DOC_GEN" == "yes"],
|
||||
[NEED_PROG(man, man)])
|
||||
|
||||
AS_IF([test "$ENABLE_BUILD" == "yes"],[
|
||||
|
||||
# Look for boost, a required dependency.
|
||||
# Note that AX_BOOST_BASE only exports *CPP* BOOST_CPPFLAGS, no CXX flags,
|
||||
# and CPPFLAGS are not passed to the C++ compiler automatically.
|
||||
# Thus we append the returned CPPFLAGS to the CXXFLAGS here.
|
||||
AX_BOOST_BASE([1.66], [CXXFLAGS="$BOOST_CPPFLAGS $CXXFLAGS"], [AC_MSG_ERROR([Nix requires boost.])])
|
||||
# For unknown reasons, setting this directly in the ACTION-IF-FOUND above
|
||||
# ends up with LDFLAGS being empty, so we set it afterwards.
|
||||
LDFLAGS="$BOOST_LDFLAGS $LDFLAGS"
|
||||
|
||||
# On some platforms, new-style atomics need a helper library
|
||||
AC_MSG_CHECKING(whether -latomic is needed)
|
||||
AC_LINK_IFELSE([AC_LANG_SOURCE([[
|
||||
#include <stdint.h>
|
||||
uint64_t v;
|
||||
int main() {
|
||||
return (int)__atomic_load_n(&v, __ATOMIC_ACQUIRE);
|
||||
}]])], GCC_ATOMIC_BUILTINS_NEED_LIBATOMIC=no, GCC_ATOMIC_BUILTINS_NEED_LIBATOMIC=yes)
|
||||
AC_MSG_RESULT($GCC_ATOMIC_BUILTINS_NEED_LIBATOMIC)
|
||||
if test "x$GCC_ATOMIC_BUILTINS_NEED_LIBATOMIC" = xyes; then
|
||||
LDFLAGS="-latomic $LDFLAGS"
|
||||
fi
|
||||
|
||||
AC_ARG_ENABLE(install-unit-tests, AS_HELP_STRING([--enable-install-unit-tests],[Install the unit tests for running later (default no)]),
|
||||
INSTALL_UNIT_TESTS=$enableval, INSTALL_UNIT_TESTS=no)
|
||||
AC_SUBST(INSTALL_UNIT_TESTS)
|
||||
|
||||
AC_ARG_WITH(check-bin-dir, AS_HELP_STRING([--with-check-bin-dir=PATH],[path to install unit tests for running later (defaults to $libexecdir/nix)]),
|
||||
checkbindir=$withval, checkbindir=$libexecdir/nix)
|
||||
AC_SUBST(checkbindir)
|
||||
|
||||
AC_ARG_WITH(check-lib-dir, AS_HELP_STRING([--with-check-lib-dir=PATH],[path to install unit tests for running later (defaults to $libdir)]),
|
||||
checklibdir=$withval, checklibdir=$libdir)
|
||||
AC_SUBST(checklibdir)
|
||||
|
||||
# LTO is currently broken with clang for unknown reasons; ld segfaults in the llvm plugin
|
||||
AC_ARG_ENABLE(lto, AS_HELP_STRING([--enable-lto],[Enable LTO (only supported with GCC) [default=no]]),
|
||||
lto=$enableval, lto=no)
|
||||
if test "$lto" = yes; then
|
||||
if $CXX --version | grep -q GCC; then
|
||||
AC_SUBST(CXXLTO, [-flto=jobserver])
|
||||
else
|
||||
echo "error: LTO is only supported with GCC at the moment" >&2
|
||||
exit 1
|
||||
fi
|
||||
else
|
||||
AC_SUBST(CXXLTO, [""])
|
||||
fi
|
||||
|
||||
PKG_PROG_PKG_CONFIG
|
||||
|
||||
AC_ARG_ENABLE(shared, AS_HELP_STRING([--enable-shared],[Build shared libraries for Nix [default=yes]]),
|
||||
shared=$enableval, shared=yes)
|
||||
if test "$shared" = yes; then
|
||||
AC_SUBST(BUILD_SHARED_LIBS, 1, [Whether to build shared libraries.])
|
||||
else
|
||||
AC_SUBST(BUILD_SHARED_LIBS, 0, [Whether to build shared libraries.])
|
||||
PKG_CONFIG="$PKG_CONFIG --static"
|
||||
fi
|
||||
|
||||
# Look for OpenSSL, a required dependency. FIXME: this is only (maybe)
|
||||
# used by S3BinaryCacheStore.
|
||||
PKG_CHECK_MODULES([OPENSSL], [libcrypto >= 1.1.1], [CXXFLAGS="$OPENSSL_CFLAGS $CXXFLAGS"])
|
||||
|
||||
|
||||
# Look for libarchive.
|
||||
PKG_CHECK_MODULES([LIBARCHIVE], [libarchive >= 3.1.2], [CXXFLAGS="$LIBARCHIVE_CFLAGS $CXXFLAGS"])
|
||||
# Workaround until https://github.com/libarchive/libarchive/issues/1446 is fixed
|
||||
if test "$shared" != yes; then
|
||||
LIBARCHIVE_LIBS+=' -lz'
|
||||
fi
|
||||
|
||||
# Look for SQLite, a required dependency.
|
||||
PKG_CHECK_MODULES([SQLITE3], [sqlite3 >= 3.6.19], [CXXFLAGS="$SQLITE3_CFLAGS $CXXFLAGS"])
|
||||
|
||||
# Look for libcurl, a required dependency.
|
||||
PKG_CHECK_MODULES([LIBCURL], [libcurl], [CXXFLAGS="$LIBCURL_CFLAGS $CXXFLAGS"])
|
||||
|
||||
# Look for editline or readline, a required dependency.
|
||||
# The the libeditline.pc file was added only in libeditline >= 1.15.2,
|
||||
# see https://github.com/troglobit/editline/commit/0a8f2ef4203c3a4a4726b9dd1336869cd0da8607,
|
||||
# Older versions are no longer supported.
|
||||
AC_ARG_WITH(
|
||||
[readline-flavor],
|
||||
AS_HELP_STRING([--with-readline-flavor],[Which library to use for nice line editting with the Nix language REPL" [default=editline]]),
|
||||
[readline_flavor=$withval],
|
||||
[readline_flavor=editline])
|
||||
AS_CASE(["$readline_flavor"],
|
||||
[editline], [
|
||||
readline_flavor_pc=libeditline
|
||||
],
|
||||
[readline], [
|
||||
readline_flavor_pc=readline
|
||||
AC_DEFINE([USE_READLINE], [1], [Use readline instead of editline])
|
||||
],
|
||||
[AC_MSG_ERROR([bad value "$readline_flavor" for --with-readline-flavor, must be one of: editline, readline])])
|
||||
PKG_CHECK_MODULES([EDITLINE], [$readline_flavor_pc], [CXXFLAGS="$EDITLINE_CFLAGS $CXXFLAGS"])
|
||||
|
||||
# Look for libsodium.
|
||||
PKG_CHECK_MODULES([SODIUM], [libsodium], [CXXFLAGS="$SODIUM_CFLAGS $CXXFLAGS"])
|
||||
|
||||
# Look for libbrotli{enc,dec}.
|
||||
PKG_CHECK_MODULES([LIBBROTLI], [libbrotlienc libbrotlidec], [CXXFLAGS="$LIBBROTLI_CFLAGS $CXXFLAGS"])
|
||||
|
||||
# Look for libcpuid.
|
||||
have_libcpuid=
|
||||
if test "$machine_name" = "x86_64"; then
|
||||
AC_ARG_ENABLE([cpuid],
|
||||
AS_HELP_STRING([--disable-cpuid], [Do not determine microarchitecture levels with libcpuid (relevant to x86_64 only)]))
|
||||
if test "x$enable_cpuid" != "xno"; then
|
||||
PKG_CHECK_MODULES([LIBCPUID], [libcpuid],
|
||||
[CXXFLAGS="$LIBCPUID_CFLAGS $CXXFLAGS"
|
||||
have_libcpuid=1
|
||||
AC_DEFINE([HAVE_LIBCPUID], [1], [Use libcpuid])]
|
||||
)
|
||||
fi
|
||||
fi
|
||||
AC_SUBST(HAVE_LIBCPUID, [$have_libcpuid])
|
||||
|
||||
|
||||
# Look for libseccomp, required for Linux sandboxing.
|
||||
case "$host_os" in
|
||||
linux*)
|
||||
AC_ARG_ENABLE([seccomp-sandboxing],
|
||||
AS_HELP_STRING([--disable-seccomp-sandboxing],[Don't build support for seccomp sandboxing (only recommended if your arch doesn't support libseccomp yet!)
|
||||
]))
|
||||
if test "x$enable_seccomp_sandboxing" != "xno"; then
|
||||
PKG_CHECK_MODULES([LIBSECCOMP], [libseccomp],
|
||||
[CXXFLAGS="$LIBSECCOMP_CFLAGS $CXXFLAGS" CFLAGS="$LIBSECCOMP_CFLAGS $CFLAGS"])
|
||||
have_seccomp=1
|
||||
AC_DEFINE([HAVE_SECCOMP], [1], [Whether seccomp is available and should be used for sandboxing.])
|
||||
AC_COMPILE_IFELSE([
|
||||
AC_LANG_SOURCE([[
|
||||
#include <seccomp.h>
|
||||
#ifndef __SNR_fchmodat2
|
||||
# error "Missing support for fchmodat2"
|
||||
#endif
|
||||
]])
|
||||
], [], [
|
||||
echo "libseccomp is missing __SNR_fchmodat2. Please provide libseccomp 2.5.5 or later"
|
||||
exit 1
|
||||
])
|
||||
else
|
||||
have_seccomp=
|
||||
fi
|
||||
;;
|
||||
*)
|
||||
have_seccomp=
|
||||
;;
|
||||
esac
|
||||
AC_SUBST(HAVE_SECCOMP, [$have_seccomp])
|
||||
|
||||
# Optional dependencies for better normalizing file system data
|
||||
AC_CHECK_HEADERS([sys/xattr.h])
|
||||
AS_IF([test "$ac_cv_header_sys_xattr_h" = "yes"],[
|
||||
AC_CHECK_FUNCS([llistxattr lremovexattr])
|
||||
AS_IF([test "$ac_cv_func_llistxattr" = "yes" && test "$ac_cv_func_lremovexattr" = "yes"],[
|
||||
AC_DEFINE([HAVE_ACL_SUPPORT], [1], [Define if we can manipulate file system Access Control Lists])
|
||||
])
|
||||
])
|
||||
|
||||
# Look for aws-cpp-sdk-s3.
|
||||
AC_LANG_PUSH(C++)
|
||||
AC_CHECK_HEADERS([aws/s3/S3Client.h],
|
||||
[AC_DEFINE([ENABLE_S3], [1], [Whether to enable S3 support via aws-sdk-cpp.]) enable_s3=1],
|
||||
[AC_DEFINE([ENABLE_S3], [0], [Whether to enable S3 support via aws-sdk-cpp.]) enable_s3=])
|
||||
AC_SUBST(ENABLE_S3, [$enable_s3])
|
||||
AC_LANG_POP(C++)
|
||||
|
||||
|
||||
# Whether to use the Boehm garbage collector.
|
||||
AC_ARG_ENABLE(gc, AS_HELP_STRING([--enable-gc],[enable garbage collection in the Nix expression evaluator (requires Boehm GC) [default=yes]]),
|
||||
gc=$enableval, gc=yes)
|
||||
if test "$gc" = yes; then
|
||||
PKG_CHECK_MODULES([BDW_GC], [bdw-gc])
|
||||
CXXFLAGS="$BDW_GC_CFLAGS $CXXFLAGS"
|
||||
AC_DEFINE(HAVE_BOEHMGC, 1, [Whether to use the Boehm garbage collector.])
|
||||
|
||||
# See `fixupBoehmStackPointer`, for the integration between Boehm GC
|
||||
# and Boost coroutines.
|
||||
old_CFLAGS="$CFLAGS"
|
||||
# Temporary set `-pthread` just for the next check
|
||||
CFLAGS="$CFLAGS -pthread"
|
||||
AC_CHECK_FUNCS([pthread_attr_get_np pthread_getattr_np])
|
||||
CFLAGS="$old_CFLAGS"
|
||||
fi
|
||||
|
||||
AS_IF([test "$ENABLE_UNIT_TESTS" == "yes"],[
|
||||
|
||||
# Look for gtest.
|
||||
PKG_CHECK_MODULES([GTEST], [gtest_main gmock_main])
|
||||
|
||||
# Look for rapidcheck.
|
||||
PKG_CHECK_MODULES([RAPIDCHECK], [rapidcheck rapidcheck_gtest])
|
||||
|
||||
])
|
||||
|
||||
# Look for nlohmann/json.
|
||||
PKG_CHECK_MODULES([NLOHMANN_JSON], [nlohmann_json >= 3.9])
|
||||
|
||||
|
||||
# Look for lowdown library.
|
||||
AC_ARG_ENABLE([markdown], AS_HELP_STRING([--enable-markdown], [Enable Markdown rendering in the Nix binary (requires lowdown) [default=auto]]),
|
||||
enable_markdown=$enableval, enable_markdown=auto)
|
||||
AS_CASE(["$enable_markdown"],
|
||||
[yes | auto], [
|
||||
PKG_CHECK_MODULES([LOWDOWN], [lowdown >= 0.9.0], [
|
||||
CXXFLAGS="$LOWDOWN_CFLAGS $CXXFLAGS"
|
||||
have_lowdown=1
|
||||
AC_DEFINE(HAVE_LOWDOWN, 1, [Whether lowdown is available and should be used for Markdown rendering.])
|
||||
], [
|
||||
AS_IF([test "x$enable_markdown" == "xyes"], [AC_MSG_ERROR([--enable-markdown was specified, but lowdown was not found.])])
|
||||
])
|
||||
],
|
||||
[no], [have_lowdown=],
|
||||
[AC_MSG_ERROR([bad value "$enable_markdown" for --enable-markdown, must be one of: yes, no, auto])])
|
||||
|
||||
|
||||
# Look for libgit2.
|
||||
PKG_CHECK_MODULES([LIBGIT2], [libgit2])
|
||||
|
||||
|
||||
# Look for toml11, a required dependency.
|
||||
AC_LANG_PUSH(C++)
|
||||
AC_CHECK_HEADER([toml.hpp], [], [AC_MSG_ERROR([toml11 is not found.])])
|
||||
AC_LANG_POP(C++)
|
||||
|
||||
# Setuid installations.
|
||||
AC_CHECK_FUNCS([setresuid setreuid lchown])
|
||||
|
||||
|
||||
# Nice to have, but not essential.
|
||||
AC_CHECK_FUNCS([strsignal posix_fallocate sysconf])
|
||||
|
||||
|
||||
AC_ARG_WITH(sandbox-shell, AS_HELP_STRING([--with-sandbox-shell=PATH],[path of a statically-linked shell to use as /bin/sh in sandboxes]),
|
||||
sandbox_shell=$withval)
|
||||
AC_SUBST(sandbox_shell)
|
||||
if test ${cross_compiling:-no} = no && ! test -z ${sandbox_shell+x}; then
|
||||
AC_MSG_CHECKING([whether sandbox-shell has the standalone feature])
|
||||
# busybox shell sometimes allows executing other busybox applets,
|
||||
# even if they are not in the path, breaking our sandbox
|
||||
if PATH= $sandbox_shell -c "busybox" 2>&1 | grep -qv "not found"; then
|
||||
AC_MSG_RESULT(enabled)
|
||||
AC_MSG_ERROR([Please disable busybox FEATURE_SH_STANDALONE])
|
||||
else
|
||||
AC_MSG_RESULT(disabled)
|
||||
fi
|
||||
fi
|
||||
|
||||
AC_ARG_ENABLE(embedded-sandbox-shell, AS_HELP_STRING([--enable-embedded-sandbox-shell],[include the sandbox shell in the Nix binary [default=no]]),
|
||||
embedded_sandbox_shell=$enableval, embedded_sandbox_shell=no)
|
||||
AC_SUBST(embedded_sandbox_shell)
|
||||
if test "$embedded_sandbox_shell" = yes; then
|
||||
AC_DEFINE(HAVE_EMBEDDED_SANDBOX_SHELL, 1, [Include the sandbox shell in the Nix binary.])
|
||||
fi
|
||||
|
||||
])
|
||||
|
||||
|
||||
# Expand all variables in config.status.
|
||||
test "$prefix" = NONE && prefix=$ac_default_prefix
|
||||
test "$exec_prefix" = NONE && exec_prefix='${prefix}'
|
||||
for name in $ac_subst_vars; do
|
||||
declare $name="$(eval echo "${!name}")"
|
||||
declare $name="$(eval echo "${!name}")"
|
||||
declare $name="$(eval echo "${!name}")"
|
||||
done
|
||||
|
||||
rm -f Makefile.config
|
||||
|
||||
AC_CONFIG_HEADERS([config.h])
|
||||
AC_CONFIG_FILES([])
|
||||
AC_OUTPUT
|
@ -1,236 +0,0 @@
|
||||
# The version of Nix used to generate the doc. Can also be
|
||||
# `$(nix_INSTALL_PATH)` or just `nix` (to grap ambient from the `PATH`),
|
||||
# if one prefers.
|
||||
doc_nix = $(nix_PATH)
|
||||
|
||||
MANUAL_SRCS := \
|
||||
$(call rwildcard, $(d)/source, *.md) \
|
||||
$(call rwildcard, $(d)/source, */*.md)
|
||||
|
||||
man-pages := $(foreach n, \
|
||||
nix-env.1 nix-store.1 \
|
||||
nix-build.1 nix-shell.1 nix-instantiate.1 \
|
||||
nix-collect-garbage.1 \
|
||||
nix-prefetch-url.1 nix-channel.1 \
|
||||
nix-hash.1 nix-copy-closure.1 \
|
||||
nix.conf.5 nix-daemon.8 \
|
||||
nix-profiles.5 \
|
||||
, $(d)/$(n))
|
||||
|
||||
# man pages for subcommands
|
||||
# convert from `$(d)/source/command-ref/nix-{1}/{2}.md` to `$(d)/nix-{1}-{2}.1`
|
||||
# FIXME: unify with how nix3-cli man pages are generated
|
||||
man-pages += $(foreach subcommand, \
|
||||
$(filter-out %opt-common.md %env-common.md, $(wildcard $(d)/source/command-ref/nix-*/*.md)), \
|
||||
$(d)/$(subst /,-,$(subst $(d)/source/command-ref/,,$(subst .md,.1,$(subcommand)))))
|
||||
|
||||
clean-files += $(d)/*.1 $(d)/*.5 $(d)/*.8
|
||||
|
||||
# Provide a dummy environment for nix, so that it will not access files outside the macOS sandbox.
|
||||
# Set cores to 0 because otherwise `nix config show` resolves the cores based on the current machine
|
||||
dummy-env = env -i \
|
||||
HOME=/dummy \
|
||||
NIX_CONF_DIR=/dummy \
|
||||
NIX_SSL_CERT_FILE=/dummy/no-ca-bundle.crt \
|
||||
NIX_STATE_DIR=/dummy \
|
||||
NIX_CONFIG='cores = 0'
|
||||
|
||||
nix-eval = $(dummy-env) $(doc_nix) eval --experimental-features nix-command -I nix=doc/manual --store dummy:// --impure --raw
|
||||
|
||||
# re-implement mdBook's include directive to make it usable for terminal output and for proper @docroot@ substitution
|
||||
define process-includes
|
||||
while read -r line; do \
|
||||
set -euo pipefail; \
|
||||
filename="$$(dirname $(1))/$$(sed 's/{{#include \(.*\)}}/\1/'<<< $$line)"; \
|
||||
test -f "$$filename" || ( echo "#include-d file '$$filename' does not exist." >&2; exit 1; ); \
|
||||
matchline="$$(sed 's|/|\\/|g' <<< $$line)"; \
|
||||
sed -i "/$$matchline/r $$filename" $(2); \
|
||||
sed -i "s/$$matchline//" $(2); \
|
||||
done < <(grep '{{#include' $(1))
|
||||
endef
|
||||
|
||||
$(d)/nix-env-%.1: $(d)/source/command-ref/nix-env/%.md
|
||||
@printf "Title: %s\n\n" "$(subst nix-env-,nix-env --,$$(basename "$@" .1))" > $^.tmp
|
||||
$(render-subcommand)
|
||||
|
||||
$(d)/nix-store-%.1: $(d)/source/command-ref/nix-store/%.md
|
||||
@printf -- 'Title: %s\n\n' "$(subst nix-store-,nix-store --,$$(basename "$@" .1))" > $^.tmp
|
||||
$(render-subcommand)
|
||||
|
||||
# FIXME: there surely is some more deduplication to be achieved here with even darker Make magic
|
||||
define render-subcommand
|
||||
@cat $^ >> $^.tmp
|
||||
@$(call process-includes,$^,$^.tmp)
|
||||
$(trace-gen) lowdown -sT man --nroff-nolinks -M section=1 $^.tmp -o $@
|
||||
@# fix up `lowdown`'s automatic escaping of `--`
|
||||
@# https://github.com/kristapsdz/lowdown/blob/edca6ce6d5336efb147321a43c47a698de41bb7c/entity.c#L202
|
||||
@sed -i 's/\e\[u2013\]/--/' $@
|
||||
@rm $^.tmp
|
||||
endef
|
||||
|
||||
|
||||
$(d)/%.1: $(d)/source/command-ref/%.md
|
||||
@printf "Title: %s\n\n" "$$(basename $@ .1)" > $^.tmp
|
||||
@cat $^ >> $^.tmp
|
||||
@$(call process-includes,$^,$^.tmp)
|
||||
$(trace-gen) lowdown -sT man --nroff-nolinks -M section=1 $^.tmp -o $@
|
||||
@rm $^.tmp
|
||||
|
||||
$(d)/%.8: $(d)/source/command-ref/%.md
|
||||
@printf "Title: %s\n\n" "$$(basename $@ .8)" > $^.tmp
|
||||
@cat $^ >> $^.tmp
|
||||
$(trace-gen) lowdown -sT man --nroff-nolinks -M section=8 $^.tmp -o $@
|
||||
@rm $^.tmp
|
||||
|
||||
$(d)/nix.conf.5: $(d)/source/command-ref/conf-file.md
|
||||
@printf "Title: %s\n\n" "$$(basename $@ .5)" > $^.tmp
|
||||
@cat $^ >> $^.tmp
|
||||
@$(call process-includes,$^,$^.tmp)
|
||||
$(trace-gen) lowdown -sT man --nroff-nolinks -M section=5 $^.tmp -o $@
|
||||
@rm $^.tmp
|
||||
|
||||
$(d)/nix-profiles.5: $(d)/source/command-ref/files/profiles.md
|
||||
@printf "Title: %s\n\n" "$$(basename $@ .5)" > $^.tmp
|
||||
@cat $^ >> $^.tmp
|
||||
$(trace-gen) lowdown -sT man --nroff-nolinks -M section=5 $^.tmp -o $@
|
||||
@rm $^.tmp
|
||||
|
||||
$(d)/source/SUMMARY.md: $(d)/source/SUMMARY.md.in $(d)/source/SUMMARY-rl-next.md $(d)/source/store/types $(d)/source/command-ref/new-cli $(d)/source/development/experimental-feature-descriptions.md
|
||||
@cp $< $@
|
||||
@$(call process-includes,$@,$@)
|
||||
|
||||
$(d)/source/store/types: $(d)/nix.json $(d)/utils.nix $(d)/generate-store-info.nix $(d)/generate-store-types.nix $(d)/source/store/types/index.md.in $(doc_nix)
|
||||
@# FIXME: build out of tree!
|
||||
@rm -rf $@.tmp
|
||||
$(trace-gen) $(nix-eval) --write-to $@.tmp --expr 'import doc/manual/generate-store-types.nix (builtins.fromJSON (builtins.readFile $<)).stores'
|
||||
@# do not destroy existing contents
|
||||
@mv $@.tmp/* $@/
|
||||
|
||||
$(d)/source/command-ref/new-cli: $(d)/nix.json $(d)/utils.nix $(d)/generate-manpage.nix $(d)/generate-settings.nix $(d)/generate-store-info.nix $(doc_nix)
|
||||
@rm -rf $@ $@.tmp
|
||||
$(trace-gen) $(nix-eval) --write-to $@.tmp --expr 'import doc/manual/generate-manpage.nix true (builtins.readFile $<)'
|
||||
@mv $@.tmp $@
|
||||
|
||||
$(d)/source/command-ref/conf-file.md: $(d)/conf-file.json $(d)/utils.nix $(d)/generate-settings.nix $(d)/source/command-ref/conf-file-prefix.md $(d)/source/command-ref/experimental-features-shortlist.md $(doc_nix)
|
||||
@cat doc/manual/source/command-ref/conf-file-prefix.md > $@.tmp
|
||||
$(trace-gen) $(nix-eval) --expr 'import doc/manual/generate-settings.nix { prefix = "conf"; } (builtins.fromJSON (builtins.readFile $<))' >> $@.tmp;
|
||||
@mv $@.tmp $@
|
||||
|
||||
$(d)/nix.json: $(doc_nix)
|
||||
$(trace-gen) $(dummy-env) $(doc_nix) __dump-cli > $@.tmp
|
||||
@mv $@.tmp $@
|
||||
|
||||
$(d)/conf-file.json: $(doc_nix)
|
||||
$(trace-gen) $(dummy-env) $(doc_nix) config show --json --experimental-features nix-command > $@.tmp
|
||||
@mv $@.tmp $@
|
||||
|
||||
$(d)/source/development/experimental-feature-descriptions.md: $(d)/xp-features.json $(d)/utils.nix $(d)/generate-xp-features.nix $(doc_nix)
|
||||
@rm -rf $@ $@.tmp
|
||||
$(trace-gen) $(nix-eval) --write-to $@.tmp --expr 'import doc/manual/generate-xp-features.nix (builtins.fromJSON (builtins.readFile $<))'
|
||||
@mv $@.tmp $@
|
||||
|
||||
$(d)/source/command-ref/experimental-features-shortlist.md: $(d)/xp-features.json $(d)/utils.nix $(d)/generate-xp-features-shortlist.nix $(doc_nix)
|
||||
@rm -rf $@ $@.tmp
|
||||
$(trace-gen) $(nix-eval) --write-to $@.tmp --expr 'import doc/manual/generate-xp-features-shortlist.nix (builtins.fromJSON (builtins.readFile $<))'
|
||||
@mv $@.tmp $@
|
||||
|
||||
$(d)/xp-features.json: $(doc_nix)
|
||||
$(trace-gen) $(dummy-env) $(doc_nix) __dump-xp-features > $@.tmp
|
||||
@mv $@.tmp $@
|
||||
|
||||
$(d)/source/language/builtins.md: $(d)/language.json $(d)/generate-builtins.nix $(d)/source/language/builtins-prefix.md $(doc_nix)
|
||||
@cat doc/manual/source/language/builtins-prefix.md > $@.tmp
|
||||
$(trace-gen) $(nix-eval) --expr 'import doc/manual/generate-builtins.nix (builtins.fromJSON (builtins.readFile $<))' >> $@.tmp;
|
||||
@cat doc/manual/source/language/builtins-suffix.md >> $@.tmp
|
||||
@mv $@.tmp $@
|
||||
|
||||
$(d)/language.json: $(doc_nix)
|
||||
$(trace-gen) $(dummy-env) $(doc_nix) __dump-language > $@.tmp
|
||||
@mv $@.tmp $@
|
||||
|
||||
# Generate "Upcoming release" notes (or clear it and remove from menu)
|
||||
$(d)/source/release-notes/rl-next.md: $(d)/rl-next $(d)/rl-next/*
|
||||
@if type -p changelog-d > /dev/null; then \
|
||||
echo " GEN " $@; \
|
||||
changelog-d doc/manual/rl-next > $@; \
|
||||
else \
|
||||
echo " NULL " $@; \
|
||||
true > $@; \
|
||||
fi
|
||||
|
||||
$(d)/source/SUMMARY-rl-next.md: $(d)/source/release-notes/rl-next.md
|
||||
$(trace-gen) true
|
||||
@if [ -s $< ]; then \
|
||||
echo ' - [Upcoming release](release-notes/rl-next.md)' > $@; \
|
||||
else \
|
||||
true > $@; \
|
||||
fi
|
||||
|
||||
# Generate the HTML manual.
|
||||
.PHONY: manual-html
|
||||
manual-html: $(docdir)/manual/index.html
|
||||
|
||||
# Open the built HTML manual in the default browser.
|
||||
manual-html-open: $(docdir)/manual/index.html
|
||||
@echo " OPEN " $<; \
|
||||
xdg-open $< \
|
||||
|| open $< \
|
||||
|| { \
|
||||
echo "Could not open the manual in a browser. Please open '$<'" >&2; \
|
||||
false; \
|
||||
}
|
||||
install: $(docdir)/manual/index.html
|
||||
|
||||
# Generate 'nix' manpages.
|
||||
.PHONY: manpages
|
||||
manpages: $(mandir)/man1/nix3-manpages
|
||||
install: $(mandir)/man1/nix3-manpages
|
||||
man: doc/manual/generated/man1/nix3-manpages
|
||||
all: doc/manual/generated/man1/nix3-manpages
|
||||
|
||||
# FIXME: unify with how the other man pages are generated.
|
||||
# this one works differently and does not use any of the amenities provided by `/mk/lib.mk`.
|
||||
$(mandir)/man1/nix3-manpages: doc/manual/generated/man1/nix3-manpages
|
||||
@mkdir -p $(DESTDIR)$$(dirname $@)
|
||||
$(trace-install) install -m 0644 $$(dirname $<)/* $(DESTDIR)$$(dirname $@)
|
||||
|
||||
doc/manual/generated/man1/nix3-manpages: $(d)/source/command-ref/new-cli
|
||||
@mkdir -p $(DESTDIR)$$(dirname $@)
|
||||
$(trace-gen) for i in doc/manual/source/command-ref/new-cli/*.md; do \
|
||||
name=$$(basename $$i .md); \
|
||||
tmpFile=$$(mktemp); \
|
||||
if [[ $$name = SUMMARY ]]; then continue; fi; \
|
||||
printf "Title: %s\n\n" "$$name" > $$tmpFile; \
|
||||
cat $$i >> $$tmpFile; \
|
||||
lowdown -sT man --nroff-nolinks -M section=1 $$tmpFile -o $(DESTDIR)$$(dirname $@)/$$name.1; \
|
||||
rm $$tmpFile; \
|
||||
done
|
||||
@touch $@
|
||||
|
||||
# the `! -name 'documentation.md'` filter excludes the one place where
|
||||
# `@docroot@` is to be preserved for documenting the mechanism
|
||||
# FIXME: maybe contributing guides should live right next to the code
|
||||
# instead of in the manual
|
||||
$(docdir)/manual/index.html: $(MANUAL_SRCS) $(d)/book.toml $(d)/anchors.jq $(d)/custom.css $(d)/source/SUMMARY.md $(d)/source/store/types $(d)/source/command-ref/new-cli $(d)/source/development/experimental-feature-descriptions.md $(d)/source/command-ref/conf-file.md $(d)/source/language/builtins.md $(d)/source/release-notes/rl-next.md $(d)/source/figures $(d)/source/favicon.png $(d)/source/favicon.svg
|
||||
$(trace-gen) \
|
||||
tmp="$$(mktemp -d)"; \
|
||||
cp -r doc/manual "$$tmp"; \
|
||||
find "$$tmp" -name '*.md' | while read -r file; do \
|
||||
$(call process-includes,$$file,$$file); \
|
||||
done; \
|
||||
find "$$tmp" -name '*.md' ! -name 'documentation.md' | while read -r file; do \
|
||||
docroot="$$(realpath --relative-to="$$(dirname "$$file")" $$tmp/manual/source)"; \
|
||||
sed -i "s,@docroot@,$$docroot,g" "$$file"; \
|
||||
done; \
|
||||
set -euo pipefail; \
|
||||
( \
|
||||
cd "$$tmp/manual"; \
|
||||
RUST_LOG=warn \
|
||||
MDBOOK_SUBSTITUTE_SEARCH=$(d)/source \
|
||||
mdbook build -d $(DESTDIR)$(docdir)/manual.tmp 2>&1 \
|
||||
| { grep -Fv "because fragment resolution isn't implemented" || :; } \
|
||||
); \
|
||||
rm -rf "$$tmp/manual"
|
||||
@rm -rf $(DESTDIR)$(docdir)/manual
|
||||
@mv $(DESTDIR)$(docdir)/manual.tmp/html $(DESTDIR)$(docdir)/manual
|
||||
@rm -rf $(DESTDIR)$(docdir)/manual.tmp
|
@ -1,14 +0,0 @@
|
||||
---
|
||||
synopsis: Use envvars NIX_CACHE_HOME, NIX_CONFIG_HOME, NIX_DATA_HOME, NIX_STATE_HOME if defined
|
||||
prs: [11351]
|
||||
---
|
||||
|
||||
Added new environment variables:
|
||||
|
||||
- `NIX_CACHE_HOME`
|
||||
- `NIX_CONFIG_HOME`
|
||||
- `NIX_DATA_HOME`
|
||||
- `NIX_STATE_HOME`
|
||||
|
||||
Each, if defined, takes precedence over the corresponding [XDG environment variable](@docroot@/command-ref/env-common.md#xdg-base-directories).
|
||||
This provides more fine-grained control over where Nix looks for files, and allows to have a stand-alone Nix environment, which only uses files in a specific directory, and doesn't interfere with the user environment.
|
@ -1,21 +0,0 @@
|
||||
---
|
||||
synopsis: Define integer overflow in the Nix language as an error
|
||||
issues: [10968]
|
||||
prs: [11188]
|
||||
---
|
||||
|
||||
Previously, integer overflow in the Nix language invoked C++ level signed overflow, which was undefined behaviour, but *usually* manifested as wrapping around on overflow.
|
||||
|
||||
Since prior to the public release of Lix, Lix had C++ signed overflow defined to crash the process and nobody noticed this having accidentally removed overflow from the Nix language for three months until it was caught by fiddling around.
|
||||
Given the significant body of actual Nix code that has been evaluated by Lix in that time, it does not appear that nixpkgs or much of importance depends on integer overflow, so it appears safe to turn into an error.
|
||||
|
||||
Some other overflows were fixed:
|
||||
- `builtins.fromJSON` of values greater than the maximum representable value in a signed 64-bit integer will generate an error.
|
||||
- `nixConfig` in flakes will no longer accept negative values for configuration options.
|
||||
|
||||
Integer overflow now looks like the following:
|
||||
|
||||
```
|
||||
$ nix eval --expr '9223372036854775807 + 1'
|
||||
error: integer overflow in adding 9223372036854775807 + 1
|
||||
```
|
@ -1,22 +0,0 @@
|
||||
---
|
||||
synopsis: |-
|
||||
The `build-hook` setting's default is less useful when using `libnixstore` as a library
|
||||
prs:
|
||||
- 11178
|
||||
---
|
||||
|
||||
*This is an obscure issue that only affects usage of the `libnixstore` library outside of the Nix executable.*
|
||||
|
||||
As part the ongoing [rewrite of the build system](https://github.com/NixOS/nix/issues/2503) to use [Meson](https://mesonbuild.com/), we are also switching to packaging individual Nix components separately (and building them in separate derivations).
|
||||
This means that when building `libnixstore` we do not know where the Nix binaries will be installed --- `libnixstore` doesn't know about downstream consumers like the Nix binaries at all.
|
||||
|
||||
*This is also unrelated to the _`post`_-`build-hook`*, which is often used for pushing to a cache.*
|
||||
|
||||
This has a small adverse affect on remote building --- the `build-remote` executable that is specified from the [`build-hook`](@docroot@/command-ref/conf-file.md#conf-build-hook) setting will not be gotten from the (presumed) installation location, but instead looked up on the `PATH`.
|
||||
This means that other applications linking `libnixstore` that wish to use remote building must arrange for the `nix` command to be on the PATH (or manually overriding `build-hook`) in order for that to work.
|
||||
|
||||
Long term we don't envision this being a downside, because we plan to [get rid of `build-remote` and the build hook setting entirely](https://github.com/NixOS/nix/issues/1221).
|
||||
There should simply be no need to have an extra, intermediate layer of remote-procedure-calling when we want to connect to a remote builder.
|
||||
The build hook protocol did in principle support custom ways of remote building, but that can also be accomplished with a custom service for the ssh or daemon/ssh-ng protocols, or with a custom [store type](@docroot@/store/types/index.md) i.e. `Store` subclass. <!-- we normally don't mention classes, but consider that this release note is about a library use case -->
|
||||
|
||||
The Perl bindings no longer expose `getBinDir` either, since the underlying C++ libraries those bindings wrap no longer know the location of installed binaries as described above.
|
@ -1,14 +0,0 @@
|
||||
---
|
||||
synopsis: wrap filesystem exceptions more correctly
|
||||
issues: []
|
||||
prs: [11378]
|
||||
---
|
||||
|
||||
|
||||
With the switch to `std::filesystem` in different places, Nix started to throw `std::filesystem::filesystem_error` in many places instead of its own exceptions.
|
||||
|
||||
This lead to no longer generating error traces, for example when listing a non-existing directory, and can also lead to crashes inside the Nix REPL.
|
||||
|
||||
This version catches these types of exception correctly and wrap them into Nix's own exeception type.
|
||||
|
||||
Author: [**@Mic92**](https://github.com/Mic92)
|
@ -1,9 +0,0 @@
|
||||
---
|
||||
synopsis: Add setting `fsync-store-paths`
|
||||
issues: [1218]
|
||||
prs: [7126]
|
||||
---
|
||||
|
||||
Nix now has a setting `fsync-store-paths` that ensures that new store paths are durably written to disk before they are registered as "valid" in Nix's database. This can prevent Nix store corruption if the system crashes or there is a power loss. This setting defaults to `false`.
|
||||
|
||||
Author: [**@squalus**](https://github.com/squalus)
|
18
doc/manual/rl-next/nix-copy-flags.md
Normal file
18
doc/manual/rl-next/nix-copy-flags.md
Normal file
@ -0,0 +1,18 @@
|
||||
---
|
||||
synopsis: "`nix copy` supports `--profile` and `--out-link`"
|
||||
prs: [11657]
|
||||
---
|
||||
|
||||
The `nix copy` command now has flags `--profile` and `--out-link`, similar to `nix build`. `--profile` makes a profile point to the
|
||||
top-level store path, while `--out-link` create symlinks to the top-level store paths.
|
||||
|
||||
For example, when updating the local NixOS system profile from a NixOS system closure on a remote machine, instead of
|
||||
```
|
||||
# nix copy --from ssh://server $path
|
||||
# nix build --profile /nix/var/nix/profiles/system $path
|
||||
```
|
||||
you can now do
|
||||
```
|
||||
# nix copy --from ssh://server --profile /nix/var/nix/profiles/system $path
|
||||
```
|
||||
The advantage is that this avoids a time window where *path* is not a garbage collector root, and so could be deleted by a concurrent `nix store gc` process.
|
@ -1,25 +0,0 @@
|
||||
---
|
||||
synopsis: Show package descriptions with `nix flake show`
|
||||
issues: [10977]
|
||||
prs: [10980]
|
||||
---
|
||||
|
||||
`nix flake show` will now display a package's `meta.description` if it exists. If the description does not fit in the terminal it will be truncated to fit the terminal width. If the size of the terminal width is unknown the description will be capped at 80 characters.
|
||||
|
||||
```
|
||||
$ nix flake show
|
||||
└───packages
|
||||
└───x86_64-linux
|
||||
├───builderImage: package 'docker-image-ara-builder-image.tar.gz' - 'Docker image hosting the nix build environment'
|
||||
└───runnerImage: package 'docker-image-gitlab-runner.tar.gz' - 'Docker image hosting the gitlab-runner executable'
|
||||
```
|
||||
|
||||
In a narrower terminal:
|
||||
|
||||
```
|
||||
$ nix flake show
|
||||
└───packages
|
||||
└───x86_64-linux
|
||||
├───builderImage: package 'docker-image-ara-builder-image.tar.gz' - 'Docker image hosting the nix b...
|
||||
└───runnerImage: package 'docker-image-gitlab-runner.tar.gz' - 'Docker image hosting the gitlab-run...
|
||||
```
|
@ -1,17 +0,0 @@
|
||||
---
|
||||
synopsis: Removing the default argument passed to the `nix fmt` formatter
|
||||
issues: []
|
||||
prs: [11438]
|
||||
---
|
||||
|
||||
The underlying formatter no longer receives the ". " default argument when `nix fmt` is called with no arguments.
|
||||
|
||||
This change was necessary as the formatter wasn't able to distinguish between
|
||||
a user wanting to format the current folder with `nix fmt .` or the generic
|
||||
`nix fmt`.
|
||||
|
||||
The default behaviour is now the responsibility of the formatter itself, and
|
||||
allows tools such as treefmt to format the whole tree instead of only the
|
||||
current directory and below.
|
||||
|
||||
Author: [**@zimbatm**](https://github.com/zimbatm)
|
@ -1,8 +0,0 @@
|
||||
---
|
||||
synopsis: Flakes are no longer substituted
|
||||
prs: [10612]
|
||||
---
|
||||
|
||||
Nix will no longer attempt to substitute the source code of flakes from a binary cache. This functionality was broken because it could lead to different evaluation results depending on whether the flake was available in the binary cache, or even depending on whether the flake was already in the local store.
|
||||
|
||||
Author: [**@edolstra**](https://github.com/edolstra)
|
@ -1,8 +0,0 @@
|
||||
---
|
||||
synopsis: "`<nix/fetchurl.nix>` uses TLS verification"
|
||||
prs: [11585]
|
||||
---
|
||||
|
||||
Previously `<nix/fetchurl.nix>` did not do TLS verification. This was because the Nix sandbox in the past did not have access to TLS certificates, and Nix checks the hash of the fetched file anyway. However, this can expose authentication data from `netrc` and URLs to man-in-the-middle attackers. In addition, Nix now in some cases (such as when using impure derivations) does *not* check the hash. Therefore we have now enabled TLS verification. This means that downloads by `<nix/fetchurl.nix>` will now fail if you're fetching from a HTTPS server that does not have a valid certificate.
|
||||
|
||||
`<nix/fetchurl.nix>` is also known as the builtin derivation builder `builtin:fetchurl`. It's not to be confused with the evaluation-time function `builtins.fetchurl`, which was not affected by this issue.
|
@ -121,6 +121,7 @@
|
||||
- [Development](development/index.md)
|
||||
- [Building](development/building.md)
|
||||
- [Testing](development/testing.md)
|
||||
- [Debugging](development/debugging.md)
|
||||
- [Documentation](development/documentation.md)
|
||||
- [CLI guideline](development/cli-guideline.md)
|
||||
- [JSON guideline](development/json-guideline.md)
|
||||
@ -129,6 +130,7 @@
|
||||
- [Contributing](development/contributing.md)
|
||||
- [Releases](release-notes/index.md)
|
||||
{{#include ./SUMMARY-rl-next.md}}
|
||||
- [Release 2.25 (2024-11-07)](release-notes/rl-2.25.md)
|
||||
- [Release 2.24 (2024-07-31)](release-notes/rl-2.24.md)
|
||||
- [Release 2.23 (2024-06-03)](release-notes/rl-2.23.md)
|
||||
- [Release 2.22 (2024-04-23)](release-notes/rl-2.22.md)
|
||||
|
@ -1,35 +1,57 @@
|
||||
# Remote Builds
|
||||
|
||||
Nix supports remote builds, where a local Nix installation can forward
|
||||
Nix builds to other machines. This allows multiple builds to be
|
||||
performed in parallel and allows Nix to perform multi-platform builds in
|
||||
a semi-transparent way. For instance, if you perform a build for a
|
||||
`x86_64-darwin` on an `i686-linux` machine, Nix can automatically
|
||||
forward the build to a `x86_64-darwin` machine, if available.
|
||||
A local Nix installation can forward Nix builds to other machines,
|
||||
this allows multiple builds to be performed in parallel.
|
||||
|
||||
To forward a build to a remote machine, it’s required that the remote
|
||||
machine is accessible via SSH and that it has Nix installed. You can
|
||||
test whether connecting to the remote Nix instance works, e.g.
|
||||
Remote builds also allow Nix to perform multi-platform builds in a
|
||||
semi-transparent way. For example, if you perform a build for a
|
||||
`x86_64-darwin` on an `i686-linux` machine, Nix can automatically
|
||||
forward the build to a `x86_64-darwin` machine, if one is available.
|
||||
|
||||
## Requirements
|
||||
|
||||
For a local machine to forward a build to a remote machine, the remote machine must:
|
||||
|
||||
- Have Nix installed
|
||||
- Be running an SSH server, e.g. `sshd`
|
||||
- Be accessible via SSH from the local machine over the network
|
||||
- Have the local machine's public SSH key in `/etc/ssh/authorized_keys.d/<username>`
|
||||
- Have the username of the SSH user in the `trusted-users` setting in `nix.conf`
|
||||
|
||||
## Testing
|
||||
|
||||
To test connecting to a remote Nix instance (in this case `mac`), run:
|
||||
|
||||
```console
|
||||
$ nix store info --store ssh://mac
|
||||
nix store info --store ssh://username@mac
|
||||
```
|
||||
|
||||
will try to connect to the machine named `mac`. It is possible to
|
||||
specify an SSH identity file as part of the remote store URI, e.g.
|
||||
To specify an SSH identity file as part of the remote store URI add a
|
||||
query paramater, e.g.
|
||||
|
||||
```console
|
||||
$ nix store info --store ssh://mac?ssh-key=/home/alice/my-key
|
||||
nix store info --store ssh://username@mac?ssh-key=/home/alice/my-key
|
||||
```
|
||||
|
||||
Since builds should be non-interactive, the key should not have a
|
||||
passphrase. Alternatively, you can load identities ahead of time into
|
||||
`ssh-agent` or `gpg-agent`.
|
||||
|
||||
In a multi-user installation (default), builds are executed by the Nix
|
||||
Daemon. The Nix Daemon cannot prompt for a passphrase via the terminal
|
||||
or `ssh-agent`, so the SSH key must not have a passphrase.
|
||||
|
||||
In addition, the Nix Daemon's user (typically root) needs to have SSH
|
||||
access to the remote builder.
|
||||
|
||||
Access can be verified by running `sudo su`, and then validating SSH
|
||||
access, e.g. by running `ssh mac`. SSH identity files for root users
|
||||
are usually stored in `/root/.ssh/` (Linux) or `/var/root/.ssh` (MacOS).
|
||||
|
||||
If you get the error
|
||||
|
||||
```console
|
||||
bash: nix-store: command not found
|
||||
bash: nix: command not found
|
||||
error: cannot connect to 'mac'
|
||||
```
|
||||
|
||||
@ -40,15 +62,28 @@ The [list of remote build machines](@docroot@/command-ref/conf-file.md#conf-buil
|
||||
For example, the following command allows you to build a derivation for `x86_64-darwin` on a Linux machine:
|
||||
|
||||
```console
|
||||
$ uname
|
||||
uname
|
||||
```
|
||||
|
||||
```console
|
||||
Linux
|
||||
```
|
||||
|
||||
$ nix build --impure \
|
||||
--expr '(with import <nixpkgs> { system = "x86_64-darwin"; }; runCommand "foo" {} "uname > $out")' \
|
||||
--builders 'ssh://mac x86_64-darwin'
|
||||
```console
|
||||
nix build --impure \
|
||||
--expr '(with import <nixpkgs> { system = "x86_64-darwin"; }; runCommand "foo" {} "uname > $out")' \
|
||||
--builders 'ssh://mac x86_64-darwin'
|
||||
```
|
||||
|
||||
```console
|
||||
[1/0/1 built, 0.0 MiB DL] building foo on ssh://mac
|
||||
```
|
||||
|
||||
$ cat ./result
|
||||
```console
|
||||
cat ./result
|
||||
```
|
||||
|
||||
```console
|
||||
Darwin
|
||||
```
|
||||
|
||||
@ -62,6 +97,8 @@ Remote build machines can also be configured in [`nix.conf`](@docroot@/command-r
|
||||
|
||||
builders = ssh://mac x86_64-darwin ; ssh://beastie x86_64-freebsd
|
||||
|
||||
After making changes to `nix.conf`, restart the Nix daemon for changes to take effect.
|
||||
|
||||
Finally, remote build machines can be configured in a separate configuration
|
||||
file included in `builders` via the syntax `@/path/to/file`. For example,
|
||||
|
||||
|
@ -36,7 +36,7 @@ Instead, it looks in a few locations, and acts on all profiles it finds there:
|
||||
>
|
||||
> Not stable; subject to change
|
||||
>
|
||||
> Do not rely on this functionality; it just exists for migration purposes and is may change in the future.
|
||||
> Do not rely on this functionality; it just exists for migration purposes and may change in the future.
|
||||
> These deprecated paths remain a private implementation detail of Nix.
|
||||
|
||||
`$NIX_STATE_DIR/profiles` and `$NIX_STATE_DIR/profiles/per-user`.
|
||||
|
@ -35,20 +35,20 @@ To build Nix itself in this shell:
|
||||
|
||||
```console
|
||||
[nix-shell]$ mesonFlags+=" --prefix=$(pwd)/outputs/out"
|
||||
[nix-shell]$ dontAddPrefix=1 mesonConfigurePhase
|
||||
[nix-shell]$ ninjaBuildPhase
|
||||
[nix-shell]$ dontAddPrefix=1 configurePhase
|
||||
[nix-shell]$ buildPhase
|
||||
```
|
||||
|
||||
To test it:
|
||||
|
||||
```console
|
||||
[nix-shell]$ mesonCheckPhase
|
||||
[nix-shell]$ checkPhase
|
||||
```
|
||||
|
||||
To install it in `$(pwd)/outputs`:
|
||||
|
||||
```console
|
||||
[nix-shell]$ ninjaInstallPhase
|
||||
[nix-shell]$ installPhase
|
||||
[nix-shell]$ ./outputs/out/bin/nix --version
|
||||
nix (Nix) 2.12
|
||||
```
|
||||
@ -90,20 +90,20 @@ $ nix develop .#native-clangStdenvPackages
|
||||
To build Nix itself in this shell:
|
||||
|
||||
```console
|
||||
[nix-shell]$ mesonConfigurePhase
|
||||
[nix-shell]$ ninjaBuildPhase
|
||||
[nix-shell]$ configurePhase
|
||||
[nix-shell]$ buildPhase
|
||||
```
|
||||
|
||||
To test it:
|
||||
|
||||
```console
|
||||
[nix-shell]$ mesonCheckPhase
|
||||
[nix-shell]$ checkPhase
|
||||
```
|
||||
|
||||
To install it in `$(pwd)/outputs`:
|
||||
|
||||
```console
|
||||
[nix-shell]$ ninjaInstallPhase
|
||||
[nix-shell]$ installPhase
|
||||
[nix-shell]$ nix --version
|
||||
nix (Nix) 2.12
|
||||
```
|
||||
@ -167,7 +167,7 @@ It is useful to perform multiple cross and native builds on the same source tree
|
||||
for example to ensure that better support for one platform doesn't break the build for another.
|
||||
Meson thankfully makes this very easy by confining all build products to the build directory --- one simple shares the source directory between multiple build directories, each of which contains the build for Nix to a different platform.
|
||||
|
||||
Nixpkgs's `mesonConfigurePhase` always chooses `build` in the current directory as the name and location of the build.
|
||||
Nixpkgs's `configurePhase` always chooses `build` in the current directory as the name and location of the build.
|
||||
This makes having multiple build directories slightly more inconvenient.
|
||||
The good news is that Meson/Ninja seem to cope well with relocating the build directory after it is created.
|
||||
|
||||
@ -176,13 +176,13 @@ Here's how to do that
|
||||
1. Configure as usual
|
||||
|
||||
```bash
|
||||
mesonConfigurePhase
|
||||
configurePhase
|
||||
```
|
||||
|
||||
2. Rename the build directory
|
||||
|
||||
```bash
|
||||
cd .. # since `mesonConfigurePhase` cd'd inside
|
||||
cd .. # since `configurePhase` cd'd inside
|
||||
mv build build-linux # or whatever name we want
|
||||
cd build-linux
|
||||
```
|
||||
@ -190,7 +190,7 @@ Here's how to do that
|
||||
3. Build as usual
|
||||
|
||||
```bash
|
||||
ninjaBuildPhase
|
||||
buildPhase
|
||||
```
|
||||
|
||||
> **N.B.**
|
||||
|
62
doc/manual/source/development/debugging.md
Normal file
62
doc/manual/source/development/debugging.md
Normal file
@ -0,0 +1,62 @@
|
||||
# Debugging Nix
|
||||
|
||||
This section shows how to build and debug Nix with debug symbols enabled.
|
||||
|
||||
## Building Nix with Debug Symbols
|
||||
|
||||
In the development shell, set the `mesonBuildType` environment variable to `debug` before configuring the build:
|
||||
|
||||
```console
|
||||
[nix-shell]$ export mesonBuildType=debugoptimized
|
||||
```
|
||||
|
||||
Then, proceed to build Nix as described in [Building Nix](./building.md).
|
||||
This will build Nix with debug symbols, which are essential for effective debugging.
|
||||
|
||||
## Debugging the Nix Binary
|
||||
|
||||
Obtain your preferred debugger within the development shell:
|
||||
|
||||
```console
|
||||
[nix-shell]$ nix-shell -p gdb
|
||||
```
|
||||
|
||||
On macOS, use `lldb`:
|
||||
|
||||
```console
|
||||
[nix-shell]$ nix-shell -p lldb
|
||||
```
|
||||
|
||||
### Launching the Debugger
|
||||
|
||||
To debug the Nix binary, run:
|
||||
|
||||
```console
|
||||
[nix-shell]$ gdb --args ../outputs/out/bin/nix
|
||||
```
|
||||
|
||||
On macOS, use `lldb`:
|
||||
|
||||
```console
|
||||
[nix-shell]$ lldb -- ../outputs/out/bin/nix
|
||||
```
|
||||
|
||||
### Using the Debugger
|
||||
|
||||
Inside the debugger, you can set breakpoints, run the program, and inspect variables.
|
||||
|
||||
```gdb
|
||||
(gdb) break main
|
||||
(gdb) run <arguments>
|
||||
```
|
||||
|
||||
Refer to the [GDB Documentation](https://www.gnu.org/software/gdb/documentation/) for comprehensive usage instructions.
|
||||
|
||||
On macOS, use `lldb`:
|
||||
|
||||
```lldb
|
||||
(lldb) breakpoint set --name main
|
||||
(lldb) process launch -- <arguments>
|
||||
```
|
||||
|
||||
Refer to the [LLDB Tutorial](https://lldb.llvm.org/use/tutorial.html) for comprehensive usage instructions.
|
@ -203,7 +203,7 @@ $ xdg-open ./result/share/doc/nix/internal-api/html/index.html
|
||||
or inside `nix-shell` or `nix develop`:
|
||||
|
||||
```console
|
||||
$ mesonConfigurePhase
|
||||
$ configurePhase
|
||||
$ ninja src/internal-api-docs/html
|
||||
$ xdg-open src/internal-api-docs/html/index.html
|
||||
```
|
||||
@ -224,7 +224,7 @@ $ xdg-open ./result/share/doc/nix/external-api/html/index.html
|
||||
or inside `nix-shell` or `nix develop`:
|
||||
|
||||
```
|
||||
$ mesonConfigurePhase
|
||||
$ configurePhase
|
||||
$ ninja src/external-api-docs/html
|
||||
$ xdg-open src/external-api-docs/html/index.html
|
||||
```
|
||||
|
@ -29,7 +29,7 @@ The unit tests are defined using the [googletest] and [rapidcheck] frameworks.
|
||||
> ```
|
||||
> src
|
||||
> ├── libexpr
|
||||
> │ ├── local.mk
|
||||
> │ ├── meson.build
|
||||
> │ ├── value/context.hh
|
||||
> │ ├── value/context.cc
|
||||
> │ …
|
||||
@ -37,25 +37,24 @@ The unit tests are defined using the [googletest] and [rapidcheck] frameworks.
|
||||
> ├── tests
|
||||
> │ │
|
||||
> │ …
|
||||
> │ └── unit
|
||||
> │ ├── libutil
|
||||
> │ │ ├── local.mk
|
||||
> │ │ …
|
||||
> │ │ └── data
|
||||
> │ │ ├── git/tree.txt
|
||||
> │ │ …
|
||||
> │ │
|
||||
> │ ├── libexpr-support
|
||||
> │ │ ├── local.mk
|
||||
> │ │ └── tests
|
||||
> │ │ ├── value/context.hh
|
||||
> │ │ ├── value/context.cc
|
||||
> │ │ …
|
||||
> │ │
|
||||
> │ ├── libexpr
|
||||
> │ … ├── local.mk
|
||||
> │ ├── value/context.cc
|
||||
> │ …
|
||||
> │ ├── libutil-tests
|
||||
> │ │ ├── meson.build
|
||||
> │ │ …
|
||||
> │ │ └── data
|
||||
> │ │ ├── git/tree.txt
|
||||
> │ │ …
|
||||
> │ │
|
||||
> │ ├── libexpr-test-support
|
||||
> │ │ ├── meson.build
|
||||
> │ │ └── tests
|
||||
> │ │ ├── value/context.hh
|
||||
> │ │ ├── value/context.cc
|
||||
> │ │ …
|
||||
> │ │
|
||||
> │ ├── libexpr-tests
|
||||
> │ … ├── meson.build
|
||||
> │ ├── value/context.cc
|
||||
> │ …
|
||||
> …
|
||||
> ```
|
||||
|
||||
@ -128,7 +127,7 @@ On other platforms they wouldn't be run at all.
|
||||
|
||||
## Functional tests
|
||||
|
||||
The functional tests reside under the `tests/functional` directory and are listed in `tests/functional/local.mk`.
|
||||
The functional tests reside under the `tests/functional` directory and are listed in `tests/functional/meson.build`.
|
||||
Each test is a bash script.
|
||||
|
||||
Functional tests are run during `installCheck` in the `nix` package build, as well as separately from the build, in VM tests.
|
||||
@ -138,7 +137,7 @@ Functional tests are run during `installCheck` in the `nix` package build, as we
|
||||
The whole test suite (functional and unit tests) can be run with:
|
||||
|
||||
```shell-session
|
||||
$ mesonCheckPhase
|
||||
$ checkPhase
|
||||
```
|
||||
|
||||
### Grouping tests
|
||||
|
@ -57,3 +57,21 @@ $ nix build ./\#hydraJobs.dockerImage.x86_64-linux
|
||||
$ docker load -i ./result/image.tar.gz
|
||||
$ docker run -ti nix:2.5pre20211105
|
||||
```
|
||||
|
||||
# Docker image with non-root Nix
|
||||
|
||||
If you would like to run Nix in a container under a user other than `root`,
|
||||
you can build an image with a non-root single-user installation of Nix
|
||||
by specifying the `uid`, `gid`, `uname`, and `gname` arguments to `docker.nix`:
|
||||
|
||||
```console
|
||||
$ nix build --file docker.nix \
|
||||
--arg uid 1000 \
|
||||
--arg gid 1000 \
|
||||
--argstr uname user \
|
||||
--argstr gname user \
|
||||
--argstr name nix-user \
|
||||
--out-link nix-user.tar.gz
|
||||
$ docker load -i nix-user.tar.gz
|
||||
$ docker run -ti nix-user
|
||||
```
|
||||
|
@ -102,7 +102,7 @@ The `+` operator is overloaded to also work on strings and paths.
|
||||
>
|
||||
> *string* `+` *string*
|
||||
|
||||
Concatenate two [strings][string] and merge their string contexts.
|
||||
Concatenate two [strings][string] and merge their [string contexts](./string-context.md).
|
||||
|
||||
[String concatenation]: #string-concatenation
|
||||
|
||||
@ -128,7 +128,7 @@ The result is a path.
|
||||
|
||||
> **Note**
|
||||
>
|
||||
> The string must not have a string context that refers to a [store path].
|
||||
> The string must not have a [string context](./string-context.md) that refers to a [store path].
|
||||
|
||||
[Path and string concatenation]: #path-and-string-concatenation
|
||||
|
||||
|
@ -75,3 +75,7 @@
|
||||
(experimental) can be found by any program that follows the [XDG Base Directory Specification](https://specifications.freedesktop.org/basedir-spec/basedir-spec-latest.html).
|
||||
|
||||
- A new command `nix store add` has been added. It replaces `nix store add-file` and `nix store add-path` which are now deprecated.
|
||||
|
||||
- A new option [`always-allow-substitutes`](@docroot@/command-ref/conf-file.md#conf-always-allow-substitutes) has been added.
|
||||
|
||||
When set to `true`, Nix will always try to substitute a derivation, even if it has the [`allowSubstitutes`]{#adv-attr-allowSubstitutes} attribute set to `false`.
|
||||
|
144
doc/manual/source/release-notes/rl-2.25.md
Normal file
144
doc/manual/source/release-notes/rl-2.25.md
Normal file
@ -0,0 +1,144 @@
|
||||
# Release 2.25.0 (2024-11-07)
|
||||
|
||||
- New environment variables to override XDG locations [#11351](https://github.com/NixOS/nix/pull/11351)
|
||||
|
||||
Added new environment variables:
|
||||
|
||||
- `NIX_CACHE_HOME`
|
||||
- `NIX_CONFIG_HOME`
|
||||
- `NIX_DATA_HOME`
|
||||
- `NIX_STATE_HOME`
|
||||
|
||||
Each, if defined, takes precedence over the corresponding [XDG environment variable](@docroot@/command-ref/env-common.md#xdg-base-directories).
|
||||
This provides more fine-grained control over where Nix looks for files. It allows having a stand-alone Nix environment that only uses files in a specific directory and that doesn't interfere with the user environment.
|
||||
|
||||
- Define integer overflow in the Nix language as an error [#10968](https://github.com/NixOS/nix/issues/10968) [#11188](https://github.com/NixOS/nix/pull/11188)
|
||||
|
||||
Previously, integer overflow in the Nix language invoked C++ level signed overflow, which manifested as wrapping around on overflow. It now looks like this:
|
||||
|
||||
```
|
||||
$ nix eval --expr '9223372036854775807 + 1'
|
||||
error: integer overflow in adding 9223372036854775807 + 1
|
||||
```
|
||||
|
||||
Some other overflows were fixed:
|
||||
- `builtins.fromJSON` of values greater than the maximum representable value in a signed 64-bit integer will generate an error.
|
||||
- `nixConfig` in flakes will no longer accept negative values for configuration options.
|
||||
|
||||
- The `build-hook` setting no longer has a useful default when using `libnixstore` as a library [#11178](https://github.com/NixOS/nix/pull/11178)
|
||||
|
||||
*This is an obscure issue that only affects usage of the `libnixstore` library outside of the Nix executable. It is unrelated to the `post-build-hook` settings, which is often used for pushing to a cache.*
|
||||
|
||||
As part the ongoing [rewrite of the build system](https://github.com/NixOS/nix/issues/2503) to use [Meson](https://mesonbuild.com/), we are also switching to packaging individual Nix components separately (and building them in separate derivations).
|
||||
This means that when building `libnixstore` we do not know where the Nix binaries will be installed --- `libnixstore` doesn't know about downstream consumers like the Nix binaries at all.
|
||||
|
||||
This has a small adverse affect on remote building --- the `build-remote` executable that is specified from the [`build-hook`](@docroot@/command-ref/conf-file.md#conf-build-hook) setting will not be gotten from the (presumed) installation location, but instead looked up on the `PATH`.
|
||||
This means that other applications linking `libnixstore` that wish to use remote building must arrange for the `nix` command to be on the PATH (or manually overriding `build-hook`) in order for that to work.
|
||||
|
||||
Long term we don't envision this being a downside, because we plan to [get rid of `build-remote` and the build hook setting entirely](https://github.com/NixOS/nix/issues/1221).
|
||||
There should simply be no need to have an extra, intermediate layer of remote-procedure-calling when we want to connect to a remote builder.
|
||||
The build hook protocol did in principle support custom ways of remote building, but that can also be accomplished with a custom service for the ssh or daemon/ssh-ng protocols, or with a custom [store type](@docroot@/store/types/index.md) i.e. `Store` subclass. <!-- we normally don't mention classes, but consider that this release note is about a library use case -->
|
||||
|
||||
The Perl bindings no longer expose `getBinDir` either, since the underlying C++ libraries those bindings wrap no longer know the location of installed binaries as described above.
|
||||
|
||||
- Wrap filesystem exceptions more correctly [#11378](https://github.com/NixOS/nix/pull/11378)
|
||||
|
||||
With the switch to `std::filesystem` in different places, Nix started to throw `std::filesystem::filesystem_error` in many places instead of its own exceptions.
|
||||
As a result, Nix no longer generated error traces when (for example) listing a non-existing directory. It could also lead to crashes inside the Nix REPL.
|
||||
|
||||
This version catches these types of exception correctly and wraps them into Nix's own exception type.
|
||||
|
||||
Author: [**@Mic92**](https://github.com/Mic92)
|
||||
|
||||
- Add setting `fsync-store-paths` [#1218](https://github.com/NixOS/nix/issues/1218) [#7126](https://github.com/NixOS/nix/pull/7126)
|
||||
|
||||
Nix now has a setting `fsync-store-paths` that ensures that new store paths are durably written to disk before they are registered as "valid" in Nix's database. This can prevent Nix store corruption if the system crashes or there is a power loss. This setting defaults to `false`.
|
||||
|
||||
Author: [**@squalus**](https://github.com/squalus)
|
||||
|
||||
- Removing the default argument passed to the `nix fmt` formatter [#11438](https://github.com/NixOS/nix/pull/11438)
|
||||
|
||||
The underlying formatter no longer receives the "." default argument when `nix fmt` is called with no arguments.
|
||||
|
||||
This change was necessary as the formatter wasn't able to distinguish between
|
||||
a user wanting to format the current folder with `nix fmt .` or the generic
|
||||
`nix fmt`.
|
||||
|
||||
The default behavior is now the responsibility of the formatter itself, and
|
||||
allows tools such as `treefmt` to format the whole tree instead of only the
|
||||
current directory and below.
|
||||
|
||||
Author: [**@zimbatm**](https://github.com/zimbatm)
|
||||
|
||||
- `<nix/fetchurl.nix>` uses TLS verification [#11585](https://github.com/NixOS/nix/pull/11585)
|
||||
|
||||
Previously `<nix/fetchurl.nix>` did not do TLS verification. This was because the Nix sandbox in the past did not have access to TLS certificates, and Nix checks the hash of the fetched file anyway. However, this can expose authentication data from `netrc` and URLs to man-in-the-middle attackers. In addition, Nix now in some cases (such as when using impure derivations) does *not* check the hash. Therefore we have now enabled TLS verification. This means that downloads by `<nix/fetchurl.nix>` will now fail if you're fetching from a HTTPS server that does not have a valid certificate.
|
||||
|
||||
`<nix/fetchurl.nix>` is also known as the builtin derivation builder `builtin:fetchurl`. It's not to be confused with the evaluation-time function `builtins.fetchurl`, which was not affected by this issue.
|
||||
|
||||
|
||||
# Contributors
|
||||
|
||||
This release was made possible by the following 58 contributors:
|
||||
|
||||
- 1444 [**(@0x5a4)**](https://github.com/0x5a4)
|
||||
- Adrian Hesketh [**(@a-h)**](https://github.com/a-h)
|
||||
- Aleksana [**(@Aleksanaa)**](https://github.com/Aleksanaa)
|
||||
- Alyssa Ross [**(@alyssais)**](https://github.com/alyssais)
|
||||
- Andrew Marshall [**(@amarshall)**](https://github.com/amarshall)
|
||||
- Artemis Tosini [**(@artemist)**](https://github.com/artemist)
|
||||
- Artturin [**(@Artturin)**](https://github.com/Artturin)
|
||||
- Bjørn Forsman [**(@bjornfor)**](https://github.com/bjornfor)
|
||||
- Brian McGee [**(@brianmcgee)**](https://github.com/brianmcgee)
|
||||
- Brian McKenna [**(@puffnfresh)**](https://github.com/puffnfresh)
|
||||
- Bryan Honof [**(@bryanhonof)**](https://github.com/bryanhonof)
|
||||
- Cole Helbling [**(@cole-h)**](https://github.com/cole-h)
|
||||
- Eelco Dolstra [**(@edolstra)**](https://github.com/edolstra)
|
||||
- Eman Resu [**(@llakala)**](https://github.com/llakala)
|
||||
- Emery Hemingway [**(@ehmry)**](https://github.com/ehmry)
|
||||
- Emil Petersen [**(@leetemil)**](https://github.com/leetemil)
|
||||
- Emily [**(@emilazy)**](https://github.com/emilazy)
|
||||
- Geoffrey Thomas [**(@geofft)**](https://github.com/geofft)
|
||||
- Gerg-L [**(@Gerg-L)**](https://github.com/Gerg-L)
|
||||
- Ivan Tkachev
|
||||
- Jacek Galowicz [**(@tfc)**](https://github.com/tfc)
|
||||
- Jan Hrcek [**(@jhrcek)**](https://github.com/jhrcek)
|
||||
- Jason Yundt [**(@Jayman2000)**](https://github.com/Jayman2000)
|
||||
- Jeremy Kerfs [**(@jkerfs)**](https://github.com/jkerfs)
|
||||
- Jeremy Kolb [**(@kjeremy)**](https://github.com/kjeremy)
|
||||
- John Ericson [**(@Ericson2314)**](https://github.com/Ericson2314)
|
||||
- Jonas Chevalier [**(@zimbatm)**](https://github.com/zimbatm)
|
||||
- Jordan Justen [**(@jljusten)**](https://github.com/jljusten)
|
||||
- Josh Heinrichs [**(@joshheinrichs-shopify)**](https://github.com/joshheinrichs-shopify)
|
||||
- Jörg Thalheim [**(@Mic92)**](https://github.com/Mic92)
|
||||
- Kevin Cox [**(@kevincox)**](https://github.com/kevincox)
|
||||
- Michael Gallagher [**(@mjgallag)**](https://github.com/mjgallag)
|
||||
- Michael [**(@michaelvanstraten)**](https://github.com/michaelvanstraten)
|
||||
- Nikodem Rabuliński [**(@nrabulinski)**](https://github.com/nrabulinski)
|
||||
- Noam Yorav-Raphael [**(@noamraph)**](https://github.com/noamraph)
|
||||
- Onni Hakala [**(@onnimonni)**](https://github.com/onnimonni)
|
||||
- Parker Hoyes [**(@parkerhoyes)**](https://github.com/parkerhoyes)
|
||||
- Philipp Otterbein
|
||||
- Pol Dellaiera [**(@drupol)**](https://github.com/drupol)
|
||||
- Robert Hensing [**(@roberth)**](https://github.com/roberth)
|
||||
- Ryan Hendrickson [**(@rhendric)**](https://github.com/rhendric)
|
||||
- Sandro [**(@SuperSandro2000)**](https://github.com/SuperSandro2000)
|
||||
- Seggy Umboh [**(@secobarbital)**](https://github.com/secobarbital)
|
||||
- Sergei Zimmerman [**(@xokdvium)**](https://github.com/xokdvium)
|
||||
- Shivaraj B H [**(@shivaraj-bh)**](https://github.com/shivaraj-bh)
|
||||
- Siddhant Kumar [**(@siddhantk232)**](https://github.com/siddhantk232)
|
||||
- Tim [**(@Jaculabilis)**](https://github.com/Jaculabilis)
|
||||
- Tom Bereknyei
|
||||
- Travis A. Everett [**(@abathur)**](https://github.com/abathur)
|
||||
- Valentin Gagarin [**(@fricklerhandwerk)**](https://github.com/fricklerhandwerk)
|
||||
- Vinayak Kaushik [**(@VinayakKaushikDH)**](https://github.com/VinayakKaushikDH)
|
||||
- Yann Hamdaoui [**(@yannham)**](https://github.com/yannham)
|
||||
- Yuriy Taraday [**(@YorikSar)**](https://github.com/YorikSar)
|
||||
- bryango [**(@bryango)**](https://github.com/bryango)
|
||||
- emhamm [**(@emhamm)**](https://github.com/emhamm)
|
||||
- jade [**(@lf-)**](https://github.com/lf-)
|
||||
- kenji [**(@a-kenji)**](https://github.com/a-kenji)
|
||||
- pennae [**(@pennae)**](https://github.com/pennae)
|
||||
- puckipedia [**(@puckipedia)**](https://github.com/puckipedia)
|
||||
- squalus [**(@squalus)**](https://github.com/squalus)
|
||||
- tomberek [**(@tomberek)**](https://github.com/tomberek)
|
@ -1,13 +1,13 @@
|
||||
# Content-Addressing File System Objects
|
||||
|
||||
For many operations, Nix needs to calculate [a content addresses](@docroot@/glossary.md#gloss-content-address) of [a file system object][file system object].
|
||||
For many operations, Nix needs to calculate [a content addresses](@docroot@/glossary.md#gloss-content-address) of [a file system object][file system object] (FSO).
|
||||
Usually this is needed as part of
|
||||
[content addressing store objects](../store-object/content-address.md),
|
||||
since store objects always have a root file system object.
|
||||
But some command-line utilities also just work on "raw" file system objects, not part of any store object.
|
||||
|
||||
Every content addressing scheme Nix uses ultimately involves feeding data into a [hash function](https://en.wikipedia.org/wiki/Hash_function), and getting back an opaque fixed-size digest which is deemed a content address.
|
||||
The various *methods* of content addressing thus differ in how abstract data (in this case, a file system object and its descendents) are fed into the hash function.
|
||||
The various *methods* of content addressing thus differ in how abstract data (in this case, a file system object and its descendants) are fed into the hash function.
|
||||
|
||||
## Serialising File System Objects { #serial }
|
||||
|
||||
@ -25,7 +25,7 @@ For example, Unix commands like `sha256sum` or `sha1sum` will produce hashes for
|
||||
|
||||
### Nix Archive (NAR) { #serial-nix-archive }
|
||||
|
||||
For the other cases of [file system objects][file system object], especially directories with arbitrary descendents, we need a more complex serialisation format.
|
||||
For the other cases of [file system objects][file system object], especially directories with arbitrary descendants, we need a more complex serialisation format.
|
||||
Examples of such serialisations are the ZIP and TAR file formats.
|
||||
However, for our purposes these formats have two problems:
|
||||
|
||||
|
50
docker.nix
50
docker.nix
@ -9,6 +9,10 @@
|
||||
, maxLayers ? 100
|
||||
, nixConf ? {}
|
||||
, flake-registry ? null
|
||||
, uid ? 0
|
||||
, gid ? 0
|
||||
, uname ? "root"
|
||||
, gname ? "root"
|
||||
}:
|
||||
let
|
||||
defaultPkgs = with pkgs; [
|
||||
@ -50,6 +54,15 @@ let
|
||||
description = "Unprivileged account (don't use!)";
|
||||
};
|
||||
|
||||
} // lib.optionalAttrs (uid != 0) {
|
||||
"${uname}" = {
|
||||
uid = uid;
|
||||
shell = "${pkgs.bashInteractive}/bin/bash";
|
||||
home = "/home/${uname}";
|
||||
gid = gid;
|
||||
groups = [ "${gname}" ];
|
||||
description = "Nix user";
|
||||
};
|
||||
} // lib.listToAttrs (
|
||||
map
|
||||
(
|
||||
@ -70,6 +83,8 @@ let
|
||||
root.gid = 0;
|
||||
nixbld.gid = 30000;
|
||||
nobody.gid = 65534;
|
||||
} // lib.optionalAttrs (gid != 0) {
|
||||
"${gname}".gid = gid;
|
||||
};
|
||||
|
||||
userToPasswd = (
|
||||
@ -150,6 +165,8 @@ let
|
||||
in
|
||||
"${n} = ${vStr}") (defaultNixConf // nixConf))) + "\n";
|
||||
|
||||
userHome = if uid == 0 then "/root" else "/home/${uname}";
|
||||
|
||||
baseSystem =
|
||||
let
|
||||
nixpkgs = pkgs.path;
|
||||
@ -237,26 +254,26 @@ let
|
||||
mkdir -p $out/etc/nix
|
||||
cat $nixConfContentsPath > $out/etc/nix/nix.conf
|
||||
|
||||
mkdir -p $out/root
|
||||
mkdir -p $out/nix/var/nix/profiles/per-user/root
|
||||
mkdir -p $out${userHome}
|
||||
mkdir -p $out/nix/var/nix/profiles/per-user/${uname}
|
||||
|
||||
ln -s ${profile} $out/nix/var/nix/profiles/default-1-link
|
||||
ln -s $out/nix/var/nix/profiles/default-1-link $out/nix/var/nix/profiles/default
|
||||
ln -s /nix/var/nix/profiles/default $out/root/.nix-profile
|
||||
ln -s /nix/var/nix/profiles/default $out${userHome}/.nix-profile
|
||||
|
||||
ln -s ${channel} $out/nix/var/nix/profiles/per-user/root/channels-1-link
|
||||
ln -s $out/nix/var/nix/profiles/per-user/root/channels-1-link $out/nix/var/nix/profiles/per-user/root/channels
|
||||
ln -s ${channel} $out/nix/var/nix/profiles/per-user/${uname}/channels-1-link
|
||||
ln -s $out/nix/var/nix/profiles/per-user/${uname}/channels-1-link $out/nix/var/nix/profiles/per-user/${uname}/channels
|
||||
|
||||
mkdir -p $out/root/.nix-defexpr
|
||||
ln -s $out/nix/var/nix/profiles/per-user/root/channels $out/root/.nix-defexpr/channels
|
||||
echo "${channelURL} ${channelName}" > $out/root/.nix-channels
|
||||
mkdir -p $out${userHome}/.nix-defexpr
|
||||
ln -s $out/nix/var/nix/profiles/per-user/${uname}/channels $out${userHome}/.nix-defexpr/channels
|
||||
echo "${channelURL} ${channelName}" > $out${userHome}/.nix-channels
|
||||
|
||||
mkdir -p $out/bin $out/usr/bin
|
||||
ln -s ${pkgs.coreutils}/bin/env $out/usr/bin/env
|
||||
ln -s ${pkgs.bashInteractive}/bin/bash $out/bin/sh
|
||||
|
||||
'' + (lib.optionalString (flake-registry-path != null) ''
|
||||
nixCacheDir="/root/.cache/nix"
|
||||
nixCacheDir="${userHome}/.cache/nix"
|
||||
mkdir -p $out$nixCacheDir
|
||||
globalFlakeRegistryPath="$nixCacheDir/flake-registry.json"
|
||||
ln -s ${flake-registry-path} $out$globalFlakeRegistryPath
|
||||
@ -268,7 +285,7 @@ let
|
||||
in
|
||||
pkgs.dockerTools.buildLayeredImageWithNixDb {
|
||||
|
||||
inherit name tag maxLayers;
|
||||
inherit name tag maxLayers uid gid uname gname;
|
||||
|
||||
contents = [ baseSystem ];
|
||||
|
||||
@ -279,25 +296,28 @@ pkgs.dockerTools.buildLayeredImageWithNixDb {
|
||||
fakeRootCommands = ''
|
||||
chmod 1777 tmp
|
||||
chmod 1777 var/tmp
|
||||
chown -R ${toString uid}:${toString gid} .${userHome}
|
||||
chown -R ${toString uid}:${toString gid} nix
|
||||
'';
|
||||
|
||||
config = {
|
||||
Cmd = [ "/root/.nix-profile/bin/bash" ];
|
||||
Cmd = [ "${userHome}/.nix-profile/bin/bash" ];
|
||||
User = "${toString uid}:${toString gid}";
|
||||
Env = [
|
||||
"USER=root"
|
||||
"USER=${uname}"
|
||||
"PATH=${lib.concatStringsSep ":" [
|
||||
"/root/.nix-profile/bin"
|
||||
"${userHome}/.nix-profile/bin"
|
||||
"/nix/var/nix/profiles/default/bin"
|
||||
"/nix/var/nix/profiles/default/sbin"
|
||||
]}"
|
||||
"MANPATH=${lib.concatStringsSep ":" [
|
||||
"/root/.nix-profile/share/man"
|
||||
"${userHome}/.nix-profile/share/man"
|
||||
"/nix/var/nix/profiles/default/share/man"
|
||||
]}"
|
||||
"SSL_CERT_FILE=/nix/var/nix/profiles/default/etc/ssl/certs/ca-bundle.crt"
|
||||
"GIT_SSL_CAINFO=/nix/var/nix/profiles/default/etc/ssl/certs/ca-bundle.crt"
|
||||
"NIX_SSL_CERT_FILE=/nix/var/nix/profiles/default/etc/ssl/certs/ca-bundle.crt"
|
||||
"NIX_PATH=/nix/var/nix/profiles/per-user/root/channels:/root/.nix-defexpr/channels"
|
||||
"NIX_PATH=/nix/var/nix/profiles/per-user/${uname}/channels:${userHome}/.nix-defexpr/channels"
|
||||
];
|
||||
};
|
||||
|
||||
|
131
flake.nix
131
flake.nix
@ -137,7 +137,7 @@
|
||||
pkgs = final;
|
||||
});
|
||||
|
||||
nix = final.nixComponents.nix;
|
||||
nix = final.nixComponents.nix-cli;
|
||||
|
||||
# See https://github.com/NixOS/nixpkgs/pull/214409
|
||||
# Remove when fixed in this flake's nixpkgs
|
||||
@ -189,7 +189,6 @@
|
||||
# system, we should reenable this.
|
||||
#perlBindings = self.hydraJobs.perlBindings.${system};
|
||||
}
|
||||
/*
|
||||
# Add "passthru" tests
|
||||
// flatMapAttrs ({
|
||||
"" = nixpkgsFor.${system}.native;
|
||||
@ -211,7 +210,6 @@
|
||||
"${nixpkgsPrefix}nix-functional-tests" = nixpkgs.nixComponents.nix-functional-tests;
|
||||
}
|
||||
)
|
||||
*/
|
||||
// devFlake.checks.${system} or {}
|
||||
);
|
||||
|
||||
@ -220,7 +218,9 @@
|
||||
# for which we don't apply the full build matrix such as cross or static.
|
||||
inherit (nixpkgsFor.${system}.native)
|
||||
changelog-d;
|
||||
default = self.packages.${system}.nix-ng;
|
||||
default = self.packages.${system}.nix;
|
||||
# TODO probably should be `nix-cli`
|
||||
nix = self.packages.${system}.nix-everything;
|
||||
nix-manual = nixpkgsFor.${system}.native.nixComponents.nix-manual;
|
||||
nix-internal-api-docs = nixpkgsFor.${system}.native.nixComponents.nix-internal-api-docs;
|
||||
nix-external-api-docs = nixpkgsFor.${system}.native.nixComponents.nix-external-api-docs;
|
||||
@ -228,7 +228,6 @@
|
||||
# We need to flatten recursive attribute sets of derivations to pass `flake check`.
|
||||
// flatMapAttrs
|
||||
{ # Components we'll iterate over in the upcoming lambda
|
||||
"nix" = { };
|
||||
"nix-util" = { };
|
||||
"nix-util-c" = { };
|
||||
"nix-util-test-support" = { };
|
||||
@ -257,10 +256,11 @@
|
||||
|
||||
"nix-cli" = { };
|
||||
|
||||
"nix-everything" = { };
|
||||
|
||||
"nix-functional-tests" = { supportsCross = false; };
|
||||
|
||||
"nix-perl-bindings" = { supportsCross = false; };
|
||||
"nix-ng" = { };
|
||||
}
|
||||
(pkgName: { supportsCross ? true }: {
|
||||
# These attributes go right into `packages.<system>`.
|
||||
@ -294,109 +294,24 @@
|
||||
});
|
||||
|
||||
devShells = let
|
||||
makeShell = pkgs: stdenv: (pkgs.nix.override { inherit stdenv; forDevShell = true; }).overrideAttrs (attrs:
|
||||
let
|
||||
buildCanExecuteHost = stdenv.buildPlatform.canExecute stdenv.hostPlatform;
|
||||
modular = devFlake.getSystem stdenv.buildPlatform.system;
|
||||
transformFlag = prefix: flag:
|
||||
assert builtins.isString flag;
|
||||
let
|
||||
rest = builtins.substring 2 (builtins.stringLength flag) flag;
|
||||
in
|
||||
"-D${prefix}:${rest}";
|
||||
havePerl = stdenv.buildPlatform == stdenv.hostPlatform && stdenv.hostPlatform.isUnix;
|
||||
ignoreCrossFile = flags: builtins.filter (flag: !(lib.strings.hasInfix "cross-file" flag)) flags;
|
||||
in {
|
||||
pname = "shell-for-" + attrs.pname;
|
||||
|
||||
# Remove the version suffix to avoid unnecessary attempts to substitute in nix develop
|
||||
version = lib.fileContents ./.version;
|
||||
name = attrs.pname;
|
||||
|
||||
installFlags = "sysconfdir=$(out)/etc";
|
||||
shellHook = ''
|
||||
PATH=$prefix/bin:$PATH
|
||||
unset PYTHONPATH
|
||||
export MANPATH=$out/share/man:$MANPATH
|
||||
|
||||
# Make bash completion work.
|
||||
XDG_DATA_DIRS+=:$out/share
|
||||
'';
|
||||
|
||||
# We use this shell with the local checkout, not unpackPhase.
|
||||
src = null;
|
||||
|
||||
env = {
|
||||
# Needed for Meson to find Boost.
|
||||
# https://github.com/NixOS/nixpkgs/issues/86131.
|
||||
BOOST_INCLUDEDIR = "${lib.getDev pkgs.nixDependencies.boost}/include";
|
||||
BOOST_LIBRARYDIR = "${lib.getLib pkgs.nixDependencies.boost}/lib";
|
||||
# For `make format`, to work without installing pre-commit
|
||||
_NIX_PRE_COMMIT_HOOKS_CONFIG =
|
||||
"${(pkgs.formats.yaml { }).generate "pre-commit-config.yaml" modular.pre-commit.settings.rawConfig}";
|
||||
};
|
||||
|
||||
mesonFlags =
|
||||
map (transformFlag "libutil") (ignoreCrossFile pkgs.nixComponents.nix-util.mesonFlags)
|
||||
++ map (transformFlag "libstore") (ignoreCrossFile pkgs.nixComponents.nix-store.mesonFlags)
|
||||
++ map (transformFlag "libfetchers") (ignoreCrossFile pkgs.nixComponents.nix-fetchers.mesonFlags)
|
||||
++ lib.optionals havePerl (map (transformFlag "perl") (ignoreCrossFile pkgs.nixComponents.nix-perl-bindings.mesonFlags))
|
||||
++ map (transformFlag "libexpr") (ignoreCrossFile pkgs.nixComponents.nix-expr.mesonFlags)
|
||||
++ map (transformFlag "libcmd") (ignoreCrossFile pkgs.nixComponents.nix-cmd.mesonFlags)
|
||||
;
|
||||
|
||||
nativeBuildInputs = attrs.nativeBuildInputs or []
|
||||
++ pkgs.nixComponents.nix-util.nativeBuildInputs
|
||||
++ pkgs.nixComponents.nix-store.nativeBuildInputs
|
||||
++ pkgs.nixComponents.nix-fetchers.nativeBuildInputs
|
||||
++ lib.optionals havePerl pkgs.nixComponents.nix-perl-bindings.nativeBuildInputs
|
||||
++ lib.optionals buildCanExecuteHost pkgs.nixComponents.nix-manual.externalNativeBuildInputs
|
||||
++ pkgs.nixComponents.nix-internal-api-docs.nativeBuildInputs
|
||||
++ pkgs.nixComponents.nix-external-api-docs.nativeBuildInputs
|
||||
++ pkgs.nixComponents.nix-functional-tests.externalNativeBuildInputs
|
||||
++ lib.optional
|
||||
(!buildCanExecuteHost
|
||||
# Hack around https://github.com/nixos/nixpkgs/commit/bf7ad8cfbfa102a90463433e2c5027573b462479
|
||||
&& !(stdenv.hostPlatform.isWindows && stdenv.buildPlatform.isDarwin)
|
||||
&& stdenv.hostPlatform.emulatorAvailable pkgs.buildPackages
|
||||
&& lib.meta.availableOn stdenv.buildPlatform (stdenv.hostPlatform.emulator pkgs.buildPackages))
|
||||
pkgs.buildPackages.mesonEmulatorHook
|
||||
++ [
|
||||
pkgs.buildPackages.cmake
|
||||
pkgs.buildPackages.shellcheck
|
||||
pkgs.buildPackages.changelog-d
|
||||
modular.pre-commit.settings.package
|
||||
(pkgs.writeScriptBin "pre-commit-hooks-install"
|
||||
modular.pre-commit.settings.installationScript)
|
||||
]
|
||||
# TODO: Remove the darwin check once
|
||||
# https://github.com/NixOS/nixpkgs/pull/291814 is available
|
||||
++ lib.optional (stdenv.cc.isClang && !stdenv.buildPlatform.isDarwin) pkgs.buildPackages.bear
|
||||
++ lib.optional (stdenv.cc.isClang && stdenv.hostPlatform == stdenv.buildPlatform) (lib.hiPrio pkgs.buildPackages.clang-tools);
|
||||
|
||||
buildInputs = attrs.buildInputs or []
|
||||
++ [
|
||||
pkgs.gtest
|
||||
pkgs.rapidcheck
|
||||
]
|
||||
++ lib.optional havePerl pkgs.perl
|
||||
;
|
||||
});
|
||||
in
|
||||
makeShell = import ./packaging/dev-shell.nix { inherit lib devFlake; };
|
||||
prefixAttrs = prefix: lib.concatMapAttrs (k: v: { "${prefix}-${k}" = v; });
|
||||
in
|
||||
forAllSystems (system:
|
||||
let
|
||||
makeShells = prefix: pkgs:
|
||||
lib.mapAttrs'
|
||||
(k: v: lib.nameValuePair "${prefix}-${k}" v)
|
||||
(forAllStdenvs (stdenvName: makeShell pkgs pkgs.${stdenvName}));
|
||||
in
|
||||
(makeShells "native" nixpkgsFor.${system}.native) //
|
||||
(lib.optionalAttrs (!nixpkgsFor.${system}.native.stdenv.isDarwin)
|
||||
(makeShells "static" nixpkgsFor.${system}.static) //
|
||||
(forAllCrossSystems (crossSystem: let pkgs = nixpkgsFor.${system}.cross.${crossSystem}; in makeShell pkgs pkgs.stdenv))) //
|
||||
{
|
||||
default = self.devShells.${system}.native-stdenvPackages;
|
||||
}
|
||||
prefixAttrs "native" (forAllStdenvs (stdenvName: makeShell {
|
||||
pkgs = nixpkgsFor.${system}.stdenvs."${stdenvName}Packages";
|
||||
})) //
|
||||
lib.optionalAttrs (!nixpkgsFor.${system}.native.stdenv.isDarwin) (
|
||||
prefixAttrs "static" (forAllStdenvs (stdenvName: makeShell {
|
||||
pkgs = nixpkgsFor.${system}.stdenvs."${stdenvName}Packages".pkgsStatic;
|
||||
})) //
|
||||
prefixAttrs "cross" (forAllCrossSystems (crossSystem: makeShell {
|
||||
pkgs = nixpkgsFor.${system}.cross.${crossSystem};
|
||||
}))
|
||||
) //
|
||||
{
|
||||
default = self.devShells.${system}.native-stdenvPackages;
|
||||
}
|
||||
);
|
||||
};
|
||||
}
|
||||
|
15
local.mk
15
local.mk
@ -1,15 +0,0 @@
|
||||
GLOBAL_CXXFLAGS += -Wno-deprecated-declarations -Werror=switch
|
||||
# Allow switch-enum to be overridden for files that do not support it, usually because of dependency headers.
|
||||
ERROR_SWITCH_ENUM = -Werror=switch-enum
|
||||
|
||||
$(foreach i, config.h $(wildcard src/lib*/*.hh) $(filter-out %_internal.h, $(wildcard src/lib*c/*.h)), \
|
||||
$(eval $(call install-file-in, $(i), $(includedir)/nix, 0644)))
|
||||
|
||||
ifdef HOST_UNIX
|
||||
$(foreach i, $(wildcard src/lib*/unix/*.hh), \
|
||||
$(eval $(call install-file-in, $(i), $(includedir)/nix, 0644)))
|
||||
endif
|
||||
|
||||
$(GCH): src/libutil/util.hh config.h
|
||||
|
||||
GCH_CXXFLAGS = $(INCLUDE_libutil)
|
@ -1,951 +0,0 @@
|
||||
# ===========================================================================
|
||||
# https://www.gnu.org/software/autoconf-archive/ax_cxx_compile_stdcxx.html
|
||||
# ===========================================================================
|
||||
#
|
||||
# SYNOPSIS
|
||||
#
|
||||
# AX_CXX_COMPILE_STDCXX(VERSION, [ext|noext], [mandatory|optional])
|
||||
#
|
||||
# DESCRIPTION
|
||||
#
|
||||
# Check for baseline language coverage in the compiler for the specified
|
||||
# version of the C++ standard. If necessary, add switches to CXX and
|
||||
# CXXCPP to enable support. VERSION may be '11' (for the C++11 standard)
|
||||
# or '14' (for the C++14 standard).
|
||||
#
|
||||
# The second argument, if specified, indicates whether you insist on an
|
||||
# extended mode (e.g. -std=gnu++11) or a strict conformance mode (e.g.
|
||||
# -std=c++11). If neither is specified, you get whatever works, with
|
||||
# preference for an extended mode.
|
||||
#
|
||||
# The third argument, if specified 'mandatory' or if left unspecified,
|
||||
# indicates that baseline support for the specified C++ standard is
|
||||
# required and that the macro should error out if no mode with that
|
||||
# support is found. If specified 'optional', then configuration proceeds
|
||||
# regardless, after defining HAVE_CXX${VERSION} if and only if a
|
||||
# supporting mode is found.
|
||||
#
|
||||
# LICENSE
|
||||
#
|
||||
# Copyright (c) 2008 Benjamin Kosnik <bkoz@redhat.com>
|
||||
# Copyright (c) 2012 Zack Weinberg <zackw@panix.com>
|
||||
# Copyright (c) 2013 Roy Stogner <roystgnr@ices.utexas.edu>
|
||||
# Copyright (c) 2014, 2015 Google Inc.; contributed by Alexey Sokolov <sokolov@google.com>
|
||||
# Copyright (c) 2015 Paul Norman <penorman@mac.com>
|
||||
# Copyright (c) 2015 Moritz Klammler <moritz@klammler.eu>
|
||||
# Copyright (c) 2016, 2018 Krzesimir Nowak <qdlacz@gmail.com>
|
||||
# Copyright (c) 2019 Enji Cooper <yaneurabeya@gmail.com>
|
||||
#
|
||||
# Copying and distribution of this file, with or without modification, are
|
||||
# permitted in any medium without royalty provided the copyright notice
|
||||
# and this notice are preserved. This file is offered as-is, without any
|
||||
# warranty.
|
||||
|
||||
#serial 11
|
||||
|
||||
dnl This macro is based on the code from the AX_CXX_COMPILE_STDCXX_11 macro
|
||||
dnl (serial version number 13).
|
||||
|
||||
AC_DEFUN([AX_CXX_COMPILE_STDCXX], [dnl
|
||||
m4_if([$1], [11], [ax_cxx_compile_alternatives="11 0x"],
|
||||
[$1], [14], [ax_cxx_compile_alternatives="14 1y"],
|
||||
[$1], [17], [ax_cxx_compile_alternatives="17 1z"],
|
||||
[m4_fatal([invalid first argument `$1' to AX_CXX_COMPILE_STDCXX])])dnl
|
||||
m4_if([$2], [], [],
|
||||
[$2], [ext], [],
|
||||
[$2], [noext], [],
|
||||
[m4_fatal([invalid second argument `$2' to AX_CXX_COMPILE_STDCXX])])dnl
|
||||
m4_if([$3], [], [ax_cxx_compile_cxx$1_required=true],
|
||||
[$3], [mandatory], [ax_cxx_compile_cxx$1_required=true],
|
||||
[$3], [optional], [ax_cxx_compile_cxx$1_required=false],
|
||||
[m4_fatal([invalid third argument `$3' to AX_CXX_COMPILE_STDCXX])])
|
||||
AC_LANG_PUSH([C++])dnl
|
||||
ac_success=no
|
||||
|
||||
m4_if([$2], [noext], [], [dnl
|
||||
if test x$ac_success = xno; then
|
||||
for alternative in ${ax_cxx_compile_alternatives}; do
|
||||
switch="-std=gnu++${alternative}"
|
||||
cachevar=AS_TR_SH([ax_cv_cxx_compile_cxx$1_$switch])
|
||||
AC_CACHE_CHECK(whether $CXX supports C++$1 features with $switch,
|
||||
$cachevar,
|
||||
[ac_save_CXX="$CXX"
|
||||
CXX="$CXX $switch"
|
||||
AC_COMPILE_IFELSE([AC_LANG_SOURCE([_AX_CXX_COMPILE_STDCXX_testbody_$1])],
|
||||
[eval $cachevar=yes],
|
||||
[eval $cachevar=no])
|
||||
CXX="$ac_save_CXX"])
|
||||
if eval test x\$$cachevar = xyes; then
|
||||
CXX="$CXX $switch"
|
||||
if test -n "$CXXCPP" ; then
|
||||
CXXCPP="$CXXCPP $switch"
|
||||
fi
|
||||
ac_success=yes
|
||||
break
|
||||
fi
|
||||
done
|
||||
fi])
|
||||
|
||||
m4_if([$2], [ext], [], [dnl
|
||||
if test x$ac_success = xno; then
|
||||
dnl HP's aCC needs +std=c++11 according to:
|
||||
dnl http://h21007.www2.hp.com/portal/download/files/unprot/aCxx/PDF_Release_Notes/769149-001.pdf
|
||||
dnl Cray's crayCC needs "-h std=c++11"
|
||||
for alternative in ${ax_cxx_compile_alternatives}; do
|
||||
for switch in -std=c++${alternative} +std=c++${alternative} "-h std=c++${alternative}"; do
|
||||
cachevar=AS_TR_SH([ax_cv_cxx_compile_cxx$1_$switch])
|
||||
AC_CACHE_CHECK(whether $CXX supports C++$1 features with $switch,
|
||||
$cachevar,
|
||||
[ac_save_CXX="$CXX"
|
||||
CXX="$CXX $switch"
|
||||
AC_COMPILE_IFELSE([AC_LANG_SOURCE([_AX_CXX_COMPILE_STDCXX_testbody_$1])],
|
||||
[eval $cachevar=yes],
|
||||
[eval $cachevar=no])
|
||||
CXX="$ac_save_CXX"])
|
||||
if eval test x\$$cachevar = xyes; then
|
||||
CXX="$CXX $switch"
|
||||
if test -n "$CXXCPP" ; then
|
||||
CXXCPP="$CXXCPP $switch"
|
||||
fi
|
||||
ac_success=yes
|
||||
break
|
||||
fi
|
||||
done
|
||||
if test x$ac_success = xyes; then
|
||||
break
|
||||
fi
|
||||
done
|
||||
fi])
|
||||
AC_LANG_POP([C++])
|
||||
if test x$ax_cxx_compile_cxx$1_required = xtrue; then
|
||||
if test x$ac_success = xno; then
|
||||
AC_MSG_ERROR([*** A compiler with support for C++$1 language features is required.])
|
||||
fi
|
||||
fi
|
||||
if test x$ac_success = xno; then
|
||||
HAVE_CXX$1=0
|
||||
AC_MSG_NOTICE([No compiler with C++$1 support was found])
|
||||
else
|
||||
HAVE_CXX$1=1
|
||||
AC_DEFINE(HAVE_CXX$1,1,
|
||||
[define if the compiler supports basic C++$1 syntax])
|
||||
fi
|
||||
AC_SUBST(HAVE_CXX$1)
|
||||
])
|
||||
|
||||
|
||||
dnl Test body for checking C++11 support
|
||||
|
||||
m4_define([_AX_CXX_COMPILE_STDCXX_testbody_11],
|
||||
_AX_CXX_COMPILE_STDCXX_testbody_new_in_11
|
||||
)
|
||||
|
||||
|
||||
dnl Test body for checking C++14 support
|
||||
|
||||
m4_define([_AX_CXX_COMPILE_STDCXX_testbody_14],
|
||||
_AX_CXX_COMPILE_STDCXX_testbody_new_in_11
|
||||
_AX_CXX_COMPILE_STDCXX_testbody_new_in_14
|
||||
)
|
||||
|
||||
m4_define([_AX_CXX_COMPILE_STDCXX_testbody_17],
|
||||
_AX_CXX_COMPILE_STDCXX_testbody_new_in_11
|
||||
_AX_CXX_COMPILE_STDCXX_testbody_new_in_14
|
||||
_AX_CXX_COMPILE_STDCXX_testbody_new_in_17
|
||||
)
|
||||
|
||||
dnl Tests for new features in C++11
|
||||
|
||||
m4_define([_AX_CXX_COMPILE_STDCXX_testbody_new_in_11], [[
|
||||
|
||||
// If the compiler admits that it is not ready for C++11, why torture it?
|
||||
// Hopefully, this will speed up the test.
|
||||
|
||||
#ifndef __cplusplus
|
||||
|
||||
#error "This is not a C++ compiler"
|
||||
|
||||
#elif __cplusplus < 201103L
|
||||
|
||||
#error "This is not a C++11 compiler"
|
||||
|
||||
#else
|
||||
|
||||
namespace cxx11
|
||||
{
|
||||
|
||||
namespace test_static_assert
|
||||
{
|
||||
|
||||
template <typename T>
|
||||
struct check
|
||||
{
|
||||
static_assert(sizeof(int) <= sizeof(T), "not big enough");
|
||||
};
|
||||
|
||||
}
|
||||
|
||||
namespace test_final_override
|
||||
{
|
||||
|
||||
struct Base
|
||||
{
|
||||
virtual ~Base() {}
|
||||
virtual void f() {}
|
||||
};
|
||||
|
||||
struct Derived : public Base
|
||||
{
|
||||
virtual ~Derived() override {}
|
||||
virtual void f() override {}
|
||||
};
|
||||
|
||||
}
|
||||
|
||||
namespace test_double_right_angle_brackets
|
||||
{
|
||||
|
||||
template < typename T >
|
||||
struct check {};
|
||||
|
||||
typedef check<void> single_type;
|
||||
typedef check<check<void>> double_type;
|
||||
typedef check<check<check<void>>> triple_type;
|
||||
typedef check<check<check<check<void>>>> quadruple_type;
|
||||
|
||||
}
|
||||
|
||||
namespace test_decltype
|
||||
{
|
||||
|
||||
int
|
||||
f()
|
||||
{
|
||||
int a = 1;
|
||||
decltype(a) b = 2;
|
||||
return a + b;
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
namespace test_type_deduction
|
||||
{
|
||||
|
||||
template < typename T1, typename T2 >
|
||||
struct is_same
|
||||
{
|
||||
static const bool value = false;
|
||||
};
|
||||
|
||||
template < typename T >
|
||||
struct is_same<T, T>
|
||||
{
|
||||
static const bool value = true;
|
||||
};
|
||||
|
||||
template < typename T1, typename T2 >
|
||||
auto
|
||||
add(T1 a1, T2 a2) -> decltype(a1 + a2)
|
||||
{
|
||||
return a1 + a2;
|
||||
}
|
||||
|
||||
int
|
||||
test(const int c, volatile int v)
|
||||
{
|
||||
static_assert(is_same<int, decltype(0)>::value == true, "");
|
||||
static_assert(is_same<int, decltype(c)>::value == false, "");
|
||||
static_assert(is_same<int, decltype(v)>::value == false, "");
|
||||
auto ac = c;
|
||||
auto av = v;
|
||||
auto sumi = ac + av + 'x';
|
||||
auto sumf = ac + av + 1.0;
|
||||
static_assert(is_same<int, decltype(ac)>::value == true, "");
|
||||
static_assert(is_same<int, decltype(av)>::value == true, "");
|
||||
static_assert(is_same<int, decltype(sumi)>::value == true, "");
|
||||
static_assert(is_same<int, decltype(sumf)>::value == false, "");
|
||||
static_assert(is_same<int, decltype(add(c, v))>::value == true, "");
|
||||
return (sumf > 0.0) ? sumi : add(c, v);
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
namespace test_noexcept
|
||||
{
|
||||
|
||||
int f() { return 0; }
|
||||
int g() noexcept { return 0; }
|
||||
|
||||
static_assert(noexcept(f()) == false, "");
|
||||
static_assert(noexcept(g()) == true, "");
|
||||
|
||||
}
|
||||
|
||||
namespace test_constexpr
|
||||
{
|
||||
|
||||
template < typename CharT >
|
||||
unsigned long constexpr
|
||||
strlen_c_r(const CharT *const s, const unsigned long acc) noexcept
|
||||
{
|
||||
return *s ? strlen_c_r(s + 1, acc + 1) : acc;
|
||||
}
|
||||
|
||||
template < typename CharT >
|
||||
unsigned long constexpr
|
||||
strlen_c(const CharT *const s) noexcept
|
||||
{
|
||||
return strlen_c_r(s, 0UL);
|
||||
}
|
||||
|
||||
static_assert(strlen_c("") == 0UL, "");
|
||||
static_assert(strlen_c("1") == 1UL, "");
|
||||
static_assert(strlen_c("example") == 7UL, "");
|
||||
static_assert(strlen_c("another\0example") == 7UL, "");
|
||||
|
||||
}
|
||||
|
||||
namespace test_rvalue_references
|
||||
{
|
||||
|
||||
template < int N >
|
||||
struct answer
|
||||
{
|
||||
static constexpr int value = N;
|
||||
};
|
||||
|
||||
answer<1> f(int&) { return answer<1>(); }
|
||||
answer<2> f(const int&) { return answer<2>(); }
|
||||
answer<3> f(int&&) { return answer<3>(); }
|
||||
|
||||
void
|
||||
test()
|
||||
{
|
||||
int i = 0;
|
||||
const int c = 0;
|
||||
static_assert(decltype(f(i))::value == 1, "");
|
||||
static_assert(decltype(f(c))::value == 2, "");
|
||||
static_assert(decltype(f(0))::value == 3, "");
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
namespace test_uniform_initialization
|
||||
{
|
||||
|
||||
struct test
|
||||
{
|
||||
static const int zero {};
|
||||
static const int one {1};
|
||||
};
|
||||
|
||||
static_assert(test::zero == 0, "");
|
||||
static_assert(test::one == 1, "");
|
||||
|
||||
}
|
||||
|
||||
namespace test_lambdas
|
||||
{
|
||||
|
||||
void
|
||||
test1()
|
||||
{
|
||||
auto lambda1 = [](){};
|
||||
auto lambda2 = lambda1;
|
||||
lambda1();
|
||||
lambda2();
|
||||
}
|
||||
|
||||
int
|
||||
test2()
|
||||
{
|
||||
auto a = [](int i, int j){ return i + j; }(1, 2);
|
||||
auto b = []() -> int { return '0'; }();
|
||||
auto c = [=](){ return a + b; }();
|
||||
auto d = [&](){ return c; }();
|
||||
auto e = [a, &b](int x) mutable {
|
||||
const auto identity = [](int y){ return y; };
|
||||
for (auto i = 0; i < a; ++i)
|
||||
a += b--;
|
||||
return x + identity(a + b);
|
||||
}(0);
|
||||
return a + b + c + d + e;
|
||||
}
|
||||
|
||||
int
|
||||
test3()
|
||||
{
|
||||
const auto nullary = [](){ return 0; };
|
||||
const auto unary = [](int x){ return x; };
|
||||
using nullary_t = decltype(nullary);
|
||||
using unary_t = decltype(unary);
|
||||
const auto higher1st = [](nullary_t f){ return f(); };
|
||||
const auto higher2nd = [unary](nullary_t f1){
|
||||
return [unary, f1](unary_t f2){ return f2(unary(f1())); };
|
||||
};
|
||||
return higher1st(nullary) + higher2nd(nullary)(unary);
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
namespace test_variadic_templates
|
||||
{
|
||||
|
||||
template <int...>
|
||||
struct sum;
|
||||
|
||||
template <int N0, int... N1toN>
|
||||
struct sum<N0, N1toN...>
|
||||
{
|
||||
static constexpr auto value = N0 + sum<N1toN...>::value;
|
||||
};
|
||||
|
||||
template <>
|
||||
struct sum<>
|
||||
{
|
||||
static constexpr auto value = 0;
|
||||
};
|
||||
|
||||
static_assert(sum<>::value == 0, "");
|
||||
static_assert(sum<1>::value == 1, "");
|
||||
static_assert(sum<23>::value == 23, "");
|
||||
static_assert(sum<1, 2>::value == 3, "");
|
||||
static_assert(sum<5, 5, 11>::value == 21, "");
|
||||
static_assert(sum<2, 3, 5, 7, 11, 13>::value == 41, "");
|
||||
|
||||
}
|
||||
|
||||
// http://stackoverflow.com/questions/13728184/template-aliases-and-sfinae
|
||||
// Clang 3.1 fails with headers of libstd++ 4.8.3 when using std::function
|
||||
// because of this.
|
||||
namespace test_template_alias_sfinae
|
||||
{
|
||||
|
||||
struct foo {};
|
||||
|
||||
template<typename T>
|
||||
using member = typename T::member_type;
|
||||
|
||||
template<typename T>
|
||||
void func(...) {}
|
||||
|
||||
template<typename T>
|
||||
void func(member<T>*) {}
|
||||
|
||||
void test();
|
||||
|
||||
void test() { func<foo>(0); }
|
||||
|
||||
}
|
||||
|
||||
} // namespace cxx11
|
||||
|
||||
#endif // __cplusplus >= 201103L
|
||||
|
||||
]])
|
||||
|
||||
|
||||
dnl Tests for new features in C++14
|
||||
|
||||
m4_define([_AX_CXX_COMPILE_STDCXX_testbody_new_in_14], [[
|
||||
|
||||
// If the compiler admits that it is not ready for C++14, why torture it?
|
||||
// Hopefully, this will speed up the test.
|
||||
|
||||
#ifndef __cplusplus
|
||||
|
||||
#error "This is not a C++ compiler"
|
||||
|
||||
#elif __cplusplus < 201402L
|
||||
|
||||
#error "This is not a C++14 compiler"
|
||||
|
||||
#else
|
||||
|
||||
namespace cxx14
|
||||
{
|
||||
|
||||
namespace test_polymorphic_lambdas
|
||||
{
|
||||
|
||||
int
|
||||
test()
|
||||
{
|
||||
const auto lambda = [](auto&&... args){
|
||||
const auto istiny = [](auto x){
|
||||
return (sizeof(x) == 1UL) ? 1 : 0;
|
||||
};
|
||||
const int aretiny[] = { istiny(args)... };
|
||||
return aretiny[0];
|
||||
};
|
||||
return lambda(1, 1L, 1.0f, '1');
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
namespace test_binary_literals
|
||||
{
|
||||
|
||||
constexpr auto ivii = 0b0000000000101010;
|
||||
static_assert(ivii == 42, "wrong value");
|
||||
|
||||
}
|
||||
|
||||
namespace test_generalized_constexpr
|
||||
{
|
||||
|
||||
template < typename CharT >
|
||||
constexpr unsigned long
|
||||
strlen_c(const CharT *const s) noexcept
|
||||
{
|
||||
auto length = 0UL;
|
||||
for (auto p = s; *p; ++p)
|
||||
++length;
|
||||
return length;
|
||||
}
|
||||
|
||||
static_assert(strlen_c("") == 0UL, "");
|
||||
static_assert(strlen_c("x") == 1UL, "");
|
||||
static_assert(strlen_c("test") == 4UL, "");
|
||||
static_assert(strlen_c("another\0test") == 7UL, "");
|
||||
|
||||
}
|
||||
|
||||
namespace test_lambda_init_capture
|
||||
{
|
||||
|
||||
int
|
||||
test()
|
||||
{
|
||||
auto x = 0;
|
||||
const auto lambda1 = [a = x](int b){ return a + b; };
|
||||
const auto lambda2 = [a = lambda1(x)](){ return a; };
|
||||
return lambda2();
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
namespace test_digit_separators
|
||||
{
|
||||
|
||||
constexpr auto ten_million = 100'000'000;
|
||||
static_assert(ten_million == 100000000, "");
|
||||
|
||||
}
|
||||
|
||||
namespace test_return_type_deduction
|
||||
{
|
||||
|
||||
auto f(int& x) { return x; }
|
||||
decltype(auto) g(int& x) { return x; }
|
||||
|
||||
template < typename T1, typename T2 >
|
||||
struct is_same
|
||||
{
|
||||
static constexpr auto value = false;
|
||||
};
|
||||
|
||||
template < typename T >
|
||||
struct is_same<T, T>
|
||||
{
|
||||
static constexpr auto value = true;
|
||||
};
|
||||
|
||||
int
|
||||
test()
|
||||
{
|
||||
auto x = 0;
|
||||
static_assert(is_same<int, decltype(f(x))>::value, "");
|
||||
static_assert(is_same<int&, decltype(g(x))>::value, "");
|
||||
return x;
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
} // namespace cxx14
|
||||
|
||||
#endif // __cplusplus >= 201402L
|
||||
|
||||
]])
|
||||
|
||||
|
||||
dnl Tests for new features in C++17
|
||||
|
||||
m4_define([_AX_CXX_COMPILE_STDCXX_testbody_new_in_17], [[
|
||||
|
||||
// If the compiler admits that it is not ready for C++17, why torture it?
|
||||
// Hopefully, this will speed up the test.
|
||||
|
||||
#ifndef __cplusplus
|
||||
|
||||
#error "This is not a C++ compiler"
|
||||
|
||||
#elif __cplusplus < 201703L
|
||||
|
||||
#error "This is not a C++17 compiler"
|
||||
|
||||
#else
|
||||
|
||||
#include <initializer_list>
|
||||
#include <utility>
|
||||
#include <type_traits>
|
||||
|
||||
namespace cxx17
|
||||
{
|
||||
|
||||
namespace test_constexpr_lambdas
|
||||
{
|
||||
|
||||
constexpr int foo = [](){return 42;}();
|
||||
|
||||
}
|
||||
|
||||
namespace test::nested_namespace::definitions
|
||||
{
|
||||
|
||||
}
|
||||
|
||||
namespace test_fold_expression
|
||||
{
|
||||
|
||||
template<typename... Args>
|
||||
int multiply(Args... args)
|
||||
{
|
||||
return (args * ... * 1);
|
||||
}
|
||||
|
||||
template<typename... Args>
|
||||
bool all(Args... args)
|
||||
{
|
||||
return (args && ...);
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
namespace test_extended_static_assert
|
||||
{
|
||||
|
||||
static_assert (true);
|
||||
|
||||
}
|
||||
|
||||
namespace test_auto_brace_init_list
|
||||
{
|
||||
|
||||
auto foo = {5};
|
||||
auto bar {5};
|
||||
|
||||
static_assert(std::is_same<std::initializer_list<int>, decltype(foo)>::value);
|
||||
static_assert(std::is_same<int, decltype(bar)>::value);
|
||||
}
|
||||
|
||||
namespace test_typename_in_template_template_parameter
|
||||
{
|
||||
|
||||
template<template<typename> typename X> struct D;
|
||||
|
||||
}
|
||||
|
||||
namespace test_fallthrough_nodiscard_maybe_unused_attributes
|
||||
{
|
||||
|
||||
int f1()
|
||||
{
|
||||
return 42;
|
||||
}
|
||||
|
||||
[[nodiscard]] int f2()
|
||||
{
|
||||
[[maybe_unused]] auto unused = f1();
|
||||
|
||||
switch (f1())
|
||||
{
|
||||
case 17:
|
||||
f1();
|
||||
[[fallthrough]];
|
||||
case 42:
|
||||
f1();
|
||||
}
|
||||
return f1();
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
namespace test_extended_aggregate_initialization
|
||||
{
|
||||
|
||||
struct base1
|
||||
{
|
||||
int b1, b2 = 42;
|
||||
};
|
||||
|
||||
struct base2
|
||||
{
|
||||
base2() {
|
||||
b3 = 42;
|
||||
}
|
||||
int b3;
|
||||
};
|
||||
|
||||
struct derived : base1, base2
|
||||
{
|
||||
int d;
|
||||
};
|
||||
|
||||
derived d1 {{1, 2}, {}, 4}; // full initialization
|
||||
derived d2 {{}, {}, 4}; // value-initialized bases
|
||||
|
||||
}
|
||||
|
||||
namespace test_general_range_based_for_loop
|
||||
{
|
||||
|
||||
struct iter
|
||||
{
|
||||
int i;
|
||||
|
||||
int& operator* ()
|
||||
{
|
||||
return i;
|
||||
}
|
||||
|
||||
const int& operator* () const
|
||||
{
|
||||
return i;
|
||||
}
|
||||
|
||||
iter& operator++()
|
||||
{
|
||||
++i;
|
||||
return *this;
|
||||
}
|
||||
};
|
||||
|
||||
struct sentinel
|
||||
{
|
||||
int i;
|
||||
};
|
||||
|
||||
bool operator== (const iter& i, const sentinel& s)
|
||||
{
|
||||
return i.i == s.i;
|
||||
}
|
||||
|
||||
bool operator!= (const iter& i, const sentinel& s)
|
||||
{
|
||||
return !(i == s);
|
||||
}
|
||||
|
||||
struct range
|
||||
{
|
||||
iter begin() const
|
||||
{
|
||||
return {0};
|
||||
}
|
||||
|
||||
sentinel end() const
|
||||
{
|
||||
return {5};
|
||||
}
|
||||
};
|
||||
|
||||
void f()
|
||||
{
|
||||
range r {};
|
||||
|
||||
for (auto i : r)
|
||||
{
|
||||
[[maybe_unused]] auto v = i;
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
namespace test_lambda_capture_asterisk_this_by_value
|
||||
{
|
||||
|
||||
struct t
|
||||
{
|
||||
int i;
|
||||
int foo()
|
||||
{
|
||||
return [*this]()
|
||||
{
|
||||
return i;
|
||||
}();
|
||||
}
|
||||
};
|
||||
|
||||
}
|
||||
|
||||
namespace test_enum_class_construction
|
||||
{
|
||||
|
||||
enum class byte : unsigned char
|
||||
{};
|
||||
|
||||
byte foo {42};
|
||||
|
||||
}
|
||||
|
||||
namespace test_constexpr_if
|
||||
{
|
||||
|
||||
template <bool cond>
|
||||
int f ()
|
||||
{
|
||||
if constexpr(cond)
|
||||
{
|
||||
return 13;
|
||||
}
|
||||
else
|
||||
{
|
||||
return 42;
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
namespace test_selection_statement_with_initializer
|
||||
{
|
||||
|
||||
int f()
|
||||
{
|
||||
return 13;
|
||||
}
|
||||
|
||||
int f2()
|
||||
{
|
||||
if (auto i = f(); i > 0)
|
||||
{
|
||||
return 3;
|
||||
}
|
||||
|
||||
switch (auto i = f(); i + 4)
|
||||
{
|
||||
case 17:
|
||||
return 2;
|
||||
|
||||
default:
|
||||
return 1;
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
namespace test_template_argument_deduction_for_class_templates
|
||||
{
|
||||
|
||||
template <typename T1, typename T2>
|
||||
struct pair
|
||||
{
|
||||
pair (T1 p1, T2 p2)
|
||||
: m1 {p1},
|
||||
m2 {p2}
|
||||
{}
|
||||
|
||||
T1 m1;
|
||||
T2 m2;
|
||||
};
|
||||
|
||||
void f()
|
||||
{
|
||||
[[maybe_unused]] auto p = pair{13, 42u};
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
namespace test_non_type_auto_template_parameters
|
||||
{
|
||||
|
||||
template <auto n>
|
||||
struct B
|
||||
{};
|
||||
|
||||
B<5> b1;
|
||||
B<'a'> b2;
|
||||
|
||||
}
|
||||
|
||||
namespace test_structured_bindings
|
||||
{
|
||||
|
||||
int arr[2] = { 1, 2 };
|
||||
std::pair<int, int> pr = { 1, 2 };
|
||||
|
||||
auto f1() -> int(&)[2]
|
||||
{
|
||||
return arr;
|
||||
}
|
||||
|
||||
auto f2() -> std::pair<int, int>&
|
||||
{
|
||||
return pr;
|
||||
}
|
||||
|
||||
struct S
|
||||
{
|
||||
int x1 : 2;
|
||||
volatile double y1;
|
||||
};
|
||||
|
||||
S f3()
|
||||
{
|
||||
return {};
|
||||
}
|
||||
|
||||
auto [ x1, y1 ] = f1();
|
||||
auto& [ xr1, yr1 ] = f1();
|
||||
auto [ x2, y2 ] = f2();
|
||||
auto& [ xr2, yr2 ] = f2();
|
||||
const auto [ x3, y3 ] = f3();
|
||||
|
||||
}
|
||||
|
||||
namespace test_exception_spec_type_system
|
||||
{
|
||||
|
||||
struct Good {};
|
||||
struct Bad {};
|
||||
|
||||
void g1() noexcept;
|
||||
void g2();
|
||||
|
||||
template<typename T>
|
||||
Bad
|
||||
f(T*, T*);
|
||||
|
||||
template<typename T1, typename T2>
|
||||
Good
|
||||
f(T1*, T2*);
|
||||
|
||||
static_assert (std::is_same_v<Good, decltype(f(g1, g2))>);
|
||||
|
||||
}
|
||||
|
||||
namespace test_inline_variables
|
||||
{
|
||||
|
||||
template<class T> void f(T)
|
||||
{}
|
||||
|
||||
template<class T> inline T g(T)
|
||||
{
|
||||
return T{};
|
||||
}
|
||||
|
||||
template<> inline void f<>(int)
|
||||
{}
|
||||
|
||||
template<> int g<>(int)
|
||||
{
|
||||
return 5;
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
} // namespace cxx17
|
||||
|
||||
#endif // __cplusplus < 201703L
|
||||
|
||||
]])
|
@ -1,35 +0,0 @@
|
||||
# =============================================================================
|
||||
# https://www.gnu.org/software/autoconf-archive/ax_cxx_compile_stdcxx_17.html
|
||||
# =============================================================================
|
||||
#
|
||||
# SYNOPSIS
|
||||
#
|
||||
# AX_CXX_COMPILE_STDCXX_17([ext|noext], [mandatory|optional])
|
||||
#
|
||||
# DESCRIPTION
|
||||
#
|
||||
# Check for baseline language coverage in the compiler for the C++17
|
||||
# standard; if necessary, add switches to CXX and CXXCPP to enable
|
||||
# support.
|
||||
#
|
||||
# This macro is a convenience alias for calling the AX_CXX_COMPILE_STDCXX
|
||||
# macro with the version set to C++17. The two optional arguments are
|
||||
# forwarded literally as the second and third argument respectively.
|
||||
# Please see the documentation for the AX_CXX_COMPILE_STDCXX macro for
|
||||
# more information. If you want to use this macro, you also need to
|
||||
# download the ax_cxx_compile_stdcxx.m4 file.
|
||||
#
|
||||
# LICENSE
|
||||
#
|
||||
# Copyright (c) 2015 Moritz Klammler <moritz@klammler.eu>
|
||||
# Copyright (c) 2016 Krzesimir Nowak <qdlacz@gmail.com>
|
||||
#
|
||||
# Copying and distribution of this file, with or without modification, are
|
||||
# permitted in any medium without royalty provided the copyright notice
|
||||
# and this notice are preserved. This file is offered as-is, without any
|
||||
# warranty.
|
||||
|
||||
#serial 2
|
||||
|
||||
AX_REQUIRE_DEFINED([AX_CXX_COMPILE_STDCXX])
|
||||
AC_DEFUN([AX_CXX_COMPILE_STDCXX_17], [AX_CXX_COMPILE_STDCXX([17], [$1], [$2])])
|
@ -59,6 +59,9 @@ Team meetings are generally open to anyone interested.
|
||||
We can make exceptions to discuss sensitive issues, such as security incidents or people matters.
|
||||
Contact any team member to get a calendar invite for reminders and updates.
|
||||
|
||||
> [!IMPORTANT]
|
||||
> [Handling security reports](./security-reports.md) always takes priority.
|
||||
|
||||
## Project board protocol
|
||||
|
||||
The team uses a [GitHub project board](https://github.com/orgs/NixOS/projects/19/views/1) for tracking its work.
|
||||
|
@ -48,5 +48,55 @@
|
||||
"delroth@gmail.com": "delroth",
|
||||
"enno@nerdworks.de": "elohmeier",
|
||||
"mjbauer95@gmail.com": "matthewbauer",
|
||||
"MostAwesomeDude@gmail.com": "MostAwesomeDude"
|
||||
"MostAwesomeDude@gmail.com": "MostAwesomeDude",
|
||||
"145775305+xokdvium@users.noreply.github.com": "xokdvium",
|
||||
"bryanhonof@gmail.com": "bryanhonof",
|
||||
"50352631+michaelvanstraten@users.noreply.github.com": "michaelvanstraten",
|
||||
"bjorn.forsman@gmail.com": "bjornfor",
|
||||
"pol.dellaiera@protonmail.com": "drupol",
|
||||
"tim.vanbaak@gmail.com": "Jaculabilis",
|
||||
"leetemil@users.noreply.github.com": "leetemil",
|
||||
"a-h@users.noreply.github.com": "a-h",
|
||||
"me@artem.ist": "artemist",
|
||||
"puck@puckipedia.com": "puckipedia",
|
||||
"marian.hammer@meetwise.com": "emhamm",
|
||||
"78693624+llakala@users.noreply.github.com": "llakala",
|
||||
"itkachev@hyperad.tech": null,
|
||||
"geofft@ldpreload.com": "geofft",
|
||||
"onni@flaky.build": "onnimonni",
|
||||
"jacek@galowicz.de": "tfc",
|
||||
"potterbein@blockstream.com": null,
|
||||
"49699333+dependabot[bot]@users.noreply.github.com": "dependabot[bot]",
|
||||
"112626461+VinayakKaushikDH@users.noreply.github.com": "VinayakKaushikDH",
|
||||
"kevincox@kevincox.ca": "kevincox",
|
||||
"yann.hamdaoui@tweag.io": "yannham",
|
||||
"GregLeyda@proton.me": "Gerg-L",
|
||||
"jljusten@gmail.com": "jljusten",
|
||||
"josh.heinrichs@shopify.com": "joshheinrichs-shopify",
|
||||
"jason@jasonyundt.email": "Jayman2000",
|
||||
"noamraph@gmail.com": "noamraph",
|
||||
"nikodem@rabulinski.com": "nrabulinski",
|
||||
"78693624+quatquatt@users.noreply.github.com": "llakala",
|
||||
"yuriy.taraday@tweag.io": "YorikSar",
|
||||
"travis.a.everett@gmail.com": "abathur",
|
||||
"Artturin@artturin.com": "Artturin",
|
||||
"zimbatm@zimbatm.com": "zimbatm",
|
||||
"contact@parkerhoyes.com": "parkerhoyes",
|
||||
"kjeremy@gmail.com": "kjeremy",
|
||||
"jkerfs@users.noreply.github.com": "jkerfs",
|
||||
"sandro.jaeckel@gmail.com": "SuperSandro2000",
|
||||
"hi@alyssa.is": "alyssais",
|
||||
"2716069+jhrcek@users.noreply.github.com": "jhrcek",
|
||||
"seggy.umboh@coupa.com": "secobarbital",
|
||||
"hello@emily.moe": "emilazy",
|
||||
"ehmry@posteo.net": "ehmry",
|
||||
"me@aleksana.moe": "Aleksanaa",
|
||||
"tom@floxdev.com": null,
|
||||
"sbh69840@gmail.com": "shivaraj-bh",
|
||||
"mjgallag@gmail.com": "mjgallag",
|
||||
"bryango@users.noreply.github.com": "bryango",
|
||||
"aks.kenji@protonmail.com": "a-kenji",
|
||||
"54070204+0x5a4@users.noreply.github.com": "0x5a4",
|
||||
"brian@bmcgee.ie": "brianmcgee",
|
||||
"squalus@squalus.net": "squalus"
|
||||
}
|
@ -41,5 +41,50 @@
|
||||
"winterqt": "Winter",
|
||||
"GoldsteinE": "Max \u201cGoldstein\u201d Siling",
|
||||
"pennae": null,
|
||||
"MostAwesomeDude": "Corbin Simpson"
|
||||
"MostAwesomeDude": "Corbin Simpson",
|
||||
"VinayakKaushikDH": "Vinayak Kaushik",
|
||||
"leetemil": "Emil Petersen",
|
||||
"michaelvanstraten": "Michael",
|
||||
"parkerhoyes": "Parker Hoyes",
|
||||
"a-h": "Adrian Hesketh",
|
||||
"a-kenji": "kenji",
|
||||
"geofft": "Geoffrey Thomas",
|
||||
"bryango": null,
|
||||
"tfc": "Jacek Galowicz",
|
||||
"brianmcgee": "Brian McGee",
|
||||
"Gerg-L": null,
|
||||
"secobarbital": "Seggy Umboh",
|
||||
"bjornfor": "Bj\u00f8rn Forsman",
|
||||
"dependabot[bot]": null,
|
||||
"xokdvium": "Sergei Zimmerman",
|
||||
"kevincox": "Kevin Cox",
|
||||
"Jayman2000": "Jason Yundt",
|
||||
"Artturin": "Artturin",
|
||||
"0x5a4": "1444",
|
||||
"llakala": "Eman Resu",
|
||||
"nrabulinski": "Nikodem Rabuli\u0144ski",
|
||||
"shivaraj-bh": "Shivaraj B H",
|
||||
"yannham": "Yann Hamdaoui",
|
||||
"jkerfs": "Jeremy Kerfs",
|
||||
"drupol": "Pol Dellaiera",
|
||||
"onnimonni": "Onni Hakala",
|
||||
"joshheinrichs-shopify": "Josh Heinrichs",
|
||||
"puckipedia": null,
|
||||
"abathur": "Travis A. Everett",
|
||||
"alyssais": "Alyssa Ross",
|
||||
"noamraph": "Noam Yorav-Raphael",
|
||||
"squalus": null,
|
||||
"emhamm": null,
|
||||
"mjgallag": "Michael Gallagher",
|
||||
"jljusten": "Jordan Justen",
|
||||
"ehmry": "Emery Hemingway",
|
||||
"jhrcek": "Jan Hrcek",
|
||||
"Jaculabilis": "Tim",
|
||||
"bryanhonof": "Bryan Honof",
|
||||
"zimbatm": "Jonas Chevalier",
|
||||
"SuperSandro2000": "Sandro",
|
||||
"Aleksanaa": "Aleksana",
|
||||
"YorikSar": "Yuriy Taraday",
|
||||
"kjeremy": "Jeremy Kolb",
|
||||
"artemist": "Artemis Tosini"
|
||||
}
|
@ -7,7 +7,7 @@
|
||||
|
||||
perSystem = { config, pkgs, ... }: {
|
||||
|
||||
# https://flake.parts/options/pre-commit-hooks-nix.html#options
|
||||
# https://flake.parts/options/git-hooks-nix#options
|
||||
pre-commit.settings = {
|
||||
hooks = {
|
||||
clang-format = {
|
||||
@ -28,8 +28,6 @@
|
||||
''^src/build-remote/build-remote\.cc$''
|
||||
''^src/libcmd/built-path\.cc$''
|
||||
''^src/libcmd/built-path\.hh$''
|
||||
''^src/libcmd/command\.cc$''
|
||||
''^src/libcmd/command\.hh$''
|
||||
''^src/libcmd/common-eval-args\.cc$''
|
||||
''^src/libcmd/common-eval-args\.hh$''
|
||||
''^src/libcmd/editor-for\.cc$''
|
||||
@ -501,7 +499,6 @@
|
||||
''^scripts/install-nix-from-closure\.sh$''
|
||||
''^scripts/install-systemd-multi-user\.sh$''
|
||||
''^src/nix/get-env\.sh$''
|
||||
''^tests/functional/build\.sh$''
|
||||
''^tests/functional/ca/build-dry\.sh$''
|
||||
''^tests/functional/ca/build-with-garbage-path\.sh$''
|
||||
''^tests/functional/ca/common\.sh$''
|
||||
@ -517,7 +514,6 @@
|
||||
''^tests/functional/ca/selfref-gc\.sh$''
|
||||
''^tests/functional/ca/why-depends\.sh$''
|
||||
''^tests/functional/characterisation-test-infra\.sh$''
|
||||
''^tests/functional/check\.sh$''
|
||||
''^tests/functional/common/vars-and-functions\.sh$''
|
||||
''^tests/functional/completions\.sh$''
|
||||
''^tests/functional/compute-levels\.sh$''
|
||||
@ -534,7 +530,6 @@
|
||||
''^tests/functional/dyn-drv/old-daemon-error-hack\.sh$''
|
||||
''^tests/functional/dyn-drv/recursive-mod-json\.sh$''
|
||||
''^tests/functional/eval-store\.sh$''
|
||||
''^tests/functional/eval\.sh$''
|
||||
''^tests/functional/export-graph\.sh$''
|
||||
''^tests/functional/export\.sh$''
|
||||
''^tests/functional/extra-sandbox-profile\.sh$''
|
||||
@ -544,15 +539,12 @@
|
||||
''^tests/functional/fetchGitSubmodules\.sh$''
|
||||
''^tests/functional/fetchGitVerification\.sh$''
|
||||
''^tests/functional/fetchMercurial\.sh$''
|
||||
''^tests/functional/fetchurl\.sh$''
|
||||
''^tests/functional/fixed\.builder1\.sh$''
|
||||
''^tests/functional/fixed\.builder2\.sh$''
|
||||
''^tests/functional/fixed\.sh$''
|
||||
''^tests/functional/flakes/absolute-paths\.sh$''
|
||||
''^tests/functional/flakes/check\.sh$''
|
||||
''^tests/functional/flakes/common\.sh$''
|
||||
''^tests/functional/flakes/config\.sh$''
|
||||
''^tests/functional/flakes/develop\.sh$''
|
||||
''^tests/functional/flakes/flakes\.sh$''
|
||||
''^tests/functional/flakes/follow-paths\.sh$''
|
||||
''^tests/functional/flakes/prefetch\.sh$''
|
||||
@ -565,16 +557,12 @@
|
||||
''^tests/functional/gc-concurrent\.sh$''
|
||||
''^tests/functional/gc-concurrent2\.builder\.sh$''
|
||||
''^tests/functional/gc-non-blocking\.sh$''
|
||||
''^tests/functional/gc\.sh$''
|
||||
''^tests/functional/git-hashing/common\.sh$''
|
||||
''^tests/functional/git-hashing/simple\.sh$''
|
||||
''^tests/functional/hash-convert\.sh$''
|
||||
''^tests/functional/help\.sh$''
|
||||
''^tests/functional/impure-derivations\.sh$''
|
||||
''^tests/functional/impure-env\.sh$''
|
||||
''^tests/functional/impure-eval\.sh$''
|
||||
''^tests/functional/install-darwin\.sh$''
|
||||
''^tests/functional/lang\.sh$''
|
||||
''^tests/functional/legacy-ssh-store\.sh$''
|
||||
''^tests/functional/linux-sandbox\.sh$''
|
||||
''^tests/functional/local-overlay-store/add-lower-inner\.sh$''
|
||||
@ -603,7 +591,6 @@
|
||||
''^tests/functional/logging\.sh$''
|
||||
''^tests/functional/misc\.sh$''
|
||||
''^tests/functional/multiple-outputs\.sh$''
|
||||
''^tests/functional/nar-access\.sh$''
|
||||
''^tests/functional/nested-sandboxing\.sh$''
|
||||
''^tests/functional/nested-sandboxing/command\.sh$''
|
||||
''^tests/functional/nix-build\.sh$''
|
||||
@ -624,7 +611,6 @@
|
||||
''^tests/functional/path-from-hash-part\.sh$''
|
||||
''^tests/functional/path-info\.sh$''
|
||||
''^tests/functional/placeholders\.sh$''
|
||||
''^tests/functional/plugins\.sh$''
|
||||
''^tests/functional/post-hook\.sh$''
|
||||
''^tests/functional/pure-eval\.sh$''
|
||||
''^tests/functional/push-to-store-old\.sh$''
|
||||
@ -639,7 +625,6 @@
|
||||
''^tests/functional/search\.sh$''
|
||||
''^tests/functional/secure-drv-outputs\.sh$''
|
||||
''^tests/functional/selfref-gc\.sh$''
|
||||
''^tests/functional/shell\.sh$''
|
||||
''^tests/functional/shell\.shebang\.sh$''
|
||||
''^tests/functional/simple\.builder\.sh$''
|
||||
''^tests/functional/supplementary-groups\.sh$''
|
||||
@ -649,7 +634,6 @@
|
||||
''^tests/functional/user-envs\.builder\.sh$''
|
||||
''^tests/functional/user-envs\.sh$''
|
||||
''^tests/functional/why-depends\.sh$''
|
||||
''^tests/functional/zstd\.sh$''
|
||||
''^src/libutil-tests/data/git/check-data\.sh$''
|
||||
];
|
||||
};
|
||||
|
@ -1,8 +0,0 @@
|
||||
|
||||
.PHONY: format
|
||||
print-top-help += echo ' format: Format source code'
|
||||
|
||||
# This uses the cached .pre-commit-hooks.yaml file
|
||||
fmt_script := $(d)/format.sh
|
||||
format:
|
||||
@$(fmt_script)
|
@ -4,3 +4,4 @@
|
||||
- https://github.com/NixOS/nixos-homepage/
|
||||
- https://github.com/orgs/NixOS/teams/nix-team
|
||||
- Matrix room
|
||||
- Team member should subscribe to notifications for the [Nix development category on Discourse](https://discourse.nixos.org/c/dev/nix/50)
|
||||
|
@ -15,7 +15,7 @@ release:
|
||||
|
||||
* (Optionally) Updated `fallback-paths.nix` in Nixpkgs
|
||||
|
||||
* An updated manual on https://nixos.org/manual/nix/stable/
|
||||
* An updated manual on https://nix.dev/manual/nix/latest/
|
||||
|
||||
## Creating a new release from the `master` branch
|
||||
|
||||
@ -194,8 +194,58 @@ release:
|
||||
|
||||
* Bump the version number of the release branch as above (e.g. to
|
||||
`2.12.2`).
|
||||
|
||||
|
||||
## Recovering from mistakes
|
||||
|
||||
`upload-release.pl` should be idempotent. For instance a wrong `IS_LATEST` value can be fixed that way, by running the script on the actual latest release.
|
||||
|
||||
## Security releases
|
||||
|
||||
> See also the instructions for [handling security reports](./security-reports.md).
|
||||
|
||||
Once a security fix is ready for merging:
|
||||
|
||||
1. Summarize *all* past communication in the report.
|
||||
|
||||
1. Request a CVE in the [GitHub security advisory](https://github.com/NixOS/nix/security/advisories) for the security fix.
|
||||
|
||||
1. Notify all collaborators on the advisory with a timeline for the release.
|
||||
|
||||
1. Merge the fix. Publish the advisory.
|
||||
|
||||
1. [Make point releases](#creating-point-releases) for all affected versions.
|
||||
|
||||
1. Update the affected Nix releases in Nixpkgs to the patched version.
|
||||
|
||||
For each Nix release, change the `version = ` strings and run
|
||||
|
||||
```shell-session
|
||||
nix-build -A nixVersions.nix_<major>_<minor>
|
||||
```
|
||||
|
||||
to get the correct hash for the `hash =` field.
|
||||
|
||||
1. Once the release is built by Hydra, update fallback paths.
|
||||
|
||||
For the Nix release `${version}` shipped with Nixpkgs, run:
|
||||
|
||||
```shell-session
|
||||
curl https://releases.nixos.org/nix/nix-${version}/fallback-paths.nix > nixos/modules/installer/tools/nix-fallback-paths.nix
|
||||
```
|
||||
|
||||
Starting with Nixpkgs 24.11, there is an automatic check that fallback paths with Nix binaries match the Nix release shipped with Nixpkgs.
|
||||
|
||||
1. Backport the updates to the two most recent stable releases of Nixpkgs.
|
||||
|
||||
Add `backport release-<version>` labels, which will trigger GitHub Actions to attempt automatic backports.
|
||||
|
||||
1. Once the pull request against `master` lands on `nixpkgs-unstable`, post a Discourse announcement with
|
||||
|
||||
- Links to the CVE and GitHub security advisory
|
||||
- A description of the vulnerability and its fix
|
||||
- Credits to the reporters of the vulnerability and contributors of the fix
|
||||
- A list of affected and patched Nix releases
|
||||
- Instructions for updating
|
||||
- A link to the [pull request tracker](https://nixpk.gs/pr-tracker.html) to follow when the patched Nix versions will appear on the various release channels
|
||||
|
||||
Check [past announcements](https://discourse.nixos.org/search?expanded=true&q=Security%20fix%20in%3Atitle%20order%3Alatest_topic) for reference.
|
||||
|
31
maintainers/security-reports.md
Normal file
31
maintainers/security-reports.md
Normal file
@ -0,0 +1,31 @@
|
||||
# Handling security reports
|
||||
|
||||
Reports can be expected to be submitted following the [security policy](https://github.com/NixOS/nix/security/policy), but may reach maintainers on various other channels.
|
||||
|
||||
In case a vulnerability is reported:
|
||||
|
||||
1. [Create a GitHub security advisory](https://github.com/NixOS/nix/security/advisories/new)
|
||||
|
||||
> [!IMPORTANT]
|
||||
> Add the reporter as a collaborator so they get notified of all activities.
|
||||
|
||||
In addition to the details in the advisory template, the initial report should:
|
||||
|
||||
- Include sufficient details of the vulnerability to allow it to be understood and reproduced.
|
||||
- Redact any personal data.
|
||||
- Set a deadline (if applicable).
|
||||
- Provide proof of concept code (if available).
|
||||
- Reference any further reading material that may be appropriate.
|
||||
|
||||
1. Establish a private communication channel (e.g. a Matrix room) with the reporter and all Nix maintainers.
|
||||
|
||||
1. Communicate with the reporter which team members are assigned and when they are available.
|
||||
|
||||
1. Consider which immediate preliminary measures should be taken before working on a fix.
|
||||
|
||||
1. Prioritize fixing the security issue over ongoing work.
|
||||
|
||||
1. Keep everyone involved up to date on progress and the estimated timeline for releasing the fix.
|
||||
|
||||
> See also the instructions for [security releases](./release-process.md#security-releases).
|
||||
|
30
meson.build
30
meson.build
@ -22,10 +22,12 @@ subproject('libcmd')
|
||||
subproject('nix')
|
||||
|
||||
# Docs
|
||||
subproject('internal-api-docs')
|
||||
subproject('external-api-docs')
|
||||
if not meson.is_cross_build()
|
||||
subproject('nix-manual')
|
||||
if get_option('doc-gen')
|
||||
subproject('internal-api-docs')
|
||||
subproject('external-api-docs')
|
||||
if not meson.is_cross_build()
|
||||
subproject('nix-manual')
|
||||
endif
|
||||
endif
|
||||
|
||||
# External C wrapper libraries
|
||||
@ -35,17 +37,19 @@ subproject('libexpr-c')
|
||||
subproject('libmain-c')
|
||||
|
||||
# Language Bindings
|
||||
if not meson.is_cross_build()
|
||||
if get_option('bindings') and not meson.is_cross_build()
|
||||
subproject('perl')
|
||||
endif
|
||||
|
||||
# Testing
|
||||
subproject('libutil-test-support')
|
||||
subproject('libutil-tests')
|
||||
subproject('libstore-test-support')
|
||||
subproject('libstore-tests')
|
||||
subproject('libfetchers-tests')
|
||||
subproject('libexpr-test-support')
|
||||
subproject('libexpr-tests')
|
||||
subproject('libflake-tests')
|
||||
if get_option('unit-tests')
|
||||
subproject('libutil-test-support')
|
||||
subproject('libutil-tests')
|
||||
subproject('libstore-test-support')
|
||||
subproject('libstore-tests')
|
||||
subproject('libfetchers-tests')
|
||||
subproject('libexpr-test-support')
|
||||
subproject('libexpr-tests')
|
||||
subproject('libflake-tests')
|
||||
endif
|
||||
subproject('nix-functional-tests')
|
||||
|
13
meson.options
Normal file
13
meson.options
Normal file
@ -0,0 +1,13 @@
|
||||
# vim: filetype=meson
|
||||
|
||||
option('doc-gen', type : 'boolean', value : false,
|
||||
description : 'Generate documentation',
|
||||
)
|
||||
|
||||
option('unit-tests', type : 'boolean', value : true,
|
||||
description : 'Build unit tests',
|
||||
)
|
||||
|
||||
option('bindings', type : 'boolean', value : true,
|
||||
description : 'Build language bindings (e.g. Perl)',
|
||||
)
|
@ -1 +0,0 @@
|
||||
$(eval $(call install-file-as, $(d)/completion.sh, $(datarootdir)/bash-completion/completions/nix, 0644))
|
@ -1 +0,0 @@
|
||||
$(eval $(call install-file-as, $(d)/completion.fish, $(datarootdir)/fish/vendor_completions.d/nix.fish, 0644))
|
@ -1,5 +0,0 @@
|
||||
ifdef HOST_DARWIN
|
||||
|
||||
$(eval $(call install-data-in, $(d)/org.nixos.nix-daemon.plist, $(prefix)/Library/LaunchDaemons))
|
||||
|
||||
endif
|
@ -1,8 +0,0 @@
|
||||
ifdef HOST_LINUX
|
||||
|
||||
$(foreach n, nix-daemon.socket nix-daemon.service, $(eval $(call install-file-in, $(d)/$(n), $(prefix)/lib/systemd/system, 0644)))
|
||||
$(foreach n, nix-daemon.conf, $(eval $(call install-file-in, $(d)/$(n), $(prefix)/lib/tmpfiles.d, 0644)))
|
||||
|
||||
clean-files += $(d)/nix-daemon.socket $(d)/nix-daemon.service $(d)/nix-daemon.conf
|
||||
|
||||
endif
|
@ -1,7 +0,0 @@
|
||||
ifdef HOST_LINUX
|
||||
|
||||
$(foreach n, nix-daemon.conf, $(eval $(call install-file-in, $(d)/$(n), $(sysconfdir)/init, 0644)))
|
||||
|
||||
clean-files += $(d)/nix-daemon.conf
|
||||
|
||||
endif
|
@ -1,2 +0,0 @@
|
||||
$(eval $(call install-file-as, $(d)/completion.zsh, $(datarootdir)/zsh/site-functions/_nix, 0644))
|
||||
$(eval $(call install-file-as, $(d)/run-help-nix, $(datarootdir)/zsh/site-functions/run-help-nix, 0644))
|
@ -1,10 +0,0 @@
|
||||
# Initialise support for build directories.
|
||||
builddir ?=
|
||||
|
||||
ifdef builddir
|
||||
buildprefix = $(builddir)/
|
||||
buildprefixrel = $(builddir)
|
||||
else
|
||||
buildprefix =
|
||||
buildprefixrel = .
|
||||
endif
|
11
mk/clean.mk
11
mk/clean.mk
@ -1,11 +0,0 @@
|
||||
clean-files :=
|
||||
|
||||
clean:
|
||||
$(suppress) rm -fv -- $(clean-files)
|
||||
|
||||
dryclean:
|
||||
@for i in $(clean-files); do if [ -e $$i ]; then echo $$i; fi; done | sort
|
||||
|
||||
print-top-help += \
|
||||
echo " clean: Delete generated files"; \
|
||||
echo " dryclean: Show what files would be deleted by 'make clean'";
|
@ -1,23 +0,0 @@
|
||||
# shellcheck shell=bash
|
||||
|
||||
# Remove overall test dir (at most one of the two should match) and
|
||||
# remove file extension.
|
||||
|
||||
test_name=$(echo -n "${test?must be defined by caller (test runner)}" | sed \
|
||||
-e "s|^src/[^/]*-test/data/||" \
|
||||
-e "s|^tests/functional/||" \
|
||||
-e "s|\.sh$||" \
|
||||
)
|
||||
|
||||
# shellcheck disable=SC2016
|
||||
TESTS_ENVIRONMENT=(
|
||||
"TEST_NAME=$test_name"
|
||||
'NIX_REMOTE='
|
||||
'PS4=+(${BASH_SOURCE[0]-$0}:$LINENO) '
|
||||
)
|
||||
|
||||
read -r -a bash <<< "${BASH:-/usr/bin/env bash}"
|
||||
|
||||
run () {
|
||||
cd "$(dirname "$1")" && env "${TESTS_ENVIRONMENT[@]}" "${bash[@]}" -x -e -u -o pipefail "$(basename "$1")"
|
||||
}
|
@ -1,11 +0,0 @@
|
||||
compile-commands-json-files :=
|
||||
|
||||
define write-compile-commands
|
||||
_srcs := $$(sort $$(foreach src, $$($(1)_SOURCES), $$(src)))
|
||||
|
||||
$(1)_COMPILE_COMMANDS_JSON := $$(addprefix $(buildprefix), $$(addsuffix .compile_commands.json, $$(basename $$(_srcs))))
|
||||
|
||||
compile-commands-json-files += $$($(1)_COMPILE_COMMANDS_JSON)
|
||||
|
||||
clean-files += $$($(1)_COMPILE_COMMANDS_JSON)
|
||||
endef
|
@ -1,5 +0,0 @@
|
||||
%.gen.hh: %
|
||||
@echo 'R"__NIX_STR(' >> $@.tmp
|
||||
$(trace-gen) cat $< >> $@.tmp
|
||||
@echo ')__NIX_STR"' >> $@.tmp
|
||||
@mv $@.tmp $@
|
@ -1,10 +0,0 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
set -eu -o pipefail
|
||||
|
||||
test=$1
|
||||
|
||||
dir="$(dirname "${BASH_SOURCE[0]}")"
|
||||
source "$dir/common-test.sh"
|
||||
|
||||
run "$test"
|
@ -1,14 +0,0 @@
|
||||
# Utility function for recursively finding files, e.g.
|
||||
# ‘$(call rwildcard, path/to/dir, *.c *.h)’.
|
||||
rwildcard=$(foreach d,$(wildcard $1*),$(call rwildcard,$d/,$2) $(filter $(subst *,%,$2),$d))
|
||||
|
||||
# Given a file name, produce the corresponding dependency file
|
||||
# (e.g. ‘foo/bar.o’ becomes ‘foo/.bar.o.dep’).
|
||||
filename-to-dep = $(dir $1).$(notdir $1).dep
|
||||
|
||||
# Return the full path to a program by looking it up in $PATH, or the
|
||||
# empty string if not found.
|
||||
find-program = $(shell for i in $$(IFS=: ; echo $$PATH); do p=$$i/$(strip $1); if [ -e $$p ]; then echo $$p; break; fi; done)
|
||||
|
||||
# Ensure that the given string ends in a single slash.
|
||||
add-trailing-slash = $(patsubst %/,%,$(1))/
|
@ -1,11 +0,0 @@
|
||||
# Default installation paths.
|
||||
prefix ?= /usr/local
|
||||
libdir ?= $(prefix)/lib
|
||||
bindir ?= $(prefix)/bin
|
||||
libexecdir ?= $(prefix)/libexec
|
||||
datadir ?= $(prefix)/share
|
||||
localstatedir ?= $(prefix)/var
|
||||
sysconfdir ?= $(prefix)/etc
|
||||
mandir ?= $(prefix)/share/man
|
||||
|
||||
DESTDIR ?=
|
@ -1,62 +0,0 @@
|
||||
# Add a rule for creating $(1) as a directory. This template may be
|
||||
# called multiple times for the same directory.
|
||||
define create-dir
|
||||
_i := $$(call add-trailing-slash, $(DESTDIR)$$(strip $(1)))
|
||||
ifndef $$(_i)_SEEN
|
||||
$$(_i)_SEEN = 1
|
||||
$$(_i):
|
||||
$$(trace-mkdir) install -d "$$@"
|
||||
endif
|
||||
endef
|
||||
|
||||
|
||||
# Add a rule for installing file $(1) as file $(2) with mode $(3).
|
||||
# The directory containing $(2) will be created automatically.
|
||||
define install-file-as
|
||||
|
||||
_i := $(DESTDIR)$$(strip $(2))
|
||||
|
||||
install: $$(_i)
|
||||
|
||||
$$(_i): $(1) | $$(dir $$(_i))
|
||||
$$(trace-install) install -m $(3) $(1) "$$@"
|
||||
|
||||
$$(eval $$(call create-dir, $$(dir $(2))))
|
||||
|
||||
endef
|
||||
|
||||
|
||||
# Add a rule for installing file $(1) in directory $(2) with mode
|
||||
# $(3). The directory will be created automatically.
|
||||
define install-file-in
|
||||
$$(eval $$(call install-file-as,$(1),$(2)/$$(notdir $(1)),$(3)))
|
||||
endef
|
||||
|
||||
|
||||
define install-program-in
|
||||
$$(eval $$(call install-file-in,$(1),$(2),0755))
|
||||
endef
|
||||
|
||||
|
||||
define install-data-in
|
||||
$$(eval $$(call install-file-in,$(1),$(2),0644))
|
||||
endef
|
||||
|
||||
|
||||
# Install a symlink from $(2) to $(1). Note that $(1) need not exist.
|
||||
define install-symlink
|
||||
|
||||
_i := $(DESTDIR)$$(strip $(2))
|
||||
|
||||
install: $$(_i)
|
||||
|
||||
$$(_i): | $$(dir $$(_i))
|
||||
$$(trace-install) ln -sfn $(1) "$$@"
|
||||
|
||||
$$(eval $$(call create-dir, $$(dir $(2))))
|
||||
|
||||
endef
|
||||
|
||||
|
||||
print-top-help += \
|
||||
echo " install: Install into \$$(prefix) (currently set to '$(prefix)')";
|
159
mk/lib.mk
159
mk/lib.mk
@ -1,159 +0,0 @@
|
||||
default: all
|
||||
|
||||
|
||||
# Get rid of default suffixes. FIXME: is this a good idea?
|
||||
.SUFFIXES:
|
||||
|
||||
|
||||
# Initialise some variables.
|
||||
bin-scripts :=
|
||||
noinst-scripts :=
|
||||
man-pages :=
|
||||
install-tests :=
|
||||
install-tests-groups :=
|
||||
|
||||
include mk/platform.mk
|
||||
|
||||
# Hack to define a literal space.
|
||||
space :=
|
||||
space +=
|
||||
|
||||
|
||||
# Hack to define a literal newline.
|
||||
define newline
|
||||
|
||||
|
||||
endef
|
||||
|
||||
|
||||
# Pass -fPIC if we're building dynamic libraries.
|
||||
BUILD_SHARED_LIBS ?= 1
|
||||
|
||||
ifeq ($(BUILD_SHARED_LIBS), 1)
|
||||
ifdef HOST_CYGWIN
|
||||
GLOBAL_CFLAGS += -U__STRICT_ANSI__ -D_GNU_SOURCE
|
||||
GLOBAL_CXXFLAGS += -U__STRICT_ANSI__ -D_GNU_SOURCE
|
||||
else
|
||||
GLOBAL_CFLAGS += -fPIC
|
||||
GLOBAL_CXXFLAGS += -fPIC
|
||||
endif
|
||||
ifndef HOST_DARWIN
|
||||
ifndef HOST_SOLARIS
|
||||
ifndef HOST_FREEBSD
|
||||
GLOBAL_LDFLAGS += -Wl,--no-copy-dt-needed-entries
|
||||
endif
|
||||
endif
|
||||
endif
|
||||
SET_RPATH_TO_LIBS ?= 1
|
||||
endif
|
||||
|
||||
# Pass -g if we want debug info.
|
||||
BUILD_DEBUG ?= 1
|
||||
|
||||
ifeq ($(BUILD_DEBUG), 1)
|
||||
GLOBAL_CFLAGS += -g
|
||||
GLOBAL_CXXFLAGS += -g
|
||||
endif
|
||||
|
||||
|
||||
include mk/build-dir.mk
|
||||
include mk/install-dirs.mk
|
||||
include mk/functions.mk
|
||||
include mk/tracing.mk
|
||||
include mk/clean.mk
|
||||
include mk/install.mk
|
||||
include mk/libraries.mk
|
||||
include mk/programs.mk
|
||||
include mk/patterns.mk
|
||||
include mk/templates.mk
|
||||
include mk/cxx-big-literal.mk
|
||||
include mk/tests.mk
|
||||
include mk/compilation-database.mk
|
||||
|
||||
|
||||
# Include all sub-Makefiles.
|
||||
define include-sub-makefile
|
||||
d := $$(patsubst %/,%,$$(dir $(1)))
|
||||
include $(1)
|
||||
endef
|
||||
|
||||
$(foreach mf, $(makefiles), $(eval $(call include-sub-makefile,$(mf))))
|
||||
|
||||
|
||||
# Instantiate stuff.
|
||||
$(foreach lib, $(libraries), $(eval $(call build-library,$(lib))))
|
||||
$(foreach prog, $(programs), $(eval $(call build-program,$(prog))))
|
||||
$(foreach script, $(bin-scripts), $(eval $(call install-program-in,$(script),$(bindir))))
|
||||
$(foreach script, $(bin-scripts), $(eval programs-list += $(script)))
|
||||
$(foreach script, $(noinst-scripts), $(eval programs-list += $(script)))
|
||||
$(foreach template, $(template-files), $(eval $(call instantiate-template,$(template))))
|
||||
$(foreach test, $(install-tests), \
|
||||
$(eval $(call run-test,$(test))) \
|
||||
$(eval installcheck: $(test).test))
|
||||
$(foreach test-group, $(install-tests-groups), \
|
||||
$(eval $(call run-test-group,$(test-group))) \
|
||||
$(eval installcheck: $(test-group).test-group) \
|
||||
$(foreach test, $($(test-group)-tests), \
|
||||
$(eval $(call run-test,$(test))) \
|
||||
$(eval $(test-group).test-group: $(test).test)))
|
||||
|
||||
# Compilation database.
|
||||
$(foreach lib, $(libraries), $(eval $(call write-compile-commands,$(lib))))
|
||||
$(foreach prog, $(programs), $(eval $(call write-compile-commands,$(prog))))
|
||||
|
||||
compile_commands.json: $(compile-commands-json-files)
|
||||
@jq --slurp '.' $^ >$@
|
||||
|
||||
# Include makefiles requiring built programs.
|
||||
$(foreach mf, $(makefiles-late), $(eval $(call include-sub-makefile,$(mf))))
|
||||
|
||||
|
||||
$(foreach file, $(man-pages), $(eval $(call install-data-in, $(file), $(mandir)/man$(patsubst .%,%,$(suffix $(file))))))
|
||||
|
||||
|
||||
.PHONY: default all man help
|
||||
|
||||
all: $(programs-list) $(libs-list) $(man-pages)
|
||||
|
||||
man: $(man-pages)
|
||||
|
||||
|
||||
help:
|
||||
@echo "The following targets are available:"
|
||||
@echo ""
|
||||
@echo " default: Build default targets"
|
||||
ifdef man-pages
|
||||
@echo " man: Generate manual pages"
|
||||
endif
|
||||
@$(print-top-help)
|
||||
ifdef programs-list
|
||||
@echo ""
|
||||
@echo "The following programs can be built:"
|
||||
@echo ""
|
||||
@for i in $(programs-list); do echo " $$i"; done
|
||||
endif
|
||||
ifdef libs-list
|
||||
@echo ""
|
||||
@echo "The following libraries can be built:"
|
||||
@echo ""
|
||||
@for i in $(libs-list); do echo " $$i"; done
|
||||
endif
|
||||
ifdef install-tests-groups
|
||||
@echo ""
|
||||
@echo "The following groups of functional tests can be run:"
|
||||
@echo ""
|
||||
@for i in $(install-tests-groups); do echo " $$i.test-group"; done
|
||||
@echo ""
|
||||
@echo "(installcheck includes tests in test groups too.)"
|
||||
endif
|
||||
@echo ""
|
||||
@echo "The following variables control the build:"
|
||||
@echo ""
|
||||
@echo " BUILD_SHARED_LIBS ($(BUILD_SHARED_LIBS)): Whether to build shared libraries"
|
||||
@echo " BUILD_DEBUG ($(BUILD_DEBUG)): Whether to include debug symbols"
|
||||
@echo " CC ($(CC)): C compiler to be used"
|
||||
@echo " CFLAGS: Flags for the C compiler"
|
||||
@echo " CXX ($(CXX)): C++ compiler to be used"
|
||||
@echo " CXXFLAGS: Flags for the C++ compiler"
|
||||
@echo " CPPFLAGS: C preprocessor flags, used for both CC and CXX"
|
||||
@$(print-var-help)
|
171
mk/libraries.mk
171
mk/libraries.mk
@ -1,171 +0,0 @@
|
||||
libs-list :=
|
||||
|
||||
ifdef HOST_DARWIN
|
||||
SO_EXT = dylib
|
||||
else
|
||||
ifdef HOST_WINDOWS
|
||||
SO_EXT = dll
|
||||
else
|
||||
SO_EXT = so
|
||||
endif
|
||||
endif
|
||||
|
||||
ifdef HOST_UNIX
|
||||
THREAD_LDFLAGS = -pthread
|
||||
else
|
||||
THREAD_LDFLAGS =
|
||||
endif
|
||||
|
||||
# Build a library with symbolic name $(1). The library is defined by
|
||||
# various variables prefixed by ‘$(1)_’:
|
||||
#
|
||||
# - $(1)_NAME: the name of the library (e.g. ‘libfoo’); defaults to
|
||||
# $(1).
|
||||
#
|
||||
# - $(1)_DIR: the directory where the (non-installed) library will be
|
||||
# placed.
|
||||
#
|
||||
# - $(1)_SOURCES: the source files of the library.
|
||||
#
|
||||
# - $(1)_CFLAGS: additional C compiler flags.
|
||||
#
|
||||
# - $(1)_CXXFLAGS: additional C++ compiler flags.
|
||||
#
|
||||
# - $(1)_ORDER_AFTER: a set of targets on which the object files of
|
||||
# this libraries will have an order-only dependency.
|
||||
#
|
||||
# - $(1)_LIBS: the symbolic names of other libraries on which this
|
||||
# library depends.
|
||||
#
|
||||
# - $(1)_ALLOW_UNDEFINED: if set, the library is allowed to have
|
||||
# undefined symbols. Has no effect for static libraries.
|
||||
#
|
||||
# - $(1)_LDFLAGS: additional linker flags.
|
||||
#
|
||||
# - $(1)_LDFLAGS_PROPAGATED: additional linker flags, also propagated
|
||||
# to the linking of programs/libraries that use this library.
|
||||
#
|
||||
# - $(1)_FORCE_INSTALL: if defined, the library will be installed even
|
||||
# if it's not needed (i.e. dynamically linked) by a program.
|
||||
#
|
||||
# - $(1)_INSTALL_DIR: the directory where the library will be
|
||||
# installed. Defaults to $(libdir).
|
||||
#
|
||||
# - $(1)_EXCLUDE_FROM_LIBRARY_LIST: if defined, the library will not
|
||||
# be automatically marked as a dependency of the top-level all
|
||||
# target andwill not be listed in the make help output. This is
|
||||
# useful for libraries built solely for testing, for example.
|
||||
#
|
||||
# - BUILD_SHARED_LIBS: if equal to ‘1’, a dynamic library will be
|
||||
# built, otherwise a static library.
|
||||
define build-library
|
||||
$(1)_NAME ?= $(1)
|
||||
_d := $(buildprefix)$$(strip $$($(1)_DIR))
|
||||
_srcs := $$(sort $$(foreach src, $$($(1)_SOURCES), $$(src)))
|
||||
$(1)_OBJS := $$(addprefix $(buildprefix), $$(addsuffix .o, $$(basename $$(_srcs))))
|
||||
_libs := $$(foreach lib, $$($(1)_LIBS), $$($$(lib)_PATH))
|
||||
|
||||
ifdef HOST_WINDOWS
|
||||
$(1)_INSTALL_DIR ?= $$(bindir)
|
||||
else
|
||||
$(1)_INSTALL_DIR ?= $$(libdir)
|
||||
endif
|
||||
|
||||
$(1)_LDFLAGS_USE :=
|
||||
$(1)_LDFLAGS_USE_INSTALLED :=
|
||||
$(1)_LIB_CLOSURE := $(1)
|
||||
|
||||
$$(eval $$(call create-dir, $$(_d)))
|
||||
|
||||
ifeq ($(BUILD_SHARED_LIBS), 1)
|
||||
|
||||
ifdef $(1)_ALLOW_UNDEFINED
|
||||
ifdef HOST_DARWIN
|
||||
$(1)_LDFLAGS += -undefined suppress -flat_namespace
|
||||
endif
|
||||
else
|
||||
ifndef HOST_DARWIN
|
||||
ifndef HOST_WINDOWS
|
||||
$(1)_LDFLAGS += -Wl,-z,defs
|
||||
endif
|
||||
endif
|
||||
endif
|
||||
|
||||
ifndef HOST_DARWIN
|
||||
$(1)_LDFLAGS += -Wl,-soname=$$($(1)_NAME).$(SO_EXT)
|
||||
endif
|
||||
|
||||
$(1)_PATH := $$(_d)/$$($(1)_NAME).$(SO_EXT)
|
||||
|
||||
$$($(1)_PATH): $$($(1)_OBJS) $$(_libs) | $$(_d)/
|
||||
+$$(trace-ld) $(CXX) -o $$(abspath $$@) -shared $$(LDFLAGS) $$(GLOBAL_LDFLAGS) $$($(1)_OBJS) $$($(1)_LDFLAGS) $$($(1)_LDFLAGS_PROPAGATED) $$(foreach lib, $$($(1)_LIBS), $$($$(lib)_LDFLAGS_USE)) $$($(1)_LDFLAGS_UNINSTALLED)
|
||||
|
||||
ifndef HOST_DARWIN
|
||||
$(1)_LDFLAGS_USE += -Wl,-rpath,$$(abspath $$(_d))
|
||||
endif
|
||||
$(1)_LDFLAGS_USE += -L$$(_d) -l$$(patsubst lib%,%,$$(strip $$($(1)_NAME)))
|
||||
|
||||
$(1)_INSTALL_PATH := $(DESTDIR)$$($(1)_INSTALL_DIR)/$$($(1)_NAME).$(SO_EXT)
|
||||
|
||||
_libs_final := $$(foreach lib, $$($(1)_LIBS), $$($$(lib)_INSTALL_PATH))
|
||||
|
||||
$$(eval $$(call create-dir, $$($(1)_INSTALL_DIR)))
|
||||
|
||||
$$($(1)_INSTALL_PATH): $$($(1)_OBJS) $$(_libs_final) | $(DESTDIR)$$($(1)_INSTALL_DIR)/
|
||||
+$$(trace-ld) $(CXX) -o $$@ -shared $$(LDFLAGS) $$(GLOBAL_LDFLAGS) $$($(1)_OBJS) $$($(1)_LDFLAGS) $$($(1)_LDFLAGS_PROPAGATED) $$(foreach lib, $$($(1)_LIBS), $$($$(lib)_LDFLAGS_USE_INSTALLED))
|
||||
|
||||
$(1)_LDFLAGS_USE_INSTALLED += -L$$(DESTDIR)$$($(1)_INSTALL_DIR) -l$$(patsubst lib%,%,$$(strip $$($(1)_NAME)))
|
||||
ifndef HOST_DARWIN
|
||||
ifeq ($(SET_RPATH_TO_LIBS), 1)
|
||||
$(1)_LDFLAGS_USE_INSTALLED += -Wl,-rpath,$$($(1)_INSTALL_DIR)
|
||||
else
|
||||
$(1)_LDFLAGS_USE_INSTALLED += -Wl,-rpath-link,$$($(1)_INSTALL_DIR)
|
||||
endif
|
||||
endif
|
||||
|
||||
ifdef $(1)_FORCE_INSTALL
|
||||
install: $$($(1)_INSTALL_PATH)
|
||||
endif
|
||||
|
||||
else
|
||||
|
||||
$(1)_PATH := $$(_d)/$$($(1)_NAME).a
|
||||
|
||||
$$($(1)_PATH): $$($(1)_OBJS) | $$(_d)/
|
||||
$$(trace-ld) $(LD) $$(ifndef $(HOST_DARWIN),-U) -r -o $$(_d)/$$($(1)_NAME).o $$^
|
||||
$$(trace-ar) $(AR) crs $$@ $$(_d)/$$($(1)_NAME).o
|
||||
|
||||
$(1)_LDFLAGS_USE += $$($(1)_PATH) $$($(1)_LDFLAGS) $$(foreach lib, $$($(1)_LIBS), $$($$(lib)_LDFLAGS_USE))
|
||||
|
||||
$(1)_INSTALL_PATH := $$(libdir)/$$($(1)_NAME).a
|
||||
|
||||
$(1)_LIB_CLOSURE += $$($(1)_LIBS)
|
||||
|
||||
endif
|
||||
|
||||
$(1)_LDFLAGS_USE += $$($(1)_LDFLAGS_PROPAGATED)
|
||||
$(1)_LDFLAGS_USE_INSTALLED += $$($(1)_LDFLAGS_PROPAGATED)
|
||||
|
||||
# Propagate CFLAGS and CXXFLAGS to the individual object files.
|
||||
$$(foreach obj, $$($(1)_OBJS), $$(eval $$(obj)_CFLAGS=$$($(1)_CFLAGS)))
|
||||
$$(foreach obj, $$($(1)_OBJS), $$(eval $$(obj)_CXXFLAGS=$$($(1)_CXXFLAGS)))
|
||||
|
||||
# Make each object file depend on the common dependencies.
|
||||
$$(foreach obj, $$($(1)_OBJS), $$(eval $$(obj): $$($(1)_COMMON_DEPS) $$(GLOBAL_COMMON_DEPS)))
|
||||
|
||||
# Make each object file have order-only dependencies on the common
|
||||
# order-only dependencies. This includes the order-only dependencies
|
||||
# of libraries we're depending on.
|
||||
$(1)_ORDER_AFTER_CLOSED = $$($(1)_ORDER_AFTER) $$(foreach lib, $$($(1)_LIBS), $$($$(lib)_ORDER_AFTER_CLOSED))
|
||||
|
||||
$$(foreach obj, $$($(1)_OBJS), $$(eval $$(obj): | $$($(1)_ORDER_AFTER_CLOSED) $$(GLOBAL_ORDER_AFTER)))
|
||||
|
||||
# Include .dep files, if they exist.
|
||||
$(1)_DEPS := $$(foreach fn, $$($(1)_OBJS), $$(call filename-to-dep, $$(fn)))
|
||||
-include $$($(1)_DEPS)
|
||||
|
||||
ifndef $(1)_EXCLUDE_FROM_LIBRARY_LIST
|
||||
libs-list += $$($(1)_PATH)
|
||||
endif
|
||||
clean-files += $$(_d)/*.a $$(_d)/*.$(SO_EXT) $$(_d)/*.o $$(_d)/.*.dep $$($(1)_DEPS) $$($(1)_OBJS)
|
||||
endef
|
@ -1,41 +0,0 @@
|
||||
|
||||
# These are the complete command lines we use to compile C and C++ files.
|
||||
# - $< is the source file.
|
||||
# - $1 is the object file to create.
|
||||
CC_CMD=$(CC) -o $1 -c $< $(CPPFLAGS) $(GLOBAL_CFLAGS) $(CFLAGS) $($1_CFLAGS) -MMD -MF $(call filename-to-dep,$1) -MP
|
||||
CXX_CMD=$(CXX) -o $1 -c $< $(CPPFLAGS) $(GLOBAL_CXXFLAGS_PCH) $(GLOBAL_CXXFLAGS) $(CXXFLAGS) $($1_CXXFLAGS) $(ERROR_SWITCH_ENUM) -MMD -MF $(call filename-to-dep,$1) -MP
|
||||
|
||||
# We use COMPILE_COMMANDS_JSON_CMD to turn a compilation command (like CC_CMD
|
||||
# or CXX_CMD above) into a comple_commands.json file. We rely on bash native
|
||||
# word splitting to define the positional arguments.
|
||||
# - $< is the source file being compiled.
|
||||
COMPILE_COMMANDS_JSON_CMD=jq --null-input '{ directory: $$ENV.PWD, file: "$<", arguments: $$ARGS.positional }' --args --
|
||||
|
||||
|
||||
$(buildprefix)%.o: %.cc
|
||||
@mkdir -p "$(dir $@)"
|
||||
$(trace-cxx) $(call CXX_CMD,$@)
|
||||
|
||||
$(buildprefix)%.o: %.cpp
|
||||
@mkdir -p "$(dir $@)"
|
||||
$(trace-cxx) $(call CXX_CMD,$@)
|
||||
|
||||
$(buildprefix)%.o: %.c
|
||||
@mkdir -p "$(dir $@)"
|
||||
$(trace-cc) $(call CC_CMD,$@)
|
||||
|
||||
# In the following we need to replace the .compile_commands.json extension in $@ with .o
|
||||
# to make the object file. This is needed because CC_CMD and CXX_CMD do further expansions
|
||||
# based on the object file name (i.e. *_CXXFLAGS and filename-to-dep).
|
||||
|
||||
$(buildprefix)%.compile_commands.json: %.cc
|
||||
@mkdir -p "$(dir $@)"
|
||||
$(trace-jq) $(COMPILE_COMMANDS_JSON_CMD) $(call CXX_CMD,$(@:.compile_commands.json=.o)) > $@
|
||||
|
||||
$(buildprefix)%.compile_commands.json: %.cpp
|
||||
@mkdir -p "$(dir $@)"
|
||||
$(trace-jq) $(COMPILE_COMMANDS_JSON_CMD) $(call CXX_CMD,$(@:.compile_commands.json=.o)) > $@
|
||||
|
||||
$(buildprefix)%.compile_commands.json: %.c
|
||||
@mkdir -p "$(dir $@)"
|
||||
$(trace-jq) $(COMPILE_COMMANDS_JSON_CMD) $(call CC_CMD,$(@:.compile_commands.json=.o)) > $@
|
@ -1,36 +0,0 @@
|
||||
ifdef HOST_OS
|
||||
HOST_KERNEL = $(firstword $(subst -, ,$(HOST_OS)))
|
||||
ifeq ($(patsubst mingw%,,$(HOST_KERNEL)),)
|
||||
HOST_MINGW = 1
|
||||
HOST_WINDOWS = 1
|
||||
endif
|
||||
ifeq ($(HOST_KERNEL), cygwin)
|
||||
HOST_CYGWIN = 1
|
||||
HOST_WINDOWS = 1
|
||||
HOST_UNIX = 1
|
||||
endif
|
||||
ifeq ($(patsubst darwin%,,$(HOST_KERNEL)),)
|
||||
HOST_DARWIN = 1
|
||||
HOST_UNIX = 1
|
||||
endif
|
||||
ifeq ($(patsubst freebsd%,,$(HOST_KERNEL)),)
|
||||
HOST_FREEBSD = 1
|
||||
HOST_UNIX = 1
|
||||
endif
|
||||
ifeq ($(patsubst netbsd%,,$(HOST_KERNEL)),)
|
||||
HOST_NETBSD = 1
|
||||
HOST_UNIX = 1
|
||||
endif
|
||||
ifeq ($(HOST_KERNEL), linux)
|
||||
HOST_LINUX = 1
|
||||
HOST_UNIX = 1
|
||||
endif
|
||||
ifeq ($(patsubst solaris%,,$(HOST_KERNEL)),)
|
||||
HOST_SOLARIS = 1
|
||||
HOST_UNIX = 1
|
||||
endif
|
||||
ifeq ($(HOST_KERNEL), gnu)
|
||||
HOST_HURD = 1
|
||||
HOST_UNIX = 1
|
||||
endif
|
||||
endif
|
@ -1,21 +0,0 @@
|
||||
PRECOMPILE_HEADERS ?= 0
|
||||
|
||||
print-var-help += \
|
||||
echo " PRECOMPILE_HEADERS ($(PRECOMPILE_HEADERS)): Whether to use precompiled headers to speed up the build";
|
||||
|
||||
GCH = $(buildprefix)precompiled-headers.h.gch
|
||||
|
||||
$(GCH): precompiled-headers.h
|
||||
@rm -f $@
|
||||
@mkdir -p "$(dir $@)"
|
||||
$(trace-gen) $(CXX) -c -x c++-header -o $@ $< $(GLOBAL_CXXFLAGS) $(GCH_CXXFLAGS)
|
||||
|
||||
clean-files += $(GCH)
|
||||
|
||||
ifeq ($(PRECOMPILE_HEADERS), 1)
|
||||
|
||||
GLOBAL_CXXFLAGS_PCH += -include $(buildprefix)precompiled-headers.h -Winvalid-pch
|
||||
|
||||
GLOBAL_ORDER_AFTER += $(GCH)
|
||||
|
||||
endif
|
@ -1,98 +0,0 @@
|
||||
programs-list :=
|
||||
|
||||
ifdef HOST_WINDOWS
|
||||
EXE_EXT = .exe
|
||||
else
|
||||
EXE_EXT =
|
||||
endif
|
||||
|
||||
# Build a program with symbolic name $(1). The program is defined by
|
||||
# various variables prefixed by ‘$(1)_’:
|
||||
#
|
||||
# - $(1)_NAME: the name of the program (e.g. ‘foo’); defaults to
|
||||
# $(1).
|
||||
#
|
||||
# - $(1)_DIR: the directory where the (non-installed) program will be
|
||||
# placed.
|
||||
#
|
||||
# - $(1)_SOURCES: the source files of the program.
|
||||
#
|
||||
# - $(1)_CFLAGS: additional C compiler flags.
|
||||
#
|
||||
# - $(1)_CXXFLAGS: additional C++ compiler flags.
|
||||
#
|
||||
# - $(1)_ORDER_AFTER: a set of targets on which the object files of
|
||||
# this program will have an order-only dependency.
|
||||
#
|
||||
# - $(1)_LIBS: the symbolic names of libraries on which this program
|
||||
# depends.
|
||||
#
|
||||
# - $(1)_LDFLAGS: additional linker flags.
|
||||
#
|
||||
# - $(1)_INSTALL_DIR: the directory where the program will be
|
||||
# installed; defaults to $(bindir).
|
||||
define build-program
|
||||
$(1)_NAME ?= $(1)
|
||||
_d := $(buildprefix)$$($(1)_DIR)
|
||||
_srcs := $$(sort $$(foreach src, $$($(1)_SOURCES), $$(src)))
|
||||
$(1)_OBJS := $$(addprefix $(buildprefix), $$(addsuffix .o, $$(basename $$(_srcs))))
|
||||
_libs := $$(foreach lib, $$($(1)_LIBS), $$(foreach lib2, $$($$(lib)_LIB_CLOSURE), $$($$(lib2)_PATH)))
|
||||
$(1)_PATH := $$(_d)/$$($(1)_NAME)$(EXE_EXT)
|
||||
|
||||
$$(eval $$(call create-dir, $$(_d)))
|
||||
|
||||
$$($(1)_PATH): $$($(1)_OBJS) $$(_libs) | $$(_d)/
|
||||
+$$(trace-ld) $(CXX) -o $$@ $$(LDFLAGS) $$(GLOBAL_LDFLAGS) $$($(1)_OBJS) $$($(1)_LDFLAGS) $$(foreach lib, $$($(1)_LIBS), $$($$(lib)_LDFLAGS_USE))
|
||||
|
||||
$(1)_INSTALL_DIR ?= $$(bindir)
|
||||
|
||||
ifdef $(1)_INSTALL_DIR
|
||||
|
||||
$(1)_INSTALL_PATH := $$($(1)_INSTALL_DIR)/$$($(1)_NAME)$(EXE_EXT)
|
||||
|
||||
$$(eval $$(call create-dir, $$($(1)_INSTALL_DIR)))
|
||||
|
||||
install: $(DESTDIR)$$($(1)_INSTALL_PATH)
|
||||
|
||||
ifeq ($(BUILD_SHARED_LIBS), 1)
|
||||
|
||||
_libs_final := $$(foreach lib, $$($(1)_LIBS), $$($$(lib)_INSTALL_PATH))
|
||||
|
||||
$(DESTDIR)$$($(1)_INSTALL_PATH): $$($(1)_OBJS) $$(_libs_final) | $(DESTDIR)$$($(1)_INSTALL_DIR)/
|
||||
+$$(trace-ld) $(CXX) -o $$@ $$(LDFLAGS) $$(GLOBAL_LDFLAGS) $$($(1)_OBJS) $$($(1)_LDFLAGS) $$(foreach lib, $$($(1)_LIBS), $$($$(lib)_LDFLAGS_USE_INSTALLED))
|
||||
|
||||
else
|
||||
|
||||
$(DESTDIR)$$($(1)_INSTALL_PATH): $$($(1)_PATH) | $(DESTDIR)$$($(1)_INSTALL_DIR)/
|
||||
+$$(trace-install) install -t $(DESTDIR)$$($(1)_INSTALL_DIR) $$<
|
||||
|
||||
endif
|
||||
endif
|
||||
|
||||
# Propagate CFLAGS and CXXFLAGS to the individual object files.
|
||||
$$(foreach obj, $$($(1)_OBJS), $$(eval $$(obj)_CFLAGS=$$($(1)_CFLAGS)))
|
||||
$$(foreach obj, $$($(1)_OBJS), $$(eval $$(obj)_CXXFLAGS=$$($(1)_CXXFLAGS)))
|
||||
|
||||
# Make each object file depend on the common dependencies.
|
||||
$$(foreach obj, $$($(1)_OBJS), $$(eval $$(obj): $$($(1)_COMMON_DEPS) $$(GLOBAL_COMMON_DEPS)))
|
||||
|
||||
# Make each object file have order-only dependencies on the common
|
||||
# order-only dependencies. This includes the order-only dependencies
|
||||
# of libraries we're depending on.
|
||||
$(1)_ORDER_AFTER_CLOSED = $$($(1)_ORDER_AFTER) $$(foreach lib, $$($(1)_LIBS), $$($$(lib)_ORDER_AFTER_CLOSED))
|
||||
|
||||
$$(foreach obj, $$($(1)_OBJS), $$(eval $$(obj): | $$($(1)_ORDER_AFTER_CLOSED) $$(GLOBAL_ORDER_AFTER)))
|
||||
|
||||
# Include .dep files, if they exist.
|
||||
$(1)_DEPS := $$(foreach fn, $$($(1)_OBJS), $$(call filename-to-dep, $$(fn)))
|
||||
-include $$($(1)_DEPS)
|
||||
|
||||
programs-list += $$($(1)_PATH)
|
||||
clean-files += $$($(1)_PATH) $$(_d)/*.o $$(_d)/.*.dep $$($(1)_DEPS) $$($(1)_OBJS)
|
||||
|
||||
# Phony target to run this program (typically as a dependency of 'check').
|
||||
.PHONY: $(1)_RUN
|
||||
$(1)_RUN: $$($(1)_PATH)
|
||||
$(trace-test) $$($(1)_ENV) $$($(1)_PATH)
|
||||
|
||||
endef
|
@ -1,38 +0,0 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
set -eu -o pipefail
|
||||
|
||||
red=""
|
||||
green=""
|
||||
yellow=""
|
||||
normal=""
|
||||
|
||||
test=$1
|
||||
|
||||
dir="$(dirname "${BASH_SOURCE[0]}")"
|
||||
source "$dir/common-test.sh"
|
||||
|
||||
post_run_msg="ran test $test..."
|
||||
if [ -t 1 ]; then
|
||||
red="[31;1m"
|
||||
green="[32;1m"
|
||||
yellow="[33;1m"
|
||||
normal="[m"
|
||||
fi
|
||||
|
||||
run_test () {
|
||||
log="$(run "$test" 2>&1)" && status=0 || status=$?
|
||||
}
|
||||
|
||||
run_test
|
||||
|
||||
if [[ "$status" = 0 ]]; then
|
||||
echo "$post_run_msg [${green}PASS$normal]"
|
||||
elif [[ "$status" = 77 ]]; then
|
||||
echo "$post_run_msg [${yellow}SKIP$normal]"
|
||||
else
|
||||
echo "$post_run_msg [${red}FAIL$normal]"
|
||||
# shellcheck disable=SC2001
|
||||
echo "$log" | sed 's/^/ /'
|
||||
exit "$status"
|
||||
fi
|
@ -1,19 +0,0 @@
|
||||
template-files :=
|
||||
|
||||
# Create the file $(1) from $(1).in by running config.status (which
|
||||
# substitutes all ‘@var@’ variables set by the configure script).
|
||||
define instantiate-template
|
||||
|
||||
clean-files += $(1)
|
||||
|
||||
endef
|
||||
|
||||
ifneq ($(MAKECMDGOALS), clean)
|
||||
|
||||
$(buildprefix)%.h: %.h.in $(buildprefix)config.status
|
||||
$(trace-gen) rm -f $@ && cd $(buildprefixrel) && ./config.status --quiet --header=$(@:$(buildprefix)%=%)
|
||||
|
||||
$(buildprefix)%: %.in $(buildprefix)config.status
|
||||
$(trace-gen) rm -f $@ && cd $(buildprefixrel) && ./config.status --quiet --file=$(@:$(buildprefix)%=%)
|
||||
|
||||
endif
|
30
mk/tests.mk
30
mk/tests.mk
@ -1,30 +0,0 @@
|
||||
# Run program $1 as part of ‘make installcheck’.
|
||||
|
||||
test-deps =
|
||||
|
||||
define run-bash
|
||||
|
||||
.PHONY: $1
|
||||
$1: $2
|
||||
@env BASH=$(bash) $(bash) $3 < /dev/null
|
||||
|
||||
endef
|
||||
|
||||
define run-test
|
||||
|
||||
$(eval $(call run-bash,$1.test,$1 $(test-deps),mk/run-test.sh $1))
|
||||
$(eval $(call run-bash,$1.test-debug,$1 $(test-deps),mk/debug-test.sh $1))
|
||||
|
||||
endef
|
||||
|
||||
define run-test-group
|
||||
|
||||
.PHONY: $1.test-group
|
||||
|
||||
endef
|
||||
|
||||
.PHONY: check installcheck
|
||||
|
||||
print-top-help += \
|
||||
echo " check: Run unit tests"; \
|
||||
echo " installcheck: Run functional tests";
|
@ -1,18 +0,0 @@
|
||||
V ?= 0
|
||||
|
||||
ifeq ($(V), 0)
|
||||
|
||||
trace-gen = @echo " GEN " $@;
|
||||
trace-cc = @echo " CC " $@;
|
||||
trace-cxx = @echo " CXX " $@;
|
||||
trace-ld = @echo " LD " $@;
|
||||
trace-ar = @echo " AR " $@;
|
||||
trace-install = @echo " INST " $@;
|
||||
trace-mkdir = @echo " MKDIR " $@;
|
||||
trace-test = @echo " TEST " $@;
|
||||
trace-sh = @echo " SH " $@;
|
||||
trace-jq = @echo " JQ " $@;
|
||||
|
||||
suppress = @
|
||||
|
||||
endif
|
362
package.nix
362
package.nix
@ -1,362 +0,0 @@
|
||||
{ lib
|
||||
, stdenv
|
||||
, releaseTools
|
||||
, autoconf-archive
|
||||
, autoreconfHook
|
||||
, aws-sdk-cpp
|
||||
, boehmgc
|
||||
, nlohmann_json
|
||||
, bison
|
||||
, boost
|
||||
, brotli
|
||||
, bzip2
|
||||
, curl
|
||||
, editline
|
||||
, readline
|
||||
, flex
|
||||
, git
|
||||
, gtest
|
||||
, jq
|
||||
, libarchive
|
||||
, libcpuid
|
||||
, libgit2
|
||||
, libseccomp
|
||||
, libsodium
|
||||
, man
|
||||
, lowdown
|
||||
, mdbook
|
||||
, mdbook-linkcheck
|
||||
, mercurial
|
||||
, openssh
|
||||
, openssl
|
||||
, pkg-config
|
||||
, rapidcheck
|
||||
, sqlite
|
||||
, toml11
|
||||
, unixtools
|
||||
, xz
|
||||
|
||||
, busybox-sandbox-shell ? null
|
||||
|
||||
# Configuration Options
|
||||
#:
|
||||
# This probably seems like too many degrees of freedom, but it
|
||||
# faithfully reflects how the underlying configure + make build system
|
||||
# work. The top-level flake.nix will choose useful combinations of these
|
||||
# options to CI.
|
||||
|
||||
, pname ? "nix"
|
||||
|
||||
, version
|
||||
, versionSuffix
|
||||
|
||||
# Whether to build Nix. Useful to skip for tasks like testing existing pre-built versions of Nix
|
||||
, doBuild ? true
|
||||
|
||||
# Run the unit tests as part of the build. See `installUnitTests` for an
|
||||
# alternative to this.
|
||||
, doCheck ? __forDefaults.canRunInstalled
|
||||
|
||||
# Run the functional tests as part of the build.
|
||||
, doInstallCheck ? test-client != null || __forDefaults.canRunInstalled
|
||||
|
||||
# Check test coverage of Nix. Probably want to use with at least
|
||||
# one of `doCHeck` or `doInstallCheck` enabled.
|
||||
, withCoverageChecks ? false
|
||||
|
||||
# Whether to build the regular manual
|
||||
, enableManual ? __forDefaults.canRunInstalled
|
||||
|
||||
# Whether to use garbage collection for the Nix language evaluator.
|
||||
#
|
||||
# If it is disabled, we just leak memory, but this is not as bad as it
|
||||
# sounds so long as evaluation just takes places within short-lived
|
||||
# processes. (When the process exits, the memory is reclaimed; it is
|
||||
# only leaked *within* the process.)
|
||||
#
|
||||
# Temporarily disabled on Windows because the `GC_throw_bad_alloc`
|
||||
# symbol is missing during linking.
|
||||
, enableGC ? !stdenv.hostPlatform.isWindows
|
||||
|
||||
# Whether to enable Markdown rendering in the Nix binary.
|
||||
, enableMarkdown ? !stdenv.hostPlatform.isWindows
|
||||
|
||||
# Which interactive line editor library to use for Nix's repl.
|
||||
#
|
||||
# Currently supported choices are:
|
||||
#
|
||||
# - editline (default)
|
||||
# - readline
|
||||
, readlineFlavor ? if stdenv.hostPlatform.isWindows then "readline" else "editline"
|
||||
|
||||
# Whether to install unit tests. This is useful when cross compiling
|
||||
# since we cannot run them natively during the build, but can do so
|
||||
# later.
|
||||
, installUnitTests ? doBuild && !__forDefaults.canExecuteHost
|
||||
|
||||
# For running the functional tests against a pre-built Nix. Probably
|
||||
# want to use in conjunction with `doBuild = false;`.
|
||||
, test-daemon ? null
|
||||
, test-client ? null
|
||||
|
||||
# Avoid setting things that would interfere with a functioning devShell
|
||||
, forDevShell ? false
|
||||
|
||||
# Not a real argument, just the only way to approximate let-binding some
|
||||
# stuff for argument defaults.
|
||||
, __forDefaults ? {
|
||||
canExecuteHost = stdenv.buildPlatform.canExecute stdenv.hostPlatform;
|
||||
canRunInstalled = doBuild && __forDefaults.canExecuteHost;
|
||||
}
|
||||
}:
|
||||
|
||||
let
|
||||
inherit (lib) fileset;
|
||||
|
||||
# selected attributes with defaults, will be used to define some
|
||||
# things which should instead be gotten via `finalAttrs` in order to
|
||||
# work with overriding.
|
||||
attrs = {
|
||||
inherit doBuild doCheck doInstallCheck;
|
||||
};
|
||||
|
||||
mkDerivation =
|
||||
if withCoverageChecks
|
||||
then
|
||||
# TODO support `finalAttrs` args function in
|
||||
# `releaseTools.coverageAnalysis`.
|
||||
argsFun:
|
||||
releaseTools.coverageAnalysis (let args = argsFun args; in args)
|
||||
else stdenv.mkDerivation;
|
||||
in
|
||||
|
||||
mkDerivation (finalAttrs: let
|
||||
|
||||
inherit (finalAttrs)
|
||||
doCheck
|
||||
doInstallCheck
|
||||
;
|
||||
|
||||
doBuild = !finalAttrs.dontBuild;
|
||||
|
||||
# Either running the unit tests during the build, or installing them
|
||||
# to be run later, requiresthe unit tests to be built.
|
||||
buildUnitTests = doCheck || installUnitTests;
|
||||
|
||||
in {
|
||||
inherit pname version;
|
||||
|
||||
src =
|
||||
let
|
||||
baseFiles = fileset.fileFilter (f: f.name != ".gitignore") ./.;
|
||||
in
|
||||
fileset.toSource {
|
||||
root = ./.;
|
||||
fileset = fileset.intersection baseFiles (fileset.unions ([
|
||||
# For configure
|
||||
./.version
|
||||
./configure.ac
|
||||
./m4
|
||||
# TODO: do we really need README.md? It doesn't seem used in the build.
|
||||
./README.md
|
||||
# This could be put behind a conditional
|
||||
./maintainers/local.mk
|
||||
# For make, regardless of what we are building
|
||||
./local.mk
|
||||
./Makefile
|
||||
./Makefile.config.in
|
||||
./mk
|
||||
(fileset.fileFilter (f: lib.strings.hasPrefix "nix-profile" f.name) ./scripts)
|
||||
] ++ lib.optionals doBuild [
|
||||
./doc
|
||||
./misc
|
||||
./precompiled-headers.h
|
||||
(fileset.difference ./src ./src/perl)
|
||||
./COPYING
|
||||
./scripts/local.mk
|
||||
] ++ lib.optionals enableManual [
|
||||
./doc/manual
|
||||
] ++ lib.optionals doInstallCheck [
|
||||
./tests/functional
|
||||
]));
|
||||
};
|
||||
|
||||
VERSION_SUFFIX = versionSuffix;
|
||||
|
||||
outputs = [ "out" ]
|
||||
++ lib.optional doBuild "dev"
|
||||
# If we are doing just build or just docs, the one thing will use
|
||||
# "out". We only need additional outputs if we are doing both.
|
||||
++ lib.optional (doBuild && enableManual) "doc"
|
||||
++ lib.optional installUnitTests "check"
|
||||
++ lib.optional doCheck "testresults"
|
||||
;
|
||||
|
||||
nativeBuildInputs = [
|
||||
autoconf-archive
|
||||
autoreconfHook
|
||||
pkg-config
|
||||
] ++ lib.optionals doBuild [
|
||||
bison
|
||||
flex
|
||||
] ++ lib.optionals enableManual [
|
||||
(lib.getBin lowdown)
|
||||
mdbook
|
||||
mdbook-linkcheck
|
||||
] ++ lib.optionals doInstallCheck [
|
||||
git
|
||||
mercurial
|
||||
openssh
|
||||
] ++ lib.optionals (doInstallCheck || enableManual) [
|
||||
jq # Also for custom mdBook preprocessor.
|
||||
] ++ lib.optionals enableManual [
|
||||
man
|
||||
] ++ lib.optional stdenv.hostPlatform.isStatic unixtools.hexdump
|
||||
;
|
||||
|
||||
buildInputs = lib.optionals doBuild (
|
||||
[
|
||||
brotli
|
||||
bzip2
|
||||
curl
|
||||
libarchive
|
||||
libgit2
|
||||
libsodium
|
||||
openssl
|
||||
sqlite
|
||||
toml11
|
||||
xz
|
||||
({ inherit readline editline; }.${readlineFlavor})
|
||||
] ++ lib.optionals enableMarkdown [
|
||||
lowdown
|
||||
] ++ lib.optionals buildUnitTests [
|
||||
gtest
|
||||
rapidcheck
|
||||
] ++ lib.optional stdenv.isLinux libseccomp
|
||||
++ lib.optional stdenv.hostPlatform.isx86_64 libcpuid
|
||||
# There have been issues building these dependencies
|
||||
++ lib.optional (stdenv.hostPlatform == stdenv.buildPlatform && (stdenv.isLinux || stdenv.isDarwin))
|
||||
aws-sdk-cpp
|
||||
);
|
||||
|
||||
propagatedBuildInputs = lib.optionals doBuild ([
|
||||
boost
|
||||
nlohmann_json
|
||||
] ++ lib.optional enableGC boehmgc
|
||||
);
|
||||
|
||||
dontBuild = !attrs.doBuild;
|
||||
doCheck = attrs.doCheck;
|
||||
|
||||
configureFlags = [
|
||||
(lib.enableFeature doBuild "build")
|
||||
(lib.enableFeature buildUnitTests "unit-tests")
|
||||
(lib.enableFeature doInstallCheck "functional-tests")
|
||||
(lib.enableFeature enableManual "doc-gen")
|
||||
(lib.enableFeature enableGC "gc")
|
||||
(lib.enableFeature enableMarkdown "markdown")
|
||||
(lib.enableFeature installUnitTests "install-unit-tests")
|
||||
(lib.withFeatureAs true "readline-flavor" readlineFlavor)
|
||||
] ++ lib.optionals (!forDevShell) [
|
||||
"--sysconfdir=/etc"
|
||||
] ++ lib.optionals installUnitTests [
|
||||
"--with-check-bin-dir=${builtins.placeholder "check"}/bin"
|
||||
"--with-check-lib-dir=${builtins.placeholder "check"}/lib"
|
||||
] ++ lib.optionals (doBuild) [
|
||||
"--with-boost=${boost}/lib"
|
||||
] ++ lib.optionals (doBuild && stdenv.isLinux) [
|
||||
"--with-sandbox-shell=${busybox-sandbox-shell}/bin/busybox"
|
||||
] ++ lib.optional (doBuild && stdenv.isLinux && !(stdenv.hostPlatform.isStatic && stdenv.system == "aarch64-linux"))
|
||||
"LDFLAGS=-fuse-ld=gold"
|
||||
++ lib.optional (doBuild && stdenv.hostPlatform.isStatic) "--enable-embedded-sandbox-shell"
|
||||
;
|
||||
|
||||
enableParallelBuilding = true;
|
||||
|
||||
makeFlags = "profiledir=$(out)/etc/profile.d PRECOMPILE_HEADERS=1";
|
||||
|
||||
preCheck = ''
|
||||
mkdir $testresults
|
||||
'';
|
||||
|
||||
installTargets = lib.optional doBuild "install";
|
||||
|
||||
installFlags = "sysconfdir=$(out)/etc";
|
||||
|
||||
# In this case we are probably just running tests, and so there isn't
|
||||
# anything to install, we just make an empty directory to signify tests
|
||||
# succeeded.
|
||||
installPhase = if finalAttrs.installTargets != [] then null else ''
|
||||
mkdir -p $out
|
||||
'';
|
||||
|
||||
postInstall = lib.optionalString doBuild (
|
||||
lib.optionalString stdenv.hostPlatform.isStatic ''
|
||||
mkdir -p $out/nix-support
|
||||
echo "file binary-dist $out/bin/nix" >> $out/nix-support/hydra-build-products
|
||||
''
|
||||
) + lib.optionalString enableManual ''
|
||||
mkdir -p ''${!outputDoc}/nix-support
|
||||
echo "doc manual ''${!outputDoc}/share/doc/nix/manual" >> ''${!outputDoc}/nix-support/hydra-build-products
|
||||
'';
|
||||
|
||||
# So the check output gets links for DLLs in the out output.
|
||||
preFixup = lib.optionalString (stdenv.hostPlatform.isWindows && builtins.elem "check" finalAttrs.outputs) ''
|
||||
ln -s "$check/lib/"*.dll "$check/bin"
|
||||
ln -s "$out/bin/"*.dll "$check/bin"
|
||||
'';
|
||||
|
||||
doInstallCheck = attrs.doInstallCheck;
|
||||
|
||||
installCheckFlags = "sysconfdir=$(out)/etc";
|
||||
# Work around buggy detection in stdenv.
|
||||
installCheckTarget = "installcheck";
|
||||
|
||||
# Work around weird bug where it doesn't think there is a Makefile.
|
||||
installCheckPhase = if (!doBuild && doInstallCheck) then ''
|
||||
runHook preInstallCheck
|
||||
mkdir -p src/nix-channel
|
||||
make installcheck -j$NIX_BUILD_CORES -l$NIX_BUILD_CORES
|
||||
'' else null;
|
||||
|
||||
# Needed for tests if we are not doing a build, but testing existing
|
||||
# built Nix.
|
||||
preInstallCheck =
|
||||
lib.optionalString (! doBuild) ''
|
||||
mkdir -p src/nix-channel
|
||||
'';
|
||||
|
||||
separateDebugInfo = !stdenv.hostPlatform.isStatic;
|
||||
|
||||
# TODO Always true after https://github.com/NixOS/nixpkgs/issues/318564
|
||||
strictDeps = !withCoverageChecks;
|
||||
|
||||
hardeningDisable = lib.optional stdenv.hostPlatform.isStatic "pie";
|
||||
|
||||
meta = {
|
||||
platforms = lib.platforms.unix ++ lib.platforms.windows;
|
||||
mainProgram = "nix";
|
||||
broken = !(lib.all (a: a) [
|
||||
# We cannot run or install unit tests if we don't build them or
|
||||
# Nix proper (which they depend on).
|
||||
(installUnitTests -> doBuild)
|
||||
(doCheck -> doBuild)
|
||||
# The build process for the manual currently requires extracting
|
||||
# data from the Nix executable we are trying to document.
|
||||
(enableManual -> doBuild)
|
||||
]);
|
||||
};
|
||||
|
||||
} // lib.optionalAttrs withCoverageChecks {
|
||||
lcovFilter = [ "*/boost/*" "*-tab.*" ];
|
||||
|
||||
hardeningDisable = ["fortify"];
|
||||
|
||||
NIX_CFLAGS_COMPILE = "-DCOVERAGE=1";
|
||||
|
||||
dontInstall = false;
|
||||
} // lib.optionalAttrs (test-daemon != null) {
|
||||
NIX_DAEMON_PACKAGE = test-daemon;
|
||||
} // lib.optionalAttrs (test-client != null) {
|
||||
NIX_CLIENT_PACKAGE = test-client;
|
||||
})
|
@ -25,11 +25,6 @@ in
|
||||
version = baseVersion + versionSuffix;
|
||||
inherit versionSuffix;
|
||||
|
||||
nix = callPackage ../package.nix {
|
||||
version = fineVersion;
|
||||
versionSuffix = fineVersionSuffix;
|
||||
};
|
||||
|
||||
nix-util = callPackage ../src/libutil/package.nix { };
|
||||
nix-util-c = callPackage ../src/libutil-c/package.nix { };
|
||||
nix-util-test-support = callPackage ../src/libutil-test-support/package.nix { };
|
||||
@ -66,6 +61,5 @@ in
|
||||
|
||||
nix-perl-bindings = callPackage ../src/perl/package.nix { };
|
||||
|
||||
# Will replace `nix` once the old build system is gone.
|
||||
nix-ng = callPackage ../packaging/everything.nix { };
|
||||
nix-everything = callPackage ../packaging/everything.nix { };
|
||||
}
|
||||
|
@ -70,6 +70,9 @@ let
|
||||
pkgs.buildPackages.meson
|
||||
pkgs.buildPackages.ninja
|
||||
] ++ prevAttrs.nativeBuildInputs or [];
|
||||
mesonCheckFlags = prevAttrs.mesonCheckFlags or [] ++ [
|
||||
"--print-errorlogs"
|
||||
];
|
||||
};
|
||||
|
||||
mesonBuildLayer = finalAttrs: prevAttrs:
|
||||
|
128
packaging/dev-shell.nix
Normal file
128
packaging/dev-shell.nix
Normal file
@ -0,0 +1,128 @@
|
||||
{ lib, devFlake }:
|
||||
|
||||
{ pkgs }:
|
||||
|
||||
pkgs.nixComponents.nix-util.overrideAttrs (attrs:
|
||||
|
||||
let
|
||||
stdenv = pkgs.nixDependencies.stdenv;
|
||||
buildCanExecuteHost = stdenv.buildPlatform.canExecute stdenv.hostPlatform;
|
||||
modular = devFlake.getSystem stdenv.buildPlatform.system;
|
||||
transformFlag = prefix: flag:
|
||||
assert builtins.isString flag;
|
||||
let
|
||||
rest = builtins.substring 2 (builtins.stringLength flag) flag;
|
||||
in
|
||||
"-D${prefix}:${rest}";
|
||||
havePerl = stdenv.buildPlatform == stdenv.hostPlatform && stdenv.hostPlatform.isUnix;
|
||||
ignoreCrossFile = flags: builtins.filter (flag: !(lib.strings.hasInfix "cross-file" flag)) flags;
|
||||
in {
|
||||
pname = "shell-for-" + attrs.pname;
|
||||
|
||||
# Remove the version suffix to avoid unnecessary attempts to substitute in nix develop
|
||||
version = lib.fileContents ../.version;
|
||||
name = attrs.pname;
|
||||
|
||||
installFlags = "sysconfdir=$(out)/etc";
|
||||
shellHook = ''
|
||||
PATH=$prefix/bin:$PATH
|
||||
unset PYTHONPATH
|
||||
export MANPATH=$out/share/man:$MANPATH
|
||||
|
||||
# Make bash completion work.
|
||||
XDG_DATA_DIRS+=:$out/share
|
||||
|
||||
# Make the default phases do the right thing.
|
||||
# FIXME: this wouldn't be needed if the ninja package set buildPhase() instead of $buildPhase.
|
||||
# FIXME: mesonConfigurePhase shouldn't cd to the build directory. It would be better to pass '-C <dir>' to ninja.
|
||||
|
||||
cdToBuildDir() {
|
||||
if [[ ! -e build.ninja ]]; then
|
||||
cd build
|
||||
fi
|
||||
}
|
||||
|
||||
configurePhase() {
|
||||
mesonConfigurePhase
|
||||
}
|
||||
|
||||
buildPhase() {
|
||||
cdToBuildDir
|
||||
ninjaBuildPhase
|
||||
}
|
||||
|
||||
checkPhase() {
|
||||
cdToBuildDir
|
||||
mesonCheckPhase
|
||||
}
|
||||
|
||||
installPhase() {
|
||||
cdToBuildDir
|
||||
ninjaInstallPhase
|
||||
}
|
||||
'';
|
||||
|
||||
# We use this shell with the local checkout, not unpackPhase.
|
||||
src = null;
|
||||
|
||||
env = {
|
||||
# Needed for Meson to find Boost.
|
||||
# https://github.com/NixOS/nixpkgs/issues/86131.
|
||||
BOOST_INCLUDEDIR = "${lib.getDev pkgs.nixDependencies.boost}/include";
|
||||
BOOST_LIBRARYDIR = "${lib.getLib pkgs.nixDependencies.boost}/lib";
|
||||
# For `make format`, to work without installing pre-commit
|
||||
_NIX_PRE_COMMIT_HOOKS_CONFIG =
|
||||
"${(pkgs.formats.yaml { }).generate "pre-commit-config.yaml" modular.pre-commit.settings.rawConfig}";
|
||||
};
|
||||
|
||||
mesonFlags =
|
||||
map (transformFlag "libutil") (ignoreCrossFile pkgs.nixComponents.nix-util.mesonFlags)
|
||||
++ map (transformFlag "libstore") (ignoreCrossFile pkgs.nixComponents.nix-store.mesonFlags)
|
||||
++ map (transformFlag "libfetchers") (ignoreCrossFile pkgs.nixComponents.nix-fetchers.mesonFlags)
|
||||
++ lib.optionals havePerl (map (transformFlag "perl") (ignoreCrossFile pkgs.nixComponents.nix-perl-bindings.mesonFlags))
|
||||
++ map (transformFlag "libexpr") (ignoreCrossFile pkgs.nixComponents.nix-expr.mesonFlags)
|
||||
++ map (transformFlag "libcmd") (ignoreCrossFile pkgs.nixComponents.nix-cmd.mesonFlags)
|
||||
;
|
||||
|
||||
nativeBuildInputs = attrs.nativeBuildInputs or []
|
||||
++ pkgs.nixComponents.nix-util.nativeBuildInputs
|
||||
++ pkgs.nixComponents.nix-store.nativeBuildInputs
|
||||
++ pkgs.nixComponents.nix-fetchers.nativeBuildInputs
|
||||
++ pkgs.nixComponents.nix-expr.nativeBuildInputs
|
||||
++ lib.optionals havePerl pkgs.nixComponents.nix-perl-bindings.nativeBuildInputs
|
||||
++ lib.optionals buildCanExecuteHost pkgs.nixComponents.nix-manual.externalNativeBuildInputs
|
||||
++ pkgs.nixComponents.nix-internal-api-docs.nativeBuildInputs
|
||||
++ pkgs.nixComponents.nix-external-api-docs.nativeBuildInputs
|
||||
++ pkgs.nixComponents.nix-functional-tests.externalNativeBuildInputs
|
||||
++ lib.optional
|
||||
(!buildCanExecuteHost
|
||||
# Hack around https://github.com/nixos/nixpkgs/commit/bf7ad8cfbfa102a90463433e2c5027573b462479
|
||||
&& !(stdenv.hostPlatform.isWindows && stdenv.buildPlatform.isDarwin)
|
||||
&& stdenv.hostPlatform.emulatorAvailable pkgs.buildPackages
|
||||
&& lib.meta.availableOn stdenv.buildPlatform (stdenv.hostPlatform.emulator pkgs.buildPackages))
|
||||
pkgs.buildPackages.mesonEmulatorHook
|
||||
++ [
|
||||
pkgs.buildPackages.cmake
|
||||
pkgs.buildPackages.shellcheck
|
||||
pkgs.buildPackages.changelog-d
|
||||
modular.pre-commit.settings.package
|
||||
(pkgs.writeScriptBin "pre-commit-hooks-install"
|
||||
modular.pre-commit.settings.installationScript)
|
||||
]
|
||||
# TODO: Remove the darwin check once
|
||||
# https://github.com/NixOS/nixpkgs/pull/291814 is available
|
||||
++ lib.optional (stdenv.cc.isClang && !stdenv.buildPlatform.isDarwin) pkgs.buildPackages.bear
|
||||
++ lib.optional (stdenv.cc.isClang && stdenv.hostPlatform == stdenv.buildPlatform) (lib.hiPrio pkgs.buildPackages.clang-tools);
|
||||
|
||||
buildInputs = attrs.buildInputs or []
|
||||
++ pkgs.nixComponents.nix-util.buildInputs
|
||||
++ pkgs.nixComponents.nix-store.buildInputs
|
||||
++ pkgs.nixComponents.nix-store-tests.externalBuildInputs
|
||||
++ pkgs.nixComponents.nix-fetchers.buildInputs
|
||||
++ pkgs.nixComponents.nix-expr.buildInputs
|
||||
++ pkgs.nixComponents.nix-expr.externalPropagatedBuildInputs
|
||||
++ pkgs.nixComponents.nix-cmd.buildInputs
|
||||
++ lib.optionals havePerl pkgs.nixComponents.nix-perl-bindings.externalBuildInputs
|
||||
++ lib.optional havePerl pkgs.perl
|
||||
;
|
||||
})
|
@ -5,12 +5,10 @@
|
||||
|
||||
nix-util,
|
||||
nix-util-c,
|
||||
nix-util-test-support,
|
||||
nix-util-tests,
|
||||
|
||||
nix-store,
|
||||
nix-store-c,
|
||||
nix-store-test-support,
|
||||
nix-store-tests,
|
||||
|
||||
nix-fetchers,
|
||||
@ -18,7 +16,6 @@
|
||||
|
||||
nix-expr,
|
||||
nix-expr-c,
|
||||
nix-expr-test-support,
|
||||
nix-expr-tests,
|
||||
|
||||
nix-flake,
|
||||
@ -38,45 +35,80 @@
|
||||
nix-external-api-docs,
|
||||
|
||||
nix-perl-bindings,
|
||||
|
||||
testers,
|
||||
runCommand,
|
||||
}:
|
||||
|
||||
let
|
||||
dev = stdenv.mkDerivation (finalAttrs: {
|
||||
name = "nix-${nix-cli.version}-dev";
|
||||
pname = "nix";
|
||||
version = nix-cli.version;
|
||||
dontUnpack = true;
|
||||
dontBuild = true;
|
||||
libs = map lib.getDev [
|
||||
nix-cmd
|
||||
nix-expr
|
||||
nix-expr-c
|
||||
nix-fetchers
|
||||
nix-flake
|
||||
nix-main
|
||||
nix-main-c
|
||||
nix-store
|
||||
nix-store-c
|
||||
nix-util
|
||||
nix-util-c
|
||||
nix-perl-bindings
|
||||
];
|
||||
installPhase = ''
|
||||
mkdir -p $out/nix-support
|
||||
echo $libs >> $out/nix-support/propagated-build-inputs
|
||||
'';
|
||||
passthru = {
|
||||
tests = {
|
||||
pkg-config =
|
||||
testers.hasPkgConfigModules {
|
||||
package = finalAttrs.finalPackage;
|
||||
};
|
||||
};
|
||||
|
||||
# If we were to fully emulate output selection here, we'd confuse the Nix CLIs,
|
||||
# because they rely on `drvPath`.
|
||||
dev = finalAttrs.finalPackage.out;
|
||||
|
||||
libs = throw "`nix.dev.libs` is not meant to be used; use `nix.libs` instead.";
|
||||
};
|
||||
meta = {
|
||||
pkgConfigModules = [
|
||||
"nix-cmd"
|
||||
"nix-expr"
|
||||
"nix-expr-c"
|
||||
"nix-fetchers"
|
||||
"nix-flake"
|
||||
"nix-main"
|
||||
"nix-main-c"
|
||||
"nix-store"
|
||||
"nix-store-c"
|
||||
"nix-util"
|
||||
"nix-util-c"
|
||||
];
|
||||
};
|
||||
});
|
||||
devdoc = buildEnv {
|
||||
name = "nix-${nix-cli.version}-devdoc";
|
||||
paths = [
|
||||
nix-internal-api-docs
|
||||
nix-external-api-docs
|
||||
];
|
||||
};
|
||||
|
||||
in
|
||||
(buildEnv {
|
||||
name = "nix-${nix-cli.version}";
|
||||
paths = [
|
||||
nix-util
|
||||
nix-util-c
|
||||
nix-util-test-support
|
||||
nix-util-tests
|
||||
|
||||
nix-store
|
||||
nix-store-c
|
||||
nix-store-test-support
|
||||
nix-store-tests
|
||||
|
||||
nix-fetchers
|
||||
nix-fetchers-tests
|
||||
|
||||
nix-expr
|
||||
nix-expr-c
|
||||
nix-expr-test-support
|
||||
nix-expr-tests
|
||||
|
||||
nix-flake
|
||||
nix-flake-tests
|
||||
|
||||
nix-main
|
||||
nix-main-c
|
||||
|
||||
nix-cmd
|
||||
|
||||
nix-cli
|
||||
|
||||
nix-manual
|
||||
nix-internal-api-docs
|
||||
nix-external-api-docs
|
||||
|
||||
] ++ lib.optionals (stdenv.buildPlatform.canExecute stdenv.hostPlatform) [
|
||||
nix-perl-bindings
|
||||
nix-manual.man
|
||||
];
|
||||
|
||||
meta.mainProgram = "nix";
|
||||
@ -85,16 +117,31 @@
|
||||
doInstallCheck = true;
|
||||
|
||||
checkInputs = [
|
||||
# Actually run the unit tests too
|
||||
# Make sure the unit tests have passed
|
||||
nix-util-tests.tests.run
|
||||
nix-store-tests.tests.run
|
||||
nix-expr-tests.tests.run
|
||||
nix-fetchers-tests.tests.run
|
||||
nix-flake-tests.tests.run
|
||||
];
|
||||
|
||||
# dev bundle is ok
|
||||
# (checkInputs must be empty paths??)
|
||||
(runCommand "check-pkg-config" { checked = dev.tests.pkg-config; } "mkdir $out")
|
||||
] ++
|
||||
(if stdenv.buildPlatform.canExecute stdenv.hostPlatform
|
||||
then [
|
||||
# TODO: add perl.tests
|
||||
nix-perl-bindings
|
||||
]
|
||||
else [
|
||||
nix-perl-bindings
|
||||
]);
|
||||
installCheckInputs = [
|
||||
nix-functional-tests
|
||||
];
|
||||
passthru = prevAttrs.passthru // {
|
||||
inherit (nix-cli) version;
|
||||
|
||||
/**
|
||||
These are the libraries that are part of the Nix project. They are used
|
||||
by the Nix CLI and other tools.
|
||||
@ -126,5 +173,26 @@
|
||||
nix-main-c
|
||||
;
|
||||
};
|
||||
|
||||
tests = prevAttrs.passthru.tests or {} // {
|
||||
# TODO: create a proper fixpoint and:
|
||||
# pkg-config =
|
||||
# testers.hasPkgConfigModules {
|
||||
# package = finalPackage;
|
||||
# };
|
||||
};
|
||||
|
||||
/**
|
||||
A derivation referencing the `dev` outputs of the Nix libraries.
|
||||
*/
|
||||
inherit dev;
|
||||
inherit devdoc;
|
||||
doc = nix-manual;
|
||||
outputs = [ "out" "dev" "devdoc" "doc" ];
|
||||
all = lib.attrValues (lib.genAttrs finalAttrs.passthru.outputs (outName: finalAttrs.finalPackage.${outName}));
|
||||
};
|
||||
meta = prevAttrs.meta // {
|
||||
description = "The Nix package manager";
|
||||
pkgConfigModules = dev.meta.pkgConfigModules;
|
||||
};
|
||||
})
|
||||
|
@ -16,32 +16,23 @@ let
|
||||
inherit tarballs;
|
||||
};
|
||||
|
||||
testNixVersions = pkgs: client: daemon:
|
||||
pkgs.nixComponents.callPackage ../package.nix {
|
||||
testNixVersions = pkgs: daemon:
|
||||
pkgs.nixComponents.nix-functional-tests.override {
|
||||
pname =
|
||||
"nix-tests"
|
||||
+ lib.optionalString
|
||||
(lib.versionAtLeast daemon.version "2.4pre20211005" &&
|
||||
lib.versionAtLeast client.version "2.4pre20211005")
|
||||
"-${client.version}-against-${daemon.version}";
|
||||
lib.versionAtLeast pkgs.nix.version "2.4pre20211005")
|
||||
"-${pkgs.nix.version}-against-${daemon.version}";
|
||||
|
||||
test-client = client;
|
||||
test-daemon = daemon;
|
||||
|
||||
doBuild = false;
|
||||
|
||||
# This could be more accurate, but a shorter version will match the
|
||||
# fine version with rev. This functionality is already covered in
|
||||
# the normal test, so it's fine.
|
||||
version = pkgs.nixComponents.version;
|
||||
versionSuffix = pkgs.nixComponents.versionSuffix;
|
||||
};
|
||||
|
||||
# Technically we could just return `pkgs.nixComponents`, but for Hydra it's
|
||||
# convention to transpose it, and to transpose it efficiently, we need to
|
||||
# enumerate them manually, so that we don't evaluate unnecessary package sets.
|
||||
forAllPackages = lib.genAttrs [
|
||||
"nix"
|
||||
"nix-everything"
|
||||
"nix-util"
|
||||
"nix-util-c"
|
||||
"nix-util-test-support"
|
||||
@ -63,7 +54,6 @@ let
|
||||
"nix-cmd"
|
||||
"nix-cli"
|
||||
"nix-functional-tests"
|
||||
"nix-ng"
|
||||
];
|
||||
in
|
||||
{
|
||||
@ -71,7 +61,9 @@ in
|
||||
build = forAllPackages (pkgName:
|
||||
forAllSystems (system: nixpkgsFor.${system}.native.nixComponents.${pkgName}));
|
||||
|
||||
shellInputs = forAllSystems (system: self.devShells.${system}.default.inputDerivation);
|
||||
shellInputs = removeAttrs
|
||||
(forAllSystems (system: self.devShells.${system}.default.inputDerivation))
|
||||
[ "i686-linux" ];
|
||||
|
||||
buildStatic = forAllPackages (pkgName:
|
||||
lib.genAttrs linux64BitSystems (system: nixpkgsFor.${system}.static.nixComponents.${pkgName}));
|
||||
@ -82,20 +74,28 @@ in
|
||||
(forAllCrossSystems (crossSystem:
|
||||
lib.genAttrs [ "x86_64-linux" ] (system: nixpkgsFor.${system}.cross.${crossSystem}.nixComponents.${pkgName}))));
|
||||
|
||||
buildNoGc = forAllSystems (system:
|
||||
self.packages.${system}.nix.override { enableGC = false; }
|
||||
);
|
||||
buildNoGc = let
|
||||
components = forAllSystems (system:
|
||||
nixpkgsFor.${system}.native.nixComponents.overrideScope (self: super: {
|
||||
nix-expr = super.nix-expr.override { enableGC = false; };
|
||||
})
|
||||
);
|
||||
in forAllPackages (pkgName: forAllSystems (system: components.${system}.${pkgName}));
|
||||
|
||||
buildNoTests = forAllSystems (system: nixpkgsFor.${system}.native.nixComponents.nix-cli);
|
||||
|
||||
# Toggles some settings for better coverage. Windows needs these
|
||||
# library combinations, and Debian build Nix with GNU readline too.
|
||||
buildReadlineNoMarkdown = forAllSystems (system:
|
||||
self.packages.${system}.nix.override {
|
||||
enableMarkdown = false;
|
||||
readlineFlavor = "readline";
|
||||
}
|
||||
);
|
||||
buildReadlineNoMarkdown = let
|
||||
components = forAllSystems (system:
|
||||
nixpkgsFor.${system}.native.nixComponents.overrideScope (self: super: {
|
||||
nix-cmd = super.nix-cmd.override {
|
||||
enableMarkdown = false;
|
||||
readlineFlavor = "readline";
|
||||
};
|
||||
})
|
||||
);
|
||||
in forAllPackages (pkgName: forAllSystems (system: components.${system}.${pkgName}));
|
||||
|
||||
# Perl bindings for various platforms.
|
||||
perlBindings = forAllSystems (system: nixpkgsFor.${system}.native.nixComponents.nix-perl-bindings);
|
||||
@ -140,11 +140,11 @@ in
|
||||
# docker image with Nix inside
|
||||
dockerImage = lib.genAttrs linux64BitSystems (system: self.packages.${system}.dockerImage);
|
||||
|
||||
# Line coverage analysis.
|
||||
coverage = nixpkgsFor.x86_64-linux.native.nix.override {
|
||||
pname = "nix-coverage";
|
||||
withCoverageChecks = true;
|
||||
};
|
||||
# # Line coverage analysis.
|
||||
# coverage = nixpkgsFor.x86_64-linux.native.nix.override {
|
||||
# pname = "nix-coverage";
|
||||
# withCoverageChecks = true;
|
||||
# };
|
||||
|
||||
# Nix's manual
|
||||
manual = nixpkgsFor.x86_64-linux.native.nixComponents.nix-manual;
|
||||
@ -181,7 +181,7 @@ in
|
||||
import (nixpkgs + "/lib/tests/test-with-nix.nix")
|
||||
{
|
||||
lib = nixpkgsFor.${system}.native.lib;
|
||||
nix = self.packages.${system}.nix;
|
||||
nix = self.packages.${system}.nix-cli;
|
||||
pkgs = nixpkgsFor.${system}.native;
|
||||
}
|
||||
);
|
||||
@ -196,15 +196,15 @@ in
|
||||
let pkgs = nixpkgsFor.${system}.native; in
|
||||
pkgs.runCommand "install-tests"
|
||||
{
|
||||
againstSelf = testNixVersions pkgs pkgs.nix pkgs.pkgs.nix;
|
||||
againstSelf = testNixVersions pkgs pkgs.nix;
|
||||
againstCurrentLatest =
|
||||
# FIXME: temporarily disable this on macOS because of #3605.
|
||||
if system == "x86_64-linux"
|
||||
then testNixVersions pkgs pkgs.nix pkgs.nixVersions.latest
|
||||
then testNixVersions pkgs pkgs.nixVersions.latest
|
||||
else null;
|
||||
# Disabled because the latest stable version doesn't handle
|
||||
# `NIX_DAEMON_SOCKET_PATH` which is required for the tests to work
|
||||
# againstLatestStable = testNixVersions pkgs pkgs.nix pkgs.nixStable;
|
||||
# againstLatestStable = testNixVersions pkgs pkgs.nixStable;
|
||||
} "touch $out");
|
||||
|
||||
installerTests = import ../tests/installer {
|
||||
|
@ -1,27 +0,0 @@
|
||||
#! /usr/bin/env bash
|
||||
|
||||
set -e
|
||||
|
||||
echo "Nix version:"
|
||||
nix --version
|
||||
|
||||
cd flake-regressions
|
||||
|
||||
status=0
|
||||
|
||||
flakes=$(find tests -mindepth 3 -maxdepth 3 -type d -not -path '*/.*' | sort | head -n25)
|
||||
|
||||
echo "Running flake tests..."
|
||||
|
||||
for flake in $flakes; do
|
||||
|
||||
if ! REGENERATE=0 ./eval-flake.sh "$flake"; then
|
||||
status=1
|
||||
echo "❌ $flake"
|
||||
else
|
||||
echo "✅ $flake"
|
||||
fi
|
||||
|
||||
done
|
||||
|
||||
exit "$status"
|
@ -96,6 +96,9 @@ poly_configure_nix_daemon_service() {
|
||||
if [ -e /run/systemd/system ]; then
|
||||
task "Setting up the nix-daemon systemd service"
|
||||
|
||||
_sudo "to create parent of the nix-daemon tmpfiles config" \
|
||||
mkdir -p "$(dirname "$TMPFILES_DEST")"
|
||||
|
||||
_sudo "to create the nix-daemon tmpfiles config" \
|
||||
ln -sfn "/nix/var/nix/profiles/default$TMPFILES_SRC" "$TMPFILES_DEST"
|
||||
|
||||
|
@ -1,13 +0,0 @@
|
||||
nix_noinst_scripts := \
|
||||
$(d)/nix-profile.sh
|
||||
|
||||
noinst-scripts += $(nix_noinst_scripts)
|
||||
|
||||
profiledir = $(sysconfdir)/profile.d
|
||||
|
||||
$(eval $(call install-file-as, $(d)/nix-profile.sh, $(profiledir)/nix.sh, 0644))
|
||||
$(eval $(call install-file-as, $(d)/nix-profile.fish, $(profiledir)/nix.fish, 0644))
|
||||
$(eval $(call install-file-as, $(d)/nix-profile-daemon.sh, $(profiledir)/nix-daemon.sh, 0644))
|
||||
$(eval $(call install-file-as, $(d)/nix-profile-daemon.fish, $(profiledir)/nix-daemon.fish, 0644))
|
||||
|
||||
clean-files += $(nix_noinst_scripts)
|
@ -41,7 +41,7 @@ INPUT = \
|
||||
@src@/src/libutil-c \
|
||||
@src@/src/libexpr-c \
|
||||
@src@/src/libstore-c \
|
||||
@src@/doc/external-api/README.md
|
||||
@src@/src/external-api-docs/README.md
|
||||
|
||||
FILE_PATTERNS = nix_api_*.h *.md
|
||||
|
||||
@ -55,6 +55,8 @@ EXCLUDE_PATTERNS = *_internal.h
|
||||
GENERATE_TREEVIEW = YES
|
||||
OPTIMIZE_OUTPUT_FOR_C = YES
|
||||
|
||||
USE_MDFILE_AS_MAINPAGE = doc/external-api/README.md
|
||||
USE_MDFILE_AS_MAINPAGE = @src@/src/external-api-docs/README.md
|
||||
|
||||
WARN_IF_UNDOCUMENTED = NO
|
||||
WARN_IF_INCOMPLETE_DOC = NO
|
||||
QUIET = YES
|
||||
|
@ -43,8 +43,8 @@ INPUT = \
|
||||
@src@/libexpr/flake \
|
||||
@src@/libexpr-tests \
|
||||
@src@/libexpr-tests/value \
|
||||
@src@/libexpr-test-support/test \
|
||||
@src@/libexpr-test-support/test/value \
|
||||
@src@/libexpr-test-support/tests \
|
||||
@src@/libexpr-test-support/tests/value \
|
||||
@src@/libexpr/value \
|
||||
@src@/libfetchers \
|
||||
@src@/libmain \
|
||||
@ -52,10 +52,11 @@ INPUT = \
|
||||
@src@/libstore/build \
|
||||
@src@/libstore/builtins \
|
||||
@src@/libstore-tests \
|
||||
@src@/libstore-test-support/test \
|
||||
@src@/libstore-test-support/tests \
|
||||
@src@/libutil \
|
||||
@src@/libutil/args \
|
||||
@src@/libutil-tests \
|
||||
@src@/libutil-test-support/test \
|
||||
@src@/libutil-test-support/tests \
|
||||
@src@/nix \
|
||||
@src@/nix-env \
|
||||
@src@/nix-store
|
||||
@ -83,7 +84,9 @@ EXPAND_ONLY_PREDEF = YES
|
||||
# RECURSIVE has no effect here.
|
||||
# This tag requires that the tag SEARCH_INCLUDES is set to YES.
|
||||
|
||||
INCLUDE_PATH =
|
||||
INCLUDE_PATH = \
|
||||
@BUILD_ROOT@/src/libexpr/libnixexpr.so.p \
|
||||
@BUILD_ROOT@/src/nix/nix.p \
|
||||
|
||||
# If the MACRO_EXPANSION and EXPAND_ONLY_PREDEF tags are set to YES then this
|
||||
# tag can be used to specify a list of macro names that should be expanded. The
|
||||
@ -96,7 +99,18 @@ EXPAND_AS_DEFINED = \
|
||||
DECLARE_COMMON_SERIALISER \
|
||||
DECLARE_WORKER_SERIALISER \
|
||||
DECLARE_SERVE_SERIALISER \
|
||||
LENGTH_PREFIXED_PROTO_HELPER
|
||||
LENGTH_PREFIXED_PROTO_HELPER \
|
||||
LENGTH_PREFIXED_PROTO_HELPER_X \
|
||||
WORKER_USE_LENGTH_PREFIX_SERIALISER \
|
||||
WORKER_USE_LENGTH_PREFIX_SERIALISER_COMMA \
|
||||
SERVE_USE_LENGTH_PREFIX_SERIALISER \
|
||||
SERVE_USE_LENGTH_PREFIX_SERIALISER_COMMA \
|
||||
COMMON_METHODS \
|
||||
JSON_IMPL \
|
||||
MakeBinOp
|
||||
|
||||
PREDEFINED = DOXYGEN_SKIP
|
||||
|
||||
WARN_IF_UNDOCUMENTED = NO
|
||||
WARN_IF_INCOMPLETE_DOC = NO
|
||||
QUIET = YES
|
||||
|
@ -12,6 +12,7 @@ doxygen_cfg = configure_file(
|
||||
configuration : {
|
||||
'PROJECT_NUMBER': meson.project_version(),
|
||||
'OUTPUT_DIRECTORY' : meson.current_build_dir(),
|
||||
'BUILD_ROOT' : meson.build_root(),
|
||||
'src' : fs.parent(fs.parent(meson.project_source_root())) / 'src',
|
||||
},
|
||||
)
|
||||
|
@ -1,3 +1,4 @@
|
||||
#include <algorithm>
|
||||
#include <nlohmann/json.hpp>
|
||||
|
||||
#include "command.hh"
|
||||
@ -9,8 +10,7 @@
|
||||
#include "profiles.hh"
|
||||
#include "repl.hh"
|
||||
#include "strings.hh"
|
||||
|
||||
extern char * * environ __attribute__((weak));
|
||||
#include "environment-variables.hh"
|
||||
|
||||
namespace nix {
|
||||
|
||||
@ -23,7 +23,8 @@ nix::Commands RegisterCommand::getCommandsFor(const std::vector<std::string> & p
|
||||
if (name.size() == prefix.size() + 1) {
|
||||
bool equal = true;
|
||||
for (size_t i = 0; i < prefix.size(); ++i)
|
||||
if (name[i] != prefix[i]) equal = false;
|
||||
if (name[i] != prefix[i])
|
||||
equal = false;
|
||||
if (equal)
|
||||
res.insert_or_assign(name[prefix.size()], command);
|
||||
}
|
||||
@ -42,16 +43,16 @@ void NixMultiCommand::run()
|
||||
std::set<std::string> subCommandTextLines;
|
||||
for (auto & [name, _] : commands)
|
||||
subCommandTextLines.insert(fmt("- `%s`", name));
|
||||
std::string markdownError = fmt("`nix %s` requires a sub-command. Available sub-commands:\n\n%s\n",
|
||||
commandName, concatStringsSep("\n", subCommandTextLines));
|
||||
std::string markdownError =
|
||||
fmt("`nix %s` requires a sub-command. Available sub-commands:\n\n%s\n",
|
||||
commandName,
|
||||
concatStringsSep("\n", subCommandTextLines));
|
||||
throw UsageError(renderMarkdownToTerminal(markdownError));
|
||||
}
|
||||
command->second->run();
|
||||
}
|
||||
|
||||
StoreCommand::StoreCommand()
|
||||
{
|
||||
}
|
||||
StoreCommand::StoreCommand() {}
|
||||
|
||||
ref<Store> StoreCommand::getStore()
|
||||
{
|
||||
@ -126,10 +127,8 @@ ref<Store> EvalCommand::getEvalStore()
|
||||
ref<EvalState> EvalCommand::getEvalState()
|
||||
{
|
||||
if (!evalState) {
|
||||
evalState =
|
||||
std::allocate_shared<EvalState>(
|
||||
traceable_allocator<EvalState>(),
|
||||
lookupPath, getEvalStore(), fetchSettings, evalSettings, getStore());
|
||||
evalState = std::allocate_shared<EvalState>(
|
||||
traceable_allocator<EvalState>(), lookupPath, getEvalStore(), fetchSettings, evalSettings, getStore());
|
||||
|
||||
evalState->repair = repair;
|
||||
|
||||
@ -144,7 +143,8 @@ MixOperateOnOptions::MixOperateOnOptions()
|
||||
{
|
||||
addFlag({
|
||||
.longName = "derivation",
|
||||
.description = "Operate on the [store derivation](@docroot@/glossary.md#gloss-store-derivation) rather than its outputs.",
|
||||
.description =
|
||||
"Operate on the [store derivation](@docroot@/glossary.md#gloss-store-derivation) rather than its outputs.",
|
||||
.category = installablesCategory,
|
||||
.handler = {&operateOn, OperateOn::Derivation},
|
||||
});
|
||||
@ -179,30 +179,34 @@ BuiltPathsCommand::BuiltPathsCommand(bool recursive)
|
||||
|
||||
void BuiltPathsCommand::run(ref<Store> store, Installables && installables)
|
||||
{
|
||||
BuiltPaths paths;
|
||||
BuiltPaths rootPaths, allPaths;
|
||||
|
||||
if (all) {
|
||||
if (installables.size())
|
||||
throw UsageError("'--all' does not expect arguments");
|
||||
// XXX: Only uses opaque paths, ignores all the realisations
|
||||
for (auto & p : store->queryAllValidPaths())
|
||||
paths.emplace_back(BuiltPath::Opaque{p});
|
||||
rootPaths.emplace_back(BuiltPath::Opaque{p});
|
||||
allPaths = rootPaths;
|
||||
} else {
|
||||
paths = Installable::toBuiltPaths(getEvalStore(), store, realiseMode, operateOn, installables);
|
||||
rootPaths = Installable::toBuiltPaths(getEvalStore(), store, realiseMode, operateOn, installables);
|
||||
allPaths = rootPaths;
|
||||
|
||||
if (recursive) {
|
||||
// XXX: This only computes the store path closure, ignoring
|
||||
// intermediate realisations
|
||||
StorePathSet pathsRoots, pathsClosure;
|
||||
for (auto & root : paths) {
|
||||
for (auto & root : rootPaths) {
|
||||
auto rootFromThis = root.outPaths();
|
||||
pathsRoots.insert(rootFromThis.begin(), rootFromThis.end());
|
||||
}
|
||||
store->computeFSClosure(pathsRoots, pathsClosure);
|
||||
for (auto & path : pathsClosure)
|
||||
paths.emplace_back(BuiltPath::Opaque{path});
|
||||
allPaths.emplace_back(BuiltPath::Opaque{path});
|
||||
}
|
||||
}
|
||||
|
||||
run(store, std::move(paths));
|
||||
run(store, std::move(allPaths), std::move(rootPaths));
|
||||
}
|
||||
|
||||
StorePathsCommand::StorePathsCommand(bool recursive)
|
||||
@ -210,10 +214,10 @@ StorePathsCommand::StorePathsCommand(bool recursive)
|
||||
{
|
||||
}
|
||||
|
||||
void StorePathsCommand::run(ref<Store> store, BuiltPaths && paths)
|
||||
void StorePathsCommand::run(ref<Store> store, BuiltPaths && allPaths, BuiltPaths && rootPaths)
|
||||
{
|
||||
StorePathSet storePaths;
|
||||
for (auto & builtPath : paths)
|
||||
for (auto & builtPath : allPaths)
|
||||
for (auto & p : builtPath.outPaths())
|
||||
storePaths.insert(p);
|
||||
|
||||
@ -233,46 +237,48 @@ void StorePathCommand::run(ref<Store> store, StorePaths && storePaths)
|
||||
|
||||
MixProfile::MixProfile()
|
||||
{
|
||||
addFlag({
|
||||
.longName = "profile",
|
||||
.description = "The profile to operate on.",
|
||||
.labels = {"path"},
|
||||
.handler = {&profile},
|
||||
.completer = completePath
|
||||
});
|
||||
addFlag(
|
||||
{.longName = "profile",
|
||||
.description = "The profile to operate on.",
|
||||
.labels = {"path"},
|
||||
.handler = {&profile},
|
||||
.completer = completePath});
|
||||
}
|
||||
|
||||
void MixProfile::updateProfile(const StorePath & storePath)
|
||||
{
|
||||
if (!profile) return;
|
||||
auto store = getStore().dynamic_pointer_cast<LocalFSStore>();
|
||||
if (!store) throw Error("'--profile' is not supported for this Nix store");
|
||||
if (!profile)
|
||||
return;
|
||||
auto store = getDstStore().dynamic_pointer_cast<LocalFSStore>();
|
||||
if (!store)
|
||||
throw Error("'--profile' is not supported for this Nix store");
|
||||
auto profile2 = absPath(*profile);
|
||||
switchLink(profile2,
|
||||
createGeneration(*store, profile2, storePath));
|
||||
switchLink(profile2, createGeneration(*store, profile2, storePath));
|
||||
}
|
||||
|
||||
void MixProfile::updateProfile(const BuiltPaths & buildables)
|
||||
{
|
||||
if (!profile) return;
|
||||
if (!profile)
|
||||
return;
|
||||
|
||||
StorePaths result;
|
||||
|
||||
for (auto & buildable : buildables) {
|
||||
std::visit(overloaded {
|
||||
[&](const BuiltPath::Opaque & bo) {
|
||||
result.push_back(bo.path);
|
||||
std::visit(
|
||||
overloaded{
|
||||
[&](const BuiltPath::Opaque & bo) { result.push_back(bo.path); },
|
||||
[&](const BuiltPath::Built & bfd) {
|
||||
for (auto & output : bfd.outputs) {
|
||||
result.push_back(output.second);
|
||||
}
|
||||
},
|
||||
},
|
||||
[&](const BuiltPath::Built & bfd) {
|
||||
for (auto & output : bfd.outputs) {
|
||||
result.push_back(output.second);
|
||||
}
|
||||
},
|
||||
}, buildable.raw());
|
||||
buildable.raw());
|
||||
}
|
||||
|
||||
if (result.size() != 1)
|
||||
throw UsageError("'--profile' requires that the arguments produce a single store path, but there are %d", result.size());
|
||||
throw UsageError(
|
||||
"'--profile' requires that the arguments produce a single store path, but there are %d", result.size());
|
||||
|
||||
updateProfile(result[0]);
|
||||
}
|
||||
@ -282,50 +288,111 @@ MixDefaultProfile::MixDefaultProfile()
|
||||
profile = getDefaultProfile();
|
||||
}
|
||||
|
||||
MixEnvironment::MixEnvironment() : ignoreEnvironment(false)
|
||||
MixEnvironment::MixEnvironment()
|
||||
: ignoreEnvironment(false)
|
||||
{
|
||||
addFlag({
|
||||
.longName = "ignore-environment",
|
||||
.longName = "ignore-env",
|
||||
.aliases = {"ignore-environment"},
|
||||
.shortName = 'i',
|
||||
.description = "Clear the entire environment (except those specified with `--keep`).",
|
||||
.description = "Clear the entire environment, except for those specified with `--keep-env-var`.",
|
||||
.category = environmentVariablesCategory,
|
||||
.handler = {&ignoreEnvironment, true},
|
||||
});
|
||||
|
||||
addFlag({
|
||||
.longName = "keep",
|
||||
.longName = "keep-env-var",
|
||||
.aliases = {"keep"},
|
||||
.shortName = 'k',
|
||||
.description = "Keep the environment variable *name*.",
|
||||
.description = "Keep the environment variable *name*, when using `--ignore-env`.",
|
||||
.category = environmentVariablesCategory,
|
||||
.labels = {"name"},
|
||||
.handler = {[&](std::string s) { keep.insert(s); }},
|
||||
.handler = {[&](std::string s) { keepVars.insert(s); }},
|
||||
});
|
||||
|
||||
addFlag({
|
||||
.longName = "unset",
|
||||
.longName = "unset-env-var",
|
||||
.aliases = {"unset"},
|
||||
.shortName = 'u',
|
||||
.description = "Unset the environment variable *name*.",
|
||||
.category = environmentVariablesCategory,
|
||||
.labels = {"name"},
|
||||
.handler = {[&](std::string s) { unset.insert(s); }},
|
||||
.handler = {[&](std::string name) {
|
||||
if (setVars.contains(name))
|
||||
throw UsageError("Cannot unset environment variable '%s' that is set with '%s'", name, "--set-env-var");
|
||||
|
||||
unsetVars.insert(name);
|
||||
}},
|
||||
});
|
||||
|
||||
addFlag({
|
||||
.longName = "set-env-var",
|
||||
.shortName = 's',
|
||||
.description = "Sets an environment variable *name* with *value*.",
|
||||
.category = environmentVariablesCategory,
|
||||
.labels = {"name", "value"},
|
||||
.handler = {[&](std::string name, std::string value) {
|
||||
if (unsetVars.contains(name))
|
||||
throw UsageError(
|
||||
"Cannot set environment variable '%s' that is unset with '%s'", name, "--unset-env-var");
|
||||
|
||||
if (setVars.contains(name))
|
||||
throw UsageError(
|
||||
"Duplicate definition of environment variable '%s' with '%s' is ambiguous", name, "--set-env-var");
|
||||
|
||||
setVars.insert_or_assign(name, value);
|
||||
}},
|
||||
});
|
||||
}
|
||||
|
||||
void MixEnvironment::setEnviron() {
|
||||
if (ignoreEnvironment) {
|
||||
if (!unset.empty())
|
||||
throw UsageError("--unset does not make sense with --ignore-environment");
|
||||
void MixEnvironment::setEnviron()
|
||||
{
|
||||
if (ignoreEnvironment && !unsetVars.empty())
|
||||
throw UsageError("--unset-env-var does not make sense with --ignore-env");
|
||||
|
||||
for (const auto & var : keep) {
|
||||
auto val = getenv(var.c_str());
|
||||
if (val) stringsEnv.emplace_back(fmt("%s=%s", var.c_str(), val));
|
||||
}
|
||||
if (!ignoreEnvironment && !keepVars.empty())
|
||||
throw UsageError("--keep-env-var does not make sense without --ignore-env");
|
||||
|
||||
vectorEnv = stringsToCharPtrs(stringsEnv);
|
||||
environ = vectorEnv.data();
|
||||
} else {
|
||||
if (!keep.empty())
|
||||
throw UsageError("--keep does not make sense without --ignore-environment");
|
||||
auto env = getEnv();
|
||||
|
||||
for (const auto & var : unset)
|
||||
unsetenv(var.c_str());
|
||||
if (ignoreEnvironment)
|
||||
std::erase_if(env, [&](const auto & var) { return !keepVars.contains(var.first); });
|
||||
|
||||
for (const auto & [name, value] : setVars)
|
||||
env[name] = value;
|
||||
|
||||
if (!unsetVars.empty())
|
||||
std::erase_if(env, [&](const auto & var) { return unsetVars.contains(var.first); });
|
||||
|
||||
replaceEnv(env);
|
||||
|
||||
return;
|
||||
}
|
||||
|
||||
void createOutLinks(const std::filesystem::path & outLink, const BuiltPaths & buildables, LocalFSStore & store)
|
||||
{
|
||||
for (const auto & [_i, buildable] : enumerate(buildables)) {
|
||||
auto i = _i;
|
||||
std::visit(
|
||||
overloaded{
|
||||
[&](const BuiltPath::Opaque & bo) {
|
||||
auto symlink = outLink;
|
||||
if (i)
|
||||
symlink += fmt("-%d", i);
|
||||
store.addPermRoot(bo.path, absPath(symlink.string()));
|
||||
},
|
||||
[&](const BuiltPath::Built & bfd) {
|
||||
for (auto & output : bfd.outputs) {
|
||||
auto symlink = outLink;
|
||||
if (i)
|
||||
symlink += fmt("-%d", i);
|
||||
if (output.first != "out")
|
||||
symlink += fmt("-%s", output.first);
|
||||
store.addPermRoot(output.second, absPath(symlink.string()));
|
||||
}
|
||||
},
|
||||
},
|
||||
buildable.raw());
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -13,18 +13,20 @@ namespace nix {
|
||||
|
||||
extern std::string programPath;
|
||||
|
||||
extern char * * savedArgv;
|
||||
extern char ** savedArgv;
|
||||
|
||||
class EvalState;
|
||||
struct Pos;
|
||||
class Store;
|
||||
class LocalFSStore;
|
||||
|
||||
static constexpr Command::Category catHelp = -1;
|
||||
static constexpr Command::Category catSecondary = 100;
|
||||
static constexpr Command::Category catUtility = 101;
|
||||
static constexpr Command::Category catNixInstallation = 102;
|
||||
|
||||
static constexpr auto installablesCategory = "Options that change the interpretation of [installables](@docroot@/command-ref/new-cli/nix.md#installables)";
|
||||
static constexpr auto installablesCategory =
|
||||
"Options that change the interpretation of [installables](@docroot@/command-ref/new-cli/nix.md#installables)";
|
||||
|
||||
struct NixMultiCommand : MultiCommand, virtual Command
|
||||
{
|
||||
@ -45,7 +47,20 @@ struct StoreCommand : virtual Command
|
||||
{
|
||||
StoreCommand();
|
||||
void run() override;
|
||||
|
||||
/**
|
||||
* Return the default Nix store.
|
||||
*/
|
||||
ref<Store> getStore();
|
||||
|
||||
/**
|
||||
* Return the destination Nix store.
|
||||
*/
|
||||
virtual ref<Store> getDstStore()
|
||||
{
|
||||
return getStore();
|
||||
}
|
||||
|
||||
virtual ref<Store> createStore();
|
||||
/**
|
||||
* Main entry point, with a `Store` provided
|
||||
@ -68,7 +83,7 @@ struct CopyCommand : virtual StoreCommand
|
||||
|
||||
ref<Store> createStore() override;
|
||||
|
||||
ref<Store> getDstStore();
|
||||
ref<Store> getDstStore() override;
|
||||
};
|
||||
|
||||
/**
|
||||
@ -112,7 +127,9 @@ struct MixFlakeOptions : virtual Args, EvalCommand
|
||||
* arguments) so that the completions for these flags can use them.
|
||||
*/
|
||||
virtual std::vector<FlakeRef> getFlakeRefsForCompletion()
|
||||
{ return {}; }
|
||||
{
|
||||
return {};
|
||||
}
|
||||
};
|
||||
|
||||
struct SourceExprCommand : virtual Args, MixFlakeOptions
|
||||
@ -122,11 +139,9 @@ struct SourceExprCommand : virtual Args, MixFlakeOptions
|
||||
|
||||
SourceExprCommand();
|
||||
|
||||
Installables parseInstallables(
|
||||
ref<Store> store, std::vector<std::string> ss);
|
||||
Installables parseInstallables(ref<Store> store, std::vector<std::string> ss);
|
||||
|
||||
ref<Installable> parseInstallable(
|
||||
ref<Store> store, const std::string & installable);
|
||||
ref<Installable> parseInstallable(ref<Store> store, const std::string & installable);
|
||||
|
||||
virtual Strings getDefaultFlakeAttrPaths();
|
||||
|
||||
@ -238,7 +253,7 @@ public:
|
||||
|
||||
BuiltPathsCommand(bool recursive = false);
|
||||
|
||||
virtual void run(ref<Store> store, BuiltPaths && paths) = 0;
|
||||
virtual void run(ref<Store> store, BuiltPaths && allPaths, BuiltPaths && rootPaths) = 0;
|
||||
|
||||
void run(ref<Store> store, Installables && installables) override;
|
||||
|
||||
@ -251,7 +266,7 @@ struct StorePathsCommand : public BuiltPathsCommand
|
||||
|
||||
virtual void run(ref<Store> store, StorePaths && storePaths) = 0;
|
||||
|
||||
void run(ref<Store> store, BuiltPaths && paths) override;
|
||||
void run(ref<Store> store, BuiltPaths && allPaths, BuiltPaths && rootPaths) override;
|
||||
};
|
||||
|
||||
/**
|
||||
@ -272,10 +287,10 @@ struct RegisterCommand
|
||||
typedef std::map<std::vector<std::string>, std::function<ref<Command>()>> Commands;
|
||||
static Commands * commands;
|
||||
|
||||
RegisterCommand(std::vector<std::string> && name,
|
||||
std::function<ref<Command>()> command)
|
||||
RegisterCommand(std::vector<std::string> && name, std::function<ref<Command>()> command)
|
||||
{
|
||||
if (!commands) commands = new Commands;
|
||||
if (!commands)
|
||||
commands = new Commands;
|
||||
commands->emplace(name, command);
|
||||
}
|
||||
|
||||
@ -285,13 +300,13 @@ struct RegisterCommand
|
||||
template<class T>
|
||||
static RegisterCommand registerCommand(const std::string & name)
|
||||
{
|
||||
return RegisterCommand({name}, [](){ return make_ref<T>(); });
|
||||
return RegisterCommand({name}, []() { return make_ref<T>(); });
|
||||
}
|
||||
|
||||
template<class T>
|
||||
static RegisterCommand registerCommand2(std::vector<std::string> && name)
|
||||
{
|
||||
return RegisterCommand(std::move(name), [](){ return make_ref<T>(); });
|
||||
return RegisterCommand(std::move(name), []() { return make_ref<T>(); });
|
||||
}
|
||||
|
||||
struct MixProfile : virtual StoreCommand
|
||||
@ -313,19 +328,21 @@ struct MixDefaultProfile : MixProfile
|
||||
MixDefaultProfile();
|
||||
};
|
||||
|
||||
struct MixEnvironment : virtual Args {
|
||||
struct MixEnvironment : virtual Args
|
||||
{
|
||||
|
||||
StringSet keep, unset;
|
||||
Strings stringsEnv;
|
||||
std::vector<char*> vectorEnv;
|
||||
StringSet keepVars;
|
||||
StringSet unsetVars;
|
||||
std::map<std::string, std::string> setVars;
|
||||
bool ignoreEnvironment;
|
||||
|
||||
MixEnvironment();
|
||||
|
||||
/***
|
||||
* Modify global environ based on `ignoreEnvironment`, `keep`, and
|
||||
* `unset`. It's expected that exec will be called before this class
|
||||
* goes out of scope, otherwise `environ` will become invalid.
|
||||
* Modify global environ based on `ignoreEnvironment`, `keep`,
|
||||
* `unset`, and `added`. It's expected that exec will be called
|
||||
* before this class goes out of scope, otherwise `environ` will
|
||||
* become invalid.
|
||||
*/
|
||||
void setEnviron();
|
||||
};
|
||||
@ -349,9 +366,12 @@ void completeFlakeRefWithFragment(
|
||||
std::string showVersions(const std::set<std::string> & versions);
|
||||
|
||||
void printClosureDiff(
|
||||
ref<Store> store,
|
||||
const StorePath & beforePath,
|
||||
const StorePath & afterPath,
|
||||
std::string_view indent);
|
||||
ref<Store> store, const StorePath & beforePath, const StorePath & afterPath, std::string_view indent);
|
||||
|
||||
/**
|
||||
* Create symlinks prefixed by `outLink` to the store paths in
|
||||
* `buildables`.
|
||||
*/
|
||||
void createOutLinks(const std::filesystem::path & outLink, const BuiltPaths & buildables, LocalFSStore & store);
|
||||
|
||||
}
|
||||
|
@ -29,13 +29,13 @@ EvalSettings evalSettings {
|
||||
{
|
||||
{
|
||||
"flake",
|
||||
[](ref<Store> store, std::string_view rest) {
|
||||
[](EvalState & state, std::string_view rest) {
|
||||
experimentalFeatureSettings.require(Xp::Flakes);
|
||||
// FIXME `parseFlakeRef` should take a `std::string_view`.
|
||||
auto flakeRef = parseFlakeRef(fetchSettings, std::string { rest }, {}, true, false);
|
||||
debug("fetching flake search path element '%s''", rest);
|
||||
auto storePath = flakeRef.resolve(store).fetchTree(store).first;
|
||||
return store->toRealPath(storePath);
|
||||
auto storePath = flakeRef.resolve(state.store).fetchTree(state.store).first;
|
||||
return state.rootPath(state.store->toRealPath(storePath));
|
||||
},
|
||||
},
|
||||
},
|
||||
|
@ -32,16 +32,6 @@ InstallableDerivedPath InstallableDerivedPath::parse(
|
||||
// store path.
|
||||
[&](const ExtendedOutputsSpec::Default &) -> DerivedPath {
|
||||
auto storePath = store->followLinksToStorePath(prefix);
|
||||
// Remove this prior to stabilizing the new CLI.
|
||||
if (storePath.isDerivation()) {
|
||||
auto oldDerivedPath = DerivedPath::Built {
|
||||
.drvPath = makeConstantStorePathRef(storePath),
|
||||
.outputs = OutputsSpec::All { },
|
||||
};
|
||||
warn(
|
||||
"The interpretation of store paths arguments ending in `.drv` recently changed. If this command is now failing try again with '%s'",
|
||||
oldDerivedPath.to_string(*store));
|
||||
};
|
||||
return DerivedPath::Opaque {
|
||||
.path = std::move(storePath),
|
||||
};
|
||||
|
@ -26,7 +26,7 @@ struct ExtraPathInfoFlake : ExtraPathInfoValue
|
||||
Flake flake;
|
||||
|
||||
ExtraPathInfoFlake(Value && v, Flake && f)
|
||||
: ExtraPathInfoValue(std::move(v)), flake(f)
|
||||
: ExtraPathInfoValue(std::move(v)), flake(std::move(f))
|
||||
{ }
|
||||
};
|
||||
|
||||
|
@ -59,7 +59,7 @@ struct ExtraPathInfoValue : ExtraPathInfo
|
||||
Value value;
|
||||
|
||||
ExtraPathInfoValue(Value && v)
|
||||
: value(v)
|
||||
: value(std::move(v))
|
||||
{ }
|
||||
|
||||
virtual ~ExtraPathInfoValue() = default;
|
||||
|
@ -857,6 +857,7 @@ std::vector<FlakeRef> RawInstallablesCommand::getFlakeRefsForCompletion()
|
||||
{
|
||||
applyDefaultInstallables(rawInstallables);
|
||||
std::vector<FlakeRef> res;
|
||||
res.reserve(rawInstallables.size());
|
||||
for (auto i : rawInstallables)
|
||||
res.push_back(parseFlakeRefWithFragment(
|
||||
fetchSettings,
|
||||
@ -917,4 +918,12 @@ void BuiltPathsCommand::applyDefaultInstallables(std::vector<std::string> & rawI
|
||||
rawInstallables.push_back(".");
|
||||
}
|
||||
|
||||
BuiltPaths toBuiltPaths(const std::vector<BuiltPathWithResult> & builtPathsWithResult)
|
||||
{
|
||||
BuiltPaths res;
|
||||
for (auto & i : builtPathsWithResult)
|
||||
res.push_back(i.path);
|
||||
return res;
|
||||
}
|
||||
|
||||
}
|
||||
|
@ -86,6 +86,8 @@ struct BuiltPathWithResult
|
||||
std::optional<BuildResult> result;
|
||||
};
|
||||
|
||||
BuiltPaths toBuiltPaths(const std::vector<BuiltPathWithResult> & builtPathsWithResult);
|
||||
|
||||
/**
|
||||
* Shorthand, for less typing and helping us keep the choice of
|
||||
* collection in sync.
|
||||
|
@ -1,15 +0,0 @@
|
||||
libraries += libcmd
|
||||
|
||||
libcmd_NAME = libnixcmd
|
||||
|
||||
libcmd_DIR := $(d)
|
||||
|
||||
libcmd_SOURCES := $(wildcard $(d)/*.cc)
|
||||
|
||||
libcmd_CXXFLAGS += $(INCLUDE_libutil) $(INCLUDE_libstore) $(INCLUDE_libfetchers) $(INCLUDE_libexpr) $(INCLUDE_libflake) $(INCLUDE_libmain)
|
||||
|
||||
libcmd_LDFLAGS = $(EDITLINE_LIBS) $(LOWDOWN_LIBS) $(THREAD_LDFLAGS)
|
||||
|
||||
libcmd_LIBS = libutil libstore libfetchers libflake libexpr libmain
|
||||
|
||||
$(eval $(call install-file-in, $(buildprefix)$(d)/nix-cmd.pc, $(libdir)/pkgconfig, 0644))
|
@ -1,9 +0,0 @@
|
||||
prefix=@prefix@
|
||||
libdir=@libdir@
|
||||
includedir=@includedir@
|
||||
|
||||
Name: Nix
|
||||
Description: Nix Package Manager
|
||||
Version: @PACKAGE_VERSION@
|
||||
Libs: -L${libdir} -lnixcmd
|
||||
Cflags: -I${includedir}/nix -std=c++2a
|
@ -1,25 +0,0 @@
|
||||
libraries += libexprc
|
||||
|
||||
libexprc_NAME = libnixexprc
|
||||
|
||||
libexprc_DIR := $(d)
|
||||
|
||||
libexprc_SOURCES := \
|
||||
$(wildcard $(d)/*.cc) \
|
||||
|
||||
# Not just for this library itself, but also for downstream libraries using this library
|
||||
|
||||
INCLUDE_libexprc := -I $(d)
|
||||
libexprc_CXXFLAGS += $(INCLUDE_libutil) $(INCLUDE_libutilc) \
|
||||
$(INCLUDE_libfetchers) \
|
||||
$(INCLUDE_libstore) $(INCLUDE_libstorec) \
|
||||
$(INCLUDE_libexpr) $(INCLUDE_libexprc)
|
||||
|
||||
libexprc_LIBS = libutil libutilc libstore libstorec libfetchers libexpr
|
||||
|
||||
libexprc_LDFLAGS += $(THREAD_LDFLAGS)
|
||||
|
||||
$(eval $(call install-file-in, $(d)/nix-expr-c.pc, $(libdir)/pkgconfig, 0644))
|
||||
|
||||
libexprc_FORCE_INSTALL := 1
|
||||
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue
Block a user