It's impossible to pass arguments with spaces with curlOpts.
curlOptsList supports that. Passing a list to curlOpts has been
deprecated. This commit is fully backwards compatible.
We can use cacert to validate that the data passes SSL certificates.
Normally, this doesn’t happen because we already have the hash, but in
the hash = "" case we don’t.
hashed-mirrors are content addressed. So if $outputHash is in the
hashed-mirror, changes from ‘postFetch’ would already be made. So,
running postFetch will end up applying the change /again/, which we
don’t want.
It would be nice to be able to track Nix requests. It's not trustworthy,
but can be helpful for stats and routing in HTTP logs.
Since `fetchurl` is used so widely, we should "magically" get a UA on
`fetchzip`, `fetchFromGitHub`, and other related fetchers.
Since `fetchurl` is only used for fixed-output derivations, this should
cause no mass rebuild.
User-Agent example: curl/7.57.0 Nixpkgs/18.03
Fixes#27406.
Commit 5d4efb2c81 added an assertion to `stopNest'
which requires it be correctly paired with `startNest'. `fetchurl' calls
`stopNest', but never calls `startNest'; the former calls are removed.
Otherwise, if the upstream mirror changes (rather than deletes) a
file, then tarballs.nixos.org won't be used even if it has a copy of
the original file, and so we'll get a hash mismatch.
This function downloads and unpacks a file in one fixed-output
derivation. This is primarily useful for dynamically generated zip
files, such as GitHub's /archive URLs, where the unpacked content of
the zip file doesn't change, but the zip file itself may (e.g. due to
minor changes in the compression algorithm, or changes in timestamps).
Fetchzip is implemented by extending fetchurl with a "postFetch" hook
that is executed after the file has been downloaded. This hook can
thus perform arbitrary checks or transformations on the downloaded
file.
fetchurl instantiations, instead of passing the mirrors to fetchurl
instantiations via environment variables. This makes the resulting
store derivations (.drv files) much smaller, which in turn makes
nix-env/nix-instantiate faster (4.8 -> 4.2 seconds on nix-env -qa
--out-path).
svn path=/nixpkgs/trunk/; revision=12695
mirror:// sites through environment variables, e.g.
NIX_MIRRORS_gnu="ftp://ftp.nluug.nl/pub/gnu/ ftp://ftp.gnu.org/pub/gnu/"
or
NIX_MIRRORS_sourceforge="http://surfnet.dl.sourceforge.net/sourceforge/"
svn path=/nixpkgs/trunk/; revision=9302
fetchurl {
url = http://heanet.dl.sourceforge.net/sourceforge/zapping/zapping-0.9.6.tar.bz2;
md5 = "8306775c6a11de4d72345b5eee970ea6";
};
you can write
fetchurl {
url = mirror://sourceforge/zapping/zapping-0.9.6.tar.bz2;
md5 = "8306775c6a11de4d72345b5eee970ea6";
};
which causes fetchurl to try the SourceForge mirrors listed in the
`sourceforge' attribute in build-support/fetchurl/mirrors.nix.
(They're currently tried in sequence, and the lists of mirrors are
not configurable yet.)
The syntax for mirror URLs is mirror://site/path/to/file, where
`site' is currently one of `sourceforge', `gnu' (mirrors of
ftp://ftp.gnu.org/pub/gnu) and `kernel' (mirrors of
http://www.all.kernel.org/pub/).
svn path=/nixpkgs/trunk/; revision=9197
fetching a file with hash HASH of type TYPE, we first try to
download <base-url>/<type>/<hash>, where <base-url> is one of a list
of mirrors. For instance, given
src = fetchurl {
url = http://releases.mozilla.org/pub/mozilla.org/firefox/releases/2.0.0.6/source/firefox-2.0.0.6-source.tar.bz2;
sha1 = "eb72f55e4a8bf08e8c6ef227c0ade3d068ba1082";
};
and the mirror list [http://nix.cs.uu.nl/dist/tarballs], we first
try to download
eb72f55e4a
and if that fails, we use the original URL.
The list of mirrors is not yet user-configurable.
* `fetchurl' now also accepts an argument `urls' instead of `url' for
a list of alternative download locations, which fetchurl will try in
sequence.
svn path=/nixpkgs/trunk/; revision=9190
* Make builders unexecutable by removing the hash-bang line and
execute permission.
* Convert calls to `derivation' to `mkDerivation'.
* Remove `system' and `stdenv' attributes from calls to
`mkDerivation'. These transformations were all done automatically,
so it is quite possible I broke stuff.
* Put the `mkDerivation' function in stdenv/generic.
svn path=/nixpkgs/trunk/; revision=874
builders for typical Autoconf-style to be much shorten, e.g.,
. $stdenv/setup
genericBuild
The generic builder does lots of stuff automatically:
- Unpacks source archives specified by $src or $srcs (it knows about
gzip, bzip2, tar, zip, and unpacked source trees).
- Determines the source tree.
- Applies patches specified by $patches.
- Fixes libtool not to search for libraries in /lib etc.
- Runs `configure'.
- Runs `make'.
- Runs `make install'.
- Strips debug information from static libraries.
- Writes nested log information (in the format accepted by
`log2xml').
There are also lots of hooks and variables to customise the generic
builder. See `stdenv/generic/docs.txt'.
* Adapted the base packages (i.e., the ones used by stdenv) to use the
generic builder.
* We now use `curl' instead of `wget' to download files in `fetchurl'.
* Neither `curl' nor `wget' are part of stdenv. We shouldn't
encourage people to download stuff in builders (impure!).
* Updated some packages.
* `buildinputs' is now `buildInputs' (but the old name also works).
* `findInputs' in the setup script now prevents inputs from being
processed multiple times (which could happen, e.g., if an input was
a propagated input of several other inputs; this caused the size
variables like $PATH to blow up exponentially in the worst case).
* Patched GNU Make to write nested log information in the format
accepted by `log2xml'. Also, prior to writing the build command,
Make now writes a line `building X' to indicate what is being
built. This is unfortunately often obscured by the gigantic tool
invocations in many Makefiles. The actual build commands are marked
`unimportant' so that they don't clutter pages generated by
`log2html'.
svn path=/nixpkgs/trunk/; revision=845