musl and darwin support UTF-8 locales without any extras. As a result
unzip can unpack UTF-8 filenames there as is. But on glibc without
locale archive presence files get mangled as:
deps/αβ -> deps/#U03b1#U03b2
This makes `fetchzip` fixed-output derivations unstable.
Tested this change to fail in `coq.src` which was generated in system
that mangles UTF-8 symbols:
$ nix build -f. coq.src --rebuild -L
source> trying https://github.com/coq/coq/archive/V8.15.2.zip
source> % Total % Received % Xferd Average Speed Time Time Time Current
source> Dload Upload Total Spent Left Speed
source> 0 0 0 0 0 0 0 0 --:--:-- 0:00:01 --:--:-- 0
source> 100 8945k 100 8945k 0 0 1513k 0 0:00:05 0:00:05 --:--:-- 1989k
source> unpacking source archive /build/V8.15.2.zip
error: hash mismatch in fixed-output derivation '/nix/store/hrnyykm7wgw8vxisgq7hc2bg5gr0y6s8-source.drv':
specified: sha256-h81nFqkuvZkMR7YLHy7laTq5yOhjMW+w6rYzncxvyD4=
got: sha256-DTspmwyD3Evl1CUmvUy2MonbLGUezvsHN3prmP9eK2I=
Note: it means that some of existing caches for fixed output
derivations become incorrect. It should not break already cached
tarballs on cache.nixos.org thus the impact should not be widespread.
fetchzip downloads the file from specified URL, renames it to basename
of that url, and then relies on unzip to do the unpacking.
The first consequence is that this requires URL to end with proper
extension—otherwise it will fail to unpack. This is not always the
case and input-fonts workarounds this by adding “&.zip” query
parameter (which is obviously a hack and is not guaranteed to work
with every URL).
The second consequence is that basename of the url must be a valid
filename. I’ve tried to build a custom configuration of input-fonts
and I get an error from mv that the filename is too long:
> trying https://input.djr.com/build/?fontSelection=fourStyleFamily®ular=InputMonoNarrow-Regular&italic=InputMonoNarrow-Italic&bold=InputMonoNarrow-Bold&boldItalic=InputMonoNarrow-BoldItalic&a=0&g=0&i=topserif&l=serifs_round&zero=0&asterisk=height&braces=straight&preset=default&line-height=1.2&accept=I+do&email=&.zip
> % Total % Received % Xferd Average Speed Time Time Time Current
> Dload Upload Total Spent Left Speed
> 100 406k 100 406k 0 0 230k 0 0:00:01 0:00:01 --:--:-- 230k
> mv: failed to access '/build/?fontSelection=fourStyleFamily®ular=InputMonoNarrow-Regular&italic=InputMonoNarrow-Italic&bold=InputMonoNarrow-Bold&boldItalic=InputMonoNarrow-BoldItalic&a=0&g=0&i=topserif&l=serifs_round&zero=0&asterisk=height&braces=straight&preset=default&line-height=1.2&accept=I+do&email=&.zip': File name too long
We could use “name” parameter as the filename (that’s how it is used
in fetchurl). However, the previous attempt to do
so (fc01353703) was
reverted (24b5eb61eb) because of the
introduced regression—many fetchzip invocations use names without
extension (also the default name is just “source”).
This commit adds an optional “extension” parameter. If it is set,
fetchzip renames the downloaded file to “download.${extension}”
effectively solving both problems above without introducing a massive
regression.
This is a no-op for all existing packages.
Tested by updating my NixOS setup + the extra inputs-fonts
configuration mentioned above + tons of unstable emacs packages after
a nix-collect-garbage (3Gb downloaded) with this patch applied.
4a5c49363a added some more commands after
`extraPostFetch` but concatenated them without a separating newline.
Which means, that since that commit
fetchzip { ..., extraPostFetch = ''rm -f "$out"/some-file''; }
now actually runs the following shell command
rm -f "$out"/some-file"chmod -R a-w "$out"
thus deleting "$out". Which is very unfortunate.
Especially since this actually happens on master for all `fetchFromBitbucket`
derivations. But since the results are fixed-output users bulding with hydra
cache enabled are not hitting this for not recently updated derivations yet.
This is a workaround for NixOS/nix#4295, which caused single-user Linux
Nix installations using sandboxed builds to start failing to build
fetchzip derivations after 4a5c49363a.
In short: removing write permissions for the entire directory is great,
except we then can't rename(2) it to the final Nix store path out of the
sandbox, because we don't have write permission on the directory and
thus cannot update the ".." directory entry.
This script is not needed anymore since "nix-prefetch-url --unpack
<url>" and "nix-prefetch-url -A foo.src" (where "foo.src" is a
fetchzip / fetchFromGitHub call) work fine.
Also fix the hash in goPackages.inflect, the only user of the fetcher ATM.
Closes #12002 (different `inflect` fix), fixes #12012.
Using fetchzip-derived functions is likely more efficient than fetchhg,
and it's lighter on dependencies (hash is the same as with fetchhg in this case).
This resolves a regression introduced in fc01353703, where providing a
name without a proper extension breaks existing uses of fetchzip (they
now fail to unpack). Of particular note, that commit broke all uses of
fetchFromGitHub because it uses a name like so: "${repo}-${rev}-src"
Fixes #5954
This fixes:
* Passing stripRoot.
* Archives containing a single file.
* Archives where the root folder has the same name as one of it's children.
Fixes #5851
If $src refers to a directory, then always copy it. Previously, we
checked the extension first, so if the directory had an extension like
.tar, unpackPhase would fail.
This function downloads and unpacks a file in one fixed-output
derivation. This is primarily useful for dynamically generated zip
files, such as GitHub's /archive URLs, where the unpacked content of
the zip file doesn't change, but the zip file itself may (e.g. due to
minor changes in the compression algorithm, or changes in timestamps).
Fetchzip is implemented by extending fetchurl with a "postFetch" hook
that is executed after the file has been downloaded. This hook can
thus perform arbitrary checks or transformations on the downloaded
file.