1
0
Fork 1
mirror of https://github.com/NixOS/nixpkgs.git synced 2024-11-17 19:21:04 +00:00
Commit graph

85 commits

Author SHA1 Message Date
aleksana 571c71e6f7 treewide: migrate packages to pkgs/by-name, take 1
We are migrating packages that meet below requirements:

1. using `callPackage`
2. called path is a directory
3. overriding set is empty (`{ }`)
4. not containing path expressions other than relative path (to
makenixpkgs-vet happy)
5. not referenced by nix files outside of the directory, other
than`pkgs/top-level/all-packages.nix`
6. not referencing nix files outside of the directory
7. not referencing `default.nix` (since it's changed to `package.nix`)
8. `outPath` doesn't change after migration

The tool is here: https://github.com/Aleksanaa/by-name-migrate.
2024-11-09 20:04:51 +08:00
R. Ryantm 86d74ecd79 llama-cpp: 3829 -> 3887 2024-10-05 23:09:18 +00:00
R. Ryantm 5000f4864f llama-cpp: 3772 -> 3829 2024-09-28 06:09:03 +00:00
Artturin e0464e4788 treewide: replace stdenv.is with stdenv.hostPlatform.is
In preparation for the deprecation of `stdenv.isX`.

These shorthands are not conducive to cross-compilation because they
hide the platforms.

Darwin might get cross-compilation for which the continued usage of `stdenv.isDarwin` will get in the way

One example of why this is bad and especially affects compiler packages
https://www.github.com/NixOS/nixpkgs/pull/343059

There are too many files to go through manually but a treewide should
get users thinking when they see a `hostPlatform.isX` in a place where it
doesn't make sense.

```
fd --type f "\.nix" | xargs sd --fixed-strings "stdenv.is" "stdenv.hostPlatform.is"
fd --type f "\.nix" | xargs sd --fixed-strings "stdenv'.is" "stdenv'.hostPlatform.is"
fd --type f "\.nix" | xargs sd --fixed-strings "clangStdenv.is" "clangStdenv.hostPlatform.is"
fd --type f "\.nix" | xargs sd --fixed-strings "gccStdenv.is" "gccStdenv.hostPlatform.is"
fd --type f "\.nix" | xargs sd --fixed-strings "stdenvNoCC.is" "stdenvNoCC.hostPlatform.is"
fd --type f "\.nix" | xargs sd --fixed-strings "inherit (stdenv) is" "inherit (stdenv.hostPlatform) is"
fd --type f "\.nix" | xargs sd --fixed-strings "buildStdenv.is" "buildStdenv.hostPlatform.is"
fd --type f "\.nix" | xargs sd --fixed-strings "effectiveStdenv.is" "effectiveStdenv.hostPlatform.is"
fd --type f "\.nix" | xargs sd --fixed-strings "originalStdenv.is" "originalStdenv.hostPlatform.is"
```
2024-09-25 00:04:37 +03:00
R. Ryantm ab0db3f40d llama-cpp: 3672 -> 3772 2024-09-17 07:23:54 +00:00
Maxime Brunet c6ab3b3ae8 llm-ls: fix build with rust 1.80 and on darwin 2024-09-16 09:44:33 +02:00
nicoo 2641d97cbf pkgs/by-name: Convert hashes to SRI format
Reproduction script:
	# Bulk rewrite
	./maintainers/scripts/sha-to-sri.py pkgs/by-name
	# Revert some packages which will need manual intervention
	for n in amdvlk azure-cli cargo-profiler corefonts flatito fluxcd gist perf_data_converter protoc-gen-js solana-cli swt verible; do
		git checkout -- "pkgs/by-name/${n:0:2}/${n}"
	done
2024-09-15 11:24:31 +02:00
R. Ryantm 9dca8ad0d2 llama-cpp: 3645 -> 3672 2024-09-06 04:58:14 +00:00
R. Ryantm 0261b36199 llama-cpp: 3620 -> 3645 2024-08-30 13:52:39 +00:00
R. Ryantm 43dd0ac922 llama-cpp: 3565 -> 3620 2024-08-24 21:41:17 +00:00
Atemu 118ab33e1d llama-cpp: update description
Closes https://github.com/NixOS/nixpkgs/pull/334451
2024-08-18 01:38:13 +02:00
R. Ryantm ccc5699686 llama-cpp: 3499 -> 3565 2024-08-10 20:56:53 +00:00
Peder Bergebakken Sundt ec91af6409
Merge pull request #314132 from newAM/cleanup-xtask-binaries
treewide: cleanup xtask binaries
2024-08-03 22:26:17 +02:00
R. Ryantm 540982b4da llama-cpp: 3423 -> 3499 2024-08-01 08:30:05 +00:00
Philip Taron 402c2115f8
Merge pull request #328522 from r-ryantm/auto-update/llama-cpp
llama-cpp: 3403 -> 3423
2024-07-31 14:01:06 -07:00
Sandro 64a4b158bd
Merge pull request #326131 from nwhirschfeld/lldap-cli 2024-07-28 22:40:10 +02:00
Saksham Mittal a3748d8201
llama-cpp: add shaderc dependency for Vulkan backend 2024-07-27 18:45:59 +05:30
R. Ryantm 8ed9b3f67b llama-cpp: 3403 -> 3423 2024-07-19 20:45:08 +00:00
Redyf 4e8851bf08 llama-cpp: 3328 -> 3403 2024-07-16 17:19:53 -03:00
Niclas Hirschfeld 63f25148f4 lldap-cli: init at 0-unstable-2024-02-24 2024-07-10 17:16:17 +02:00
R. Ryantm a231b6ea37 llama-cpp: 3260 -> 3328 2024-07-07 14:03:24 +00:00
Someone d5491008d9
Merge pull request #323056 from SomeoneSerge/fix/cudaPackages/outputSpecified
cudaPackages: make getOutput work again
2024-07-03 19:19:51 +00:00
Someone Serge 82018339bd treewide: cuda: use propagatedBuildInputs, lib.getOutput 2024-07-02 01:47:19 +00:00
Lan Tian 134743c02a
llama-cpp: 3091 -> 3260 2024-06-29 02:38:46 -07:00
Jeremy Schlatter 4a2b827c71
treewide: use cmakeCudaArchitecturesString 2024-06-23 16:51:31 -07:00
R. Ryantm aee3455afd llama-cpp: 3089 -> 3091 2024-06-09 23:44:27 +00:00
Jono Chang b5331032eb llama-cpp: 3070 -> 3089
Diff: https://github.com/ggerganov/llama.cpp/compare/b3070..b3089

Changelog: https://github.com/ggerganov/llama.cpp/releases/tag/b3089
2024-06-06 08:18:00 +10:00
R. Ryantm 4d1bc27756 llama-cpp: 3015 -> 3070 2024-06-03 04:34:54 +00:00
OTABI Tomoya 9d73dc1ae5
Merge pull request #315258 from r-ryantm/auto-update/llama-cpp
llama-cpp: 2953 -> 3015
2024-06-02 10:25:13 +09:00
Peder Bergebakken Sundt aaa74081c2
Merge pull request #313525 from maxstrid/llama-cpp-rpc
llama-cpp: Add rpc and remove mpi support
2024-06-01 21:05:36 +02:00
R. Ryantm fb013003f0 llama-cpp: 2953 -> 3015 2024-05-28 05:50:51 +00:00
R. Ryantm 01d3250b56 llm-ls: 0.5.2 -> 0.5.3 2024-05-24 14:49:45 +00:00
Alex Martens dc9f57c229 llm-ls: set buildAndTestSubdir 2024-05-23 17:15:37 -07:00
Maxwell Henderson 6467f8b017 llama-cpp: Add rpc and remove mpi support
llama-cpp no longer supports mpi and rpc is the recommended alternative.
See: https://github.com/ggerganov/llama.cpp/pull/7395

Signed-off-by: Maxwell Henderson <mxwhenderson@gmail.com>
2024-05-21 17:43:47 -07:00
R. Ryantm f26e8f439c llama-cpp: 2901 -> 2953 2024-05-21 14:41:37 +00:00
R. Ryantm 12e0c85d77 llama-cpp: 2843 -> 2901 2024-05-16 10:09:34 +00:00
R. Ryantm e94a2d47cd llama-cpp: 2781 -> 2843 2024-05-11 03:35:19 +00:00
R. Ryantm ab83e3fd1e llama-cpp: 2746 -> 2781 2024-05-03 10:06:47 +00:00
Enno Richter 6aed0cc958 llama-cpp: set build_number/build_commit for version info 2024-04-30 10:21:56 +02:00
R. Ryantm 27a673ef3e llama-cpp: 2700 -> 2746 2024-04-26 18:14:07 +00:00
Roman Zakirzyanov aedebb76de llm-ls: 0.4.0 -> 0.5.2 2024-04-25 17:11:07 +03:00
R. Ryantm fdcc5233d8 llama-cpp: 2674 -> 2700 2024-04-21 10:39:42 +00:00
R. Ryantm 1897af2d37 llama-cpp: 2636 -> 2674 2024-04-14 23:18:05 +00:00
R. Ryantm ef16276bb6 llama-cpp: 2589 -> 2636 2024-04-09 21:33:46 +00:00
R. Ryantm 90891cd009 llama-cpp: 2568 -> 2589 2024-04-04 15:32:57 +00:00
Jonathan Ringer 65c4c21a2b llama-cpp: use pkgs.autoAddDriverRunpath 2024-03-31 10:15:47 -07:00
Joseph Stahl a06a03ed7c
llama-cpp: update from b2481 to b2568 2024-03-28 20:56:55 -04:00
Joseph Stahl e1ef3aaacc
llama-cpp: embed (don't pre-compile) metal shaders
port of https://github.com/ggerganov/llama.cpp/pull/6118, although compiling shaders with XCode disabled as it requires disabling sandbox (and only works on MacOS anyways)
2024-03-26 14:01:29 -04:00
Joseph Stahl 7aa588cc96
llama-cpp: rename cuBLAS to CUDA
Matches change from upstream 280345968d
2024-03-26 13:54:30 -04:00
Christian Kögler 2af438f836
llama-cpp: fix blasSupport (#298567)
* llama-cpp: fix blasSupport

* llama-cpp: switch from openblas to blas
2024-03-25 18:55:45 +01:00