Commit graph

  • 0f828ea441 Do not limit API updates/second oobabooga 2023-12-04 20:45:06 -08:00
  • af261e5dd4
    Merge pull request #4815 from oobabooga/dev oobabooga 2023-12-05 01:30:57 -03:00
  • 9edb193def
    Optimize HF text generation (#4814) oobabooga 2023-12-05 00:00:40 -03:00
  • 1ccbcb967e
    Merge pull request #4811 from oobabooga/dev oobabooga 2023-12-04 21:29:45 -03:00
  • ac9f154bcc
    Bump exllamav2 from 0.0.8 to 0.0.10 & Fix code change (#4782) 俞航 2023-12-05 00:15:05 +00:00
  • 131a5212ce UI: update context upper limit to 200000 oobabooga 2023-12-04 15:48:34 -08:00
  • f7145544f9 Update README oobabooga 2023-12-04 15:44:44 -08:00
  • 8e1f86a866 Merge remote-tracking branch 'refs/remotes/origin/dev' into dev oobabooga 2023-12-04 15:41:56 -08:00
  • be88b072e9 Update --loader flag description oobabooga 2023-12-04 15:41:25 -08:00
  • 801ba87c68
    Update accelerate requirement from ==0.24.* to ==0.25.* (#4810) dependabot[bot] 2023-12-04 20:36:01 -03:00
  • 7fc9033b2e Recommend ExLlama_HF and ExLlamav2_HF oobabooga 2023-12-04 15:28:46 -08:00
  • e4e35f357b
    Merge pull request #4807 from oobabooga/dev oobabooga 2023-12-04 12:28:34 -03:00
  • 3f993280e4 Minor changes oobabooga 2023-12-04 07:27:44 -08:00
  • 0931ed501b Minor changes oobabooga 2023-12-04 07:25:18 -08:00
  • 427a165597 Bump TTS version in coqui_tts oobabooga 2023-12-04 07:21:56 -08:00
  • 0bfd5090be
    Import accelerate very early to make Intel GPU happy (#4704) Song Fuchang 2023-12-04 09:51:18 +08:00
  • 2e83844f35
    Bump safetensors from 0.4.0 to 0.4.1 (#4750) dependabot[bot] 2023-12-03 22:50:10 -03:00
  • 06cc9a85f7
    README: minor typo fix (#4793) Ikko Eltociear Ashimine 2023-12-04 10:46:34 +09:00
  • 7c0a17962d
    Gallery improvements (#4789) Lounger 2023-12-04 02:45:50 +01:00
  • 96df4f10b9
    Merge pull request #4777 from oobabooga/dev oobabooga 2023-12-01 00:00:17 -03:00
  • 77d6ccf12b Add a LOADER debug message while loading models oobabooga 2023-11-30 12:00:32 -08:00
  • 1c90e02243 Update Colab-TextGen-GPU.ipynb oobabooga 2023-11-30 11:55:18 -08:00
  • 092a2c3516 Fix a bug in llama.cpp get_logits() function oobabooga 2023-11-30 11:21:40 -08:00
  • 6d3a9b8689
    Merge pull request #4773 from oobabooga/dev oobabooga 2023-11-30 02:31:37 -03:00
  • 000b77a17d Minor docker changes oobabooga 2023-11-29 21:27:23 -08:00
  • 88620c6b39
    feature/docker_improvements (#4768) Callum 2023-11-30 05:20:23 +00:00
  • 2698d7c9fd Fix llama.cpp model unloading oobabooga 2023-11-29 15:19:48 -08:00
  • fa89d305e3 Merge remote-tracking branch 'refs/remotes/origin/dev' into dev oobabooga 2023-11-29 15:13:17 -08:00
  • 9940ed9c77 Sort the loaders oobabooga 2023-11-29 15:13:03 -08:00
  • 78fd7f6aa8
    Fixed naming for sentence-transformers library (#4764) Manu Kashyap 2023-11-29 20:45:03 +05:30
  • a7670c31ca Sort oobabooga 2023-11-28 18:43:33 -08:00
  • 6e51bae2e0 Sort the loaders menu oobabooga 2023-11-28 18:41:11 -08:00
  • f4b956b47c Detect yi instruction template oobabooga 2023-11-25 06:33:24 -08:00
  • 68059d7c23 llama.cpp: minor log change & lint oobabooga 2023-11-25 06:33:37 -08:00
  • 1b05832f9a
    Add direnv artifacts to gitignore (#4737) Denis Iskandarov 2023-11-27 22:43:42 +04:00
  • b5b3d18773
    resonable cli args for docker container (#4727) xr4dsh 2023-11-27 19:43:01 +01:00
  • 9f7ae6bb2e
    fix detection of stopping strings when HTML escaping is used (#4728) tsukanov-as 2023-11-27 21:42:08 +03:00
  • d06ce7b75c
    add openhermes mistral support (#4730) Eve 2023-11-27 18:41:06 +00:00
  • b6d16a35b1 Minor API fix oobabooga 2023-11-21 17:56:28 -08:00
  • 51add248c8
    Merge pull request #4702 from oobabooga/dev oobabooga 2023-11-21 21:18:27 -03:00
  • cb0dbffccc Merge branch 'main' into dev oobabooga 2023-11-21 16:12:45 -08:00
  • 8d811a4d58 one-click: move on instead of crashing if extension fails to install oobabooga 2023-11-21 16:00:48 -08:00
  • 0589ff5b12
    Bump llama-cpp-python to 0.2.19 & add min_p and typical_p parameters to llama.cpp loader (#4701) oobabooga 2023-11-21 20:59:39 -03:00
  • 2769a1fa25 Hide deprecated args from Session tab oobabooga 2023-11-21 15:15:16 -08:00
  • 0047d9f5e0 Do not install coqui_tts requirements by default oobabooga 2023-11-21 15:12:54 -08:00
  • fb124ab6e2 Bump to flash-attention 2.3.4 + switch to Github Actions wheels on Windows (#4700) oobabooga 2023-11-21 20:06:56 -03:00
  • e9cdaa2ada
    Bump to flash-attention 2.3.4 + switch to Github Actions wheels on Windows (#4700) oobabooga 2023-11-21 20:06:56 -03:00
  • b81d6ad8a4
    Detect Orca 2 template (#4697) oobabooga 2023-11-21 15:26:42 -03:00
  • 360eeb9ff1
    Merge pull request #4686 from oobabooga/dev oobabooga 2023-11-21 08:38:50 -03:00
  • 54a4eb60a3
    Remove --no-dependencies from TTS installation command oobabooga 2023-11-21 08:30:50 -03:00
  • efdd99623c
    Merge pull request #4683 from oobabooga/dev oobabooga 2023-11-21 00:36:58 -03:00
  • b02dc4dc0d Add --no-dependencies to TTS installation command oobabooga 2023-11-20 19:02:12 -08:00
  • 55f2a3643b Update multimodal API example oobabooga 2023-11-20 18:41:09 -08:00
  • 829c6d4f78 Add "remove_trailing_dots" option to XTTSv2 oobabooga 2023-11-20 18:33:29 -08:00
  • 8dc9ec3491
    add XTTSv2 (coqui_tts extension) (#4673) kanttouchthis 2023-11-21 02:37:52 +01:00
  • ff24648510 Credit llama-cpp-python in the README oobabooga 2023-11-20 12:13:15 -08:00
  • be78d79811 Revert accidental noavx2 changes oobabooga 2023-11-20 11:48:04 -08:00
  • 4b84e45116 Use +cpuavx2 instead of +cpuavx oobabooga 2023-11-20 11:46:03 -08:00
  • d7f1bc102b
    Fix "Illegal instruction" bug in llama.cpp CPU only version (#4677) oobabooga 2023-11-20 16:36:38 -03:00
  • 5e70263e25
    docker: install xformers with sepcific cuda version, matching the docker image. (#4670) drew9781 2023-11-19 18:43:15 -06:00
  • f11092ac2a
    Merge pull request #4664 from oobabooga/dev oobabooga 2023-11-19 15:12:55 -03:00
  • f0d66cf817 Add missing file oobabooga 2023-11-19 10:12:13 -08:00
  • 22e7a22d1e
    Merge pull request #4662 from oobabooga/dev oobabooga 2023-11-19 14:23:19 -03:00
  • a2e6d00128 Use convert_ids_to_tokens instead of decode in logits endpoint oobabooga 2023-11-19 09:22:08 -08:00
  • d1bba48a83
    Merge pull request #4660 from oobabooga/dev oobabooga 2023-11-19 13:32:08 -03:00
  • 8cf05c1b31 Fix disappearing character gallery oobabooga 2023-11-19 08:31:01 -08:00
  • 9da7bb203d Minor LoRA bug fix oobabooga 2023-11-19 07:59:29 -08:00
  • 78af3b0a00 Update docs/What Works.md oobabooga 2023-11-19 07:57:16 -08:00
  • a6f1e1bcc5 Fix PEFT LoRA unloading oobabooga 2023-11-19 07:55:25 -08:00
  • a290d17386 Add hover cursor to bot pfp oobabooga 2023-11-19 06:53:41 -08:00
  • ab94f0d9bf Minor style change oobabooga 2023-11-18 21:11:04 -08:00
  • 5fcee696ea
    New feature: enlarge character pictures on click (#4654) oobabooga 2023-11-19 02:05:17 -03:00
  • cb836dd49c
    fix: use shared chat-instruct_command with api (#4653) Jordan Tucker 2023-11-18 22:19:10 -06:00
  • 771e62e476
    Add /v1/internal/lora endpoints (#4652) oobabooga 2023-11-19 00:35:22 -03:00
  • ef6feedeb2
    Add --nowebui flag for pure API mode (#4651) oobabooga 2023-11-18 23:38:39 -03:00
  • 0fa1af296c
    Add /v1/internal/logits endpoint (#4650) oobabooga 2023-11-18 23:19:31 -03:00
  • 8f4f4daf8b
    Add --admin-key flag for API (#4649) oobabooga 2023-11-18 22:33:27 -03:00
  • af76fbedb8
    Openai embedding fix to support jina-embeddings-v2 (#4642) wizd 2023-11-19 07:24:29 +08:00
  • baab894759
    fix: use system message in chat-instruct mode (#4648) Jordan Tucker 2023-11-18 17:20:13 -06:00
  • 47d9e2618b Refresh the Preset menu after saving a preset oobabooga 2023-11-18 14:03:42 -08:00
  • 83b64e7fc1
    New feature: "random preset" button (#4647) oobabooga 2023-11-18 18:31:41 -03:00
  • d1a58da52f Update ancient Docker instructions oobabooga 2023-11-17 19:52:30 -08:00
  • e0ca49ed9c
    Bump llama-cpp-python to 0.2.18 (2nd attempt) (#4637) oobabooga 2023-11-18 00:31:27 -03:00
  • 3146124ec0
    Merge pull request #4632 from oobabooga/dev oobabooga 2023-11-17 10:18:31 -03:00
  • 9d6f79db74 Revert "Bump llama-cpp-python to 0.2.18 (#4611)" oobabooga 2023-11-17 05:14:25 -08:00
  • e0a7cc5e0f Simplify CORS code oobabooga 2023-11-16 20:11:55 -08:00
  • 13dc3b61da Update README oobabooga 2023-11-16 19:57:55 -08:00
  • 8b66d83aa9 Set use_fast=True by default, create --no_use_fast flag oobabooga 2023-11-16 19:45:05 -08:00
  • f889302d24
    Merge pull request #4628 from oobabooga/dev oobabooga 2023-11-16 23:47:07 -03:00
  • b2ce8dc7ee Update a message oobabooga 2023-11-16 18:46:26 -08:00
  • 0ee8d2b66b
    Merge pull request #4627 from oobabooga/dev oobabooga 2023-11-16 23:41:18 -03:00
  • 780b00e1cf Minor bug fix oobabooga 2023-11-16 18:39:39 -08:00
  • c0233bb9d3 Minor message change oobabooga 2023-11-16 18:36:28 -08:00
  • 94b7177174 Update docs/07 - Extensions oobabooga 2023-11-16 18:24:46 -08:00
  • 6525707a7f Fix "send instruction template to..." buttons (closes #4625) oobabooga 2023-11-16 18:16:42 -08:00
  • 510a01ef46 Lint oobabooga 2023-11-16 18:03:06 -08:00
  • 923c8e25fb
    Bump llama-cpp-python to 0.2.18 (#4611) oobabooga 2023-11-16 22:55:14 -03:00
  • 61f429563e
    Bump AutoAWQ to 0.1.7 (#4620) Casper 2023-11-16 21:08:08 +01:00
  • e7d460d932 Make sure that API requirements are installed oobabooga 2023-11-16 10:08:41 -08:00
  • cbf2b47476 Strip trailing "\" characters in CMD_FLAGS.txt oobabooga 2023-11-16 09:33:36 -08:00