Commit graph

  • 40bbd53640
    Add custom prompt format for SD API pictures (#1964) Minecrafter20 2023-06-27 15:49:18 -05:00
  • cb029cf65f
    Get SD samplers from API (#2889) missionfloyd 2023-06-27 14:31:54 -06:00
  • d7a7f7896b
    Add SD checkpoint selection in sd_api_pictures (#2872) GuizzyQC 2023-06-27 16:29:27 -04:00
  • 7611978f7b
    Add Community section to README oobabooga 2023-06-27 13:56:14 -03:00
  • 22d455b072 Add LoRA support to ExLlama_HF oobabooga 2023-06-26 00:10:13 -03:00
  • b7c627f9a0 Set UI defaults oobabooga 2023-06-25 22:55:43 -03:00
  • c52290de50
    ExLlama with long context (#2875) oobabooga 2023-06-25 22:49:26 -03:00
  • 9290c6236f Keep ExLlama_HF if already selected oobabooga 2023-06-25 19:06:28 -03:00
  • 75fd763f99 Fix chat saving issue (closes #2863) oobabooga 2023-06-25 18:14:57 -03:00
  • 21c189112c
    Several Training Enhancements (#2868) FartyPants 2023-06-25 14:34:46 -04:00
  • 95212edf1f
    Update training.py oobabooga 2023-06-25 12:13:15 -03:00
  • 1f5ea451c9 Merge remote-tracking branch 'refs/remotes/origin/main' oobabooga 2023-06-25 02:14:19 -03:00
  • f31281a8de Fix loading instruction templates containing literal '\n' oobabooga 2023-06-25 02:13:26 -03:00
  • 68ae5d8262
    more models: +orca_mini (#2859) matatonic 2023-06-25 00:54:53 -04:00
  • f0fcd1f697 Sort some imports oobabooga 2023-06-25 01:44:36 -03:00
  • 365b672531 Minor change to prevent future bugs oobabooga 2023-06-25 01:38:54 -03:00
  • e6e5f546b8 Reorganize Chat settings tab oobabooga 2023-06-25 01:10:20 -03:00
  • b45baeea41
    extensions/openai: Major docs update, fix #2852 (critical bug), minor improvements (#2849) matatonic 2023-06-24 21:50:04 -04:00
  • ebfcfa41f2
    Update ExLlama.md oobabooga 2023-06-24 20:25:34 -03:00
  • bef67af23c
    Use pre-compiled python module for ExLlama (#2770) jllllll 2023-06-24 18:24:17 -05:00
  • a70a2ac3be
    Update ExLlama.md oobabooga 2023-06-24 20:23:01 -03:00
  • b071eb0d4b
    Clean up the presets (#2854) oobabooga 2023-06-24 18:41:17 -03:00
  • cec5fb0ef6 Failed attempt at evaluating exllama_hf perplexity oobabooga 2023-06-24 12:02:25 -03:00
  • e356f69b36
    Make stop_everything work with non-streamed generation (#2848) 快乐的我531 2023-06-24 22:19:16 +08:00
  • ec482f3dae Apply input extensions after yielding *Is typing...* oobabooga 2023-06-24 11:07:11 -03:00
  • 3e80f2aceb Apply the output extensions only once oobabooga 2023-06-24 10:59:07 -03:00
  • 77baf43f6d
    Add CORS support to the API (#2718) rizerphe 2023-06-24 16:16:06 +03:00
  • 8c36c19218
    8k size only for minotaur-15B (#2815) matatonic 2023-06-24 09:14:19 -04:00
  • 38897fbd8a
    fix: added model parameter check (#2829) Roman 2023-06-24 14:09:34 +01:00
  • eac8450ef7
    Move special character check to start script (#92) jllllll 2023-06-24 08:06:35 -05:00
  • 51a388fa34
    Organize chat history/character import menu (#2845) missionfloyd 2023-06-24 06:55:02 -06:00
  • 8bb3bb39b3
    Implement stopping string search in string space (#2847) oobabooga 2023-06-24 09:43:00 -03:00
  • 0f9088f730 Update README oobabooga 2023-06-23 12:24:43 -03:00
  • 3ae9af01aa Add --no_use_cuda_fp16 param for AutoGPTQ oobabooga 2023-06-23 12:22:56 -03:00
  • 5646690769
    Fix some models not loading on exllama_hf (#2835) Panchovix 2023-06-23 10:31:02 -04:00
  • 383c50f05b
    Replace old presets with the results of Preset Arena (#2830) oobabooga 2023-06-23 01:48:29 -03:00
  • aa1f1ef46a
    Fix printing, take two. (#2810) missionfloyd 2023-06-22 13:06:49 -06:00
  • b4a38c24b7
    Fix Multi-GPU not working on exllama_hf (#2803) Panchovix 2023-06-22 15:05:25 -04:00
  • d94ea31d54
    more models. +minotaur 8k (#2806) matatonic 2023-06-21 20:05:08 -04:00
  • 04cae3e5db
    Remove bitsandbytes compatibility workaround (#91) jllllll 2023-06-21 13:40:41 -05:00
  • 580c1ee748
    Implement a demo HF wrapper for exllama to utilize existing HF transformers decoding. (#2777) LarryVRH 2023-06-22 02:31:42 +08:00
  • a06acd6d09
    Update bitsandbytes to 0.39.1 (#2799) jllllll 2023-06-21 13:04:45 -05:00
  • 89fb6f9236
    Fixed the ZeroDivisionError when downloading a model (#2797) Gaurav Bhagchandani 2023-06-21 11:31:50 -04:00
  • 90be1d9fe1
    More models (match more) & templates (starchat-beta, tulu) (#2790) matatonic 2023-06-21 11:30:44 -04:00
  • 2661c9899a
    Format chat for printing (#2793) missionfloyd 2023-06-21 07:39:58 -06:00
  • 5dfe0bec06 Remove old/useless code oobabooga 2023-06-20 23:36:56 -03:00
  • faa92eee8d Add spaces oobabooga 2023-06-20 23:25:58 -03:00
  • b22c7199c9
    Download optimizations (#2786) Peter Sofronas 2023-06-20 22:14:18 -04:00
  • 447569e31a
    Add a download progress bar to the web UI. (#2472) Morgan Schweers 2023-06-20 18:59:14 -07:00
  • d1da22d7ee
    Fix -y from previous commit (#90) jllllll 2023-06-20 20:48:59 -05:00
  • 80a615c3ae
    Add space oobabooga 2023-06-20 22:48:45 -03:00
  • a2116e8b2b
    use uninstall -y oobabooga 2023-06-20 21:24:01 -03:00
  • c0a1baa46e
    Minor changes oobabooga 2023-06-20 20:23:21 -03:00
  • 5cbc0b28f2
    Workaround for Peft not updating their package version on the git repo (#88) jllllll 2023-06-20 18:21:10 -05:00
  • 0d0d849478
    Update Dockerfile to resolve superbooga requirement error (#2401) ramblingcoder 2023-06-20 16:31:28 -05:00
  • 9bb2fc8cd7
    Install Pytorch through pip instead of Conda (#84) jllllll 2023-06-20 14:39:23 -05:00
  • 7625c6de89
    fix usage of self in classmethod (#2781) EugeoSynthesisThirtyTwo 2023-06-20 21:18:42 +02:00
  • c40932eb39
    Added Falcon LoRA training support (#2684) MikoAL 2023-06-20 12:03:44 +08:00
  • c623e142ac Bump llama-cpp-python oobabooga 2023-06-20 00:49:38 -03:00
  • ce86f726e9
    Added saving of training logs to training_log.json (#2769) FartyPants 2023-06-19 23:47:36 -04:00
  • 017884132f Merge remote-tracking branch 'refs/remotes/origin/main' oobabooga 2023-06-20 00:46:29 -03:00
  • e1cd6cc410 Minor style change oobabooga 2023-06-20 00:46:18 -03:00
  • 59e7ecb198
    llama.cpp: implement ban_eos_token via logits_processor (#2765) Cebtenzzre 2023-06-19 20:31:19 -04:00
  • 0d9d70ec7e Update docs oobabooga 2023-06-19 12:52:23 -03:00
  • f6a602861e Update docs oobabooga 2023-06-19 12:51:16 -03:00
  • 5d4b4d15a5
    Update Using-LoRAs.md oobabooga 2023-06-19 12:43:57 -03:00
  • eb30f4441f
    Add ExLlama+LoRA support (#2756) oobabooga 2023-06-19 12:31:24 -03:00
  • a1cac88c19
    Update README.md oobabooga 2023-06-19 01:28:23 -03:00
  • 5f418f6171 Fix a memory leak (credits for the fix: Ph0rk0z) oobabooga 2023-06-19 01:19:28 -03:00
  • def3b69002
    Fix loading condition for universal llama tokenizer (#2753) ThisIsPIRI 2023-06-18 21:14:06 +00:00
  • 490a1795f0 Bump peft commit oobabooga 2023-06-18 16:42:11 -03:00
  • 09c781b16f Add modules/block_requests.py oobabooga 2023-06-18 16:31:14 -03:00
  • 687fd2604a Improve code/ul styles in chat mode oobabooga 2023-06-18 15:48:16 -03:00
  • e8588d7077 Merge remote-tracking branch 'refs/remotes/origin/main' oobabooga 2023-06-18 15:23:38 -03:00
  • 44f28830d1 Chat CSS: fix ul, li, pre styles + remove redefinitions oobabooga 2023-06-18 01:56:43 -03:00
  • 3cae1221d4
    Update exllama.py - Respect model dir parameter (#2744) Forkoz 2023-06-18 16:26:30 +00:00
  • 5b4c0155f6 Move a button oobabooga 2023-06-18 01:56:43 -03:00
  • 0686a2e75f Improve instruct colors in dark mode oobabooga 2023-06-18 01:38:32 -03:00
  • c5641b65d3 Handle leading spaces properly in ExLllama oobabooga 2023-06-17 19:32:04 -03:00
  • 1e97aaac95
    extensions/openai: docs update, model loader, minor fixes (#2557) matatonic 2023-06-17 18:15:24 -04:00
  • 2220b78e7a
    models/config.yaml: +alpacino, +alpasta, +hippogriff, +gpt4all-snoozy, +lazarus, +based, -airoboros 4k (#2580) matatonic 2023-06-17 18:14:25 -04:00
  • b1d05cbbf6
    Install exllama (#83) jllllll 2023-06-17 17:10:36 -05:00
  • 657049d7d0
    Fix cmd_macos.sh (#82) jllllll 2023-06-17 17:09:42 -05:00
  • b2483e28d1
    Check for special characters in path on Windows (#81) jllllll 2023-06-17 17:09:22 -05:00
  • 05a743d6ad Make llama.cpp use tfs parameter oobabooga 2023-06-17 19:08:25 -03:00
  • e19cbea719 Add a variable to modules/shared.py oobabooga 2023-06-17 19:02:29 -03:00
  • cbd63eeeff Fix repeated tokens with exllama oobabooga 2023-06-17 19:02:08 -03:00
  • 766c760cd7 Use gen_begin_reuse in exllama oobabooga 2023-06-17 18:00:10 -03:00
  • 239b11c94b Minor bug fixes oobabooga 2023-06-17 17:57:56 -03:00
  • d8d29edf54
    Install wheel using pip3 (#2719) Bhavika Tekwani 2023-06-16 22:46:40 -04:00
  • a1ca1c04a1
    Update ExLlama.md (#2729) Jonathan Yankovich 2023-06-16 21:46:25 -05:00
  • b27f83c0e9 Make exllama stoppable oobabooga 2023-06-16 22:03:23 -03:00
  • 7f06d551a3 Fix streaming callback oobabooga 2023-06-16 21:44:56 -03:00
  • 1e400218e9 Fix a typo oobabooga 2023-06-16 21:01:57 -03:00
  • 5f392122fd Add gpu_split param to ExLlama oobabooga 2023-06-16 20:49:36 -03:00
  • cb9be5db1c
    Update ExLlama.md oobabooga 2023-06-16 20:40:12 -03:00
  • 83be8eacf0 Minor fix oobabooga 2023-06-16 20:38:32 -03:00
  • 9f40032d32
    Add ExLlama support (#2444) oobabooga 2023-06-16 20:35:38 -03:00
  • dea43685b0 Add some clarifications oobabooga 2023-06-16 19:07:16 -03:00
  • 7ef6a50e84
    Reorganize model loading UI completely (#2720) oobabooga 2023-06-16 19:00:37 -03:00