Commit graph

  • c9d814592e Increase maximum temperature value to 5 oobabooga 2024-01-04 17:28:15 -08:00
  • 3bb4b0504e
    Close the menu on second click. (#5110) Guanghua Lu 2024-01-05 00:52:11 +08:00
  • e4d724eb3f Fix cache_folder bug introduced in 37eff915d6 oobabooga 2024-01-04 07:49:40 -08:00
  • 37eff915d6
    Use --disk-cache-dir for all caches Alberto Cano 2024-01-04 04:27:26 +01:00
  • c54d1daaaa
    Merge pull request #5163 from oobabooga/dev oobabooga 2024-01-03 22:57:00 -03:00
  • 7965f6045e
    Fix loading latest history for file names with dots (#5162) Lounger 2024-01-04 02:39:41 +01:00
  • 894e1a0700
    Docker: added build args for non AVX2 CPU (#5154) Adam Florizone 2024-01-03 18:43:02 -05:00
  • b80e6365d0
    Fix various bugs for LoRA training (#5161) AstrisCantCode 2024-01-04 00:42:20 +01:00
  • f6a204d7c9 Bump llama-cpp-python to 0.2.26 oobabooga 2024-01-03 11:06:36 -08:00
  • 3a6cba9021 Add top_k=1 to Debug-deterministic preset oobabooga 2024-01-02 15:54:56 -08:00
  • 3f28925a8d
    Merge pull request #5152 from oobabooga/dev oobabooga 2024-01-02 13:22:14 -03:00
  • 7cce88c403 Rmove an unncecessary exception oobabooga 2024-01-02 07:20:59 -08:00
  • 90c7e84b01 UI: improve chat style margin for last bot message oobabooga 2024-01-01 19:50:13 -08:00
  • a4b4708560 Decrease "Show controls" button opacity oobabooga 2024-01-01 19:08:30 -08:00
  • 94afa0f9cf Minor style changes oobabooga 2024-01-01 16:00:22 -08:00
  • 3e3a66e721
    Merge pull request #5132 from oobabooga/dev oobabooga 2023-12-31 02:32:25 -03:00
  • cbf6f9e695 Update some UI messages oobabooga 2023-12-30 21:31:17 -08:00
  • 2aad91f3c9
    Remove deprecated command-line flags (#5131) oobabooga 2023-12-31 02:07:48 -03:00
  • 485b85ee76
    Superboogav2 Quick Fixes (#5089) TheInvisibleMage 2023-12-31 16:03:23 +11:00
  • 2734ce3e4c
    Remove RWKV loader (#5130) oobabooga 2023-12-31 02:01:40 -03:00
  • 0e54a09bcb
    Remove exllamav1 loaders (#5128) oobabooga 2023-12-31 01:57:06 -03:00
  • 8e397915c9
    Remove --sdp-attention, --xformers flags (#5126) oobabooga 2023-12-31 01:36:51 -03:00
  • b7dd1f9542
    Specify utf-8 encoding for model metadata file open (#5125) B611 2023-12-31 05:34:32 +01:00
  • 20a2eaaf95 Add .vs to .gitignore oobabooga 2023-12-27 12:58:07 -08:00
  • a4079e879e CSS: don't change --chat-height when outside the chat tab oobabooga 2023-12-27 11:51:55 -08:00
  • c419206ce1 Lint the JS/CSS oobabooga 2023-12-27 09:59:23 -08:00
  • 3fd7073808
    Merge pull request #5100 from oobabooga/dev oobabooga 2023-12-27 13:23:28 -03:00
  • 648c2d1cc2 Update settings-template.yaml oobabooga 2023-12-25 15:25:16 -08:00
  • c21e3d6300
    Merge pull request #5044 from TheLounger/style_improvements oobabooga 2023-12-25 20:00:50 -03:00
  • 2ad6c526b8 Check if extensions block exists before changing it oobabooga 2023-12-25 14:43:12 -08:00
  • 63553b41ed Improve some paddings oobabooga 2023-12-25 14:25:31 -08:00
  • abd227594c Fix a border radius oobabooga 2023-12-25 14:17:00 -08:00
  • 8d0359a6d8 Rename some CSS variables oobabooga 2023-12-25 14:10:07 -08:00
  • 5466ae59a7 Prevent input/chat area overlap with new --my-delta variable oobabooga 2023-12-25 14:07:31 -08:00
  • 19d13743a6
    Merge pull request #5078 from oobabooga/dev oobabooga 2023-12-25 17:23:01 -03:00
  • 02d063fb9f Fix extra space after 18ca35faaa oobabooga 2023-12-25 08:38:17 -08:00
  • ae927950a8 Remove instruct style border radius oobabooga 2023-12-25 08:35:33 -08:00
  • 18ca35faaa Space between chat tab and extensions block oobabooga 2023-12-25 08:34:02 -08:00
  • 73ba7a8921 Change height -> min-height for .chat oobabooga 2023-12-25 08:32:02 -08:00
  • 29b0f14d5a
    Bump llama-cpp-python to 0.2.25 (#5077) oobabooga 2023-12-25 12:36:32 -03:00
  • af876095e2
    Merge pull request #5073 from oobabooga/dev oobabooga 2023-12-25 02:58:45 -03:00
  • c06f630bcc Increase max_updates_second maximum value oobabooga 2023-12-24 13:29:47 -08:00
  • 92d5e64a82
    Bump AutoAWQ to 0.1.8 (#5061) Casper 2023-12-24 18:27:34 +01:00
  • 4aeebfc571 Merge branch 'dev' into TheLounger-style_improvements oobabooga 2023-12-24 09:24:55 -08:00
  • d76b00c211 Pin lm_eval package version oobabooga 2023-12-24 09:22:31 -08:00
  • 8c60495878 UI: add "Maximum UI updates/second" parameter oobabooga 2023-12-24 09:17:40 -08:00
  • 1b8b61b928
    Fix output_ids decoding for Qwen/Qwen-7B-Chat (#5045) zhangningboo 2023-12-23 10:11:02 +08:00
  • dbe438564e
    Support for sending images into OpenAI chat API (#4827) kabachuha 2023-12-23 04:45:53 +03:00
  • 8956f3ebe2
    Synthia instruction templates (#5041) Stefan Daniel Schwarz 2023-12-23 02:19:43 +01:00
  • afc91edcb2
    Reset the model_name after unloading the model (#5051) Yiximail 2023-12-23 09:18:24 +08:00
  • 554a8f910b Attempt at shrinking chat area when input box grows Lounger 2023-12-22 04:51:20 +01:00
  • 4b25acf58f
    Merge pull request #5039 from oobabooga/dev oobabooga 2023-12-21 20:22:48 -03:00
  • 588b37c032 Add slight padding to top of message container Lounger 2023-12-21 22:04:41 +01:00
  • 568541aa31 Remove bottom padding on chat tab Lounger 2023-12-21 21:48:34 +01:00
  • c1b99f45cb Make --help output instant oobabooga 2023-12-21 09:30:56 -08:00
  • 0dd759c44f Claim more vertical space Lounger 2023-12-21 05:42:06 +01:00
  • 6fbd64db72 Set borders for all chat styles Lounger 2023-12-21 05:00:56 +01:00
  • 2706149c65
    Organize the CMD arguments by group (#5027) oobabooga 2023-12-21 00:33:55 -03:00
  • c727a70572 Remove redundancy from modules/loaders.py oobabooga 2023-12-20 19:18:07 -08:00
  • e3e053ab99 UI: Expand chat vertically and handle header wrapping Lounger 2023-12-21 03:42:23 +01:00
  • a098c7eee3 Merge branch 'dev' into style_improvements Lounger 2023-12-20 23:09:15 +01:00
  • 11288d11d4
    Merge pull request #5022 from oobabooga/dev oobabooga 2023-12-20 15:56:04 -03:00
  • 6efbe3009f
    let exllama v1 models load safetensor loras (#4854) luna 2023-12-20 13:29:19 -03:00
  • bcba200790 Fix EOS being ignored in ExLlamav2 after previous commit oobabooga 2023-12-20 07:54:06 -08:00
  • f0f6d9bdf9 Add HQQ back & update version oobabooga 2023-12-20 07:36:33 -08:00
  • b15f510154 Optimize ExLlamav2 (non-HF) loader oobabooga 2023-12-20 07:31:42 -08:00
  • 489f4a23bf
    Merge pull request #5012 from oobabooga/dev oobabooga 2023-12-20 02:59:30 -03:00
  • 258c695ead Add rich requirement oobabooga 2023-12-19 21:58:36 -08:00
  • c1f78dbd0f
    Merge pull request #5011 from oobabooga/dev oobabooga 2023-12-20 02:38:25 -03:00
  • fadb295d4d Lint oobabooga 2023-12-19 21:36:57 -08:00
  • 2289e9031e Remove HQQ from requirements (after https://github.com/oobabooga/text-generation-webui/issues/4993) oobabooga 2023-12-19 21:33:49 -08:00
  • fb8ee9f7ff Add a specific error if HQQ is missing oobabooga 2023-12-19 21:32:58 -08:00
  • 366c93a008 Hide a warning oobabooga 2023-12-19 21:03:20 -08:00
  • 9992f7d8c0 Improve several log messages oobabooga 2023-12-19 20:54:32 -08:00
  • 23818dc098 Better logger oobabooga 2023-12-19 20:38:33 -08:00
  • 95600073bc Add an informative error when extension requirements are missing oobabooga 2023-12-19 20:20:45 -08:00
  • f9accd38e0 UI: Update chat instruct styles Lounger 2023-12-20 02:54:08 +01:00
  • d8279dc710 Replace character name placeholders in chat context (closes #5007) oobabooga 2023-12-19 17:31:46 -08:00
  • ff3e845b04 UI: Header boy is dropping shadows Lounger 2023-12-20 01:24:34 +01:00
  • 40d5bf6c35 Set margin on other tabs too Lounger 2023-12-19 23:42:13 +01:00
  • f42074b6c1 UI: Remove header margin on chat tab Lounger 2023-12-19 23:27:11 +01:00
  • 5b791cae4a
    Merge pull request #5005 from oobabooga/dev oobabooga 2023-12-19 18:21:09 -03:00
  • e83e6cedbe Organize the model menu oobabooga 2023-12-19 13:18:26 -08:00
  • f4ae0075e8 Fix conversion from old template format to jinja2 oobabooga 2023-12-19 13:16:52 -08:00
  • de138b8ba6
    Add llama-cpp-python wheels with tensor cores support (#5003) oobabooga 2023-12-19 17:30:53 -03:00
  • 71eb744b1c
    Merge pull request #5002 from oobabooga/dev oobabooga 2023-12-19 15:24:40 -03:00
  • 0a299d5959
    Bump llama-cpp-python to 0.2.24 (#5001) oobabooga 2023-12-19 15:22:21 -03:00
  • 83cf1a6b67 Fix Yi space issue (closes #4996) oobabooga 2023-12-19 07:54:19 -08:00
  • 781367bdc3
    Merge pull request #4988 from oobabooga/dev oobabooga 2023-12-18 23:42:16 -03:00
  • 9847809a7a Add a warning about ppl evaluation without --no_use_fast oobabooga 2023-12-18 18:09:24 -08:00
  • f6d701624c UI: mention that QuIP# does not work on Windows oobabooga 2023-12-18 18:05:02 -08:00
  • a23a004434 Update the example template oobabooga 2023-12-18 17:47:35 -08:00
  • 3d10c574e7 Fix custom system messages in instruction templates oobabooga 2023-12-18 17:45:06 -08:00
  • 9e48e50428
    Update optimum requirement from ==1.15.* to ==1.16.* (#4986) dependabot[bot] 2023-12-18 21:43:29 -03:00
  • 9fa3883630
    Add ROCm wheels for exllamav2 (#4973) 俞航 2023-12-19 08:40:38 +08:00
  • 674be9a09a
    Add HQQ quant loader (#4888) Water 2023-12-18 19:23:16 -05:00
  • b28020a9e4
    Merge pull request #4980 from oobabooga/dev oobabooga 2023-12-18 10:11:32 -03:00
  • 64a57d9dc2 Remove duplicate instruction templates oobabooga 2023-12-17 21:39:47 -08:00
  • 1f9e25e76a UI: update "Saved instruction templates" dropdown after loading template oobabooga 2023-12-17 21:19:06 -08:00
  • da1c8d77ea Merge remote-tracking branch 'refs/remotes/origin/dev' into dev oobabooga 2023-12-17 21:05:10 -08:00