Commit graph

  • 7a3717b824 Allow uploading characters oobabooga 2023-01-25 15:45:25 -03:00
  • 6388c7fbc0 Set queue size to 1 to prevent gradio undefined behavior oobabooga 2023-01-25 14:37:41 -03:00
  • ec69c190ba Keep the character's greeting/example dialogue when "clear history" is clicked oobabooga 2023-01-25 10:52:35 -03:00
  • ebed1dea56 Generate 8 tokens at a time in streaming mode instead of just 1 oobabooga 2023-01-25 10:38:26 -03:00
  • 651eb50dd1 Merge branch 'main' of github.com:oobabooga/text-generation-webui oobabooga 2023-01-25 10:20:03 -03:00
  • 0b2a6b2819 Add file oobabooga 2023-01-25 10:19:50 -03:00
  • 3b8f0021cc Stop generating at \nYou: in chat mode oobabooga 2023-01-25 10:17:55 -03:00
  • befabbb862
    Update README.md oobabooga 2023-01-24 23:01:43 -03:00
  • 7b98971df1
    Update README.md oobabooga 2023-01-24 18:39:29 -03:00
  • 86b237eb0b
    Update README.md oobabooga 2023-01-24 09:29:34 -03:00
  • c2d1f04305
    Update README.md oobabooga 2023-01-23 23:34:20 -03:00
  • 54e77acac4 Rename to "Generation parameters preset" for clarity oobabooga 2023-01-23 20:49:44 -03:00
  • 6c40f7eeb4 New NovelAI/KoboldAI preset selection oobabooga 2023-01-23 20:44:27 -03:00
  • ce4756fb88 Allow uploading chat history in official pygmalion web ui format oobabooga 2023-01-23 15:29:01 -03:00
  • 8325e23923 Fix bug in loading chat history as text file oobabooga 2023-01-23 14:28:02 -03:00
  • 059d47edb5 Submit with enter instead of shift+enter in chat mode oobabooga 2023-01-23 14:04:01 -03:00
  • 22845ba445 Update README oobabooga 2023-01-23 13:41:50 -03:00
  • 4820379139 Add debug preset (deterministic, should always give the same responses) oobabooga 2023-01-23 13:36:01 -03:00
  • 5b60691367 Update README oobabooga 2023-01-23 10:05:25 -03:00
  • 085d5cbcb9 Update README oobabooga 2023-01-23 10:03:19 -03:00
  • 8a68930220 Update README oobabooga 2023-01-23 10:02:35 -03:00
  • 947b50e8ea Allow uploading chat history as simple text files oobabooga 2023-01-23 09:45:10 -03:00
  • ebf720585b Mention time and it/s in terminal with streaming off oobabooga 2023-01-22 20:07:19 -03:00
  • d87310ad61 Send last input to the input box when "Remove last" is clicked oobabooga 2023-01-22 19:40:22 -03:00
  • 3c5454c6f8 Update README oobabooga 2023-01-22 19:38:07 -03:00
  • 8f37383dc6 Update README oobabooga 2023-01-22 19:29:24 -03:00
  • 9592e7e618 Update README oobabooga 2023-01-22 19:28:03 -03:00
  • c410fe5ab8 Add simplified colab notebook oobabooga 2023-01-22 19:27:30 -03:00
  • 41806fe40d
    Update chat screenshot oobabooga 2023-01-22 17:28:51 -03:00
  • d0ea6d5f86 Make the maximum history size in prompt unlimited by default oobabooga 2023-01-22 17:17:35 -03:00
  • 00f3b0996b Warn the user that chat mode becomes a lot slower with text streaming oobabooga 2023-01-22 16:19:11 -03:00
  • c5cc3a3075 Fix bug in "remove last" button oobabooga 2023-01-22 13:10:36 -03:00
  • a410cf1345 Mention that "Chat history size" means "Chat history size in prompt" oobabooga 2023-01-22 03:15:35 -03:00
  • b3e1a874bc Fix bug in loading history oobabooga 2023-01-22 02:32:54 -03:00
  • 62b533f344 Add "regenerate" button to the chat oobabooga 2023-01-22 02:19:58 -03:00
  • 5b50db9dee Mention pygmalion support oobabooga 2023-01-22 01:30:55 -03:00
  • 6b06fca2f1 Mention pygmalion support oobabooga 2023-01-22 01:28:52 -03:00
  • 94ecbc6dff Export history as nicely formatted json oobabooga 2023-01-22 01:24:16 -03:00
  • deacb96c34 Change the pygmalion default context oobabooga 2023-01-22 00:49:59 -03:00
  • 23f94f559a Improve the chat prompt design oobabooga 2023-01-22 00:35:42 -03:00
  • 139e2f0ab4 Redesign the upload/download chat history buttons oobabooga 2023-01-22 00:22:50 -03:00
  • 434d4b128c Add refresh buttons for the model/preset/character menus oobabooga 2023-01-22 00:02:46 -03:00
  • bc664ecf3b Update the installation instructions for low attention span people oobabooga 2023-01-21 22:54:35 -03:00
  • 1e5e56fa2e Better recognize the 4chan model (for #19) oobabooga 2023-01-21 22:13:01 -03:00
  • aadf4e899a Improve example dialogue handling oobabooga 2023-01-21 15:04:13 -03:00
  • f9dbe7e08e Update README oobabooga 2023-01-21 03:05:55 -03:00
  • 27e2d932b0 Don't export include the example dialogue in the export json oobabooga 2023-01-21 02:55:13 -03:00
  • 990ee54ddd Move the example dialogue to the chat history, and keep it hidden. oobabooga 2023-01-21 02:48:06 -03:00
  • 3f2c1e7170
    Merge pull request #16 from 81300/model-download oobabooga 2023-01-21 00:43:35 -03:00
  • 1e541d4882
    Update download-model.py oobabooga 2023-01-21 00:43:00 -03:00
  • 18ef72d7c0
    Update download-model.py oobabooga 2023-01-21 00:38:23 -03:00
  • d7299df01f Rename parameters oobabooga 2023-01-21 00:33:41 -03:00
  • 86a2832f3b
    Merge pull request #17 from Silver267/main oobabooga 2023-01-21 00:26:19 -03:00
  • 5df03bf0fd
    Merge branch 'main' into main oobabooga 2023-01-21 00:25:34 -03:00
  • faaafe7c0e Better parameter naming oobabooga 2023-01-20 23:45:16 -03:00
  • f4634e4c32 Update. Silver267 2023-01-20 17:05:43 -05:00
  • fffd49e64e
    Add --branch option to the model download script 81300 2023-01-20 22:51:56 +02:00
  • c0f2367b54 Minor fix oobabooga 2023-01-20 17:09:25 -03:00
  • 185587a33e Add a history size parameter to the chat oobabooga 2023-01-20 17:03:09 -03:00
  • 4067cecf67 Bump bitsandbytes version oobabooga 2023-01-20 12:51:49 -03:00
  • 8f3deec759 Prevent the history from being altered by the html script oobabooga 2023-01-20 01:59:51 -03:00
  • 78d5a999e6 Improve prompt formatation oobabooga 2023-01-20 01:54:38 -03:00
  • 70ff685736 Encode the input string correctly oobabooga 2023-01-20 00:45:02 -03:00
  • 83584ae2d7 Clearer installation instructions oobabooga 2023-01-20 00:20:35 -03:00
  • b66d18d5a0 Allow presets/characters with '.' in their names oobabooga 2023-01-19 21:56:33 -03:00
  • c4f7a874d5 Fix the regex... oobabooga 2023-01-19 21:16:11 -03:00
  • 8d4170826f Update README oobabooga 2023-01-19 21:08:26 -03:00
  • 11c3214981 Fix some regexes oobabooga 2023-01-19 19:59:34 -03:00
  • e61138bdad Minor fixes oobabooga 2023-01-19 19:04:54 -03:00
  • 2181fca709 Better defaults for chat oobabooga 2023-01-19 18:58:45 -03:00
  • cd7b07239f Add Colab guide oobabooga 2023-01-19 17:58:04 -03:00
  • 83808171d3 Add --share option for Colab oobabooga 2023-01-19 17:31:29 -03:00
  • b054367be2 Update README oobabooga 2023-01-19 16:54:58 -03:00
  • 8d788874d7 Add support for characters oobabooga 2023-01-19 16:46:46 -03:00
  • 3121f4788e Fix uploading chat log in --chat mode oobabooga 2023-01-19 15:05:42 -03:00
  • 849e4c7f90 Better way of finding the generated reply in the output string oobabooga 2023-01-19 14:57:01 -03:00
  • d03b0ad7a8 Implement saving/loading chat logs (#9) oobabooga 2023-01-19 14:03:47 -03:00
  • 39bfea5a22 Add a progress bar oobabooga 2023-01-19 12:20:57 -03:00
  • 5390fc87c8 add auto-devices when disk is used oobabooga 2023-01-19 12:11:44 -03:00
  • 759da435e3 Release 8-bit models memory oobabooga 2023-01-19 12:01:58 -03:00
  • f9faad4cfa Add low VRAM guide oobabooga 2023-01-19 11:25:17 -03:00
  • 7ace04864a Implement sending layers to disk with --disk (#10) oobabooga 2023-01-19 11:09:24 -03:00
  • 1ce95ee817 Mention text streaming oobabooga 2023-01-19 10:46:41 -03:00
  • 93fa9bbe01 Clean up the streaming implementation oobabooga 2023-01-19 10:43:05 -03:00
  • c90310e40e Small simplification oobabooga 2023-01-19 00:41:57 -03:00
  • 99536ef5bf Add no-stream option oobabooga 2023-01-18 23:56:42 -03:00
  • 116299b3ad Manual eos_token implementation oobabooga 2023-01-18 22:57:39 -03:00
  • 3cb30bed0a Add a "stop" button oobabooga 2023-01-18 22:44:47 -03:00
  • 8f27d33034 Fix another bug oobabooga 2023-01-18 22:08:23 -03:00
  • 6c7f187586 Minor change oobabooga 2023-01-18 21:59:23 -03:00
  • b3cba0b330 Bug oobabooga 2023-01-18 21:54:44 -03:00
  • df2e910421 Stop generating in chat mode when \nYou: is generated oobabooga 2023-01-18 21:51:18 -03:00
  • 022960a087 This is the correct way of sampling 1 token at a time oobabooga 2023-01-18 21:37:21 -03:00
  • 0f01a3b1fa Implement text streaming (#10) oobabooga 2023-01-18 19:06:50 -03:00
  • ca13acdfa0 Ensure that the chat prompt will always contain < 2048 tokens oobabooga 2023-01-17 20:16:23 -03:00
  • 6456777b09 Clean things up oobabooga 2023-01-16 16:35:45 -03:00
  • 3a99b2b030 Change a truncation parameter oobabooga 2023-01-16 13:53:30 -03:00
  • 54bf55372b Truncate prompts to 2048 characters oobabooga 2023-01-16 13:43:23 -03:00
  • 99d24bdbfe
    Update README.md oobabooga 2023-01-16 11:23:45 -03:00
  • ed1d2c0d38
    Update README.md oobabooga 2023-01-16 11:19:23 -03:00