Commit graph

  • edc0262889 Minor file uploading fixes oobabooga 2023-02-17 10:27:41 -03:00
  • 243244eeec Attempt at fixing greyed out files on iphone oobabooga 2023-02-17 10:17:15 -03:00
  • a226f4cddb No change, so reverting oobabooga 2023-02-17 09:27:17 -03:00
  • 40cb9f63f6 Try making Colab happy (tensorflow warnings) oobabooga 2023-02-17 09:23:11 -03:00
  • 71c2764516 Fix the API docs in chat mode oobabooga 2023-02-17 01:56:51 -03:00
  • 33ad21c4f2 Make the profile pictures a bit larger oobabooga 2023-02-17 00:35:17 -03:00
  • c4e87c109e Include the bot's image as base64 oobabooga 2023-02-17 00:24:27 -03:00
  • cb226247e8 Make it possible to disable the TTS from within the interface oobabooga 2023-02-16 23:38:27 -03:00
  • fd8070b960 Give some default options in the download script oobabooga 2023-02-16 23:04:13 -03:00
  • aeddf902ec Make the refresh button prettier oobabooga 2023-02-16 21:55:20 -03:00
  • 21512e2790 Make the Stop button work more reliably oobabooga 2023-02-16 21:21:45 -03:00
  • 348acdf626 Mention deepspeed in the README oobabooga 2023-02-16 17:29:48 -03:00
  • bde4cd402a Change the default TTS voice oobabooga 2023-02-16 16:07:38 -03:00
  • 5fb99371ba Add .gitignore oobabooga 2023-02-16 13:35:54 -03:00
  • 08805b3374 Force "You" in impersonate too oobabooga 2023-02-16 13:24:13 -03:00
  • d7db04403f Fix --chat chatbox height oobabooga 2023-02-16 12:45:05 -03:00
  • 589069e105 Don't regenerate if no message has been sent oobabooga 2023-02-16 12:32:35 -03:00
  • 6160a03984 Merge branch 'main' of github.com:oobabooga/text-generation-webui oobabooga 2023-02-16 12:16:26 -03:00
  • 405dfbf57c Force your name to be "You" for pygmalion (properly) oobabooga 2023-02-16 12:16:12 -03:00
  • 20484f26f3
    Trying to make character bias more consistent oobabooga 2023-02-15 23:38:52 -03:00
  • 7bd2ae05bf Force your name to be "You" for pygmalion oobabooga 2023-02-15 21:32:53 -03:00
  • 3746d72853 More style fixes oobabooga 2023-02-15 21:13:12 -03:00
  • 6f213b8c14 Style fix oobabooga 2023-02-15 20:58:17 -03:00
  • ccf10db60f Move stuff into tabs in chat mode oobabooga 2023-02-15 20:55:32 -03:00
  • a55e8836f6 Bump gradio version oobabooga 2023-02-15 20:20:56 -03:00
  • 0e89ff4b13 Clear the persistent history after clicking on "Clear history" oobabooga 2023-02-15 16:49:52 -03:00
  • 05b53e4626 Update README oobabooga 2023-02-15 14:43:34 -03:00
  • ed73d00bd5 Update README oobabooga 2023-02-15 14:43:13 -03:00
  • 30fcb26737 Update README oobabooga 2023-02-15 14:42:41 -03:00
  • b3bcd2881d Implement regenerate/impersonate the proper way (fixes #78) oobabooga 2023-02-15 14:39:26 -03:00
  • 5ee9283cae Mention BLIP oobabooga 2023-02-15 13:53:38 -03:00
  • 8d3b3959e7 Document --picture option oobabooga 2023-02-15 13:50:18 -03:00
  • 2eea0f4edb Minor change oobabooga 2023-02-15 12:58:11 -03:00
  • 3c31fa7079 Simplifications oobabooga 2023-02-15 12:46:11 -03:00
  • 80fbc584f7 Readability oobabooga 2023-02-15 11:38:44 -03:00
  • b397bea387 Make chat history persistent oobabooga 2023-02-15 11:30:38 -03:00
  • 7be372829d Set chat prompt size in tokens oobabooga 2023-02-15 10:18:50 -03:00
  • 1622059179 Move BLIP to the CPU oobabooga 2023-02-15 00:03:19 -03:00
  • d4d90a8000
    Merge pull request #76 from SillyLossy/main oobabooga 2023-02-14 23:57:44 -03:00
  • 8c3ef58e00 Use BLIP directly + some simplifications oobabooga 2023-02-14 23:55:46 -03:00
  • a7d98f494a Use BLIP to send a picture to model SillyLossy 2023-02-15 01:38:21 +02:00
  • 79d3a524f2 Add a file oobabooga 2023-02-14 15:18:05 -03:00
  • f6bf74dcd5 Add Silero TTS extension oobabooga 2023-02-14 15:06:06 -03:00
  • 01e5772302
    Update README.md oobabooga 2023-02-14 13:06:26 -03:00
  • d910d435cd Consider the softprompt in the maximum prompt length calculation oobabooga 2023-02-14 12:06:47 -03:00
  • 8b3bb512ef Minor bug fix (soft prompt was being loaded twice) oobabooga 2023-02-13 23:34:04 -03:00
  • 56bbc996a4 Minor CSS change for readability oobabooga 2023-02-13 23:01:14 -03:00
  • 210c918199
    Update README.md oobabooga 2023-02-13 21:49:19 -03:00
  • 2fe9d7f372 Merge branch 'main' of github.com:oobabooga/text-generation-webui oobabooga 2023-02-13 18:48:46 -03:00
  • 7739a29524 Some simplifications oobabooga 2023-02-13 18:48:32 -03:00
  • b7ddcab53a
    Update README.md oobabooga 2023-02-13 15:52:49 -03:00
  • 3277b751f5 Add softprompt support (for real this time) oobabooga 2023-02-13 15:25:16 -03:00
  • aa1177ff15 Send last internal reply to input rather than visible oobabooga 2023-02-13 03:29:23 -03:00
  • 61aed97439 Slightly increase a margin oobabooga 2023-02-12 17:38:54 -03:00
  • 2c3abcf57a Add support for rosey/chip/joi instruct models oobabooga 2023-02-12 09:46:34 -03:00
  • 7ef7bba6e6 Add progress bar for model loading oobabooga 2023-02-12 09:36:27 -03:00
  • 939e9d00a2
    Update README.md oobabooga 2023-02-12 00:47:03 -03:00
  • bf9dd8f8ee Add --text-only option to the download script oobabooga 2023-02-12 00:42:56 -03:00
  • 42cc307409
    Update README.md oobabooga 2023-02-12 00:34:55 -03:00
  • 66862203fc Only download safetensors if both pytorch and safetensors are present oobabooga 2023-02-12 00:06:22 -03:00
  • 5d3f15b915 Use the CPU if no GPU is detected oobabooga 2023-02-11 23:17:06 -03:00
  • 337290777b Rename example extension to "softprompt" oobabooga 2023-02-11 17:17:10 -03:00
  • b3c4657c47 Remove commas from preset files oobabooga 2023-02-11 14:54:29 -03:00
  • 144857acfe Update README oobabooga 2023-02-11 14:49:11 -03:00
  • 0dd1409f24 Add penalty_alpha parameter (contrastive search) oobabooga 2023-02-11 14:48:12 -03:00
  • 8aafb55693
    1-click installer now also works for AMD GPUs oobabooga 2023-02-11 14:24:47 -03:00
  • 7eed553337 Merge branch 'main' of github.com:oobabooga/text-generation-webui oobabooga 2023-02-11 08:00:29 -03:00
  • 2ed0386d87 Fix replace last reply in --chat mode (for #69) oobabooga 2023-02-11 07:59:54 -03:00
  • 1e97cb9570
    Merge pull request #68 from Spencer-Dawson/patch-1 oobabooga 2023-02-11 07:56:30 -03:00
  • 1176d64b13
    Update README.md oobabooga 2023-02-11 07:56:12 -03:00
  • c5324d653b
    re-added missed README changes Spencer-Dawson 2023-02-11 00:13:06 -07:00
  • cf89ef1c74
    Update README.md oobabooga 2023-02-10 21:46:29 -03:00
  • 8782ac1911
    Update README.md oobabooga 2023-02-10 17:10:27 -03:00
  • 7d7cc37560
    Add Linux 1-click installer oobabooga 2023-02-10 17:09:53 -03:00
  • 316e07f06a auto-assign gpu memory with --auto-devices alone oobabooga 2023-02-10 16:36:06 -03:00
  • 76d3d7ddb3 Reorder the imports here too oobabooga 2023-02-10 15:57:55 -03:00
  • 219366342b Sort imports according to PEP8 (based on #67) oobabooga 2023-02-10 15:40:03 -03:00
  • 96d56d4f3c Turn the example script into a soft prompt script oobabooga 2023-02-10 15:24:26 -03:00
  • e0b164feab
    Merge pull request #66 from 81300/bf16 oobabooga 2023-02-09 15:13:17 -03:00
  • 20dbef9623
    Extend bfloat16 support 81300 2023-02-09 20:00:03 +02:00
  • 991de5ed40
    Update README.md oobabooga 2023-02-09 14:36:47 -03:00
  • 04d3d0aee6
    Add 1-click windows installer (for #45) oobabooga 2023-02-09 13:27:30 -03:00
  • a21620fc59 Update README oobabooga 2023-02-08 01:17:50 -03:00
  • cadd100405 min_length has to be 0 when streaming is on oobabooga 2023-02-08 00:23:35 -03:00
  • 6be571cff7 Better variable names oobabooga 2023-02-08 00:19:20 -03:00
  • fc0493d885 Add credits oobabooga 2023-02-08 00:09:41 -03:00
  • 58b07cca81 length_penalty can be negative (apparently) oobabooga 2023-02-07 23:33:02 -03:00
  • d038963193 Rename a variable (for #59) oobabooga 2023-02-07 23:26:02 -03:00
  • 7e4c25691d Repetition penalty has to be < 5 oobabooga 2023-02-07 23:23:39 -03:00
  • 1c30e1b49a Add even more sliders oobabooga 2023-02-07 23:11:04 -03:00
  • 24dc705eca Add lots of sliders oobabooga 2023-02-07 22:08:21 -03:00
  • 53af062fa5
    Update README.md oobabooga 2023-02-05 23:14:25 -03:00
  • a2519ede90
    Merge pull request #57 from martinnj/fix/tokenize-dialogue-regex-backref oobabooga 2023-02-05 14:09:19 -03:00
  • 06a4664805 Fix a regex issue in tokenize_dialogue. Martin J 2023-02-05 07:42:57 +01:00
  • 2fe235738e Reorganize chat buttons oobabooga 2023-02-04 22:53:42 -03:00
  • 2207d44986 Windows doesn't like : in filenames oobabooga 2023-02-04 20:07:39 -03:00
  • b4fc8dfa8f Add safetensors version oobabooga 2023-02-04 18:58:17 -03:00
  • 65266f3349 Fix loading official colab chat logs oobabooga 2023-02-03 22:43:02 -03:00
  • 54a6a74b7d Merge branch 'main' of github.com:oobabooga/text-generation-webui oobabooga 2023-02-03 20:07:38 -03:00
  • 3dbebe30b1 Remove deepspeed requirement (only works on Linux for now) oobabooga 2023-02-03 20:07:13 -03:00