From 0ff38c994e152901863ec67ead0e56a7b03710f1 Mon Sep 17 00:00:00 2001 From: oobabooga <112222186+oobabooga@users.noreply.github.com> Date: Thu, 11 May 2023 09:58:58 -0300 Subject: [PATCH] Update README.md --- README.md | 13 ++++++------- 1 file changed, 6 insertions(+), 7 deletions(-) diff --git a/README.md b/README.md index 1b71b33..40fb62a 100644 --- a/README.md +++ b/README.md @@ -15,24 +15,23 @@ Its goal is to become the [AUTOMATIC1111/stable-diffusion-webui](https://github. * Chat mode for conversation and role-playing * Instruct mode compatible with various formats, including Alpaca, Vicuna, Open Assistant, Dolly, Koala, ChatGLM, MOSS, RWKV-Raven, Galactica, StableLM, WizardLM, Baize, MPT, and INCITE * [Multimodal pipelines, including LLaVA and MiniGPT-4](https://github.com/oobabooga/text-generation-webui/tree/main/extensions/multimodal) -* Nice HTML output for GPT-4chan * Markdown output for [GALACTICA](https://github.com/paperswithcode/galai), including LaTeX rendering +* Nice HTML output for GPT-4chan * [Custom chat characters](docs/Chat-mode.md) * Advanced chat features (send images, get audio responses with TTS) * Very efficient text streaming * Parameter presets +* [LLaMA model](docs/LLaMA-model.md) +* [4-bit GPTQ mode](docs/GPTQ-models-(4-bit-mode).md) +* [LoRA (loading and training)](docs/Using-LoRAs.md) +* [llama.cpp](docs/llama.cpp-models.md) +* [RWKV model](docs/RWKV-model.md) * 8-bit mode * Layers splitting across GPU(s), CPU, and disk * CPU mode * [FlexGen](docs/FlexGen.md) * [DeepSpeed ZeRO-3](docs/DeepSpeed.md) * API [with](https://github.com/oobabooga/text-generation-webui/blob/main/api-example-stream.py) streaming and [without](https://github.com/oobabooga/text-generation-webui/blob/main/api-example.py) streaming -* [LLaMA model](docs/LLaMA-model.md) -* [4-bit GPTQ mode](docs/GPTQ-models-(4-bit-mode).md) -* [llama.cpp](docs/llama.cpp-models.md) -* [RWKV model](docs/RWKV-model.md) -* [LoRA (loading and training)](docs/Using-LoRAs.md) -* Softprompts * [Extensions](docs/Extensions.md) - see the [user extensions list](https://github.com/oobabooga/text-generation-webui-extensions) ## Installation