ruslanmv commited on
Commit
1520b93
·
1 Parent(s): 6ec7105

First commit

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -7,20 +7,20 @@ sdk: docker
7
  app_port: 23333
8
  ---
9
 
10
- ## HF-LLM-API
11
  Huggingface LLM Inference API in OpenAI message format.
12
 
13
  Project link: https://github.com/ruslanmv/hf-llm-api-collection
14
 
15
  ## Features
16
 
17
- - Available Models (2024/01/22): [#5](https://github.com/Hansimov/hf-llm-api/issues/5)
18
  - `mistral-7b`, `mixtral-8x7b`, `nous-mixtral-8x7b`
19
  - Adaptive prompt templates for different models
20
  - Support OpenAI API format
21
  - Enable api endpoint via official `openai-python` package
22
  - Support both stream and no-stream response
23
- - Support API Key via both HTTP auth header and env varible [#4](https://github.com/Hansimov/hf-llm-api/issues/4)
24
  - Docker deployment
25
 
26
  ## Run API service
 
7
  app_port: 23333
8
  ---
9
 
10
+ ## HF-LLM-API-COLLECTION
11
  Huggingface LLM Inference API in OpenAI message format.
12
 
13
  Project link: https://github.com/ruslanmv/hf-llm-api-collection
14
 
15
  ## Features
16
 
17
+ - Available Models (2024/01/22): [#5](https://github.com/ruslanmv/hf-llm-api-collection/issues/5)
18
  - `mistral-7b`, `mixtral-8x7b`, `nous-mixtral-8x7b`
19
  - Adaptive prompt templates for different models
20
  - Support OpenAI API format
21
  - Enable api endpoint via official `openai-python` package
22
  - Support both stream and no-stream response
23
+ - Support API Key via both HTTP auth header and env varible [#4](https://github.com/ruslanmv/hf-llm-api-collection/issues/4)
24
  - Docker deployment
25
 
26
  ## Run API service