update top_k
Browse files
README.md
CHANGED
|
@@ -175,7 +175,7 @@ We recommend using [SGLang](https://docs.sglang.ai/) to serve MiniMax-M2. SGLang
|
|
| 175 |
We recommend using [vLLM](https://docs.vllm.ai/en/stable/) to serve MiniMax-M2. vLLM provides efficient day-0 support of MiniMax-M2 model, check https://docs.vllm.ai/projects/recipes/en/latest/MiniMax/MiniMax-M2.html for latest deployment guide. We also provide our [vLLM Deployment Guide](https://huggingface.co/MiniMaxAI/MiniMax-M2/blob/main/docs/vllm_deploy_guide.md).
|
| 176 |
|
| 177 |
### Inference Parameters
|
| 178 |
-
We recommend using the following parameters for best performance: `temperature=1.0`, `top_p = 0.95`, `top_k =
|
| 179 |
|
| 180 |
## Tool Calling Guide
|
| 181 |
|
|
|
|
| 175 |
We recommend using [vLLM](https://docs.vllm.ai/en/stable/) to serve MiniMax-M2. vLLM provides efficient day-0 support of MiniMax-M2 model, check https://docs.vllm.ai/projects/recipes/en/latest/MiniMax/MiniMax-M2.html for latest deployment guide. We also provide our [vLLM Deployment Guide](https://huggingface.co/MiniMaxAI/MiniMax-M2/blob/main/docs/vllm_deploy_guide.md).
|
| 176 |
|
| 177 |
### Inference Parameters
|
| 178 |
+
We recommend using the following parameters for best performance: `temperature=1.0`, `top_p = 0.95`, `top_k = 40`.
|
| 179 |
|
| 180 |
## Tool Calling Guide
|
| 181 |
|