Update README.md
Browse files
README.md
CHANGED
|
@@ -149,6 +149,30 @@ InverseCoder is a series of code LLMs instruction-tuned by generating data from
|
|
| 149 |
| 7B | [codellama/CodeLlama-7b-Python-hf](https://huggingface.co/codellama/CodeLlama-7b-Python-hf) | [wyt2000/InverseCoder-CL-7B](https://huggingface.co/wyt2000/InverseCoder-CL-7B) | [wyt2000/InverseCoder-CL-7B-Evol-Instruct-90K](https://huggingface.co/datasets/wyt2000/InverseCoder-DS-6.7B-Evol-Instruct-90K) |
|
| 150 |
| 6.7B | [deepseek-ai/deepseek-coder-6.7b-base](https://huggingface.co/deepseek-ai/deepseek-coder-6.7b-base) | [wyt2000/InverseCoder-DS-6.7B](https://huggingface.co/wyt2000/InverseCoder-DS-6.7B) | [wyt2000/InverseCoder-DS-6.7B-Evol-Instruct-90K](https://huggingface.co/datasets/wyt2000/InverseCoder-DS-6.7B-Evol-Instruct-90K) |
|
| 151 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 152 |
## Paper
|
| 153 |
**Arxiv:** <https://arxiv.org/abs/2407.05700>
|
| 154 |
|
|
@@ -164,4 +188,11 @@ Please cite the paper if you use the models or datasets from InverseCoder.
|
|
| 164 |
primaryClass={cs.CL},
|
| 165 |
url={https://arxiv.org/abs/2407.05700},
|
| 166 |
}
|
| 167 |
-
```
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 149 |
| 7B | [codellama/CodeLlama-7b-Python-hf](https://huggingface.co/codellama/CodeLlama-7b-Python-hf) | [wyt2000/InverseCoder-CL-7B](https://huggingface.co/wyt2000/InverseCoder-CL-7B) | [wyt2000/InverseCoder-CL-7B-Evol-Instruct-90K](https://huggingface.co/datasets/wyt2000/InverseCoder-DS-6.7B-Evol-Instruct-90K) |
|
| 150 |
| 6.7B | [deepseek-ai/deepseek-coder-6.7b-base](https://huggingface.co/deepseek-ai/deepseek-coder-6.7b-base) | [wyt2000/InverseCoder-DS-6.7B](https://huggingface.co/wyt2000/InverseCoder-DS-6.7B) | [wyt2000/InverseCoder-DS-6.7B-Evol-Instruct-90K](https://huggingface.co/datasets/wyt2000/InverseCoder-DS-6.7B-Evol-Instruct-90K) |
|
| 151 |
|
| 152 |
+
## Usage
|
| 153 |
+
|
| 154 |
+
Similar to [Magicoder-S-DS-6.7B](https://huggingface.co/ise-uiuc/Magicoder-S-DS-6.7B/), use the code below to get started with the model. Make sure you installed the [transformers](https://huggingface.co/docs/transformers/index) library.
|
| 155 |
+
|
| 156 |
+
```python
|
| 157 |
+
from transformers import pipeline
|
| 158 |
+
import torch
|
| 159 |
+
INVERSECODER_PROMPT = """You are an exceptionally intelligent coding assistant that consistently delivers accurate and reliable responses to user instructions.
|
| 160 |
+
@@ Instruction
|
| 161 |
+
{instruction}
|
| 162 |
+
@@ Response
|
| 163 |
+
"""
|
| 164 |
+
instruction = <Your code instruction here>
|
| 165 |
+
prompt = INVERSECODER_PROMPT.format(instruction=instruction)
|
| 166 |
+
generator = pipeline(
|
| 167 |
+
model="wyt2000/InverseCoder-CL-7B",
|
| 168 |
+
task="text-generation",
|
| 169 |
+
torch_dtype=torch.bfloat16,
|
| 170 |
+
device_map="auto",
|
| 171 |
+
)
|
| 172 |
+
result = generator(prompt, max_length=1024, num_return_sequences=1, temperature=0.0)
|
| 173 |
+
print(result[0]["generated_text"])
|
| 174 |
+
```
|
| 175 |
+
|
| 176 |
## Paper
|
| 177 |
**Arxiv:** <https://arxiv.org/abs/2407.05700>
|
| 178 |
|
|
|
|
| 188 |
primaryClass={cs.CL},
|
| 189 |
url={https://arxiv.org/abs/2407.05700},
|
| 190 |
}
|
| 191 |
+
```
|
| 192 |
+
|
| 193 |
+
## Acknowledgements
|
| 194 |
+
|
| 195 |
+
* [Magicoder](https://github.com/ise-uiuc/magicoder): Training code, original dataset and data decontamination
|
| 196 |
+
* [DeepSeek-Coder](https://github.com/deepseek-ai/DeepSeek-Coder): Base model for InverseCoder-DS
|
| 197 |
+
* [CodeLlama](https://ai.meta.com/research/publications/code-llama-open-foundation-models-for-code/): Base model for InverseCoder-CL
|
| 198 |
+
* [AutoMathText](https://github.com/yifanzhang-pro/AutoMathText): Self-evaluation and Data Selection method
|