Add/update the quantized ONNX model files and README.md for Transformers.js v3 (#2)
Browse files- Add/update the quantized ONNX model files and README.md for Transformers.js v3 (abe478e48edf77a832fe63487f980572bedea256)
Co-authored-by: Yuichiro Tachibana <whitphx@users.noreply.huggingface.co>
    	
        README.md
    CHANGED
    
    | @@ -7,7 +7,7 @@ tags: | |
| 7 |  | 
| 8 | 
             
            # text-davinci-003 Tokenizer
         | 
| 9 |  | 
| 10 | 
            -
            A 🤗-compatible version of the **text-davinci-003 tokenizer** (adapted from [openai/tiktoken](https://github.com/openai/tiktoken)). This means it can be used with Hugging Face libraries including [Transformers](https://github.com/huggingface/transformers), [Tokenizers](https://github.com/huggingface/tokenizers), and [Transformers.js](https://github.com/ | 
| 11 |  | 
| 12 | 
             
            ## Example usage:
         | 
| 13 |  | 
| @@ -21,8 +21,8 @@ assert tokenizer.encode('hello world') == [31373, 995] | |
| 21 |  | 
| 22 | 
             
            ### Transformers.js
         | 
| 23 | 
             
            ```js
         | 
| 24 | 
            -
            import { AutoTokenizer } from '@ | 
| 25 |  | 
| 26 | 
             
            const tokenizer = await AutoTokenizer.from_pretrained('Xenova/text-davinci-003');
         | 
| 27 | 
             
            const tokens = tokenizer.encode('hello world'); // [31373, 995]
         | 
| 28 | 
            -
            ```
         | 
|  | |
| 7 |  | 
| 8 | 
             
            # text-davinci-003 Tokenizer
         | 
| 9 |  | 
| 10 | 
            +
            A 🤗-compatible version of the **text-davinci-003 tokenizer** (adapted from [openai/tiktoken](https://github.com/openai/tiktoken)). This means it can be used with Hugging Face libraries including [Transformers](https://github.com/huggingface/transformers), [Tokenizers](https://github.com/huggingface/tokenizers), and [Transformers.js](https://github.com/huggingface/transformers.js).
         | 
| 11 |  | 
| 12 | 
             
            ## Example usage:
         | 
| 13 |  | 
|  | |
| 21 |  | 
| 22 | 
             
            ### Transformers.js
         | 
| 23 | 
             
            ```js
         | 
| 24 | 
            +
            import { AutoTokenizer } from '@huggingface/transformers';
         | 
| 25 |  | 
| 26 | 
             
            const tokenizer = await AutoTokenizer.from_pretrained('Xenova/text-davinci-003');
         | 
| 27 | 
             
            const tokens = tokenizer.encode('hello world'); // [31373, 995]
         | 
| 28 | 
            +
            ```
         | 

