update requirements for transformers version
Browse files- README.md +2 -0
- config.json +1 -1
    	
        README.md
    CHANGED
    
    | @@ -80,6 +80,8 @@ MiniCPM 4.1 can be used with following frameworks: Huggingface Transformers, SGL | |
| 80 |  | 
| 81 |  | 
| 82 | 
             
            ### Inference with Transformers
         | 
|  | |
|  | |
| 83 | 
             
            ```python
         | 
| 84 | 
             
            from transformers import AutoModelForCausalLM, AutoTokenizer
         | 
| 85 | 
             
            import torch
         | 
|  | |
| 80 |  | 
| 81 |  | 
| 82 | 
             
            ### Inference with Transformers
         | 
| 83 | 
            +
            MiniCPM4.1-8B requires `transformers>=4.56`.
         | 
| 84 | 
            +
             | 
| 85 | 
             
            ```python
         | 
| 86 | 
             
            from transformers import AutoModelForCausalLM, AutoTokenizer
         | 
| 87 | 
             
            import torch
         | 
    	
        config.json
    CHANGED
    
    | @@ -30,7 +30,7 @@ | |
| 30 | 
             
                "original_max_position_embeddings": 65536
         | 
| 31 | 
             
              },
         | 
| 32 | 
             
              "torch_dtype": "bfloat16",
         | 
| 33 | 
            -
              "transformers_version": "4. | 
| 34 | 
             
              "use_cache": true,
         | 
| 35 | 
             
              "vocab_size": 73448,
         | 
| 36 | 
             
              "rope_theta": 10000.0,
         | 
|  | |
| 30 | 
             
                "original_max_position_embeddings": 65536
         | 
| 31 | 
             
              },
         | 
| 32 | 
             
              "torch_dtype": "bfloat16",
         | 
| 33 | 
            +
              "transformers_version": "4.56.1",
         | 
| 34 | 
             
              "use_cache": true,
         | 
| 35 | 
             
              "vocab_size": 73448,
         | 
| 36 | 
             
              "rope_theta": 10000.0,
         | 
