iva-codeint-swift-small GPT-2 is (small version - 239.4M parameters) trained from scratch to obtain results in the text-to-code task tailored for Swift language used in native mobile development (iOS).
Usage
from transformers import pipeline
pipe = pipeline("text-generation", model="mvasiliniuc/iva-codeint-swift-small")
outputs = pipe("func triggerNSNotification")
Inference
API_URL = "https://api-inference.huggingface.co/models/mvasiliniuc/iva-codeint-swift-small"
headers = {"Authorization": "Bearer <key>"}
def query(payload):
    response = requests.post(API_URL, headers=headers, json=payload)
    return response.json()
output = query({
"inputs": """
/* 
A function that gets the current device operating system.
*/
"""
})
pprint.pprint(output, compact=True)
Training
| Config | Value | 
|---|---|
| seq length | 1024 | 
| weight decay | 0.1 | 
| learning rate | 0.0005 | 
| max eval steps | -1 | 
| shuffle buffer | 10000 | 
| max train steps | 150000 | 
| mixed precision | fp16 | 
| num warmup steps | 2000 | 
| train batch size | 5 | 
| valid batch size | 5 | 
| lr scheduler type | cosine | 
| save checkpoint steps | 15000 | 
| gradient checkpointing | false | 
| gradient accumulation steps | 1 | 
Resources
Resources used for research:
- Downloads last month
- 5