AnujithM commited on
Commit
e37e26f
·
verified ·
1 Parent(s): b018b91

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +11 -14
README.md CHANGED
@@ -4,23 +4,20 @@ emoji: 🌦️
4
  colorFrom: blue
5
  colorTo: green
6
  sdk: gradio
7
- sdk_version: 4.44.0
8
  app_file: app.py
9
  pinned: false
10
  ---
11
 
12
- # ClimaMind — K2-Think + Live Climate Data (Gradio on Hugging Face Spaces)
13
 
14
- ## Setup
15
- 1) Create a new Space SDK = **Gradio**.
16
- 2) Upload `app.py` and `requirements.txt` (this README is optional).
17
- 3) In **Settings Variables / secrets**, set:
18
- - `PROVIDER` = `hf_model` (recommended) or `local` or `stub`
19
- - `MODEL_ID` = `MBZUAI-IFM/K2-Think-SFT` (default) or `LLM360/K2-Think`
20
- - `HF_TOKEN` = your HF token (Read + Inference)
21
- 4) If choosing `local`, switch the Space hardware to **GPU**.
22
 
23
- ## Notes
24
- - Uses Open-Meteo + OpenAQ (keyless).
25
- - If model returns non-JSON, you’ll see a friendly fallback.
26
- - If rate-limited, temporarily set `PROVIDER=stub` for the demo.
 
 
4
  colorFrom: blue
5
  colorTo: green
6
  sdk: gradio
7
+ sdk_version: 4.44.1
8
  app_file: app.py
9
  pinned: false
10
  ---
11
 
12
+ # ClimaMind — K2-Think + Live Climate Data (Gradio on Spaces)
13
 
14
+ This Space hosts your hackathon demo:
15
+ - Live weather/UV from Open-Meteo + PM2.5 from OpenAQ (keyless).
16
+ - Reasoning via K2-Think (HF Inference API by default), strict JSON output → Answer, Why-trace, Risk.
17
+ - Gradio UI with a shareable link.
 
 
 
 
18
 
19
+ ## Configure (Settings → Variables / secrets)
20
+ - `PROVIDER` = `hf_model` (recommended) or `local` or `stub`
21
+ - `MODEL_ID` = `MBZUAI-IFM/K2-Think-SFT` (default) or `LLM360/K2-Think`
22
+ - `HF_TOKEN` = your HF token (Read + Inference)
23
+ - Optional: `HF_HUB_DISABLE_TELEMETRY` = `1`