File size: 11,122 Bytes
2844a43
 
 
 
 
 
 
 
 
 
c8ebe73
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
---
title: Medilang Tech Backend
emoji: 🩺
colorFrom: blue
colorTo: green
sdk: docker
pinned: false
license: mit
---

Carehelp (medicare-backend)

A FastAPI backend providing AI-powered medical assistance tailored for Cameroon, with multilingual support, conversation history, moderation for emergencies, and AI backends via Hugging Face Inference (default) or local providers (Ollama, LM Studio). OpenAI is no longer required.

Quick start

1) Requirements
- Python 3.11+
- Create a .env file (see variables below)

2) Install
pip install -r requirements.txt

3) Run
uvicorn main:app --reload

Environment variables (.env)

- APP_NAME=Carehelp
- ENVIRONMENT=development
- PORT=8000
- CORS_ALLOW_ORIGINS=*
- JWT_SECRET=change_this_secret
- JWT_ALGORITHM=HS256
- ACCESS_TOKEN_EXPIRE_MINUTES=43200
- AI_PROVIDER=hf            # hf | ollama | lmstudio (default: hf)

# Hugging Face
- HF_API_TOKEN=             # required for private models or higher rate limits
- HF_TEXT_MODEL=meta-llama/Meta-Llama-3-8B-Instruct
- HF_ASR_MODEL=distil-whisper/distil-large-v3
- HF_VISION_CAPTION_MODEL=Salesforce/blip-image-captioning-large

# Local: Ollama
- OLLAMA_BASE_URL=http://localhost:11434
- OLLAMA_MODEL=llama3.1:8b
- OLLAMA_VISION_MODEL=llava:latest

# Local: LM Studio (OpenAI-compatible)
- LMSTUDIO_BASE_URL=http://localhost:1234/v1
- LMSTUDIO_MODEL=local-model
- PATIENT_DATA_PATH=../patient_records.json
- HF_TRANSLATION_MODEL=facebook/nllb-200-distilled-600M

# Supabase (recommended)

- SUPABASE_URL=https://your-project-ref.supabase.co
- SUPABASE_ANON_KEY=eyJhbGciOi...
- SUPABASE_SERVICE_ROLE_KEY=eyJhbGciOi...  # keep secret; server-side only
- SUPABASE_DB_PASSWORD=your-postgres-password  # from Project Settings → Database → Connection

Database setup on Supabase (step-by-step)

1) Create your Supabase project
   - Go to `https://supabase.com` → New Project.
   - Set a strong database password (you will use it as SUPABASE_DB_PASSWORD).
   - Wait for the project to be provisioned.

2) Configure environment variables
   - In Supabase: Project Settings → API, copy `Project URL` and `anon` and `service_role` keys.
   - In your backend `.env`, set SUPABASE_URL, SUPABASE_ANON_KEY, SUPABASE_SERVICE_ROLE_KEY, SUPABASE_DB_PASSWORD (see above).

3) Full SQL schema and secure RLS policies
   - Open Supabase SQL Editor and run this script. It uses Supabase Auth users and keeps a separate `users` profile table keyed by `auth.users.id` (UUID):

```sql
-- Enable UUID extension if not enabled
create extension if not exists "uuid-ossp";

-- Profiles table linked to Supabase Auth users
create table if not exists public.users (
  id uuid primary key references auth.users(id) on delete cascade,
  email text unique,
  preferred_language text default 'fr',
  created_at timestamptz default now()
);

-- Conversations table
create table if not exists public.conversations (
  id bigint generated always as identity primary key,
  user_id uuid references public.users(id) on delete set null,
  started_at timestamptz default now(),
  context text default ''
);

-- Messages table
create table if not exists public.messages (
  id bigint generated always as identity primary key,
  conversation_id bigint references public.conversations(id) on delete cascade,
  message_type text default 'text',
  content text not null,
  role text default 'user',
  timestamp timestamptz default now()
);

create index if not exists idx_messages_conversation_id on public.messages(conversation_id);

-- Row Level Security
alter table public.users enable row level security;
alter table public.conversations enable row level security;
alter table public.messages enable row level security;

-- Helper function to check if the auth user matches profile id
create or replace function public.is_me(profile_id uuid)
returns boolean language sql stable as $$
  select auth.uid() = profile_id
$$;

-- Users policies: each user can see and update own profile
create policy "users_select_own" on public.users for select
using (is_me(id));

create policy "users_insert_self" on public.users for insert
with check (is_me(id));

create policy "users_update_own" on public.users for update
using (is_me(id)) with check (is_me(id));

-- Conversations policies: owner-only access
create policy "conversations_owner_select" on public.conversations for select
using (is_me(user_id));

create policy "conversations_owner_insert" on public.conversations for insert
with check (is_me(user_id));

create policy "conversations_owner_update" on public.conversations for update
using (is_me(user_id)) with check (is_me(user_id));

create policy "conversations_owner_delete" on public.conversations for delete
using (is_me(user_id));

-- Messages policies: access restricted by parent conversation ownership
create policy "messages_owner_select" on public.messages for select
using (exists (select 1 from public.conversations c where c.id = messages.conversation_id and is_me(c.user_id)));

create policy "messages_owner_insert" on public.messages for insert
with check (exists (select 1 from public.conversations c where c.id = conversation_id and is_me(c.user_id)));

create policy "messages_owner_update" on public.messages for update
using (exists (select 1 from public.conversations c where c.id = messages.conversation_id and is_me(c.user_id)))
with check (exists (select 1 from public.conversations c where c.id = conversation_id and is_me(c.user_id)));

create policy "messages_owner_delete" on public.messages for delete
using (exists (select 1 from public.conversations c where c.id = messages.conversation_id and is_me(c.user_id)));

-- Optional: service role bypass via RPC or using the service key (server-side)
-- Keep service key server-side only.
```

4) Enable Row Level Security (RLS) and basic policies
   - For a quick start, you can keep RLS disabled while developing.
   - Recommended (secure): enable RLS and add policies to restrict access to each user's data.
   - Example policies (adjust to your auth model):

```sql
-- WARNING: Example only. Adjust for your auth model.
alter table public.users enable row level security;
alter table public.conversations enable row level security;
alter table public.messages enable row level security;

-- If you use Supabase Auth, link auth.uid() to users.id via a mapping table or store auth.uid in users table.
-- For now, allow all for development:
create policy "allow all users read" on public.users for select using (true);
create policy "allow all users read" on public.conversations for select using (true);
create policy "allow all users read" on public.messages for select using (true);
create policy "allow all users insert" on public.users for insert with check (true);
create policy "allow all users insert" on public.conversations for insert with check (true);
create policy "allow all users insert" on public.messages for insert with check (true);
```

5) Install the Supabase Python client in your backend

```bash
pip install supabase
```

6) Initialize the Supabase client in your backend
   - Create a helper, for example `app/utils/supabase_client.py`:

```python
from supabase import create_client, Client
import os

SUPABASE_URL = os.getenv("SUPABASE_URL")
SUPABASE_ANON_KEY = os.getenv("SUPABASE_ANON_KEY")

def get_supabase_client() -> Client:
    if not SUPABASE_URL or not SUPABASE_ANON_KEY:
        raise RuntimeError("Supabase credentials are not configured")
    return create_client(SUPABASE_URL, SUPABASE_ANON_KEY)
```

7) Replace 501 stubs with Supabase queries
   - Users: use Supabase Auth for registration/login; store profile in `public.users` keyed by auth.user.id.
   - Conversations: insert into `conversations`, then insert messages into `messages`.
   - Example (create conversation and add first message):

```python
from app.utils.supabase_client import get_supabase_client

sb = get_supabase_client()

# Create conversation
conv = sb.table("conversations").insert({"user_id": user_id, "context": ""}).execute()
conv_id = conv.data[0]["id"]

# Add a message
sb.table("messages").insert({
    "conversation_id": conv_id,
    "content": user_text,
    "role": "user",
    "message_type": "text",
}).execute()

# Fetch last N messages
history = (
    sb.table("messages")
      .select("role, content")
      .eq("conversation_id", conv_id)
      .order("timestamp", desc=True)
      .limit(10)
      .execute()
).data
```

8) Authentication
   - Using Supabase Auth: the backend now uses `sign_in_with_password` and returns the Supabase access token.
   - Protect endpoints by sending the `Authorization: Bearer <access_token>` header. RLS policies enforce per-user data access.

9) Environment and deployment
   - Set SUPABASE_* variables in your production environment.
   - Ensure outbound access to `your-project-ref.supabase.co` is allowed from your server.
   - Do not expose `SERVICE_ROLE` to the browser or mobile apps; keep it server-side only.

Security

- Use strong JWT secrets and rotate periodically.
- Do not store raw medical data without consent. The example schema stores message content; consider encrypting sensitive fields at rest.

API

- Auth: register /api/users/register, login /api/users/login (currently 501 until Supabase integration)
- Chat: /api/chat (currently 501 until Supabase integration)
- Transcribe: /api/transcribe
- Analyze image: /api/analyze-image
- Translate: /api/translate
- Health: /health
- Single gateway endpoint: POST /gateway with body {"action": "chat|transcribe|analyze-image|translate", "payload": { ... }}

Docker

docker build -t carehelp-backend .
docker run -p 8000:8000 --env-file .env carehelp-backend

Providers and local setup

- Hugging Face (default)
  - Set `AI_PROVIDER=hf` and `HF_API_TOKEN` if needed.
  - Models used:
    - Text: `HF_TEXT_MODEL` (default: Meta-Llama 3 8B Instruct)
    - ASR: `HF_ASR_MODEL` (default: Distil-Whisper large v3)
    - Vision caption: `HF_VISION_CAPTION_MODEL` (BLIP captioning)

- Ollama (local)
  - Install: https://ollama.com
  - Start server: `ollama serve` (default `http://localhost:11434`)
  - Pull models: `ollama pull llama3.1:8b` and `ollama pull llava:latest`
  - Set `.env`: `AI_PROVIDER=ollama`, optionally customize `OLLAMA_*` vars.

- LM Studio (local)
  - Install: https://lmstudio.ai
  - Start the server (OpenAI-compatible) and note its base URL (default `http://localhost:1234/v1`).
  - Set `.env`: `AI_PROVIDER=lmstudio`, `LMSTUDIO_BASE_URL=...`, `LMSTUDIO_MODEL=<model name in LM Studio>`.

Deploy

- Render/Heroku: deploy container or repo, set environment variables, and expose port 8000. Ensure DATABASE_URL and OPENAI_API_KEY are configured.
- AWS (ECS/Fargate or EC2): use the Dockerfile, attach secrets, and security groups to allow DB egress.

Notes

- Responses include a medical disclaimer and emergency triage to redirect urgent cases.
- RAG: primary context is `clinical_summaries.csv` (configurable via `CAMEROON_DATA_CSV`); falls back to `patient_records.json` if CSV is missing.
- Tests patch OpenAI clients in `app/ai_services.py`, and are kept compatible; runtime uses HF/Ollama/LM Studio based on `AI_PROVIDER`.