Spaces:
Running
Running
Elimina el parámetro opcional `request` de la función `respond` en `app.py`, simplificando la firma de la función y mejorando la claridad del código. Esta modificación se realiza para optimizar la gestión de la función y centrarse en el manejo del historial de interacciones sin la necesidad de gestionar sesiones.
Browse files
app.py
CHANGED
|
@@ -193,7 +193,7 @@ def _history_preview(history: list[tuple[str, str]] | None, max_turns: int = 3,
|
|
| 193 |
return _preview_text(joined, max_chars)
|
| 194 |
|
| 195 |
|
| 196 |
-
def respond(message, history: list[tuple[str, str]]
|
| 197 |
"""Stream assistant reply via Gemini using OpenAI-compatible API.
|
| 198 |
|
| 199 |
Yields partial text chunks so the UI shows a live stream.
|
|
@@ -248,8 +248,6 @@ def respond(message, history: list[tuple[str, str]], request: gr.Request | None
|
|
| 248 |
"has_images": bool(image_parts),
|
| 249 |
"history_preview": _history_preview(history),
|
| 250 |
},
|
| 251 |
-
# Agrupa todos los turns de una misma sesión de Gradio
|
| 252 |
-
thread_id=getattr(request, "session_hash", None) if request is not None else None,
|
| 253 |
)
|
| 254 |
pipeline.post()
|
| 255 |
|
|
|
|
| 193 |
return _preview_text(joined, max_chars)
|
| 194 |
|
| 195 |
|
| 196 |
+
def respond(message, history: list[tuple[str, str]]):
|
| 197 |
"""Stream assistant reply via Gemini using OpenAI-compatible API.
|
| 198 |
|
| 199 |
Yields partial text chunks so the UI shows a live stream.
|
|
|
|
| 248 |
"has_images": bool(image_parts),
|
| 249 |
"history_preview": _history_preview(history),
|
| 250 |
},
|
|
|
|
|
|
|
| 251 |
)
|
| 252 |
pipeline.post()
|
| 253 |
|