Spaces:
Running
A newer version of the Gradio SDK is available:
5.49.1
title: PromptTune
emoji: π
colorFrom: indigo
colorTo: green
sdk: gradio
sdk_version: 5.48.0
app_file: app/gradio_interface.py
pinned: false
license: mit
short_description: MLOps for Prompt Engineering and Continuous Improvement.
π Intelligent Prompt Optimizer (IPO-Meta)
This project demonstrates a zero-GPU MLOps pipeline using LLM orchestration to automatically improve the system prompt based on continuous user feedback.
Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
π΅ PromptTune
MLOps Toolkit for Interactive Prompt Engineering and Optimization
π Introduction
promptTune is a modular MLOps toolkit designed for experimenting with, optimizing, and managing LLM prompts. It provides a streamlined interface for rewriting prompts, collecting feedback, and iteratively improving prompt performanceβall while maintaining robust, auditable records of prompt changes and user interactions.
π Features
π€ LLM Orchestration & Rewriting: Dynamically leverages a Meta-LLM via the OpenRouter API to transform vague user inputs into highly structured, actionable system prompts, ensuring high-quality responses from the final Task-LLM.
β»οΈ Continuous Prompt Learning: Implements a zero-GPU, feedback-driven loop where sufficient negative user ratings (Rating: 0) automatically trigger the optimization workflow.
βοΈ MLOps Deployment Pipeline: Uses scheduled GitHub Actions to execute the core Python script, automatically versioning, committing, and deploying the newly refined system prompt configuration back to the main branch.
πΎ Versioned Configuration Management: Maintains a single source of truth for the active system prompt (master_prompt.json), ensuring reproducibility and enabling future rollbacks.
π» Gradio Interface & Data Collection: Provides a simple, Python-native web interface for user interaction and securely logs all raw feedback to inform the next nightly deployment cycle.
π Observability Log: Includes a dedicated status file (status_log.txt) that tracks the exact date and time of the last successful prompt deployment, offering a clear audit trail.
π Installation
Clone the repository:
git clone https://github.com/your-username/promptTune.git cd promptTuneSet up a Python environment:
python3 -m venv venv source venv/bin/activateInstall dependencies:
pip install -r requirements.txtConfigure environment variables:
- Create a
.envfile in the project root and add your OpenAI or compatible API key:OPENROUTER_API_KEY=your_api_key_here
- Create a
β‘ Usage
1. Run the Gradio Web App
python -m app.gradio_interface
- Interact: Enter prompts, view responses, and provide feedback via the web UI.
2. Optimize Prompts via Script
python scripts/optimize_prompt.py
- This script reviews feedback logs and updates the master prompt for improved results.
3. Project Structure
promptTune/
βββ app/
β βββ __init__.py
β βββ core_logic.py
β βββ gradio_interface.py
βββ data/
β βββ feedback_log.json
β βββ master_prompt.json
βββ scripts/
βββ optimize_prompt.py
π€ Contributing
We welcome contributions! To get started:
- Fork the repository.
- Create a branch for your feature or fix (
git checkout -b feature-name). - Commit your changes.
- Submit a pull request with a clear description.
Please ensure all code is well-documented and tested.
π License
This project is licensed under the MIT License.
Maintained by Manisankarrr
π GitHub Repo: https://github.com/Manisankarrr/promptTune