Spaces:
Running
Running
Update README.md
Browse files
README.md
CHANGED
|
@@ -31,6 +31,13 @@ EmbeddedLLM is an open-source company dedicated to advancing the field of Large
|
|
| 31 |
- New features not yet available in the upstream
|
| 32 |
- Optimized for AMD GPUs with ROCm support
|
| 33 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 34 |
**Join Us**
|
| 35 |
|
| 36 |
We invite you to explore our repositories and models, contribute to our projects, and join us in pushing the boundaries of what's possible with LLMs.
|
|
|
|
| 31 |
- New features not yet available in the upstream
|
| 32 |
- Optimized for AMD GPUs with ROCm support
|
| 33 |
|
| 34 |
+
3. **[EmbeddedLLM/embeddedllm](https://github.com/EmbeddedLLM/embeddedllm)**
|
| 35 |
+
- **Description**: It is a AIPC embedded LLM Engine unifying and provide stable way to run LLM fast on CPU, iGPU, GPU. It supports launching OpenAI-API-Compatible API server powered by our engine.
|
| 36 |
+
- **Key Features**:
|
| 37 |
+
- Supported hardwares: CPU (ONNX), AMD iGPU (ONNX-DirectML), Intel iGPU (IPEX-LLM, OpenVINO), Intel XPU (IPEX-LLM, OpenVINO), Nvidia GPU (ONNX-CUDA).
|
| 38 |
+
- Provide prebuilt, ready-to-run Windows 11 executable.
|
| 39 |
+
- Vision Language Models support (CPU)
|
| 40 |
+
|
| 41 |
**Join Us**
|
| 42 |
|
| 43 |
We invite you to explore our repositories and models, contribute to our projects, and join us in pushing the boundaries of what's possible with LLMs.
|