|
|
--- |
|
|
title: README |
|
|
emoji: π |
|
|
colorFrom: green |
|
|
colorTo: red |
|
|
sdk: static |
|
|
pinned: false |
|
|
--- |
|
|
|
|
|
Zhipu AI (Z.ai) |
|
|
|
|
|
We build the **[ChatGLM](https://arxiv.org/pdf/2406.12793)** family of LLMs, develop LLMs as **Agents**, and release related LLM training & inference techniques: |
|
|
|
|
|
* **[GLM-4.6 & GLM-4.5](https://github.com/zai-org/GLM-4.5)**, |
|
|
**[GLM-4.5V & GLM-4.1V](https://github.com/zai-org/GLM-V)**, |
|
|
**[GLM-4](https://github.com/zai-org/GLM-4)**, |
|
|
**[CodeGeeX](https://github.com/zai-org/CodeGeeX4)**, |
|
|
**[CogVLM (VisualGLM)](https://github.com/zai-org/CogVLM2)**, |
|
|
**[WebGLM](https://github.com/zai-org/WebGLM)**, |
|
|
**[GLM-130B](https://github.com/zai-org/GLM-130B)**, |
|
|
**[CogView](https://github.com/zai-org/CogView)**, |
|
|
**[CogVideo && CogVideoX](https://github.com/zai-org/CogVideo)**. |
|
|
* **[CogAgent](https://github.com/zai-org/CogVLM)**, |
|
|
**[AutoWebGLM](https://github.com/zai-org/AutoWebGLM)**, |
|
|
**[AgentTuning](https://github.com/zai-org/AgentTuning)**, |
|
|
**[APAR](https://arxiv.org/abs/2401.06761)**. |
|
|
|
|
|
We also work on **LLM evaluations**: |
|
|
**[LVBench](https://github.com/zai-org/LVBench)**, |
|
|
**[MotionBench](https://github.com/zai-org/MotionBench)**, |
|
|
**[ComplexFuncBench](https://github.com/zai-org/ComplexFuncBench)**, |
|
|
**[AgentBench](https://github.com/THUDM/AgentBench)**, |
|
|
**[AlignBench](https://github.com/THUDM/AlignBench)**, |
|
|
**[LongBench](https://github.com/THUDM/LongBench)**, |
|
|
**[NaturalCodeBench](https://github.com/THUDM/NaturalCodeBench)**. |
|
|
|
|
|
|
|
|
We started with **social networks and graphs**, and always love them: |
|
|
**[GraphMAE](https://github.com/THUDM/GraphMAE)**, |
|
|
**[GPT-GNN](https://github.com/acbull/GPT-GNN)**, |
|
|
**[GCC](https://github.com/THUDM/CogDL)**, |
|
|
**[SelfKG](https://github.com/THUDM/SelfKG)**, |
|
|
**[CogDL](https://github.com/THUDM/CogDL)**. |
|
|
**[AMiner](https://www.aminer.cn/)**. |
|
|
|