[go: up one dir, main page]

×
Jun 18, 2024 · We introduce ChatGLM, an evolving family of large language models that we have been developing over time. This report primarily focuses on the GLM-4 language ...
Jul 30, 2024 · This paper introduces a family of large language models called ChatGLM, ranging from the 130-billion parameter GLM-130B to the smaller GLM-4.
Jun 19, 2024 · ChatGLM: A Family of Large Language Models from GLM-130B to GLM-4 All Tools GLM-4: - closely rivals GPT-4 on MMLU, MATH, GPQA, ...
Jun 18, 2024 · TL;DR: The document discusses the development and impressive capabilities of the ChatGLM family of large language models, particularly the GLM-4 ...
Dec 4, 2023 · 8 months have witnessed numerous challenges o Engineering: How to train 100B-scale models from scratch? ▫ Hygon DCU, NVIDIA A100, ...
GLM-4-9B is the open-source version of the latest generation of pre-trained models in the GLM-4 series launched by Zhipu AI.
People also search for
Jun 18, 2024 · This report primarily focuses on the GLM-4 language series, which includes GLM-4, GLM-4-Air, and GLM-4-9B. They represent our most capable ...
Aug 12, 2024 · GLM-4-9B is the open-source version of the latest generation of pre-trained models in the GLM-4 series launched by Zhipu AI.
Jun 19, 2024 · We published a tech report about GLM's Family! ChatGLM: A Family of Large Language Models from GLM-130B to GLM-4 All Tools.
GLM-130B is an open bilingual (English & Chinese) bidirectional dense model with 130 billion parameters, pre-trained using the algorithm of General Language ...
Missing: Family | Show results with:Family