The official GitHub page for the survey paper "A Survey of Large Language Models".
-
Updated
Aug 20, 2024 - Python
The official GitHub page for the survey paper "A Survey of Large Language Models".
Use PEFT or Full-parameter to finetune 400+ LLMs or 100+ MLLMs. (LLM: Qwen2.5, Llama3.2, GLM4, Internlm2.5, Yi1.5, Mistral, Baichuan2, DeepSeek, Gemma2, ...; MLLM: Qwen2-VL, Qwen2-Audio, Llama3.2-Vision, Llava, InternVL2, MiniCPM-V-2.6, GLM4v, Xcomposer2.5, Yi-VL, DeepSeek-VL, Phi3.5-Vision, ...)
Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
Making data higher-quality, juicier, and more digestible for any large models! 🍎 🍋 🌽 ➡️ ➡️🍸 🍹 🍷为大模型提供更高质量、更丰富、更易”消化“的数据!
Papers about pretraining and self-supervised learning on Graph Neural Networks (GNN).
Awesome resources for in-context learning and prompt engineering: Mastery of the LLMs such as ChatGPT, GPT-3, and FlanT5, with up-to-date and cutting-edge updates.
Code for TKDE paper "Self-supervised learning on graphs: Contrastive, generative, or predictive"
An Open-sourced Knowledgable Large Language Model Framework.
Awesome list for research on CLIP (Contrastive Language-Image Pre-Training).
Oscar and VinVL
Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo
Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN
A professional list on Large (Language) Models and Foundation Models (LLM, LM, FM) for Time Series, Spatiotemporal, and Event Data.
Unified Training of Universal Time Series Forecasting Transformers
Research code for ECCV 2020 paper "UNITER: UNiversal Image-TExt Representation Learning"
Code for ICLR 2020 paper "VL-BERT: Pre-training of Generic Visual-Linguistic Representations".
Large Language Model-enhanced Recommender System Papers
[ICLR 2024] Sheared LLaMA: Accelerating Language Model Pre-training via Structured Pruning
[NeurIPS 2020] "Graph Contrastive Learning with Augmentations" by Yuning You, Tianlong Chen, Yongduo Sui, Ting Chen, Zhangyang Wang, Yang Shen
Code for KDD'20 "Generative Pre-Training of Graph Neural Networks"
Add a description, image, and links to the pre-training topic page so that developers can more easily learn about it.
To associate your repository with the pre-training topic, visit your repo's landing page and select "manage topics."