-
University of Toronto
- Toronto
-
00:28
(UTC -12:00) - https://www.linkedin.com/in/rex-ma-20a455113/
- @RexMa9
Highlights
- Pro
Block or Report
Block or report rexxxx1234
Contact GitHub support about this user’s behavior. Learn more about reporting abuse.
Report abuseStars
Language
Sort by: Recently starred
Cramming the training of a (BERT-type) language model into limited compute.
Fast & Simple repository for pre-training and fine-tuning T5-style models
Implementation of MEGABYTE, Predicting Million-byte Sequences with Multiscale Transformers, in Pytorch
A framework for few-shot evaluation of language models.
A collection of awesome bio-foundation models, including protein, RNA, DNA, gene, single-cell, and so on.
The LucaOne’s code, including model code and pre-training code.
A high-throughput and memory-efficient inference and serving engine for LLMs
Contrastive learning harmonizing protein language models and natural language models
Implementation of Alpha Fold 3 from the paper: "Accurate structure prediction of biomolecular interactions with AlphaFold3" in PyTorch
GaLore: Memory-Efficient LLM Training by Gradient Low-Rank Projection
Finetune Llama 3.1, Mistral, Phi & Gemma LLMs 2-5x faster with 80% less memory
DNABERT_S: Learning Species-Aware DNA Embedding with Genome Foundation Models
ICLR'24 | BioBridge: Bridging Biomedical Foundation Models via Knowledge Graphs
Code for 'LLM2Vec: Large Language Models Are Secretly Powerful Text Encoders'
Integrate Any Omics: Towards genome-wide data integration for patient stratification
Polygraph evaluates and compares groups of nucleic acid sequences based on their sequence and functional content for effective design of regulatory elements.
Gibbs sampling for generating protein sequences
Generating and scoring novel enzyme sequences with a variety of models and metrics
A Protein Large Language Model for Multi-Task Protein Language Processing
BiomedGPT: A Unified and Generalist Biomedical Generative Pre-trained Transformer for Vision, Language, and Multimodal Tasks
[ACL 2024] ProtLLM: An Interleaved Protein-Language LLM with Protein-as-Word Pre-Training
Implementation of 💍 Ring Attention, from Liu et al. at Berkeley AI, in Pytorch
a friendly neighborhood repository with diverse experiments and adventures in the world of LLMs
The official PyTorch implementation of Google's Gemma models