8000 flash-algo · GitHub
[go: up one dir, main page]

Skip to content
@flash-algo

flash-algo

Research on efficient algorithms

Hi there đź‘‹

We aim to develop more efficient algorithms to be apply on the Transformers. 🤗

Pinned Loading

  1. flash-sparse-attention flash-sparse-attention Public

    Trainable fast and memory-efficient sparse attention

    Python 485 46

Repositories

Showing 4 of 4 repositories

People

This organization has no public members. You must be a member to see who’s a part of this organization.

Top languages

Loading…

Most used topics

Loading…

0