8000 saforem2 (Sam Foreman) Β· GitHub
[go: up one dir, main page]

Skip to content
View saforem2's full-sized avatar
:octocat:
making rocks think
:octocat:
making rocks think

Organizations

@argonne-lcf @nftqcd

Block or report saforem2

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Maximum 250 characters. Please don't include any personal information such as legal names or email addresses. Markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
saforem2/README.md

Sam Foreman

πŸ‘‹ Hi, I'm Sam.

I'm interested generally on large scale distributed training and AI for science1.

GitHub Streak

Profile Stats

Wakatime

wakatime:

🐍

github contribution grid snake animation

Note

TakaTime Dashboard Jan 21 to Jan 28

Tip

Total Coding Time (7d): 518h 19m

πŸ“ˆ Trends

Period Duration Period Duration
Yesterday 0m Last 7 Days 518h 19m
Last 30 Days 733h 4m All Time 733h 4m

πŸ’» Languages

Language Time Percentage
md 243h 26m 🟦🟦🟦🟦⬜⬜⬜⬜⬜⬜ 47.0%
qmd 137h 15m 🟦🟦⬜⬜⬜⬜⬜⬜⬜⬜ 26.5%
text 120h 17m 🟦🟦⬜⬜⬜⬜⬜⬜⬜⬜ 23.2%
lua 17h 1m ⬜⬜⬜⬜⬜⬜⬜⬜⬜⬜ 3.3%
Other 19m ⬜⬜⬜⬜⬜⬜⬜⬜⬜⬜ 0.1%

πŸ”₯ Projects

Project Time Percentage
ezpz 243h 46m 🟩🟩🟩🟩⬜⬜⬜⬜⬜⬜ 47.0%
nvim 137h 15m 🟩🟩⬜⬜⬜⬜⬜⬜⬜⬜ 26.5%
personal_site_CLEAN 137h 15m 🟩🟩⬜⬜⬜⬜⬜⬜⬜⬜ 26.5%
unknown 1m ⬜⬜⬜⬜⬜⬜⬜⬜⬜⬜ 0.0%

Footnotes

  1. Mostly trying to get supercomputers to talk to each other. ↩

Pinned Loading

  1. l2hmc-qcd l2hmc-qcd Public

    Application of the L2HMC algorithm to simulations in lattice QCD.

    Jupyter Notebook 68 9

  2. ezpz ezpz Public

    Train across all your devices, ezpz πŸ‹

    Python 26 7

  3. deepspeedai/DeepSpeed deepspeedai/DeepSpeed Public

    DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.

    Python 41.5k 4.7k

  4. huggingface/nanotron huggingface/nanotron Public

    Minimalistic large language model 3D-parallelism training

    Python 2.5k 275

  5. argonne-lcf/Megatron-DeepSpeed argonne-lcf/Megatron-DeepSpeed Public

    Forked from deepspeedai/Megatron-DeepSpeed

    Ongoing research training transformer language models at scale, including: BERT & GPT-2

    Python 17 18

  6. personal_site personal_site Public

    My personal website

    Python 11

0