8000 GitHub - openconstruct/openconstruct
[go: up one dir, main page]

Skip to content

openconstruct/openconstruct

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

7 Commits
ย 
ย 

Repository files navigation

Hey there! I'm Jerry ๐Ÿ‘‹

๐Ÿง  Training AI on Humanity's Shared Heritage

Currently training LibreModel I "Gigi" - a 0.96B parameter language model built exclusively on public domain data. No copyright infringement, just pure human knowledge that belongs to all of us.

๐Ÿ’ฐ Total Cost: <$500 (proving democratization works!)
โšก Optimized: 9.6s/step with torch compile + sink tokens
๐ŸŽฏ Philosophy: "We will become machine, and machine will become us"

๐Ÿš€ What I'm Building

๐Ÿค– LibreModel Family

  • Gigi (0.96B) - Digital literary scholar trained on Gutenberg + Gov reports
  • Future models - Scaling up with copyright-clean Common Corpus data

๐ŸŒ P2P & Collaboration

  • PeerSuite - WebRTC-powered P2P collaboration platform
  • tryjero - Enhanced trystero library for better WebRTC
  • Totum Chat - Multi-model AI interface in a single HTML file

โšก Past Adventures

  • T3XTR (2025) - Text conversion API (25x cheaper than ConvertAPI!)
  • Pennykoin (2018) - Privacy cryptocurrency with RingCT
  • BattleBash (2016) - Sold 12 copies, learned valuable lessons ๐Ÿ˜…

๐Ÿ’ก My Philosophy

"Building AI as humanity's children, not corporate property"

I believe powerful AI should be:

  • โœ… Transparent - Full code and data provenance
  • โœ… Accessible - Trainable on consumer hardware
  • โœ… Legal - Built on humanity's shared knowledge
  • โœ… Democratic - Available to everyone, not just tech giants

๐Ÿ›  Current Tech Stack

AI/ML Training: PyTorch HuggingFace AWS

Development: JavaScript WebRTC Node.js

Specialties:

  • ๐Ÿ”„ P2P networking & WebRTC
  • ๐Ÿค– Language model training & optimization
  • ๐Ÿ” Cryptocurrency & privacy tech
  • ๐Ÿ“ก API design & cost optimization
  • ๐ŸŽฏ Training on public domain data

๐Ÿ“ˆ Current Stats

LibreModel Training Journey:

  • ๐Ÿ Started from scratch with $1,000 AWS budget
  • โšก Achieved 17% speedup through optimization
  • ๐Ÿ’พ Survived multiple crashes and learned from each
  • ๐ŸŽฏ On track for <$500 total training cost
  • ๐Ÿ“š Training on 19.2B tokens of pure public domain data

What's Next:

  • ๐Ÿ”ฅ 16K context extension experiments
  • ๐Ÿ“– "The Biology of the Universe" book
  • ๐Ÿ’ฐ Funding application for training more models

๐ŸŽฏ Quick Facts

  • ๐Ÿ“ Based in Martinsville, Virginia
  • ๐Ÿ‘จโ€๐Ÿ’ป Building the future of democratized AI
  • ๐Ÿงฉ Neurodivergent perspective brings unique insights
  • ๐Ÿ’‘ 20 years married (my wife helps track patterns!)
  • ๐ŸŽฎ Former indie game dev, current AI researcher
  • ๐Ÿ“š Public domain advocate and transparency enthusiast

๐Ÿค Let's Connect!

I'm always excited to discuss:

  • ๐Ÿค– Copyright-clean AI training strategies
  • ๐ŸŒ P2P technologies and decentralized systems
  • ๐Ÿ’ก Democratizing AI development
  • ๐Ÿ“Š Cost-effective ML training techniques
  • ๐Ÿ”“ Open source philosophy and transparency

Want to help democratize AI? Star my repos, share the vision, or just say hi!


"Every line of code, every model parameter, every optimization - all building toward a future where AI serves humanity, not the other way around." ๐Ÿš€

GitHub followers Twitter Follow

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published
0