Groq Chat is a lightning-fast, browser-based chat interface for language models powered by Groq's LPU (Language Processing Unit). Experience ChatGPT-like conversations using Meta's LLAMA 3.1 series, with enhanced privacy and productivity features.
- 🧠 Powered by Groq's LPU for ultra-fast language model inference
- 🤖 Access to Meta's LLAMA 3.1 series models (405B, 70B, 8B parameters)
- 🔒 Privacy-focused: Runs entirely in your browser, no server-side data storage
- 🌐 RAG support for web page URLs: Attach and crawl web pages for context-aware conversations
- 🎙️ Speech-to-text functionality for voice interactions
- 📝 Edit messages and branch conversations
- 💾 Save and manage conversation history locally
- 🗑️ Delete conversations as needed
- 🔗 No login required - just bring your Groq API key
Visit https://groqchat-three.vercel.app/ to use Groq Chat online.
To run Groq Chat locally:
-
Clone the repository:
git clone https://github.com/yourusername/groq-chat.git cd groq-chat
-
Install dependencies:
npm install
-
Run the development server:
npm run dev
-
Open http://localhost:3000 in your browser.
- Enter your Groq API key when prompted.
- Start chatting with the language model of your choice.
- Use the URL attachment feature to add context from web pages.
- Utilize speech-to-text for voice interactions.
- Edit, branch, save, or delete conversations as needed.
- Next.js
- Vercel for deployment
- IndexedDB for local storage
- Groq API for language model inference
- Multi-modality support (when available from Groq)
- Custom JavaScript macros for enhanced functionality
- File attachment support (PDF, documents)
- Auto-formatting options
This is an open-source project. Contributions, issues, and feature requests are welcome!
- Groq for their incredible LPU technology
- Meta for the LLAMA 3.1 series models
- Vercel for their excellent hosting and deployment services
Built with ❤️ by Unclecode (Follow me on X).