LLM Writer Workshop simulates a creative writing workshop. It uses selectable AI models, including OpenAI's GPT-4 and GPT-3.5 Turbo; Anthropic's Claude Haiku, Sonnet, and Opus; Google's Gemini; and Mistral's small, medium, and large. The project is written with a Python/Flask backend and a TypeScript/React frontend.
- Editor: Provides fine-grained technical feedback on word choice, grammar, and sentence structure.
- Agent: Reviews your work with and eye toward genre and overall quality. Enthusiastic but critical. You may find the agent offering to clean your car or bring you breakfast—anything for a client!
- Writer: A fellow writer. Reviews your work for plot, characters, and imagery. Your fellow writer may relate your work to their own (made-up) publications.
- Publisher: Part of the workshop is submitting your work as-is to your workshop publisher. The publisher is very busy reviewing manuscripts, and only has time to give a quick accept/reject decision, with concise, decisive feedback related to their publication decision.
Your workshop teammates may make up names for themselves, their publishing house, their agency, or their published writing.
- Clone this repository and cd into it
- Install Docker if have not already: https://docs.docker.com/engine/install/
- Ensure you have your API keys set (see below)
Then:
docker-compose up --build
Connect to http://localhost:3000 in your browser.
You will need API keys to run this project. Here are instructions for creating these api keys:
OpenAI: https://help.openai.com/en/articles/4936850-where-do-i-find-my-openai-api-key
Claude: https://docs.anthropic.com/claude/reference/getting-started-with-the-api
Gemini: https://ai.google.dev/tutorials/get_started_web
Mistral: https://docs.mistral.ai/
Copy the .env.example
in the "service" directory to a new file named .env
in the "service" directory, and replace the placeholders with your actual API keys:
cp ./service/.env.example ./service/.env
# Or use your favorite editor here
vim ./service/.env
If you are building manually (see below), another option is just to set the environment variables in your environment:
export OPENAI_API_KEY=<your_openai_api_key>
export ANTHROPIC_API_KEY=<your_anthropic_api_key>
export GEMINI_API_KEY=<your_gemini_api_key>
export MISTRAL_API_KEY=<your_mistral_api_key>
This project uses Poetry for dependency management. To install the project dependencies, first install Poetry:
pip install poetry
Then run:
cd service
poetry install
cd ui
npm install
Perform the following steps, then visit http://localhost:3000 in your browser.
cd service && poetry run python app.py
cd ui
npm start
The different personas in the workshop (editor, agent, writer, and publisher) are created through system prompts, which are defined in service/service/config/config.toml
. You may edit this file to easily use your own prompts instead of the defaults.
I made this because I wanted AI to help me write, not write for me. I wanted the virtual experience of getting feedback from multiple points of view within the same workshop session. And I wanted the ability to receive different points of view—by selecting different models—even for the same persona.
- Streaming to UI
- Different prompts selectable through the UI
- Disable model selection based on token count of writing
- Disable model selection based on usage limitations (e.g. gemini-pro-1.5 only 2 times/minute)
- Support for local models
This project is licensed under the terms of the MIT license.