8000 GitHub - entl/evolyte-backend
[go: up one dir, main page]

Skip to content

entl/evolyte-backend

Repository files navigation

FastAPI Solar Forecasting Backend

This project provides the main backend services for a solar energy forecasting platform, built with FastAPI and PostgreSQL.

🔍 Logging and Monitoring

All logs generated by the backend are structured and shipped to Elasticsearch, allowing for centralised storage and analysis. These logs are visualised using Kibana, enabling real-time monitoring, filtering, and diagnostics of backend behaviour and API activity.

To access Kibana (if running locally or in Docker):

  • Open: http://localhost:5601
  • To filter logs by backend service in Kibana, use the following query in the Discover tab:
container.labels.family: "backend-api"

Make sure the Elasticsearch and Kibana services are running and properly configured in your Docker or deployment environment. Example Docker Compose setup for Elasticsearch and Kibana can be found in the docker-elk repository


Setup Instructions

1. Clone the repository

git clone https://github.com/entl/evolyte-ml-backend
cd evolyte-ml-backend

2. Create a virtual environment and activate it

python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate

3. Install required dependencies

pip install -r requirements.txt

4. Create a .env file

Create a .env file in the project root directory and add the following environment variables:

PG_DATABASE_HOSTNAME
PG_DATABASE_PORT
PG_DATABASE_PASSWORD
PG_DATABASE_NAME
PG_DATABASE_USERNAME

JWT_SECRET_KEY
JWT_ALGORITHM
jwt_token_expire_minutes

ML_API_URL

CORS_ORIGINS= (list format)

Note:

  • Update PG_DATABASE_PASSWORD, PG_DATABASE_NAME, and other variables if needed.
  • Make sure your PostgreSQL server is running.

5. Apply database migrations

Run Alembic migrations to upgrade the database schema:

alembic upgrade head

6. Populate the database

After running the migrations, populate the solar_panels table by importing the provided CSV file into your PostgreSQL database.

The solar_panels.csv file is located in the project root directory.

Example command using psql:

psql -h <HOST> -U <USERNAME> -d <DATABASE> -c "\COPY solar_panels(column1, column2, ...) FROM './solar_panels.csv' DELIMITER ',' CSV HEADER;"
  • Replace <HOST>, <USERNAME>, and <DATABASE> with your database details.
  • Make sure the columns match your database schema.

Alternatively, you can use your preferred PostgreSQL client (like DBeaver, pgAdmin, etc.) to import the file.

7. Run the FastAPI ML service

uvicorn src.main:app --reload --port=8001

8. Access the API documentation

Once the server is running:

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

0