This code can be used to download solar forecasts and save them to a PostgreSQL database. It fetches solar generation estimates for embedded solar farms and processes the data for analysis. We currently collect
- UK: Forecast can be retreived from NESO. Generation Data can be retrevied from PVLive.
- NL: Generation values from Ned NL, both national and region. National Forecast values from Ned NL too.
- DE: Generation values from ENTSOE for several TSOs.
Here are the different sources of data, and which methods can be used to save the results
| Source | Country | CSV | Data Platform | DB (Legacy) | Site DB (Legacy) |
|---|---|---|---|---|---|
| PVLive | π¬π§ | β | β | ||
| NESO forecast | π¬π§ | β | β | ||
| Ned-nl | π³π± | β | β | ||
| Ned-nl forecast | π³π± | β | β | ||
| Germany (ENTSOE) | π©πͺ | β | β |
- Docker
- Docker Compose
- Clone the repository:
git clone https://github.com/openclimatefix/neso-solar-consumer.git
cd neso-solar-consumer- Copy the example environment file:
cp .example.env .env- Start the application:
docker compose up -dThe above command will:
- Start a PostgreSQL database container
- Build and start the NESO Solar Consumer application
- Configure all necessary networking between containers
To stop the application:
docker compose downTo view logs:
docker compose logs -fNote: The PostgreSQL data is persisted in a Docker volume. To completely reset the database, use:
docker compose down -v
The package provides three main functionalities:
- Data Fetching: Retrieves solar forecast data from the NESO API
- Data Formatting: Processes the data into standardized forecast objects
- Data Storage: Saves the formatted forecasts to a PostgreSQL database
fetch_data.py: Handles API data retrievalformat_forecast.py: Converts raw data into forecast objectssave_forecast.py: Manages database operationsapp.py: Orchestrates the entire pipeline
DB_URL=postgresql://postgres:postgres@localhost:5432/neso_solar: Database ConfigurationCOUNTRY="gb": Country code for fetching data. Currently, other options are ["nl"]SAVE_METHOD="db": Ways to store the data. Currently other options are ["csv", "site-db"]CSV_DIR=None: Directory to save CSV files ifSAVE_METHOD="csv".UK_PVLIVE_REGIME=in-day: For UK PVLive, the regime. Can be "in-day" or "day-after"UK_PVLIVE_N_GSPS=342: For UK PVLive, the amount of gsps we pull data for.UK_PVLIVE_BACKFILL_HOURS=2: For UK PVLive, the amount of backfill hours we pull, when regime="in-day"
- Set up the development environment:
pip install ".[dev]"- Run tests:
pytest- Format code:
black .- Run linter:
ruff check .The test suite includes unit tests and integration tests:
# Run all tests
pytest
# Run specific test file
pytest tests/test_fetch_data.py
# Run with coverage
pytest --cov=neso_solar_consumerThis reposistory has 2 main CI workflows - branch-ci and merged-ci.
branch-ciis triggered on all pushes to any branch exceptmain, and on any pull request that is opened, reopened or updated. It runs the tests suite, lints the project, and builds and pushes a dev image.merged-ciis triggered on any pull request merged intomain. It bumps the git tag, and builds and pushes a container with that tag.
Q: What format is the data stored in? A: The data is stored in PostgreSQL using SQLAlchemy models, with timestamps in UTC and power values in megawatts.
Q: How often should I run the consumer? A: This depends on your use case and the NESO API update frequency. The consumer can be scheduled using cron jobs or other scheduling tools.
This project is licensed under the MIT License - see the LICENSE file for details.
- PR's are welcome! See the Organisation Profile for details on contributing
- Find out about our other projects in the OCF Meta Repo
- Check out the OCF blog for updates
- Follow OCF on LinkedIn
Part of the Open Climate Fix community.
