Skip to content

Commit

Permalink
Merge branch 'develop' into 'master'
Browse files Browse the repository at this point in the history
Develop into master

See merge request iek-3/shared-code/fine!326
  • Loading branch information
kknos committed Feb 13, 2024
2 parents a8f4a11 + 4666bc5 commit f4ec7bb
Show file tree
Hide file tree
Showing 22 changed files with 322 additions and 160 deletions.
86 changes: 86 additions & 0 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,86 @@
on:
push:
branches:
- master
- develop
pull_request:
branches:
- master
- develop
# Allows to trigger the workflow manually
workflow_dispatch:
branches:
- master
- develop
schedule:
# * is a special character in YAML so you have to quote this string
# Some Examples for cron syntax https://crontab.guru/examples.html
# Schedules job at any point after 12 pm
- cron: '0 0 * * *'
# Weekly after sunday
# - cron: 0 0 * * 0

jobs:
TestFineSingle:
name: Ex1 (${{ matrix.python-version }}, ${{ matrix.os }})
runs-on: ${{ matrix.os }}
strategy:
fail-fast: false
matrix:
os: ["ubuntu-latest", "macos-latest", "windows-latest"]
steps:
- name: Checkout
uses: actions/checkout@v4
with:
repository: FZJ-IEK3-VSA/FINE
path: './fine'
- uses: conda-incubator/setup-miniconda@v3
with:
miniforge-version: latest
channels: conda-forge
activate-environment: test_env
- name: Run tests
shell: pwsh
run: |
ls
echo "LS Done"
mamba install fine pytest
echo "Installation done"
conda list
echo "libaries printed"
echo "start pytest"
pytest
echo "Pytest done"
TestFineDevLocal:
name: Ex1 (${{ matrix.python-version }}, ${{ matrix.os }})
runs-on: ${{ matrix.os }}
strategy:
fail-fast: false
matrix:
os: ["ubuntu-latest", "macos-latest", "windows-latest"]
steps:
- name: Checkout
uses: actions/checkout@v4
with:
repository: FZJ-IEK3-VSA/FINE
path: './fine'
- uses: conda-incubator/setup-miniconda@v3
with:
miniforge-version: latest
channels: conda-forge
activate-environment: test_env
- name: Run tests
shell: pwsh
run: |
ls
echo "LS Done"
cd fine
mamba env create --name fine_env --yes --file requirements_dev.yml
conda run --name fine_env pip install . --no-deps
echo "Installation done"
conda list --name fine_env
echo "libaries printed"
echo "start pytest"
conda run --name fine_env pytest
echo "Pytest done"
15 changes: 9 additions & 6 deletions .gitlab-ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -30,18 +30,20 @@ variables:
policy: pull
before_script:
- micromamba install -n base -y --file=requirements_dev.yml
- python -m pip install .
rules:
# Switch from branch pipeline to merge pipeline once a merge request has
# been created on the branch.
- if: $CI_COMMIT_BRANCH && $CI_OPEN_MERGE_REQUESTS && $CI_PIPELINE_SOURCE == "push"
when: never
- if: $CI_COMMIT_BRANCH && $CI_OPEN_MERGE_REQUESTS
when: never
retry: 1


.test_docker_template_noupdate:
stage: test
image: jugit-registry.fz-juelich.de/iek-3/shared-code/fine/fine-dev:latest
image: jugit-registry.fz-juelich.de/iek-3/shared-code/fine/fine-dev:latest
before_script:
- python -m pip install -e .
rules:
Expand All @@ -68,7 +70,7 @@ variables:

.test_docker_template:
stage: test
image: jugit-registry.fz-juelich.de/iek-3/shared-code/fine/fine-dev:latest
image: jugit-registry.fz-juelich.de/iek-3/shared-code/fine/fine-dev:latest
variables:
CONDA_PKGS_DIRS: "$CI_PROJECT_DIR/.cache/pkgs"
cache:
Expand All @@ -83,6 +85,7 @@ variables:
policy: pull
before_script:
- micromamba install -n base -y --file=requirements_dev.yml
- python -m pip install .
rules:
# Do not run for pushes to master or develop and for merge requests to master
- if: $CI_COMMIT_BRANCH == "master"
Expand Down Expand Up @@ -114,7 +117,7 @@ variables:
test-code:
extends: .test_template
script:
- python -m pytest --cov=fine test/
- python -m pytest -n auto --cov=fine test/
rules:
# Run only for pushes to master or develop and for merge requests to master.
# Do not run for scheduled pushes to master (runs `test-code-push-cache` instead).
Expand All @@ -129,20 +132,20 @@ test-code-push-cache:
cache:
policy: push
script:
- python -m pytest --cov=fine test/
- python -m pytest -n auto --cov=fine test/
rules:
# Run only for scheduled pushes to master.
- if: '$CI_PIPELINE_SOURCE == "schedule" && $CI_COMMIT_BRANCH == "master"'

test-code-docker:
extends: .test_docker_template
script:
- python -m pytest --cov=fine test/
- python -m pytest -n auto --cov=fine test/

test-code-docker-noupdate:
extends: .test_docker_template_noupdate
script:
- python -m pytest --cov=fine test/
- python -m pytest -n auto --cov=fine test/

test-notebooks:
extends: .test_template
Expand Down
42 changes: 35 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,18 +10,35 @@

# ETHOS.FINE - Framework for Integrated Energy System Assessment


The ETHOS.FINE python package provides a framework for modeling, optimizing and assessing energy systems. With the provided framework, systems with multiple regions, commodities and time steps can be modeled. Target of the optimization is the minimization of the total annual cost while considering technical and environmental constraints. Besides using the full temporal resolution, an interconnected typical period storage formulation can be applied, that reduces the complexity and computational time of the model.

This readme provides information on the installation of the package. For further information have a look at the [documentation](https://vsa-fine.readthedocs.io/en/latest/).

ETHOS.FINE is used for the modelling of a diverse group of optimization problems within the [Energy Transformation PatHway Optimization Suite (ETHOS) at IEK-3](https://www.fz-juelich.de/de/iek/iek-3/leistungen/model-services).

If you want to use ETHOS.FINE in a published work, please [**kindly cite following publication**](https://www.sciencedirect.com/science/article/pii/S036054421830879X) which gives a description of the first stages of the framework. The python package which provides the time series aggregation module and its corresponding literature can be found [here](https://github.com/FZJ-IEK3-VSA/tsam).

## Features
* representation of an energy system by multiple locations, commodities and time steps
* complexity reducing storage formulation based on typical periods
## Content
<!-- TOC -->
* [Requirements](#requirements)
* [Python package manager](#python-package-manager)
* [Mixed Integer Linear Programming (MILP) solver](#mixed-integer-linear-programming-milp-solver)
* [Installation](#installation)
* [Installation via conda-forge](#installation-via-conda-forge)
* [Installation from a local folder](#installation-from-a-local-folder)
* [Installation for developers](#installation-for-developers)
* [Installation of an optimization solver](#installation-of-an-optimization-solver)
* [Gurobi installation](#gurobi-installation)
* [GLPK installation](#glpk-installation)
* [CBC](#cbc)
* [Examples](#examples)
* [License](#license)
* [About Us](#about-us-)
* [Contributions and Users](#contributions-and-users)
* [Acknowledgement](#acknowledgement)
<!-- TOC -->

## Documentation
A "Read the Docs" documentation of ETHOS.FINE can be found [here](https://vsa-fine.readthedocs.io/en/latest/).

## Requirements

Expand All @@ -46,7 +63,7 @@ mamba create -n fine -c conda-forge fine
### Installation from a local folder
Alternatively you can first clone the content of this repository and perform the installation from there:

1. Clone the content of this repository
1. (Shallow) clone the content of this repository
```bash
git clone --depth 1 https://github.com/FZJ-IEK3-VSA/FINE.git
```
Expand All @@ -62,14 +79,25 @@ mamba env create -f requirements.yml
```bash
mamba activate fine
```
6. Install FINE with:
```bash
python -m pip install --no-deps .
```

### Installation for developers
If you want to work on the FINE codebase you need to run.
```bash
git clone https://github.com/FZJ-IEK3-VSA/FINE.git
```
to get the whole git history and then
```bash
mamba env create -f requirements_dev.yml
```
This installs additional dependencies such as `pytest` and installs FINE from the folder in editable mode with `pip -e`. Changes in the folder are then reflected in the package installation.

Finally, install FINE in editable mode with:
```bash
python -m pip install --no-deps --editable .
```
Test your installation with the following command in the project root folder:
```
pytest
Expand Down
2 changes: 2 additions & 0 deletions docs/source/newsDoc.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,8 @@
FINE's News Feed
################

Since version 2.3.3 this news feed is not updated anymore. Please refer to the [release page](https://github.com/FZJ-IEK3-VSA/FINE/releases) for changelogs.

*********************
Release version 2.3.2
*********************
Expand Down
30 changes: 15 additions & 15 deletions fine/IOManagement/standardIO.py
Original file line number Diff line number Diff line change
Expand Up @@ -95,7 +95,7 @@ def writeOptimizationOutputToExcel(
if not optSum.empty:
optSum.to_excel(
writer,
name[:-5]
sheet_name=name[:-5]
+ "OptSummary_"
+ esM.componentModelingDict[name].dimension,
)
Expand Down Expand Up @@ -123,7 +123,7 @@ def writeOptimizationOutputToExcel(
((dfTD1dim != 0) & (~dfTD1dim.isnull())).any(axis=1)
]
if not dfTD1dim.empty:
dfTD1dim.to_excel(writer, name[:-5] + "_TDoptVar_1dim")
dfTD1dim.to_excel(writer, sheet_name=name[:-5] + "_TDoptVar_1dim")
if dataTD2dim:
names = ["Variable", "Component", "LocationIn", "LocationOut"]
dfTD2dim = pd.concat(dataTD2dim, keys=indexTD2dim, names=names)
Expand All @@ -132,7 +132,7 @@ def writeOptimizationOutputToExcel(
((dfTD2dim != 0) & (~dfTD2dim.isnull())).any(axis=1)
]
if not dfTD2dim.empty:
dfTD2dim.to_excel(writer, name[:-5] + "_TDoptVar_2dim")
dfTD2dim.to_excel(writer, sheet_name=name[:-5] + "_TDoptVar_2dim")
if dataTI:
if esM.componentModelingDict[name].dimension == "1dim":
names = ["Variable type", "Component"]
Expand All @@ -144,7 +144,7 @@ def writeOptimizationOutputToExcel(
if not dfTI.empty:
dfTI.to_excel(
writer,
name[:-5]
sheet_name=name[:-5]
+ "_TIoptVar_"
+ esM.componentModelingDict[name].dimension,
)
Expand All @@ -154,7 +154,7 @@ def writeOptimizationOutputToExcel(
periodsOrder = pd.DataFrame(
[esM.periodsOrder[_ip]], index=["periodsOrder"], columns=esM.periods
)
periodsOrder.to_excel(writer, "Misc")
periodsOrder.to_excel(writer, sheet_name="Misc")
if esM.segmentation:
ls = []
for i in esM.periodsOrder[_ip].tolist():
Expand All @@ -163,7 +163,7 @@ def writeOptimizationOutputToExcel(
columns={"Segment Duration": "timeStepsPerSegment"}
)
segmentDuration.index.name = "segmentNumber"
segmentDuration.to_excel(writer, "Misc", startrow=3)
segmentDuration.to_excel(writer, sheet_name="Misc", startrow=3)
utils.output("\tSaving file...", esM.verbose, 0)
writer.close()
utils.output("Done. (%.4f" % (time.time() - _t) + " sec)", esM.verbose, 0)
Expand Down Expand Up @@ -809,7 +809,7 @@ def plotLocations(
locationsShapeFileName,
indexColumn,
plotLocNames=False,
crs="epsg:3035",
crs="EPSG:3035",
faceColor="none",
edgeColor="black",
fig=None,
Expand Down Expand Up @@ -840,7 +840,7 @@ def plotLocations(
:type plotLocNames: boolean
:param crs: coordinate reference system
|br| * the default value is 'epsg:3035'
|br| * the default value is 'EPSG:3035'
:type crs: string
:param faceColor: face color of the plot
Expand Down Expand Up @@ -884,7 +884,7 @@ def plotLocations(
:type dpi: scalar > 0
"""

gdf = gpd.read_file(locationsShapeFileName).to_crs({"init": crs})
gdf = gpd.read_file(locationsShapeFileName).to_crs(crs)

if ax is None:
fig, ax = plt.subplots(1, 1, figsize=figsize, **kwargs)
Expand Down Expand Up @@ -919,7 +919,7 @@ def plotTransmission(
loc0,
loc1,
ip=0,
crs="epsg:3035",
crs="EPSG:3035",
variableName="capacityVariablesOptimum",
color="k",
loc=7,
Expand Down Expand Up @@ -961,7 +961,7 @@ def plotTransmission(
:type ip: int
:param crs: coordinate reference system
|br| * the default value is 'epsg:3035'
|br| * the default value is 'EPSG:3035'
:type crs: string
:param variableName: parameter for plotting installed capacity ('_capacityVariablesOptimum') or operation
Expand Down Expand Up @@ -1024,7 +1024,7 @@ def plotTransmission(
if capMax == 0:
return fig, ax
cap = cap / capMax
gdf = gpd.read_file(transmissionShapeFileName).to_crs({"init": crs})
gdf = gpd.read_file(transmissionShapeFileName).to_crs(crs)

if ax is None:
fig, ax = plt.subplots(1, 1, figsize=figsize, **kwargs)
Expand Down Expand Up @@ -1082,7 +1082,7 @@ def plotLocationalColorMap(
ip=0,
perArea=True,
areaFactor=1e3,
crs="epsg:3035",
crs="EPSG:3035",
variableName="capacityVariablesOptimum",
doSum=False,
cmap="viridis",
Expand Down Expand Up @@ -1128,7 +1128,7 @@ def plotLocationalColorMap(
:type areaFactor: scalar > 0
:param crs: coordinate reference system
|br| * the default value is 'epsg:3035'
|br| * the default value is 'EPSG:3035'
:type crs: string
:param variableName: parameter for plotting installed capacity ('_capacityVariablesOptimum') or operation
Expand Down Expand Up @@ -1184,7 +1184,7 @@ def plotLocationalColorMap(

if doSum:
data = data.sum(axis=1)
gdf = gpd.read_file(locationsShapeFileName).to_crs({"init": crs})
gdf = gpd.read_file(locationsShapeFileName).to_crs(crs)

# Make sure the data and gdf indices match
## 1. Sort the indices to obtain same order
Expand Down
Loading

0 comments on commit f4ec7bb

Please sign in to comment.