Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Limited Area Mask breaking Hierachical Graphs #123

Open
icedoom888 opened this issue Feb 6, 2025 · 0 comments · May be fixed by #102
Open

Limited Area Mask breaking Hierachical Graphs #123

icedoom888 opened this issue Feb 6, 2025 · 0 comments · May be fixed by #102
Labels
bug Something isn't working

Comments

@icedoom888
Copy link
Contributor

icedoom888 commented Feb 6, 2025

What happened?

The following lines in anemoi/training/train/forecaster.py, break the training functionality if a hierarchical graph is used:

# Check if the model is a stretched grid
if graph_data["hidden"].node_type == "StretchedTriNodes":
       mask_name = config.graph.nodes.hidden.node_builder.mask_attr_name
       limited_area_mask = graph_data[config.graph.data][mask_name].squeeze().bool()
else:
       limited_area_mask = torch.ones((1,))

Hierarchical graphs have a graph structure defined as:

data: "data"
hidden:
 - "hidden_1"
 - "hidden_2"
 - "hidden_3"

therefore cannot access graph_data["hidden"], as it does not exist.
In case of hierarchical graphs multiple masks can be computed, each per dept.

What are the steps to reproduce the bug?

Run any training config using a hierarchical graph architecture.

Version

training-0.3.3

Platform (OS and architecture)

Linux

Relevant log output

Accompanying data

No response

Organisation

MeteoSwiss

@icedoom888 icedoom888 added the bug Something isn't working label Feb 6, 2025
@icedoom888 icedoom888 linked a pull request Feb 6, 2025 that will close this issue
6 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant