Skip to content

Commit

Permalink
Merge pull request #3346 from GEOS-ESM/develop
Browse files Browse the repository at this point in the history
  • Loading branch information
mathomp4 authored Jan 17, 2025
2 parents c65548b + 16b7f41 commit c41f37c
Show file tree
Hide file tree
Showing 32 changed files with 1,209 additions and 316 deletions.
2 changes: 1 addition & 1 deletion .circleci/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ parameters:

# Anchors to prevent forgetting to update a version
os_version: &os_version ubuntu24
baselibs_version: &baselibs_version v7.27.0
baselibs_version: &baselibs_version v7.29.0
bcs_version: &bcs_version v11.6.0
tag_build_arg_name: &tag_build_arg_name maplversion

Expand Down
6 changes: 6 additions & 0 deletions .github/workflows/docs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,9 @@ jobs:
steps:
- name: Checkout
uses: actions/checkout@v4
with:
fetch-depth: 0
filter: blob:none

- name: Build and Deploy Docs
uses: ./.github/actions/deploy-ford-docs
Expand All @@ -34,6 +37,9 @@ jobs:
steps:
- name: Checkout
uses: actions/checkout@v4
with:
fetch-depth: 0
filter: blob:none

- name: Build and Deploy Dev Docs
uses: ./.github/actions/deploy-ford-docs
Expand Down
1 change: 1 addition & 0 deletions .github/workflows/push-to-develop.yml
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@ jobs:
uses: actions/checkout@v4
with:
fetch-depth: 0
filter: blob:none
- name: Run the action
uses: devops-infra/[email protected]
with:
Expand Down
1 change: 1 addition & 0 deletions .github/workflows/push-to-main.yml
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@ jobs:
uses: actions/checkout@v4
with:
fetch-depth: 0
filter: blob:none
- name: Run the action
uses: devops-infra/[email protected]
with:
Expand Down
6 changes: 5 additions & 1 deletion .github/workflows/validate_yaml_files.yml
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,11 @@ jobs:
validate-YAML:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Checkout repo
uses: actions/checkout@v4
with:
fetch-depth: 0
filter: blob:none
- id: yaml-lint
name: yaml-lint
uses: ibiqlik/action-yamllint@v3
Expand Down
8 changes: 6 additions & 2 deletions .github/workflows/workflow.yml
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,9 @@ jobs:
steps:
- name: Checkout
uses: actions/checkout@v4
with:
fetch-depth: 0
filter: blob:none

- name: Build and Deploy Docs
uses: ./.github/actions/deploy-ford-docs
Expand All @@ -35,7 +38,7 @@ jobs:
name: Build and Test MAPL GNU
runs-on: ubuntu-latest
container:
image: gmao/ubuntu24-geos-env-mkl:v7.27.0-openmpi_5.0.5-gcc_14.2.0
image: gmao/ubuntu24-geos-env-mkl:v7.29.0-openmpi_5.0.5-gcc_14.2.0
# Per https://github.com/actions/virtual-environments/issues/1445#issuecomment-713861495
# It seems like we might not need secrets on GitHub Actions which is good for forked
# pull requests
Expand Down Expand Up @@ -86,7 +89,7 @@ jobs:
name: Build and Test MAPL Intel
runs-on: ubuntu-latest
container:
image: gmao/ubuntu24-geos-env:v7.27.0-intelmpi_2021.13-ifort_2021.13
image: gmao/ubuntu24-geos-env:v7.29.0-intelmpi_2021.13-ifort_2021.13
# Per https://github.com/actions/virtual-environments/issues/1445#issuecomment-713861495
# It seems like we might not need secrets on GitHub Actions which is good for forked
# pull requests
Expand All @@ -102,6 +105,7 @@ jobs:
uses: actions/checkout@v4
with:
fetch-depth: 1
filter: blob:none
- name: Set all directories as git safe
run: |
git config --global --add safe.directory '*'
Expand Down
32 changes: 32 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,38 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0

### Deprecated

## [2.52.0] - 2025-01-17

### Added

- Added subroutine to read nc4 tile file
- Added optional `start_date` and `start_time` to control the output window for each History collection. No output will be written before then. If not specified, these default to the beginning of the experiment.
- Added utility to prepare inputs for `ExtDataDriver.x` so that ExtData can simulate a real GEOS run
- Added loggers when writing or reading weight files
- Added new option to AGCM.rc `overwrite_checkpoint` to allow checkpoint files to be overwritten. By default still will not overwrite checkpoints
- The trajectory sampler netCDF output variable `location_index_in_iodafile` can be turned off, after we add two control variables: `use_NWP_1_file` and `restore_2_obs_vector` for users. When set to true, the two options will select only one obs file at each Epoch interval, and will rotate the output field index back to the location vector inthe obs file before generating netCDF output.
- Support `splitfield: 1` in HISTORY.rc for trajectory sampler

### Changed

- Changed `MAPL_ESMFRegridder` to require the dstMaskValues to be added as grid attribute to use fixed masking, fixes UFS issue
- Increased formatting width of time index in ExtData2G diagnostic print
- Updated GitHub checkout action to use blobless clones
- Update CI to use Baselibs 7.29.0 by default
- This provides ESMF 8.8.0
- Update `components.yaml`
- `ESMA_env` v4.34.0
- Update to MPT 2.30 at NAS
- Update to Baselibs 7.29.0 (ESMF 8.8.0)
- `ESMA_cmake` v3.56.0
- Use `LOCATION` Python `FIND_STRATEGY`

### Fixed

- Free MPI communicators after reading and/or writing of restarts
- Fixed the behavior of `MAPL_MaxMin` in presence of NaN
- Fixed bug with return codes and macros in udunits2f

## [2.51.2] - 2024-12-19

### Changed
Expand Down
2 changes: 1 addition & 1 deletion CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ endif ()

project (
MAPL
VERSION 2.51.2
VERSION 2.52.0
LANGUAGES Fortran CXX C) # Note - CXX is required for ESMF

# Set the possible values of build type for cmake-gui
Expand Down
58 changes: 58 additions & 0 deletions Tests/generate_extdatadriver_input.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,58 @@
# Introduction
This is a simple utility to generate inputs for ExtDataDriver.x so that ExtData can simulate a real GEOS run. It requires 3 things that are passed in
- A list of items ExtData needs to fill. This can be found by looking at the GEOS log
- The import spec of the GCM component from using the printspec option to GEOS
- all the needed yaml files in a directory, name of directory is passed

This will generate the import and optionally the export (as well as History) defition for ExtDataDriver.x to spare the human the tedious work.

# Example Inputs
To get the list of ExtDate items, just grab all the lines that look like this to a file:
```
EXTDATA: INFO: ---- 00001: BC_AIRCRAFT
EXTDATA: INFO: ---- 00002: BC_ANTEBC1
EXTDATA: INFO: ---- 00003: BC_ANTEBC2
EXTDATA: INFO: ---- 00004: BC_AVIATION_CDS
EXTDATA: INFO: ---- 00005: BC_AVIATION_CRS
EXTDATA: INFO: ---- 00006: BC_AVIATION_LTO
EXTDATA: INFO: ---- 00007: BC_BIOFUEL
EXTDATA: INFO: ---- 00008: BC_BIOMASS
EXTDATA: INFO: ---- 00009: BC_SHIP
EXTDATA: INFO: ---- 00010: BRC_AIRCRAFT
EXTDATA: INFO: ---- 00011: BRC_ANTEBRC1
EXTDATA: INFO: ---- 00012: BRC_ANTEBRC2
EXTDATA: INFO: ---- 00013: BRC_AVIATION_CDS
EXTDATA: INFO: ---- 00014: BRC_AVIATION_CRS
EXTDATA: INFO: ---- 00015: BRC_AVIATION_LTO
EXTDATA: INFO: ---- 00016: BRC_BIOFUEL
EXTDATA: INFO: ---- 00017: BRC_BIOMASS
EXTDATA: INFO: ---- 00018: BRC_SHIP
EXTDATA: INFO: ---- 00019: BRC_TERPENE
```

To get the GCM component spec, run with `PRINTSPEC: 1` in the `CAP.rc` and copy lines out that look like this:
```
#IMPORT spec for GCM
#COMPONENT, SHORT_NAME, LONG_NAME, UNIT, DIMS, CONTAINER_TYPE
GENERIC: INFO: GCM, WSUB_CLIM, stdev in vertical velocity, m s-1, 3, esmf_field
GENERIC: INFO: GCM, MEGAN_ORVC, MEGAN_ORVC, kgC/m2/s, 2, esmf_field
GENERIC: INFO: GCM, CLM4_PFT_CROP, CLM4_PFT_CROP, 1, 2, esmf_field
GENERIC: INFO: GCM, CLM4_PFT_C4_GRSS, CLM4_PFT_C4_GRSS, 1, 2, esmf_field
GENERIC: INFO: GCM, CLM4_PFT_C3_NARC_GRSS, CLM4_PFT_C3_NARC_GRSS, 1, 2, esmf_field
GENERIC: INFO: GCM, CLM4_PFT_C3_ARCT_GRSS, CLM4_PFT_C3_ARCT_GRSS, 1, 2, esmf_field
GENERIC: INFO: GCM, CLM4_PFT_BDLF_DECD_BORL_SHRB, CLM4_PFT_BDLF_DECD_BORL_SHRB, 1, 2, esmf_field
GENERIC: INFO: GCM, CLM4_PFT_BDLF_DECD_TMPT_SHRB, CLM4_PFT_BDLF_DECD_TMPT_SHRB, 1, 2, esmf_field
GENERIC: INFO: GCM, CLM4_PFT_BDLF_EVGN_SHRB, CLM4_PFT_BDLF_EVGN_SHRB, 1, 2, esmf_field
GENERIC: INFO: GCM, CLM4_PFT_BDLF_DECD_BORL_TREE, CLM4_PFT_BDLF_DECD_BORL_TREE, 1, 2, esmf_field
GENERIC: INFO: GCM, CLM4_PFT_BDLF_DECD_TMPT_TREE, CLM4_PFT_BDLF_DECD_TMPT_TREE, 1, 2, esmf_field
GENERIC: INFO: GCM, CLM4_PFT_BDLF_DECD_TROP_TREE, CLM4_PFT_BDLF_DECD_TROP_TREE, 1, 2, esmf_field
GENERIC: INFO: GCM, CLM4_PFT_BDLF_EVGN_TMPT_TREE, CLM4_PFT_BDLF_EVGN_TMPT_TREE, 1, 2, esmf_field
GENERIC: INFO: GCM, CLM4_PFT_BDLF_EVGN_TROP_TREE, CLM4_PFT_BDLF_EVGN_TROP_TREE, 1, 2, esmf_field
GENERIC: INFO: GCM, CLM4_PFT_NDLF_DECD_BORL_TREE, CLM4_PFT_NDLF_DECD_BORL_TREE, 1, 2, esmf_field
GENERIC: INFO: GCM, CLM4_PFT_NDLF_EVGN_BORL_TREE, CLM4_PFT_NDLF_EVGN_BORL_TREE, 1, 2, esmf_field
GENERIC: INFO: GCM, CLM4_PFT_NDLF_EVGN_TMPT_TREE, CLM4_PFT_NDLF_EVGN_TMPT_TREE, 1, 2, esmf_field
```

Finally just grab the right yaml files for ExtData.

To run you will of course need to do some further editing of the produced files and link in the acutal data using the same convention `gcm_run.j` does.
167 changes: 167 additions & 0 deletions Tests/generate_extdatadriver_input.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,167 @@
#!/usr/bin/env python3
from yaml import load,dump
import argparse
import os
import yaml
import glob

dims_dict = {"2":"xy", "3":"xyz"}

def get_dims(component_map, name):
for comp in component_map:
if name in component_map[comp]:
dims = component_map[comp][name]["dims"]
return dims

def get_vars_needed(input_file):

output_list = []
f = open(input_file,"r")
lines = f.readlines()
f.close()
for line in lines:
temp = line.split()
output_list.append(temp[4])

return output_list

def get_extdata_map(input_dir):

input_files = glob.glob(input_dir+"/*.yaml")
export_list = {}
for input_file in input_files:
f = open(input_file,'r')
extdata_def = yaml.safe_load(f)
f.close()
for short_name in extdata_def["Exports"]:
export_list.update({short_name:extdata_def["Exports"][short_name]})
return export_list

def get_block(cvs_file):

temp = cvs_file[0].split(' ')
state_type = temp[1][1:].strip()
component = temp[4].strip()
i=2
for line in cvs_file[2:]:
if "spec for" in line:
break
i=i+1
return component,state_type,i


def get_component_map(cvs_file):

i_start = 0
i_end = 0
n_lines = len(cvs_file)
components = {}
while i_start < n_lines-1:

comp_name,state_type,i_end = get_block(cvs_file[i_start:])
comp_map = {}
for i in range(i_end-2):
line = cvs_file[i_start+2+i]
values = line.split(',')
short_name = values[1].strip()
long_name = values[2].strip()
units = values[3].strip()
dims = values[4].strip()
item_type = values[5].strip()
comp_map.update({short_name:{"long_name":long_name,"units":units,"item_type":item_type,"dims":dims}})
components.update({comp_name+"_"+state_type:comp_map})
i_start = i_start + i_end

return components

def parse_args():
p = argparse.ArgumentParser(description='Generarte input files for ExtDataDriver to simulate GEOS')
p.add_argument('extdata_provided',type=str,help='a list of items ExtData should fill',default=None)
p.add_argument('spec_def',type=str,help='the GEOS gcm import state from the printspec',default=None)
p.add_argument('extdata_dir',type=str,help='diretory with all the yaml imputs for extdata',default=None)
p.add_argument('-e','--export',action='store_true',help='also include exports for corresponding imports')

return vars(p.parse_args())

if __name__ == '__main__':

args = parse_args()

extdata_list = args['extdata_provided']
do_exports = args['export']
input_file = args['spec_def']
f = open(input_file,"r")
input_rc = f.readlines()
f.close()

extdata_directory = args['extdata_dir']
extdata_def = get_extdata_map(extdata_directory)

f_agcm = open("AGCM.rc",'w')
# component
component_map = {}
component_map = get_component_map(input_rc)

vars_needed = get_vars_needed(extdata_list)

nl = "\n"
cm = " , "

# Import state
written = []
f_agcm.write("IMPORT_STATE::"+nl)

for item in vars_needed:
if item in extdata_def:
long_name = "NA"
units = "NA"
dims = get_dims(component_map, item)
cdims = dims_dict[dims]

if item not in written:
f_agcm.write(item+cm+long_name+cm+units+cm+cdims+cm+"c"+nl)
written.append(item)

f_agcm.write("::"+nl)

# Export state
if do_exports:
written = []
f_agcm.write("EXPORT_STATE::"+nl)
for item in vars_needed:
if item in extdata_def:
long_name = "NA"
units = "NA"
dims = get_dims(component_map, item)
cdims = dims_dict[dims]

if item not in written:
f_agcm.write(item+cm+long_name+cm+units+cm+cdims+cm+"c"+nl)
written.append(item)

f_agcm.write("::"+nl)

f_hist = open("HISTORY.rc",'w')
f_hist.write("GRID_LABELS:"+nl)
f_hist.write("::"+nl)
f_hist.write("COLLECTIONS: my_collection"+nl)
f_hist.write("::"+nl)
f_hist.write("my_collection.template: 'nc4'"+nl)
f_hist.write("my_collection.format: 'CFIO'"+nl)
f_hist.write("my_collection.frequency: '240000'"+nl)
first = True
written = []
for item in vars_needed:
if item in extdata_def:
if item not in written:
if first:
first = False
f_hist.write("my_collection.fields:'"+item+"' , 'Root',"+"\n")
else:
f_hist.write("'"+item+"' , 'Root',"+"\n")
written.append(item)
f_hist.write("::")

f_hist.close()
f_agcm.close()

Loading

0 comments on commit c41f37c

Please sign in to comment.