You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/source/change_log.rst
+16-1Lines changed: 16 additions & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,21 @@
1
1
.. note::
2
2
Go to the `Releases <https://github.com/Dewberry/stormhub/releases.html>`__ page for a list of all releases.
3
3
4
+
Release v0.1.0
5
+
==============
6
+
7
+
**Tag:** v0.1.0
8
+
9
+
**Published at:** 2025-01-31T21:13:10Z
10
+
11
+
**Author:** github-actions[bot]
12
+
13
+
**Release Notes:**
14
+
15
+
Summary^^^^^^^^
16
+
This feature release adds routines for developing STAC catalogs using AORC data. Source code was ported from existing internal libraries and refactored for this initial release.
17
+
18
+
4
19
Release v0.1.0rc1
5
20
=================
6
21
@@ -12,6 +27,6 @@ Release v0.1.0rc1
12
27
13
28
**Release Notes:**
14
29
15
-
30
+
Initial code compilation from existing internal libraries, adapted for STAC catalogs.
Copy file name to clipboardExpand all lines: docs/source/tech_summary.rst
+19-19Lines changed: 19 additions & 19 deletions
Original file line number
Diff line number
Diff line change
@@ -8,14 +8,14 @@ Storm Transposition Module
8
8
9
9
Data Source
10
10
-----------
11
-
The Analysis Of Record for Calibration (AORC) dataset is available on the AWS `Registry of Open Data <https://registry.opendata.aws/noaa-nws-aorc/>`_, and provides the
12
-
source data for precipitation and temperature data used in this module. (Other sources may be added but are not currently available). This gridded / houlry data is available for the CONUS
13
-
beginning on 1972-02-01 and is updated regularly.
11
+
The Analysis Of Record for Calibration (AORC) dataset is available on the AWS `Registry of Open Data <https://registry.opendata.aws/noaa-nws-aorc/>`_, and provides the
12
+
source data for precipitation and temperature data used in this module. (Other sources may be added but are not currently available). This gridded / hourly data is available for the CONUS
13
+
beginning on 1972-02-01 and is updated regularly.
14
14
15
15
16
16
The primary inputs used for the data development workflow are as follows.
17
17
18
-
Watershed = The waterhsed which will be used for hydrologic modeling.
18
+
Watershed = The watershed which will be used for hydrologic modeling.
19
19
20
20
Transposition Area = A region containing the watershed that has been developed as a hydro-meteorologically homogenous region.
21
21
@@ -46,10 +46,10 @@ cells of the watershed within the transposition region. The centroid location an
46
46
.. image:: ./images/2011-event.png
47
47
48
48
49
-
3. **Iterate over the period of record or desired date range**: In order to process multiple dates for a range (from start_date - end_date), there is an optional argumen `check_every_n_hours`. If set to 1, the process will sum up the storm duration for every hour from the start_date
49
+
3. **Iterate over the period of record or desired date range**: In order to process multiple dates for a range (from start_date - end_date), there is an optional argument `check_every_n_hours`. If set to 1, the process will sum up the storm duration for every hour from the start_date
50
50
to the end_date. For a 72-hour event, this would require processing 350,400 datasets (every hour for the period) for 40 years of record and would represent the most precise estimate to aid in identifying the start hour for the event. To save in processing
51
51
time and data, an alternate interval can be used. For example, selecting `check_every_n_hours` = 24 would result in 14,600 datasets processed for the same 40 year period.
52
-
52
+
53
53
check_every_n_hours = 6 (This would get check the totals every 6 hours, or 4 times a day)
54
54
55
55
@@ -67,15 +67,15 @@ After processing the data for every date in the requested date range, a csv is c
4. **Top events and date declustering** With the staticics in place, user settings can be used to create a STAC collection for the watershed / transpositon region / storm duration using the following inputs.
71
-
70
+
4. **Top events and date declustering** With the statistics in place, user settings can be used to create a STAC collection for the watershed / transpositon region / storm duration using the following inputs.
71
+
72
72
min_precip_threshold = 2 (Defaults to 1, this can be used to filter out events based on a minimum threshold)
73
-
73
+
74
74
top_n_events = 440 (This will be the total # of events in the collection. 440 would represent the top 10 events for 44 years)
75
75
76
76
To avoid double counting what is essentially the same storm because the N hour duration for several consecutive periods may result in a top storm, results of the query are iterated and added to a list,
77
77
a process filters storms to be skipped if there is any temporal overlap with a storm already existing in the list (the overlap is determined using the start time and duration of the top storm). As shown
78
-
in these images, these records are considered to be a single storm, and would be declustered, wherein the day with the greater mean precipitation would be included in the top storms collection and the other
78
+
in these images, these records are considered to be a single storm, and would be declustered, wherein the day with the greater mean precipitation would be included in the top storms collection and the other
79
79
would be dropped.
80
80
81
81
@@ -91,7 +91,7 @@ would be dropped.
91
91
92
92
93
93
5. The following additional arguments are available.
94
-
94
+
95
95
.. code:: bash
96
96
97
97
specific_dates # Can be provided to resume processing in the event of a failure or other use cases
@@ -102,23 +102,23 @@ would be dropped.
102
102
103
103
104
104
105
-
Results
105
+
Results
106
106
-------
107
107
108
-
A Storm Catalog is created containing a copy of the watershed, transposition domain, and the *valid transpositon domain* which is the space within the transposition domain wherein a
109
-
watershed can be transposed without encountering null space (i.e. part of the watershed extending outside of the trnasposition domain).
108
+
A Storm Catalog is created containing a copy of the watershed, transposition domain, and the *valid transpositon domain* which is the space within the transposition domain wherein a
109
+
watershed can be transposed without encountering null space (i.e. part of the watershed extending outside of the transposition domain).
110
110
111
111
.. image:: ./images/catalog.png
112
112
113
113
114
-
STAC Collections will be added to the catalog for each storm duration requested. The collection will include relevant data including summary statistics, plots, and other assets to rpovide
114
+
STAC Collections will be added to the catalog for each storm duration requested. The collection will include relevant data including summary statistics, plots, and other assets to provide
115
115
context and metadata for the data.
116
116
117
117
.. image:: ./images/storm-collection.png
118
118
119
119
120
-
The collection is compised of STAC Items, which provide links to source data and derivative products. For example, a model speciric timeseries file may be required for hydrologic modeling.
121
-
These files can be created and added to the event item alongside metadata and other information. Assets may include additional data required for modeling (i.e. temperature data, also available via AORC).
120
+
The collection is composed of STAC Items, which provide links to source data and derivative products. For example, a model specific timeseries file may be required for hydrologic modeling.
121
+
These files can be created and added to the event item alongside metadata and other information. Assets may include additional data required for modeling (i.e. temperature data, also available via AORC).
122
122
.. image:: ./images/storm-item.png
123
123
124
124
@@ -129,5 +129,5 @@ These files can be created and added to the event item alongside metadata and ot
129
129
This feature was evaluated and used in pilot projects, does not currently exist in this repository, but may be incorporated in the future.
130
130
131
131
132
-
Where possible, `NOAA Atlas-14 precipitation frequency estimates <https://hdsc.nws.noaa.gov/hdsc/pfds/pfds_gis.html>`_ may be considered to normalize the average accumulation for each storm.
133
-
.. image:: ./images/2yr03da.PNG
132
+
Where possible, `NOAA Atlas-14 precipitation frequency estimates <https://hdsc.nws.noaa.gov/hdsc/pfds/pfds_gis.html>`_ may be considered to normalize the average accumulation for each storm.
Copy file name to clipboardExpand all lines: docs/source/user_guide.rst
+7-6Lines changed: 7 additions & 6 deletions
Original file line number
Diff line number
Diff line change
@@ -2,7 +2,7 @@
2
2
Getting Started
3
3
################
4
4
5
-
This section provides a high level overview for using stormhub for production, including starting the stormhub server and create objects.
5
+
This section provides a high level overview for using stormhub for production, including starting the stormhub server and creating objects.
6
6
7
7
Installation
8
8
------------
@@ -18,20 +18,21 @@ have Python already installed and setup:
18
18
19
19
Note that it is highly recommended to create a python `virtual environment
20
20
<https://docs.python.org/3/library/venv.html>`_ to install, test, and run
21
-
stormhub.
21
+
stormhub. It is also recommended to avoid use of Windows Subsystem for Linux (WSL)
22
+
as issues can arise with the parallel processing within stormhub.
22
23
23
24
24
25
Starting the server
25
26
-------------------
26
27
27
28
For convenience, a local file server is provided. This server is not necessary for data
28
-
production, but is useful for visualizing and exploring the data.
29
+
production, but is useful for visualizing and exploring the data.
29
30
30
31
**Start the stormhub file:**
31
32
32
33
.. code-block:: bash
33
34
34
-
stormhub-server <path-to-local-dir>
35
+
stormhub-server <path-to-local-dir>
35
36
36
37
37
38
Local file server is useful for interacting with STAC browser for viewing the data locally. This is not required....
@@ -44,7 +45,7 @@ Local file server is useful for interacting with STAC browser for viewing the da
44
45
Workflows
45
46
---------
46
47
47
-
A config file shown in below includes the information required to create a new catalog.
48
+
A config file shown below includes the information required to create a new catalog.
48
49
49
50
.. code-block:: json
50
51
@@ -62,7 +63,7 @@ A config file shown in below includes the information required to create a new c
62
63
}
63
64
64
65
65
-
The following snippet provides an example of how to build and create a storm catalog. Requires an example watershed and transposition domain (examples availble in the `repo <https://github.com/Dewberry/stormhub/tree/main/catalogs/example-input-data>`_).
66
+
The following snippet provides an example of how to build and create a storm catalog. Requires an example watershed and transposition domain (examples available in the `repo <https://github.com/Dewberry/stormhub/tree/main/catalogs/example-input-data>`_).
0 commit comments