Skip to content

Commit b4e4abc

Browse files
committed
Add core app functionality
1 parent cfc9560 commit b4e4abc

23 files changed

+2811
-569
lines changed

.gcloudignore

+4
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,4 @@
1+
.git
2+
.next
3+
node_modules
4+
data

.github/workflows/load-weather.yaml

+15
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,15 @@
1+
on:
2+
schedule:
3+
- cron: '45 * * * *' # Hourly at 45 minutes past the hour
4+
5+
workflow_dispatch:
6+
7+
jobs:
8+
load-weather:
9+
runs-on: ubuntu-latest
10+
steps:
11+
- uses: actions/checkout@v2
12+
- name: Install dependencies
13+
run: pip3 install -r data/requirements.txt
14+
- name: Load weather data
15+
run: python3 data/load_weather.py

.gitignore

+7
Original file line numberDiff line numberDiff line change
@@ -34,3 +34,10 @@ yarn-error.log*
3434
# typescript
3535
*.tsbuildinfo
3636
next-env.d.ts
37+
38+
# CIFP files
39+
CIFP_*
40+
CIFP_*.zip
41+
42+
# Data CSVs
43+
data/*.csv

README.md

+2-37
Original file line numberDiff line numberDiff line change
@@ -1,38 +1,3 @@
1-
This is a [Next.js](https://nextjs.org/) project bootstrapped with [`create-next-app`](https://github.com/vercel/next.js/tree/canary/packages/create-next-app).
1+
# In the Soup ☁️
22

3-
## Getting Started
4-
5-
First, run the development server:
6-
7-
```bash
8-
npm run dev
9-
# or
10-
yarn dev
11-
# or
12-
pnpm dev
13-
```
14-
15-
Open [http://localhost:3000](http://localhost:3000) with your browser to see the result.
16-
17-
You can start editing the page by modifying `pages/index.tsx`. The page auto-updates as you edit the file.
18-
19-
[API routes](https://nextjs.org/docs/api-routes/introduction) can be accessed on [http://localhost:3000/api/hello](http://localhost:3000/api/hello). This endpoint can be edited in `pages/api/hello.ts`.
20-
21-
The `pages/api` directory is mapped to `/api/*`. Files in this directory are treated as [API routes](https://nextjs.org/docs/api-routes/introduction) instead of React pages.
22-
23-
This project uses [`next/font`](https://nextjs.org/docs/basic-features/font-optimization) to automatically optimize and load Inter, a custom Google Font.
24-
25-
## Learn More
26-
27-
To learn more about Next.js, take a look at the following resources:
28-
29-
- [Next.js Documentation](https://nextjs.org/docs) - learn about Next.js features and API.
30-
- [Learn Next.js](https://nextjs.org/learn) - an interactive Next.js tutorial.
31-
32-
You can check out [the Next.js GitHub repository](https://github.com/vercel/next.js/) - your feedback and contributions are welcome!
33-
34-
## Deploy on Vercel
35-
36-
The easiest way to deploy your Next.js app is to use the [Vercel Platform](https://vercel.com/new?utm_medium=default-template&filter=next.js&utm_source=create-next-app&utm_campaign=create-next-app-readme) from the creators of Next.js.
37-
38-
Check out our [Next.js deployment documentation](https://nextjs.org/docs/deployment) for more details.
3+
Find nearby instrument approaches in IMC.

app.yaml

+7
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
env: standard
2+
runtime: nodejs18
3+
service: default
4+
handlers:
5+
- url: /.*
6+
secure: always
7+
script: auto

data/load_cifp.py

+115
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,115 @@
1+
"""
2+
Parse the FAA's CIFP file to extract airport and approach/FAF data.
3+
"""
4+
5+
import json
6+
import sys
7+
8+
import arinc424.record as a424
9+
import google.auth
10+
import pandas as pd
11+
import pandas_gbq
12+
from tqdm import tqdm
13+
14+
credentials, project = google.auth.default()
15+
16+
17+
def dms_to_dd(dms):
18+
"""Convert a DMS string to a decimal degree float.
19+
"""
20+
21+
if dms[0] == 'N' or dms[0] == 'E':
22+
sign = 1
23+
else:
24+
sign = -1
25+
26+
dms = dms[1:].rjust(9, '0')
27+
28+
d = int(dms[0:3])
29+
m = int(dms[3:5])
30+
s = int(dms[5:9]) / 100
31+
32+
return sign * (d + m / 60 + s / 3600)
33+
34+
35+
def parse_cifp(file_path):
36+
"""Parse the CIFP file and return pandas DataFrames containing the airport and FAF data.
37+
@param file_path: The path to the CIFP file
38+
@return: A tuple containing a pandas DataFrame for the airport data and a pandas DataFrame for the FAF data
39+
"""
40+
41+
cifp = open(file_path, 'r').readlines()
42+
records = []
43+
44+
print('Reading CIFP records...')
45+
46+
for line in tqdm(cifp):
47+
record = a424.Record()
48+
record.read(line)
49+
records.append(record)
50+
51+
print('Extracting relevant CIFP data...')
52+
53+
apts = []
54+
fafs = []
55+
56+
for record in tqdm(records):
57+
is_apt = False
58+
is_faf = False
59+
60+
for f in record.fields:
61+
if f.name == 'Section Code' and f.value == 'PA':
62+
is_apt = True
63+
break
64+
65+
if f.name == 'Waypoint Description Code' and f.value == 'E F':
66+
is_faf = True
67+
break
68+
69+
if is_apt:
70+
apts.append(json.loads(record.json()))
71+
72+
if is_faf:
73+
fafs.append(json.loads(record.json()))
74+
75+
df_apt = pd.DataFrame(apts)
76+
df_apt = df_apt.apply(lambda x: x.str.strip())
77+
df_apt['Latitude'] = df_apt['Airport Reference Pt. Latitude'].apply(lambda x: dms_to_dd(x))
78+
df_apt['Longitude'] = df_apt['Airport Reference Pt. Longitude'].apply(lambda x: dms_to_dd(x))
79+
80+
df_faf = pd.DataFrame(fafs)
81+
df_faf = df_faf.apply(lambda x: x.str.strip())
82+
83+
return df_apt, df_faf
84+
85+
86+
if __name__ == '__main__':
87+
file_path = sys.argv[1]
88+
89+
df_apt, df_faf = parse_cifp(file_path)
90+
91+
# Remove special characters from the column names
92+
df_apt.columns = df_apt.columns.str.replace(r'[^a-zA-Z0-9_ ]', '', regex=True)
93+
df_faf.columns = df_faf.columns.str.replace(r'[^a-zA-Z0-9_ ]', '', regex=True)
94+
95+
# Condense multiple spaces in the column names
96+
df_apt.columns = df_apt.columns.str.replace(r' +', '_', regex=True)
97+
df_faf.columns = df_faf.columns.str.replace(r' +', '_', regex=True)
98+
99+
# Set column types
100+
df_apt = df_apt.mask(df_apt == '')
101+
columns = ['Longest_Runway', 'Airport_Elevation', 'Transition_Altitude', 'Transition_Level']
102+
df_apt[columns] = df_apt[columns].apply(pd.to_numeric, errors='coerce')
103+
104+
df_faf = df_faf.mask(df_faf == '')
105+
columns = ['RNP', 'Arc_Radius', 'Theta', 'Rho', 'Magnetic_Course', 'Route_Holding_Distance_or_Time', 'Altitude', 'Altitude_2', 'Speed_Limit', 'Transition_Altitude', 'Vertical_Angle']
106+
df_faf[columns] = df_faf[columns].apply(pd.to_numeric, errors='coerce')
107+
108+
df_apt.to_csv('apt.csv', index=False)
109+
df_faf.to_csv('faf.csv', index=False)
110+
111+
print('Uploading to BigQuery...')
112+
113+
# Upload to BigQuery
114+
pandas_gbq.to_gbq(df_apt, 'aeronautical.airport', project, if_exists='replace', credentials=credentials)
115+
pandas_gbq.to_gbq(df_faf, 'aeronautical.faf', project, if_exists='replace', credentials=credentials)

data/load_weather.py

+146
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,146 @@
1+
"""
2+
Loads weather data from NOAA and uploads it to BigQuery.
3+
"""
4+
5+
from datetime import datetime, timedelta
6+
from ftplib import FTP
7+
from io import StringIO
8+
9+
import google.auth
10+
import pandas as pd
11+
import pandas_gbq
12+
from tqdm import tqdm
13+
14+
credentials, project = google.auth.default()
15+
16+
17+
def get_weather_data():
18+
"""Get weather data from NOAA.
19+
@return: Tuple of (nbh, nbs) where nbh is the weather data for the next 24 hours
20+
and nbs is the weather data for the next 72 hours (in 3hr increments).
21+
"""
22+
23+
print('Accessing NOAA FTP server...')
24+
25+
ftp = FTP('ftp.ncep.noaa.gov')
26+
ftp.login()
27+
28+
ftp.cwd('pub/data/nccf/com/blend/prod')
29+
30+
# Get forecast days
31+
forecast_days = ftp.nlst()
32+
forecast_day = forecast_days[-1]
33+
ftp.cwd(forecast_day)
34+
35+
# Get forecast hours
36+
forecast_hours = ftp.nlst()
37+
forecast_hour = forecast_hours[-1]
38+
39+
print(f'Downloading NBH forecast {forecast_day} {forecast_hour}Z...')
40+
41+
nbh_str = StringIO()
42+
ftp.retrlines('RETR {time}/text/blend_nbhtx.t{time}z'.format(time=forecast_hour), lambda line: nbh_str.write(line + '\n'))
43+
nbh = nbh_str.getvalue()
44+
nbh_str.close()
45+
46+
print(f'Downloading NBS forecast {forecast_day} {forecast_hour}Z...')
47+
48+
nbs_str = StringIO()
49+
ftp.retrlines('RETR {time}/text/blend_nbstx.t{time}z'.format(time=forecast_hour), lambda line: nbs_str.write(line + '\n'))
50+
nbs = nbs_str.getvalue()
51+
nbs_str.close()
52+
53+
return nbh, nbs
54+
55+
56+
def parse_weather_data(data, fmt):
57+
"""Parse weather data for a specific location
58+
@param data: The weather data to parse
59+
@param fmt: The format of the data. Either 'nbh' or 'nbs'
60+
@return: A pandas DataFrame containing the weather data
61+
"""
62+
63+
location = data.strip().split('\n')[0].split(' ')[0]
64+
65+
forecast_date = None
66+
if fmt == 'nbh':
67+
forecast_date_str = data.strip().split('\n')[0].strip().split(' ')
68+
forecast_date_str = list(filter(len, forecast_date_str))[-3:]
69+
forecast_date_str = ' '.join(forecast_date_str)
70+
forecast_date = datetime.strptime(forecast_date_str, '%m/%d/%Y %H%M %Z')
71+
first_date = forecast_date + timedelta(hours=1)
72+
elif fmt == 'nbs':
73+
forecast_date_str = data.strip().split('\n')[0].strip().split(' ')
74+
forecast_date_str = list(filter(len, forecast_date_str))[-3:]
75+
forecast_date_str = ' '.join(forecast_date_str)
76+
utc_hour = int(forecast_date_str[-8:-6])
77+
forecast_date = datetime.strptime(forecast_date_str, '%m/%d/%Y %H%M %Z')
78+
first_date = forecast_date + timedelta(hours=6 - (utc_hour % 3))
79+
80+
skip_lines = 1 if fmt == 'nbh' else 2
81+
82+
lines = data.strip().split('\n')[skip_lines:]
83+
parsed_data = {}
84+
85+
max_len = max([len(line) for line in lines])
86+
for line in lines:
87+
var_name = line[:5].strip()
88+
value_str = line[5:]
89+
90+
# Split the values into a list of integers every 3 characters
91+
values = [int(value_str[i:i+3].strip()) if value_str[i:i+3].strip().lstrip('-').isdigit() else None for i in range(0, max_len - 5, 3)]
92+
93+
parsed_data[var_name] = values
94+
95+
df = pd.DataFrame(parsed_data)
96+
df['Location'] = location
97+
df['Forecast_Time'] = forecast_date
98+
99+
date = first_date
100+
dates = []
101+
102+
prev_hr = None
103+
for hr in df['UTC']:
104+
date = date.replace(hour=hr)
105+
106+
if prev_hr is not None and hr < prev_hr:
107+
date += timedelta(days=1)
108+
109+
dates.append(date)
110+
prev_hr = hr
111+
112+
df['Time'] = dates
113+
114+
return df
115+
116+
117+
if __name__ == '__main__':
118+
nbh, nbs = get_weather_data()
119+
120+
# Split the data into separate locations
121+
nbh = nbh.strip().split(' ' * 50)[1:]
122+
nbs = nbs.strip().split(' ' * 50)[1:]
123+
124+
print('Parsing NBH forecast data...')
125+
126+
df_nbh = pd.DataFrame()
127+
for forecast in tqdm(nbh):
128+
location_forecast = parse_weather_data(forecast, 'nbh')
129+
df_nbh = pd.concat([df_nbh, location_forecast])
130+
131+
df_nbh.to_csv('wx_nbh.csv', index=False)
132+
133+
print('Parsing NBS forecast data...')
134+
135+
df_nbs = pd.DataFrame()
136+
for forecast in tqdm(nbs):
137+
location_forecast = parse_weather_data(forecast, 'nbs')
138+
df_nbs = pd.concat([df_nbs, location_forecast])
139+
140+
df_nbs.to_csv('wx_nbs.csv', index=False)
141+
142+
print('Uploading to BigQuery...')
143+
144+
# Upload to BigQuery
145+
pandas_gbq.to_gbq(df_nbh, 'weather.nbh', project, if_exists='replace', credentials=credentials)
146+
pandas_gbq.to_gbq(df_nbs, 'weather.nbs', project, if_exists='replace', credentials=credentials)

data/requirements.txt

+5
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
arinc424 @ git+https://github.com/andrewda/arinc424@996025203aebf0a3794d2e337fe536ff4f4845b0
2+
google-auth==2.16.3
3+
pandas==1.5.3
4+
pandas-gbq==0.19.1
5+
tqdm==4.64.1

next.config.js

+1
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,7 @@
11
/** @type {import('next').NextConfig} */
22
const nextConfig = {
33
reactStrictMode: true,
4+
distDir: 'build',
45
}
56

67
module.exports = nextConfig

0 commit comments

Comments
 (0)