Skip to content

Commit 18adc68

Browse files
committed
Update seed data and refactor its processing
Use Azure resources from the new development subscription. Make data restoration more robust so that reinitialising data works when there already exists data in the databases.
1 parent 0da8e17 commit 18adc68

File tree

2 files changed

+44
-34
lines changed

2 files changed

+44
-34
lines changed

README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -194,7 +194,7 @@ Internally the script calls `start-dependencies.sh` and forwards any arguments (
194194

195195
## Loading single dump into development database
196196

197-
The jore3-importer microservice imports data from JORE3 and transforms it into the JORE4 data model. Currently, existing database dumps of the transformed data can be found from Azure Blob container at `hsl-jore4-common / jore4storage / jore4-dump`.
197+
The jore3-importer microservice imports data from JORE3 and transforms it into the JORE4 data model. Currently, existing database dumps of the transformed data can be found from Azure Blob container at `rg-jore4-dev-001 / stjore4dev001 / jore4-dump`.
198198

199199
To download a single dump file to your local workspace and import it into your local development database instance, run `./scripts/development.sh dump:import <azure_blob_filepath> <database_name>` and follow the instructions. _Warning!_ This will empty the target database and overwrite all the data in it!
200200

@@ -204,7 +204,7 @@ If you just want to download a single dump file to your local workspace (but not
204204

205205
To update database dump files (with the `.pgdump` extension), do the following:
206206

207-
- Check the latest suitable dump files from the `jore4-dump` container under the `jore4storage` storage account in the `hsl-jore4-common` resource group. Make sure that the `docker-compose.custom.yml` file does not specify `jore4-hasura` and `jore4-tiamat` microservices with versions older than the versions the dump files were created with. If needed, restart dependencies and generate new GraphQL schema for new Hasura version and make necessary changes to achieve ui - hasura compatibility.
207+
- Check the latest suitable dump files from the `jore4-dump` container under the `stjore4dev001` storage account in the `rg-jore4-dev-001` resource group. As of 2025-05, dump files are organised into directories in Azure Blob storage. They should be accompanied by a README file that states which microservice versions the dumps were created with. Make sure that the `docker-compose.custom.yml` file does not specify `jore4-hasura` and `jore4-tiamat` microservices with versions older than the versions the dump files were created with. If needed, restart dependencies and generate new GraphQL schema for new Hasura version and make necessary changes to achieve ui - hasura compatibility.
208208
- Update the dump filenames in the `./scripts/development.sh` file. Remove the existing `.pgdump` files from your project directory. Then stop the dependencies, and run `./scripts/setup-dependencies-and-seed.sh`.
209209
- If everything goes right, after running the script and following the instructions you should now have your databases seeded with the new dumps.
210210

scripts/development.sh

Lines changed: 42 additions & 32 deletions
Original file line numberDiff line numberDiff line change
@@ -16,9 +16,9 @@ DOCKER_COMPOSE_BUNDLE_REF=${BUNDLE_REF:-main}
1616
# this project from others.
1717
export COMPOSE_PROJECT_NAME=jore4-ui
1818

19-
DUMP_ROUTES_FILENAME="routes-12-2024.pgdump"
20-
DUMP_TIMETABLES_FILENAME="timetables-12-2024.pgdump"
21-
DUMP_STOPS_FILENAME="stopdb-12-2024.pgdump"
19+
DUMP_ROUTES_FILENAME="2025-04-03_test/2025-04-03-jore4-local-jore4e2e.pgdump"
20+
DUMP_TIMETABLES_FILENAME="2025-04-03_test/2025-04-03-jore4-local-timetablesdb-nodata.pgdump"
21+
DUMP_STOPS_FILENAME="2025-04-03_test/2025-04-03-jore4-local-stopdb.pgdump"
2222

2323
DOCKER_TESTDB_IMAGE="jore4-testdb"
2424
DOCKER_IMAGES="jore4-auth jore4-hasura jore4-mbtiles jore4-mapmatchingdb jore4-mapmatching jore4-hastus jore4-tiamat jore4-timetablesapi"
@@ -208,58 +208,68 @@ start_dependencies() {
208208
}
209209

210210
download_dump() {
211-
echo "Downloading database dump for JORE4 network & routes from Azure Blob Storage..."
212-
213-
# Here is a breakdown of the dump name used below:
214-
# - "jore4e2e" ~ The name of the database to which the data dump applies
215-
# - "test-20240104" ~ The data originates from the Jore3 test database (not production) and specifically the snapshot taken on 4.1.2024.
216-
# - "data-only" ~ The dump contains only data. It does not contain DDL, i.e. table and other schema element definitions.
217-
# - "8a28ef5f" ~ The dump is based on the database migrations of the jore4-hasura image version starting with this hash.
218-
# - "20240104" (2nd) ~ The day when the jore3-importer was run
219-
if [ -z ${1+x} ]; then
220-
read -p "Dump file name (default: jore4e2e-test-20240104-data-only-8a28ef5f-20240104.pgdump): " DUMP_FILENAME
221-
DUMP_FILENAME="${DUMP_FILENAME:-jore4e2e-test-20240104-data-only-8a28ef5f-20240104.pgdump}"
211+
local az_blob_filepath
212+
213+
if [ -z "$1" ]; then
214+
read -rp "Dump file name: " az_blob_filepath
215+
if [ -z "$az_blob_filepath" ]; then
216+
echo "Error: empty Azure Blob container filepath given. Exiting..."
217+
exit 1
218+
fi
222219
else
223-
DUMP_FILENAME=$1
220+
az_blob_filepath="$1"
224221
fi
225222

226223
login
227224

228-
# Check dump file
229-
if [ ! -f "$1" ]; then
230-
echo "Downloading dump file as $DUMP_FILENAME"
225+
# Download the dump file, if it does not already exist.
226+
if [ ! -f "$az_blob_filepath" ]; then
227+
echo "Downloading dump file: $az_blob_filepath"
228+
231229
az storage blob download \
232-
--account-name "jore4storage" \
230+
--subscription "HSLAZ-CORP-DEV-JORE4" \
231+
--account-name "stjore4dev001" \
233232
--container-name "jore4-dump" \
234-
--name "$DUMP_FILENAME" \
235-
--file "$DUMP_FILENAME" \
233+
--name "$az_blob_filepath" \
234+
--file "$(basename "$az_blob_filepath")" \
236235
--auth-mode login
237236
fi
238237
}
239238

240239
import_dump() {
241-
if [[ -z ${1+x} || -z ${2+x} ]]; then
242-
echo "File and target database need to be defined!"
240+
local az_blob_filepath="$1"
241+
local target_database="$2"
242+
243+
if [[ -z ${az_blob_filepath} || -z ${target_database} ]]; then
244+
echo "Azure Blob container filepath and target database need to be defined!"
243245
echo "usage:"
244-
echo " development.sh dump:import file database"
246+
echo " development.sh dump:import <azure_blob_filepath> <database_name>"
245247
exit
246248
fi
247249

248-
echo "Importing JORE4 dump to $2 database"
250+
# Extract the filename from the full path, which may include directory names.
251+
local az_blob_filename
252+
az_blob_filename=$(basename "$az_blob_filepath")
249253

250254
# Download dump if it is missing
251-
if [ ! -f "$1" ]; then
252-
download_dump "$1"
255+
if [ ! -f "$az_blob_filename" ]; then
256+
download_dump "$az_blob_filepath"
253257
fi
254258

255-
docker exec -i testdb pg_restore -U dbadmin --dbname="$2" --format=c < "$1"
259+
echo "Importing database dump from the file '$az_blob_filename' to the '${target_database}' database..."
260+
261+
docker exec -i testdb bash -c "
262+
set -eux
263+
dropdb --username=dbadmin --force $target_database
264+
pg_restore --username=dbadmin --dbname=postgres --format=custom --create
265+
" < "$az_blob_filename"
256266
}
257267

258268
download_digitransit_key() {
259269
login
260270

261271
echo "Downloading secret value to ui/.env.local"
262-
{ echo -n "NEXT_PUBLIC_DIGITRANSIT_API_KEY=" && az keyvault secret show --name "hsl-jore4-digitransit-api-key" --vault-name "hsl-jore4-dev-vault" --query "value"; } > ui/.env.local
272+
{ echo -n "NEXT_PUBLIC_DIGITRANSIT_API_KEY=" && az keyvault secret show --name "hsl-jore4-digitransit-api-key" --vault-name "kv-jore4-dev-001" --query "value"; } > ui/.env.local
263273
}
264274

265275
setup_environment() {
@@ -362,14 +372,14 @@ print_usage() {
362372
dump:download [<azure_blob_filepath>]
363373
Downloads a JORE4 database dump from Azure Blob Storage. A full file path
364374
may be given as a parameter. The file path is used to refer to a file inside
365-
the 'jore4-dump' container under the 'jore4storage' storage account in the
366-
'hsl-jore4-common' resource group.
375+
the 'jore4-dump' container under the 'stjore4dev001' storage account in the
376+
'rg-jore4-dev-001' resource group.
367377
368378
dump:import <azure_blob_filepath> <database_name>
369379
Imports a database dump from the given file to the specified database.
370380
The dump file must be given as a Azure Blob storage reference where a full
371381
file path needs to be given within the 'jore4-dump' container under the
372-
'jore4storage' storage account in the 'hsl-jore4-common' resource group.
382+
'stjore4dev001' storage account in the 'rg-jore4-dev-001' resource group.
373383
374384
digitransit:fetch
375385
Download Digitransit map API key for JORE4 account.

0 commit comments

Comments
 (0)