Skip to content

Commit 8b56882

Browse files
committed
Prepare repo to be public
1 parent a9e1036 commit 8b56882

File tree

7 files changed

+32
-23
lines changed

7 files changed

+32
-23
lines changed

README.md

+14
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,14 @@
1+
2+
3+
This repository contains a collection of tools for Iris maintainers
4+
to make it easier to manage Iris. Currently it includes tools for:
5+
6+
- Publishing Iris data
7+
- Scanning container logs
8+
- Updating `iris-agent` containers without having to recreate them
9+
10+
Note that some of these tools, which interact directly with the
11+
Iris containers running on the Iris server, must be executed on the
12+
Iris server itself.
13+
14+
Also, the tool for updating `iris-agent` containers requires `gcloud`.

README.txt

-8
This file was deleted.

cache/README.txt

+5-4
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,6 @@
1-
This directory contains files that may take a long time to generate,
2-
but all contents can and should be regenerated to meet your desired
3-
criteria.
1+
This directory serves as a cache directory for files that can take
2+
a long time to generate.
43

5-
You can use the contents here for quick reference.
4+
You can use the cached files here for quick reference but all of
5+
these files can and should be regenerated to meet your desired
6+
criteria.

conf/tables.conf

+4-4
Original file line numberDiff line numberDiff line change
@@ -84,10 +84,10 @@ readonly CLEANED_RESULTS_TABLE_EXPORT=$(cat <<EOF
8484
EOF
8585
)
8686
# Uploading.
87-
readonly GCP_PROJECT_ID="mlab-edgenet"
88-
readonly BQ_PUBLIC_DATASET="iris_test" # public dataset with tables in scamper1 format
89-
readonly BQ_PRIVATE_DATASET="iris_test_2" # private dataset to store temporary tables during conversion
90-
readonly BQ_TABLE="elena1" # table in scamper1 format
87+
readonly GCP_PROJECT_ID=""
88+
readonly BQ_PUBLIC_DATASET="" # public dataset with tables in scamper1 format
89+
readonly BQ_PRIVATE_DATASET="" # private dataset to store temporary tables during conversion
90+
readonly BQ_TABLE="" # table in scamper1 format
9191
readonly SCHEMA_RESULTS_JSON="${toplevel}/db/schema_results.json"
9292
readonly SCHEMA_SCAMPER1_JSON="${toplevel}/db/scamper1.json"
9393
readonly TABLE_CONVERSION_QUERY="${toplevel}/db/iris_to_mlab.sql"

exported_tables/README.txt

+6-4
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,6 @@
1-
This directory contains exported Iris tables from ClickHouse and
2-
serves as the staging area for uploading them to BigQuery. None
3-
of the exported tables in this directory should be added to the
4-
repo.
1+
When publishing Iris data, this directory will contain exported
2+
Iris tables from ClickHouse and will serve as the staging area for
3+
uploading them to BigQuery.
4+
5+
None of the exported tables in this directory should be added to
6+
the repo.

tools/scan_logs.sh

+2-2
Original file line numberDiff line numberDiff line change
@@ -2,10 +2,10 @@
22

33
#
44
# This script must be executed on the Iris server, as it relies on
5-
# the `logcli`.
5+
# the `logcli` tool.
66
#
77
# It scans container logs of $CONTAINER_NAME for a specific $PATTERN
8-
# within from $START_DATE to $END_DATE.
8+
# from $START_DATE to $END_DATE.
99
#
1010
# The time period is divided into 30-day intervals to comply with
1111
# Loki's configuration limits (i.e., longer periods are not supported).

tools/update_agents.sh

+1-1
Original file line numberDiff line numberDiff line change
@@ -37,7 +37,7 @@ ZONES=(
3737
main() {
3838
for ((i=0; i<${#HOSTS[@]}; i++)); do
3939
echo "${HOSTS[${i}]}"
40-
gcloud compute ssh --project mlab-edgenet --zone "${ZONES[${i}]}" "${HOSTS[${i}]}" --command="
40+
gcloud compute ssh --project "${GCP_PROJECT_ID}" --zone "${ZONES[${i}]}" "${HOSTS[${i}]}" --command="
4141
readonly IMAGE_NAME=\"ghcr.io/dioptra-io/iris/iris-agent:production\"
4242
readonly CONTAINER_NAME=\"iris-agent\"
4343

0 commit comments

Comments
 (0)