Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Test cmdline docker #408

Merged
merged 1 commit into from
Feb 21, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
18 changes: 4 additions & 14 deletions .github/workflows/system.yml
Original file line number Diff line number Diff line change
Expand Up @@ -29,25 +29,15 @@ jobs:
git clone https://github.com/bats-core/bats-core.git
cd bats-core && ./install.sh $HOME

- name: Build
- name: Build and run
run: |
make testbin
export AWS_ACCESS_KEY_ID=user
export AWS_SECRET_ACCESS_KEY=pass
export AWS_REGION=us-east-1
aws configure set aws_access_key_id $AWS_ACCESS_KEY_ID --profile versity
aws configure set aws_secret_access_key $AWS_SECRET_ACCESS_KEY --profile versity
aws configure set aws_region $AWS_REGION --profile versity
mkdir /tmp/gw

- name: Run tests
run: |
export AWS_ACCESS_KEY_ID=user
export AWS_SECRET_ACCESS_KEY=pass
export WORKSPACE=$GITHUB_WORKSPACE
./tests/run.sh

- name: Run tests with static buckets
run: |
export AWS_ACCESS_KEY_ID=user
export AWS_SECRET_ACCESS_KEY=pass
export WORKSPACE=$GITHUB_WORKSPACE
./tests/run_static.sh
./tests/run_all.sh
2 changes: 1 addition & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ VERSION
dist/

# secrets file for local github-actions testing
.secrets
tests/.secrets

# env files for testing
.env*
Expand Down
59 changes: 59 additions & 0 deletions Dockerfile_test_bats
Original file line number Diff line number Diff line change
@@ -0,0 +1,59 @@
FROM --platform=linux/arm64 ubuntu:latest

RUN apt-get update && \
apt-get install -y --no-install-recommends \
git \
make \
wget \
curl \
unzip \
jq \
ca-certificates && \
update-ca-certificates && \
rm -rf /var/lib/apt/lists/*

# Set working directory
WORKDIR /tmp

RUN curl "https://awscli.amazonaws.com/awscli-exe-linux-aarch64.zip" -o "awscliv2.zip" && unzip awscliv2.zip && ./aws/install

# Download Go 1.21 (adjust the version and platform as needed)
RUN wget https://golang.org/dl/go1.21.7.linux-arm64.tar.gz

# Extract the downloaded archive
RUN tar -xvf go1.21.7.linux-arm64.tar.gz -C /usr/local

# Set Go environment variables
ENV PATH="/usr/local/go/bin:${PATH}"
ENV GOPATH="/go"
ENV GOBIN="$GOPATH/bin"

# Make the directory for Go packages
RUN mkdir -p "$GOPATH/src" "$GOPATH/bin" && chmod -R 777 "$GOPATH"

# Create tester user
RUN groupadd -r tester && useradd -r -g tester tester
RUN mkdir /home/tester && chown tester:tester /home/tester
ENV HOME=/home/tester

RUN git clone https://github.com/bats-core/bats-core.git && \
cd bats-core && \
./install.sh /home/tester

USER tester
COPY . /home/tester

WORKDIR /home/tester
RUN make

RUN . tests/.secrets && \
export AWS_ACCESS_KEY_ID AWS_SECRET_ACCESS_KEY AWS_REGION AWS_PROFILE && \
aws configure set aws_access_key_id $AWS_ACCESS_KEY_ID --profile $AWS_PROFILE && \
aws configure set aws_secret_access_key $AWS_SECRET_ACCESS_KEY --profile $AWS_PROFILE && \
aws configure set aws_region $AWS_REGION --profile $AWS_PROFILE

RUN mkdir /tmp/gw

ENV WORKSPACE=.

CMD ["tests/run_all.sh"]
3 changes: 1 addition & 2 deletions tests/.env.default
Original file line number Diff line number Diff line change
@@ -1,9 +1,8 @@
AWS_REGION=us-east-1
AWS_PROFILE=versity
AWS_ENDPOINT_URL=http://127.0.0.1:7070
VERSITY_EXE=./versitygw
BACKEND=posix
LOCAL_FOLDER=/tmp/gw
AWS_ENDPOINT_URL=http://127.0.0.1:7070
BUCKET_ONE_NAME=versity-gwtest-bucket-one
BUCKET_TWO_NAME=versity-gwtest-bucket-two
RECREATE_BUCKETS=true
3 changes: 1 addition & 2 deletions tests/.env.static
Original file line number Diff line number Diff line change
@@ -1,9 +1,8 @@
AWS_REGION=us-east-1
AWS_PROFILE=versity
AWS_ENDPOINT_URL=http://127.0.0.1:7070
VERSITY_EXE=./versitygw
BACKEND=posix
LOCAL_FOLDER=/tmp/gw
AWS_ENDPOINT_URL=http://127.0.0.1:7070
BUCKET_ONE_NAME=versity-gwtest-bucket-one-static
BUCKET_TWO_NAME=versity-gwtest-bucket-two-static
RECREATE_BUCKETS=false
3 changes: 1 addition & 2 deletions tests/.env.versitygw
Original file line number Diff line number Diff line change
@@ -1,9 +1,8 @@
AWS_REGION=us-east-1
AWS_PROFILE=versity
AWS_ENDPOINT_URL=http://127.0.0.1:7070
VERSITY_EXE=./versitygw
BACKEND=posix
LOCAL_FOLDER=/tmp/gw
AWS_ENDPOINT_URL=http://127.0.0.1:7070
BUCKET_ONE_NAME=versity-gwtest-bucket-one
BUCKET_TWO_NAME=versity-gwtest-bucket-two
RECREATE_BUCKETS=true
28 changes: 20 additions & 8 deletions tests/README.md
Original file line number Diff line number Diff line change
@@ -1,13 +1,25 @@
# Command-Line Tests

Instructions:
## Instructions - Running Locally

1. Build the `versitygw` binary.
2. Create a local AWS profile for connection to S3, and add the `AWS_ACCESS_KEY_ID` and `AWS_SECRET_ACCESS_KEY` values above to the profile.
3. Create an environment file (`.env`) similar to the ones in this folder, setting the `AWS_PROFILE` parameter to the name of the profile you created.
4. In the root repo folder, run with `VERSITYGW_TEST_ENV=<env file> tests/s3_bucket_tests.sh`.
5. If running/testing the GitHub workflow locally, create a `.secrets` file, and set the `AWS_ACCESS_KEY_ID` and `AWS_SECRET_ACCESS_KEY` parameters here to the values of your AWS S3 IAM account.
2. Install the aws command-line interface if unavailable on your machine. Instructions are [here](https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html).
3. Install BATS. Instructions are [here](https://bats-core.readthedocs.io/en/stable/installation.html).
4. Create a `.secrets` file in the `tests` folder, and add the `AWS_ACCESS_KEY_ID` and `AWS_SECRET_ACCESS_KEY` values to the file.
5. Create a local AWS profile for connection to S3, and add the `AWS_ACCESS_KEY_ID`, `AWS_SECRET_ACCESS_KEY`, and `AWS_REGION` values for your account to the profile. Example:
```
AWS_ACCESS_KEY_ID=<key_id>
AWS_SECRET_ACCESS_KEY=<secret_key>
export AWS_PROFILE=versity-test
export AWS_ACCESS_KEY_ID=<your account ID>
export AWS_SECRET_ACCESS_KEY=<your account key>
export AWS_REGION=<your account region>
aws configure set aws_access_key_id $AWS_ACCESS_KEY_ID --profile $AWS_PROFILE
aws configure set aws_secret_access_key $AWS_SECRET_ACCESS_KEY --profile $AWS_PROFILE
aws configure set aws_region $AWS_REGION --profile $AWS_PROFILE
```
6. To run the workflow locally, install **act** and run with `act -W .github/workflows/system.yml`.
6. Create an environment file (`.env`) similar to the ones in this folder, setting the `AWS_PROFILE` parameter to the name of the profile you created.
7. In the root repo folder, run with `VERSITYGW_TEST_ENV=<env file> tests/run_all.sh`.

## Instructions - Running With Docker

1. Create a `.secrets` file in the `tests` folder, and add the `AWS_PROFILE`, `AWS_ACCESS_KEY_ID`, `AWS_SECRET_ACCESS_KEY`, and the `AWS_PROFILE` fields.
2. Build and run the `Dockerfile_test_bats` file.
4 changes: 4 additions & 0 deletions tests/posix_tests.sh
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,10 @@ source ./tests/util_posix.sh

local object_name="test-object"

if [[ -e "$LOCAL_FOLDER"/"$BUCKET_ONE_NAME" ]]; then
rm -rf "${LOCAL_FOLDER:?}"/"${BUCKET_ONE_NAME:?}"
fi

mkdir "$LOCAL_FOLDER"/"$BUCKET_ONE_NAME"
local object="$BUCKET_ONE_NAME"/"$object_name"
touch "$LOCAL_FOLDER"/"$object"
Expand Down
12 changes: 10 additions & 2 deletions tests/run.sh
Original file line number Diff line number Diff line change
@@ -1,4 +1,12 @@
#!/bin/bash

VERSITYGW_TEST_ENV=$WORKSPACE/tests/.env.default "$HOME"/bin/bats ./tests/s3_bucket_tests.sh
VERSITYGW_TEST_ENV=$WORKSPACE/tests/.env.default "$HOME"/bin/bats ./tests/posix_tests.sh
export VERSITYGW_TEST_ENV=$WORKSPACE/tests/.env.default
# shellcheck source=./.env.default
source "$VERSITYGW_TEST_ENV"
export AWS_PROFILE BUCKET_ONE_NAME BUCKET_TWO_NAME AWS_ENDPOINT_URL
if ! "$HOME"/bin/bats ./tests/s3_bucket_tests.sh; then
exit 1
fi
if ! "$HOME"/bin/bats ./tests/posix_tests.sh; then
exit 1
fi
8 changes: 8 additions & 0 deletions tests/run_all.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
#!/bin/bash

if ! ./tests/run.sh; then
exit 1
fi
if ! ./tests/run_static.sh; then
exit 1
fi
16 changes: 10 additions & 6 deletions tests/run_static.sh
Original file line number Diff line number Diff line change
Expand Up @@ -3,10 +3,14 @@
export VERSITYGW_TEST_ENV=$WORKSPACE/tests/.env.static
# shellcheck source=./.env.static
source "$VERSITYGW_TEST_ENV"
export AWS_PROFILE AWS_REGION BUCKET_ONE_NAME BUCKET_TWO_NAME AWS_ENDPOINT_URL
aws configure set aws_access_key_id "$AWS_ACCESS_KEY_ID"
aws configure set aws_secret_access_key "$AWS_SECRET_ACCESS_KEY"
export AWS_PROFILE BUCKET_ONE_NAME BUCKET_TWO_NAME AWS_ENDPOINT_URL
result=0
./tests/setup_static.sh
"$HOME"/bin/bats ./tests/s3_bucket_tests.sh
"$HOME"/bin/bats ./tests/posix_tests.sh
./tests/teardown_static.sh
if ! "$HOME"/bin/bats ./tests/s3_bucket_tests.sh; then
result=1
fi
if ! "$HOME"/bin/bats ./tests/posix_tests.sh; then
result=1
fi
./tests/teardown_static.sh
exit $result
8 changes: 4 additions & 4 deletions tests/s3_bucket_tests.sh
Original file line number Diff line number Diff line change
Expand Up @@ -313,8 +313,8 @@ source ./tests/util.sh
[[ $upload_result -eq 0 ]] || fail "Error performing multipart upload"

copy_file "s3://$BUCKET_ONE_NAME/$bucket_file" "$test_file_folder/$bucket_file-copy"
copy_data=$(<"$test_file_folder"/$bucket_file-copy)
[[ $bucket_file_data == "$copy_data" ]] || fail "Data doesn't match"
compare_files "$test_file_folder/$bucket_file-copy" "$test_file_folder"/$bucket_file || compare_result=$?
[[ $compare_result -eq 0 ]] || fail "Files do not match"

delete_bucket_or_contents "$BUCKET_ONE_NAME"
delete_test_files $bucket_file
Expand Down Expand Up @@ -436,8 +436,8 @@ source ./tests/util.sh
[[ $upload_result -eq 0 ]] || fail "Error performing multipart upload"

copy_file "s3://$BUCKET_ONE_NAME/$bucket_file-copy" "$test_file_folder/$bucket_file-copy"
copy_data=$(<"$test_file_folder"/$bucket_file-copy)
[[ $bucket_file_data == "$copy_data" ]] || fail "Data doesn't match"
compare_files "$test_file_folder"/$bucket_file-copy "$test_file_folder"/$bucket_file || compare_result=$?
[[ $compare_result -eq 0 ]] || fail "Data doesn't match"

delete_bucket_or_contents "$BUCKET_ONE_NAME"
delete_test_files $bucket_file
Expand Down
11 changes: 5 additions & 6 deletions tests/setup.sh
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,8 @@

setup() {

if [ "$GITHUB_ACTIONS" != "true" ] && [ -r .secrets ]; then
source .secrets
if [ "$GITHUB_ACTIONS" != "true" ] && [ -r tests/.secrets ]; then
source tests/.secrets
else
echo "Warning: no secrets file found"
fi
Expand Down Expand Up @@ -31,9 +31,6 @@ setup() {
elif [ -z "$BACKEND" ]; then
echo "No backend parameter set (options: 'posix')"
return 1
elif [ -z "$AWS_REGION" ]; then
echo "No AWS region set"
return 1
elif [ -z "$AWS_PROFILE" ]; then
echo "No AWS profile set"
return 1
Expand All @@ -56,10 +53,12 @@ setup() {
echo "RECREATE_BUCKETS must be 'true' or 'false'"
return 1
fi
key_len=${#AWS_ACCESS_KEY_ID}
secret_len=${#AWS_SECRET_ACCESS_KEY}
echo "$key_len $secret_len $VERSITY_EXE $BACKEND $LOCAL_FOLDER $AWS_ENDPOINT_URL $AWS_PROFILE $BUCKET_ONE_NAME $BUCKET_TWO_NAME"

ROOT_ACCESS_KEY="$AWS_ACCESS_KEY_ID" ROOT_SECRET_KEY="$AWS_SECRET_ACCESS_KEY" "$VERSITY_EXE" "$BACKEND" "$LOCAL_FOLDER" &

export AWS_REGION
export AWS_PROFILE
export AWS_ENDPOINT_URL
export LOCAL_FOLDER
Expand Down
16 changes: 16 additions & 0 deletions tests/util.sh
Original file line number Diff line number Diff line change
Expand Up @@ -814,4 +814,20 @@ split_file() {
return 1
fi
return 0
}

# compare files
# input: two files
# return 0 for same data, 1 for different data, 2 for error
compare_files() {
if [ $# -ne 2 ]; then
echo "file comparison requires two files"
return 2
fi
file_one_md5=$(md5 -q "$1")
file_two_md5=$(md5 -q "$2")
if [[ $file_one_md5 == "$file_two_md5" ]]; then
return 0
fi
return 1
}
Loading