diff --git a/docs/modules/ROOT/pages/how-tos/testing/integration/creating.adoc b/docs/modules/ROOT/pages/how-tos/testing/integration/creating.adoc index 54b84f06..36caa142 100644 --- a/docs/modules/ROOT/pages/how-tos/testing/integration/creating.adoc +++ b/docs/modules/ROOT/pages/how-tos/testing/integration/creating.adoc @@ -1 +1,179 @@ -= Creating a custom integration test \ No newline at end of file += Creating a custom integration test + +In {ProductName}, you can create your own integration tests to run on all components of a given application before they are deployed. + +.Procedure + +To create any custom test, complete the following steps: + +. In your preferred IDE, create a Tekton pipeline in a `.yaml` file. + +. Within that pipeline, create tasks, which define the actual steps of the test that {ProductName} executes against images before deploying them. + +. Commit the `.yaml` file to a GitHub repo and add it as an integration test in {ProductName}. + +.Procedure with example + +To create a custom test that checks that your app serves the text “Hello world!”, complete the following steps: + +. In your preferred IDE, create a new `.yaml` file, with a name of your choosing. + +. Define a new Tekton pipeline. The following example is the beginning of a pipeline that uses `curl` to check that the app serves the text “Hello world!”. + ++ +Example pipeline file: + ++ +[source] +---- +kind: Pipeline +apiVersion: tekton.dev/v1beta1 +metadata: + name: example-pipeline +spec: + params: + - description: 'Snapshot of the application' + name: SNAPSHOT + default: '{"components": [{"name":"test-app", "containerImage": "quay.io/example/repo:latest"}]}' + type: string + tasks: +---- + +. In the `.pipeline.spec` path, declare a new task. + ++ +Example task declaration: + ++ +[source] +---- +tasks: + - name: task-1 + description: Placeholder task that prints the Snapshot and outputs standard TEST_OUTPUT + params: + - name: SNAPSHOT + value: $(params.SNAPSHOT) + taskSpec: + params: + - name: SNAPSHOT + results: + - name: TEST_OUTPUT + description: Test output + steps: + - image: registry.redhat.io/openshift4/ose-cli:latest + env: + - name: SNAPSHOT + value: $(params.SNAPSHOT) + script: | + dnf -y install jq + + echo -e "Example test task for the Snapshot:\n ${SNAPSHOT}" + // Run custom tests for the given Snapshot here + // After the tests finish, record the overall result in the RESULT variable + RESULT="SUCCESS" + + // Output the standardized TEST_OUTPUT result in JSON form + TEST_OUTPUT=$(jq -rc --arg date $(date +%s) --arg RESULT "${RESULT}" --null-input \ + '{result: $RESULT, timestamp: $date, failures: 0, successes: 1, warnings: 0}') + echo -n "${TEST_OUTPUT}" | tee $(results.TEST_OUTPUT.path) + +---- + +. Save the `.yaml` file. + +.. If you haven’t already, commit this file to a GitHub repository that {ProductName} can access. + ++ +Complete example file: + ++ +[source] +---- +kind: Pipeline +apiVersion: tekton.dev/v1beta1 +metadata: + name: example-pipeline +spec: + params: + - description: 'Snapshot of the application' + name: SNAPSHOT + default: '{"components": [{"name":"test-app", "containerImage": "quay.io/example/repo:latest"}]}' + type: string + - description: 'Namespace where the application is running' + name: NAMESPACE + default: "default" + type: string + - description: 'Expected output' + name: EXPECTED_OUTPUT + default: "Hello World!" + type: string + tasks: + - name: task-1 + description: Placeholder task that prints the Snapshot and outputs standard TEST_OUTPUT + params: + - name: SNAPSHOT + value: $(params.SNAPSHOT) + taskSpec: + params: + - name: SNAPSHOT + results: + - name: TEST_OUTPUT + description: Test output + steps: + - image: registry.redhat.io/openshift4/ose-cli:latest + env: + - name: SNAPSHOT + value: $(params.SNAPSHOT) + script: | + dnf -y install jq + echo -e "Example test task for the Snapshot:\n ${SNAPSHOT}" + // Run custom tests for the given Snapshot here + // After the tests finish, record the overall result in the RESULT variable + RESULT="SUCCESS" + + // Output the standardized TEST_OUTPUT result in JSON form + TEST_OUTPUT=$(jq -rc --arg date $(date +%s) --arg RESULT "${RESULT}" --null-input \ + '{result: $RESULT, timestamp: $date, failures: 0, successes: 1, warnings: 0}') + echo -n "${TEST_OUTPUT}" | tee $(results.TEST_OUTPUT.path) +---- + +. Add your new custom test as an integration test in {ProductName}. + +.. For additional instructions on adding an integration test, see Adding an integration test. + +.Data injected into the PipelineRun of the integration test + +When you create a custom integration test, {ProductName} automatically adds certain parameters and labels to the PipelineRun of the integration test. This section explains what those parameters and labels are, and how they can help you. + +Parameters: + +* *`SNAPSHOT`*: contains the snapshot of the whole application as a JSON string. This JSON string provides useful information about the test, such as which components {ProductName} is testing, and what git repository and commit {ProductName} is using to build those components. For information about snapshot JSON string, see link:https://github.com/konflux-ci/integration-examples/blob/main/examples/snapshot_json_string_example[an example snapshot JSON string]. + +Labels: + +* *`appstudio.openshift.io/application`*: contains the name of the application. + +* *`appstudio.openshift.io/component`*: contains the name of the component. + +* *`appstudio.openshift.io/snapshot`*: contains the name of the snapshot. + +* *`test.appstudio.openshift.io/optional`*: contains the optional flag, which specifies whether or not components must pass the integration test before deployment. + +* *`test.appstudio.openshift.io/scenario`*: contains the name of the integration test (this label ends with "scenario," because each test is technically a custom resource called an `IntegrationTestScenario`). + + +.Verification + +After adding the integration test to an application, you need to trigger a new build of its components to make {ProductName} run the integration test. Make a commit to the GitHub repositories of your components to trigger a new build. + +NOTE: For information on other ways to trigger a new build, refer to the xref:how-tos/testing/integration/rerunning.adoc[Retriggering Integration Tests] + +When the new build is finished, complete the following steps in the {ProductName} console: + +. Go to the *Integration tests* tab and select the highlighted name of your test. + +. Go to the *Pipeline runs* tab of that test and select the most recent run. + +. On the *Details* page, see if the test succeeded for that component. Select the other tabs to view more details. + +.. If you used our example script, switch to the *Logs* tab and verify that the test printed “Hello world!”.