From 48a1f6a8b1b35bf7d2d5cadf2e76945ff98713cc Mon Sep 17 00:00:00 2001 From: "gtrivedi@redhat.com" Date: Tue, 14 May 2024 13:10:53 +0530 Subject: [PATCH 1/3] Moved content for integration tests --- .../how-tos/testing/integration/adding.adoc | 47 ++++++++++++++++++- 1 file changed, 46 insertions(+), 1 deletion(-) diff --git a/docs/modules/ROOT/pages/how-tos/testing/integration/adding.adoc b/docs/modules/ROOT/pages/how-tos/testing/integration/adding.adoc index ff9fc0bc..0a4b67da 100644 --- a/docs/modules/ROOT/pages/how-tos/testing/integration/adding.adoc +++ b/docs/modules/ROOT/pages/how-tos/testing/integration/adding.adoc @@ -1 +1,46 @@ -= Adding an integration test \ No newline at end of file += Adding an integration test + +In {ProductName}, you can add integration tests to verify that the individual components of your application integrate correctly, forming a complete and functional application. {ProductName} runs these integration tests on the container images of components before their deployment. + +.Prerequisites + +* You have created an application in {ProductName}. + +.Procedure + +Complete the following steps in the {ProductName} console: + +. Open an existing application and go to the *Integration tests* tab. + +. Select *Add integration test*. + +. In the *Integration test name* field, enter a name of your choosing. + +. In the *GitHub URL* field, enter the URL of the GitHub repository that contains the test you want to use. +. Optional: If you want to use a branch, commit, or version other than the default, specify the branch name, commit SHA, or tag in the *Revisions* field. + +. In the *Path in repository* field, enter the path to the `.yaml` file that defines the test you want to use. +. Optional: To allow the integration tests to fail without impacting the deployment and release process of your application, you can choose to select *Mark as optional for release*. + ++ +NOTE: By default, all integration test scenarios are mandatory and must pass. A failing integration test marks the application snapshot as failed, preventing its deployment and release. However, if you have selected *Mark as optional for release*, a failure in this test won't hinder the deployment and release of the application snapshot. + +. Select *Add integration test*. + +. Start a new build for any component you want to test. + +.. For components using the default build pipeline, go to the *Components* tab, select the three dots next to the name of the component, and select *Start new build*. + +.. For components with an upgraded build pipeline, make a commit to their GitHub repository. + +.Verification + +When the new build is finished: + +. Go to the *Integration tests* tab and select the highlighted name of your test. + +. Go to the *Pipeline runs* tab of that test and select the most recent run. + +. On the *Details* page, you can see if the test succeeded for that component. Navigate to the other tabs for more details. + + From 245207976e6af79221b5f11d199e681e06d607cf Mon Sep 17 00:00:00 2001 From: "gtrivedi@redhat.com" Date: Tue, 14 May 2024 13:14:22 +0530 Subject: [PATCH 2/3] Added content for Creating a custom integration test for upstream --- .../how-tos/testing/integration/creating.adoc | 181 +++++++++++++++++- 1 file changed, 180 insertions(+), 1 deletion(-) diff --git a/docs/modules/ROOT/pages/how-tos/testing/integration/creating.adoc b/docs/modules/ROOT/pages/how-tos/testing/integration/creating.adoc index 54b84f06..e0ec633a 100644 --- a/docs/modules/ROOT/pages/how-tos/testing/integration/creating.adoc +++ b/docs/modules/ROOT/pages/how-tos/testing/integration/creating.adoc @@ -1 +1,180 @@ -= Creating a custom integration test \ No newline at end of file += Creating a custom integration test + +In {ProductName}, you can create your own integration tests to run on all components of a given application before they are deployed. + +.Procedure + +To create any custom test, complete the following steps: + +. In your preferred IDE, create a Tekton pipeline in a `.yaml` file. + +. Within that pipeline, create tasks, which define the actual steps of the test that {ProductName} executes against images before deploying them. + +. Commit the `.yaml` file to a GitHub repo and add it as an integration test in {ProductName}. + +.Procedure with example + +To create a custom test that checks that your app serves the text “Hello world!”, complete the following steps: + +. In your preferred IDE, create a new `.yaml` file, with a name of your choosing. + +. Define a new Tekton pipeline. The following example is the beginning of a pipeline that uses `curl` to check that the app serves the text “Hello world!”. + ++ +Example pipeline file: + ++ +[source] +---- +kind: Pipeline +apiVersion: tekton.dev/v1beta1 +metadata: + name: example-pipeline +spec: + params: + - description: 'Snapshot of the application' + name: SNAPSHOT + default: '{"components": [{"name":"test-app", "containerImage": "quay.io/example/repo:latest"}]}' + type: string + tasks: +---- + +. In the `.pipeline.spec` path, declare a new task. + ++ +Example task declaration: + ++ +[source] +---- +tasks: + - name: task-1 + description: Placeholder task that prints the Snapshot and outputs standard TEST_OUTPUT + params: + - name: SNAPSHOT + value: $(params.SNAPSHOT) + taskSpec: + params: + - name: SNAPSHOT + results: + - name: TEST_OUTPUT + description: Test output + steps: + - image: registry.redhat.io/openshift4/ose-cli:latest + env: + - name: SNAPSHOT + value: $(params.SNAPSHOT) + script: | + dnf -y install jq + + echo -e "Example test task for the Snapshot:\n ${SNAPSHOT}" + // Run custom tests for the given Snapshot here + // After the tests finish, record the overall result in the RESULT variable + RESULT="SUCCESS" + + // Output the standardized TEST_OUTPUT result in JSON form + TEST_OUTPUT=$(jq -rc --arg date $(date +%s) --arg RESULT "${RESULT}" --null-input \ + '{result: $RESULT, timestamp: $date, failures: 0, successes: 1, warnings: 0}') + echo -n "${TEST_OUTPUT}" | tee $(results.TEST_OUTPUT.path) + +---- + +. Save the `.yaml` file. + +.. If you haven’t already, commit this file to a GitHub repository that {ProductName} can access. + ++ +Complete example file: + ++ +[source] +---- +kind: Pipeline +apiVersion: tekton.dev/v1beta1 +metadata: + name: example-pipeline +spec: + params: + - description: 'Snapshot of the application' + name: SNAPSHOT + default: '{"components": [{"name":"test-app", "containerImage": "quay.io/example/repo:latest"}]}' + type: string + - description: 'Namespace where the application is running' + name: NAMESPACE + default: "default" + type: string + - description: 'Expected output' + name: EXPECTED_OUTPUT + default: "Hello World!" + type: string + workspaces: + - name: cluster-credentials + optional: true + tasks: + - name: task-1 + description: Placeholder task that prints the Snapshot and outputs standard TEST_OUTPUT + params: + - name: SNAPSHOT + value: $(params.SNAPSHOT) + taskSpec: + params: + - name: SNAPSHOT + results: + - name: TEST_OUTPUT + description: Test output + steps: + - image: registry.redhat.io/openshift4/ose-cli:latest + env: + - name: SNAPSHOT + value: $(params.SNAPSHOT) + script: | + dnf -y install jq + echo -e "Example test task for the Snapshot:\n ${SNAPSHOT}" + // Run custom tests for the given Snapshot here + // After the tests finish, record the overall result in the RESULT variable + RESULT="SUCCESS" + + // Output the standardized TEST_OUTPUT result in JSON form + TEST_OUTPUT=$(jq -rc --arg date $(date +%s) --arg RESULT "${RESULT}" --null-input \ + '{result: $RESULT, timestamp: $date, failures: 0, successes: 1, warnings: 0}') + echo -n "${TEST_OUTPUT}" | tee $(results.TEST_OUTPUT.path) +---- + +. Add your new custom test as an integration test in {ProductName}. + +.. For additional instructions on adding an integration test, see this document: xref:how-to-guides/testing_applications/proc_adding_an_integration_test.adoc[Adding an integration test]. + +.Data injected into the PipelineRun of the integration test + +When you create a custom integration test, {ProductName} automatically adds certain parameters, workspaces, and labels to the PipelineRun of the integration test. This section explains what those parameters, workspaces, and labels are, and how they can help you. + +Parameters: + +* *`SNAPSHOT`*: contains the xref:../../glossary/index.adoc#_snapshot[snapshot] of the whole application as a JSON string. This JSON string provides useful information about the test, such as which components {ProductName} is testing, and what git repository and commit {ProductName} is using to build those components. For information about snapshot JSON string, see link:https://github.com/konflux-ci/integration-examples/blob/main/examples/snapshot_json_string_example[an example snapshot JSON string]. + +Labels: + +* *`appstudio.openshift.io/application`*: contains the name of the application. + +* *`appstudio.openshift.io/component`*: contains the name of the component. + +* *`appstudio.openshift.io/snapshot`*: contains the name of the snapshot. + +* *`test.appstudio.openshift.io/optional`*: contains the optional flag, which specifies whether or not components must pass the integration test before deployment. + +* *`test.appstudio.openshift.io/scenario`*: contains the name of the integration test (this label ends with "scenario," because each test is technically a custom resource called an `IntegrationTestScenario`). + + +.Verification + +After adding the integration test to an application, you need to trigger a new build of its components to make {ProductName} run the integration test. Make a commit to the GitHub repositories of your components to trigger a new build. + +When the new build is finished, complete the following steps in the {ProductName} console: + +. Go to the *Integration tests* tab and select the highlighted name of your test. + +. Go to the *Pipeline runs* tab of that test and select the most recent run. + +. On the *Details* page, see if the test succeeded for that component. Select the other tabs to view more details. + +.. If you used our example script, switch to the *Logs* tab and verify that the test printed “Hello world!”. From 9c5589de021932ac470de07676699e601aaee048 Mon Sep 17 00:00:00 2001 From: "gtrivedi@redhat.com" Date: Tue, 14 May 2024 13:45:23 +0530 Subject: [PATCH 3/3] Revert "Added content for Creating a custom integration test for upstream" This reverts commit f49b125b3a0a90529d40d2c7e7ddbfde4627d8a9. --- .../how-tos/testing/integration/adding.adoc | 6 +- .../how-tos/testing/integration/creating.adoc | 181 +----------------- 2 files changed, 2 insertions(+), 185 deletions(-) diff --git a/docs/modules/ROOT/pages/how-tos/testing/integration/adding.adoc b/docs/modules/ROOT/pages/how-tos/testing/integration/adding.adoc index 0a4b67da..60f1680c 100644 --- a/docs/modules/ROOT/pages/how-tos/testing/integration/adding.adoc +++ b/docs/modules/ROOT/pages/how-tos/testing/integration/adding.adoc @@ -27,11 +27,7 @@ NOTE: By default, all integration test scenarios are mandatory and must pass. A . Select *Add integration test*. -. Start a new build for any component you want to test. - -.. For components using the default build pipeline, go to the *Components* tab, select the three dots next to the name of the component, and select *Start new build*. - -.. For components with an upgraded build pipeline, make a commit to their GitHub repository. +. To start building a new component, either open a new pull request (PR) that targets the tracked branch of the component in the GitHub repository, or comment '/retest' on an existing PR. .Verification diff --git a/docs/modules/ROOT/pages/how-tos/testing/integration/creating.adoc b/docs/modules/ROOT/pages/how-tos/testing/integration/creating.adoc index e0ec633a..54b84f06 100644 --- a/docs/modules/ROOT/pages/how-tos/testing/integration/creating.adoc +++ b/docs/modules/ROOT/pages/how-tos/testing/integration/creating.adoc @@ -1,180 +1 @@ -= Creating a custom integration test - -In {ProductName}, you can create your own integration tests to run on all components of a given application before they are deployed. - -.Procedure - -To create any custom test, complete the following steps: - -. In your preferred IDE, create a Tekton pipeline in a `.yaml` file. - -. Within that pipeline, create tasks, which define the actual steps of the test that {ProductName} executes against images before deploying them. - -. Commit the `.yaml` file to a GitHub repo and add it as an integration test in {ProductName}. - -.Procedure with example - -To create a custom test that checks that your app serves the text “Hello world!”, complete the following steps: - -. In your preferred IDE, create a new `.yaml` file, with a name of your choosing. - -. Define a new Tekton pipeline. The following example is the beginning of a pipeline that uses `curl` to check that the app serves the text “Hello world!”. - -+ -Example pipeline file: - -+ -[source] ----- -kind: Pipeline -apiVersion: tekton.dev/v1beta1 -metadata: - name: example-pipeline -spec: - params: - - description: 'Snapshot of the application' - name: SNAPSHOT - default: '{"components": [{"name":"test-app", "containerImage": "quay.io/example/repo:latest"}]}' - type: string - tasks: ----- - -. In the `.pipeline.spec` path, declare a new task. - -+ -Example task declaration: - -+ -[source] ----- -tasks: - - name: task-1 - description: Placeholder task that prints the Snapshot and outputs standard TEST_OUTPUT - params: - - name: SNAPSHOT - value: $(params.SNAPSHOT) - taskSpec: - params: - - name: SNAPSHOT - results: - - name: TEST_OUTPUT - description: Test output - steps: - - image: registry.redhat.io/openshift4/ose-cli:latest - env: - - name: SNAPSHOT - value: $(params.SNAPSHOT) - script: | - dnf -y install jq - - echo -e "Example test task for the Snapshot:\n ${SNAPSHOT}" - // Run custom tests for the given Snapshot here - // After the tests finish, record the overall result in the RESULT variable - RESULT="SUCCESS" - - // Output the standardized TEST_OUTPUT result in JSON form - TEST_OUTPUT=$(jq -rc --arg date $(date +%s) --arg RESULT "${RESULT}" --null-input \ - '{result: $RESULT, timestamp: $date, failures: 0, successes: 1, warnings: 0}') - echo -n "${TEST_OUTPUT}" | tee $(results.TEST_OUTPUT.path) - ----- - -. Save the `.yaml` file. - -.. If you haven’t already, commit this file to a GitHub repository that {ProductName} can access. - -+ -Complete example file: - -+ -[source] ----- -kind: Pipeline -apiVersion: tekton.dev/v1beta1 -metadata: - name: example-pipeline -spec: - params: - - description: 'Snapshot of the application' - name: SNAPSHOT - default: '{"components": [{"name":"test-app", "containerImage": "quay.io/example/repo:latest"}]}' - type: string - - description: 'Namespace where the application is running' - name: NAMESPACE - default: "default" - type: string - - description: 'Expected output' - name: EXPECTED_OUTPUT - default: "Hello World!" - type: string - workspaces: - - name: cluster-credentials - optional: true - tasks: - - name: task-1 - description: Placeholder task that prints the Snapshot and outputs standard TEST_OUTPUT - params: - - name: SNAPSHOT - value: $(params.SNAPSHOT) - taskSpec: - params: - - name: SNAPSHOT - results: - - name: TEST_OUTPUT - description: Test output - steps: - - image: registry.redhat.io/openshift4/ose-cli:latest - env: - - name: SNAPSHOT - value: $(params.SNAPSHOT) - script: | - dnf -y install jq - echo -e "Example test task for the Snapshot:\n ${SNAPSHOT}" - // Run custom tests for the given Snapshot here - // After the tests finish, record the overall result in the RESULT variable - RESULT="SUCCESS" - - // Output the standardized TEST_OUTPUT result in JSON form - TEST_OUTPUT=$(jq -rc --arg date $(date +%s) --arg RESULT "${RESULT}" --null-input \ - '{result: $RESULT, timestamp: $date, failures: 0, successes: 1, warnings: 0}') - echo -n "${TEST_OUTPUT}" | tee $(results.TEST_OUTPUT.path) ----- - -. Add your new custom test as an integration test in {ProductName}. - -.. For additional instructions on adding an integration test, see this document: xref:how-to-guides/testing_applications/proc_adding_an_integration_test.adoc[Adding an integration test]. - -.Data injected into the PipelineRun of the integration test - -When you create a custom integration test, {ProductName} automatically adds certain parameters, workspaces, and labels to the PipelineRun of the integration test. This section explains what those parameters, workspaces, and labels are, and how they can help you. - -Parameters: - -* *`SNAPSHOT`*: contains the xref:../../glossary/index.adoc#_snapshot[snapshot] of the whole application as a JSON string. This JSON string provides useful information about the test, such as which components {ProductName} is testing, and what git repository and commit {ProductName} is using to build those components. For information about snapshot JSON string, see link:https://github.com/konflux-ci/integration-examples/blob/main/examples/snapshot_json_string_example[an example snapshot JSON string]. - -Labels: - -* *`appstudio.openshift.io/application`*: contains the name of the application. - -* *`appstudio.openshift.io/component`*: contains the name of the component. - -* *`appstudio.openshift.io/snapshot`*: contains the name of the snapshot. - -* *`test.appstudio.openshift.io/optional`*: contains the optional flag, which specifies whether or not components must pass the integration test before deployment. - -* *`test.appstudio.openshift.io/scenario`*: contains the name of the integration test (this label ends with "scenario," because each test is technically a custom resource called an `IntegrationTestScenario`). - - -.Verification - -After adding the integration test to an application, you need to trigger a new build of its components to make {ProductName} run the integration test. Make a commit to the GitHub repositories of your components to trigger a new build. - -When the new build is finished, complete the following steps in the {ProductName} console: - -. Go to the *Integration tests* tab and select the highlighted name of your test. - -. Go to the *Pipeline runs* tab of that test and select the most recent run. - -. On the *Details* page, see if the test succeeded for that component. Select the other tabs to view more details. - -.. If you used our example script, switch to the *Logs* tab and verify that the test printed “Hello world!”. += Creating a custom integration test \ No newline at end of file