From e542c6739830c3f92306052b93e85b9058d2b5a4 Mon Sep 17 00:00:00 2001 From: SergioLangaritaBenitez Date: Thu, 17 Oct 2024 09:55:28 +0200 Subject: [PATCH 1/5] fix typo Alterations --- docpage/docs/05.- Alterations/Decode.md | 2 +- docpage/docs/05.- Alterations/index.md | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/docpage/docs/05.- Alterations/Decode.md b/docpage/docs/05.- Alterations/Decode.md index f9a281e4..d7d0b0c3 100644 --- a/docpage/docs/05.- Alterations/Decode.md +++ b/docpage/docs/05.- Alterations/Decode.md @@ -5,7 +5,7 @@ sidebar_position: 2 # Decode Alteration's Decode decodes the data flow from the chosen encoding. The user must ensure the input data is encoded using the selected encoding. -Three encodes are available: `base64`, `base32` and `hex`. It is similar to the command `base64 -d` or `base32 -d`. For example, If the input data is a string in base64 with the value `aGVsbG8K` or in base32 with the value `NBSWY3DPBI======`. The output data is be the same in both cases, `hello`. +Three encodes are available: `base64`, `base32` and `hex`. It is similar to the command `base64 -d` or `base32 -d`. For example, If the input data is a string in base64 with the value `aGVsbG8K` or in base32 with the value `NBSWY3DPBI======`. The output data is the same in both cases, `hello`. Here is the YAML example. diff --git a/docpage/docs/05.- Alterations/index.md b/docpage/docs/05.- Alterations/index.md index cd15956c..9a85f446 100644 --- a/docpage/docs/05.- Alterations/index.md +++ b/docpage/docs/05.- Alterations/index.md @@ -4,7 +4,7 @@ sidebar_position: 5 # Alterations -The subsection `alterations`, is located inside a Sources elements, and it changes the input data format. These alterations are applied as a descendent definition. These steps are helpful to be okay with the input Sources format and to re-use the Sources with no changes. +The subsection `alterations`, is located inside a Source element and changes the input data format. These alterations are applied as a descendent definition. These steps are helpful to be okay with the input Sources format and to re-use the Sources with no changes. From d7142c9c22ec98d82050e4d6ee67621aaac62298 Mon Sep 17 00:00:00 2001 From: SergioLangaritaBenitez Date: Thu, 17 Oct 2024 09:58:49 +0200 Subject: [PATCH 2/5] improve Destination doc --- docpage/docs/04.- Destinations/OSCAR.md | 2 +- docpage/docs/04.- Destinations/index.md | 6 ++++++ 2 files changed, 7 insertions(+), 1 deletion(-) create mode 100644 docpage/docs/04.- Destinations/index.md diff --git a/docpage/docs/04.- Destinations/OSCAR.md b/docpage/docs/04.- Destinations/OSCAR.md index 782599ad..b0fc5fec 100644 --- a/docpage/docs/04.- Destinations/OSCAR.md +++ b/docpage/docs/04.- Destinations/OSCAR.md @@ -8,7 +8,7 @@ The OSCAR Destination invokes an OSCAR service asynchronously: - An identifier name of the process. It must be unique. Required. - Endpoint. Required. - Service in OSCAR. Required. -- Token or user/password. The user/password will be first if both authentication processes are defined. Do not edit the OSCAR services. Required. +- Token or user/password. The user/password has priority. Do not edit the OSCAR services. Required. Destination is composed of this component: diff --git a/docpage/docs/04.- Destinations/index.md b/docpage/docs/04.- Destinations/index.md new file mode 100644 index 00000000..1602afc6 --- /dev/null +++ b/docpage/docs/04.- Destinations/index.md @@ -0,0 +1,6 @@ +--- +sidebar_position: 4 +--- +# Destinations + +The `Destinations` defines the information of third-party elements where NiFi sends the data or event. Only [OSCAR](/docs/Destinations/OSCAR) is available. From 4a3fedbd026422c75460bebd2f6bab266dadc1a9 Mon Sep 17 00:00:00 2001 From: SergioLangaritaBenitez Date: Thu, 17 Oct 2024 12:07:23 +0200 Subject: [PATCH 3/5] update doc --- docpage/docs/03.- Sources/{ => AWS}/S3.md | 2 +- docpage/docs/03.- Sources/{ => AWS}/SQS.md | 0 docpage/docs/03.- Sources/AWS/index.md | 26 +++++++++++++++ docpage/docs/03.- Sources/Kafka.md | 27 ++++++++-------- docpage/docs/03.- Sources/dcache.md | 2 +- docpage/docs/03.- Sources/generic.md | 27 ---------------- docpage/docs/03.- Sources/index.md | 13 ++++++++ docpage/docs/Introduction.md | 8 ++--- docpage/docs/Users.md | 37 ++++++++++------------ 9 files changed, 76 insertions(+), 66 deletions(-) rename docpage/docs/03.- Sources/{ => AWS}/S3.md (68%) rename docpage/docs/03.- Sources/{ => AWS}/SQS.md (100%) create mode 100644 docpage/docs/03.- Sources/AWS/index.md delete mode 100644 docpage/docs/03.- Sources/generic.md create mode 100644 docpage/docs/03.- Sources/index.md diff --git a/docpage/docs/03.- Sources/S3.md b/docpage/docs/03.- Sources/AWS/S3.md similarity index 68% rename from docpage/docs/03.- Sources/S3.md rename to docpage/docs/03.- Sources/AWS/S3.md index dba3a0ee..1576d237 100644 --- a/docpage/docs/03.- Sources/S3.md +++ b/docpage/docs/03.- Sources/AWS/S3.md @@ -3,7 +3,7 @@ sidebar_position: 3 --- # S3 -The S3 Source captures an ObjectCreated event from an AWS S3 bucket. DCNiOS creates S3 bucket event redirections to SQS queue. Then, Apache NiFi captures the event and introduces it to the dataflow. The whole pipeline is created using DCNiOS. But, SQS queue is deleted with DCNiOS, but the Event Notification in the S3 section needs to be removed manually. +The S3 Source captures an ObjectCreated event from an AWS S3 bucket. DCNiOS creates S3 bucket event redirections to the SQS queue. Then, Apache NiFi captures the event and introduces it to the dataflow. The whole pipeline is created using DCNiOS. The SQS queue is deleted with DCNiOS, but the Event Notification in the S3 section needs to be removed manually. The S3 Source requires: - An identifier name of the process. It must be unique. Required. diff --git a/docpage/docs/03.- Sources/SQS.md b/docpage/docs/03.- Sources/AWS/SQS.md similarity index 100% rename from docpage/docs/03.- Sources/SQS.md rename to docpage/docs/03.- Sources/AWS/SQS.md diff --git a/docpage/docs/03.- Sources/AWS/index.md b/docpage/docs/03.- Sources/AWS/index.md new file mode 100644 index 00000000..bfb700dc --- /dev/null +++ b/docpage/docs/03.- Sources/AWS/index.md @@ -0,0 +1,26 @@ +--- +sidebar_position: 5 +--- +# AWS + + +**DCNiOS can use some AWS services**: This section clarifies the configuration required to use the credentials! + + +DCNiOS can use some AWS as input. A valid pair of AWS Access Key and AWS Secret Key is necessary in all those cases. DCNiOS takes the AWS credentials from several places, following a hierarchy. This implementation is made to minimize the times that the credentials are written in the configuration file. + +- Environment variables `AWS_ACCESS_KEY_ID` and `AWS_SECRET_ACCESS_KEY`. +- From [aws file credentials](https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-files.html) with the `default` section. Here is an example: + ``` bash + [default] + aws_access_key_id = AK<> + aws_secret_access_key = <> + ``` +- From the DCNiOS workflow file using the argument `AWS_ACCESS_KEY_ID` and `AWS_SECRET_ACCESS_KEY`. + + + +AWS_DEFAULT_REGION is mandatory for any that uses AWS in the configuration file. These ProcessGroups use AWS credentials: +- [SQS](/docs/Sources/SQS) +- [S3](/docs/Sources/S3) + diff --git a/docpage/docs/03.- Sources/Kafka.md b/docpage/docs/03.- Sources/Kafka.md index fe6e702f..b2eb8b2f 100644 --- a/docpage/docs/03.- Sources/Kafka.md +++ b/docpage/docs/03.- Sources/Kafka.md @@ -4,26 +4,27 @@ sidebar_position: 2 # Kafka - The Kafka Source allows us to consume a Kafka topic. It requires this information: - An identifier name of the process. It must be unique. Required. - Kafka bootstrap_servers, just the IP and the port with any protocol as `:`. Required. - The topic name that is going to be consumed. Required. - The group identifier indicates the consumer group. Required. - [IM](https://www.grycap.upv.es/im/index.php) serve a recipe that supports the SASL_SSL security protocol. So, the user `sasl_username` and password `sasl_password` must be set. These parameters are set at Kafka deployment time. Required. -- In case the topics you are consuming follow a `key:value` pattern set the argument `separate_by_key` as true and select the demarcator with `message_demarcator` +- If the topic consuming follows a `key:value` pattern, set the argument `separate_by_key` true and select the demarcator with `message_demarcator`. -Also, it is necessary an SSL connection between NiFi and Kafka. This connection is made by a PKCS12 certificate and the password of the certificate. +Also, an SSL connection between NiFi and Kafka is necessary. A PKCS12 certificate and the certificate's password make this connection. ``` -Kafka: -- name: kafka - bootstrap_servers: : - topic: - group_id: "" - sasl_username: - sasl_password: - ssl_context: - Truststore_Filename: - Truststore_Password: + Kafka: + - name: kafka + bootstrap_servers: : + topic: + group_id: "1" + sasl_username: + sasl_password: + #separate_by_key: "false" + #message_demarcator: ";" + ssl_context: + Truststore_Filename: + Truststore_Password: "" ``` diff --git a/docpage/docs/03.- Sources/dcache.md b/docpage/docs/03.- Sources/dcache.md index 6af32a4a..c7bcdb60 100644 --- a/docpage/docs/03.- Sources/dcache.md +++ b/docpage/docs/03.- Sources/dcache.md @@ -7,7 +7,7 @@ dCache is a Source that listens into a dCache instance. The following values mus - An identifier name of the process. It must be unique. Required. - Endpoint, user, and password of a dCache instance. Required. - Folder of dCache where keeps an active listening.Required. -- Statefile is the name of the file that will store the state. `dcache` value is not recommended. It creates misbehavior. Required. +- Statefile is the file name that will store the state. `dcache` value is not recommended. It creates misbehavior. Required. The dCache Source only works when the NiFi cluster is deployed with the image `ghcr.io/grycap/nifi-sse:latest`, is composed of: - ExecuteProcess diff --git a/docpage/docs/03.- Sources/generic.md b/docpage/docs/03.- Sources/generic.md deleted file mode 100644 index 63ac01f4..00000000 --- a/docpage/docs/03.- Sources/generic.md +++ /dev/null @@ -1,27 +0,0 @@ ---- -sidebar_position: 5 ---- -# Generic - -The generic section creates a custom workflow by giving a ProcessGroup file (.json). The purpose of this component could be Source, Destination, Alteration, or even a data flow complete. The behavior is specific by the creator of the file '.json' - -which is comprised of: -- An identifier name of the process. It must be unique. Required. -- The path of your ProcessGroup (.json file).Required. -- The variables that compose the workflow (as a list). - - -``` -generic: - - name: - file: - variables: - key1: value1 - key2: value2 - components: - - name: InvokeOSCAR - seconds: 5 -``` - - - diff --git a/docpage/docs/03.- Sources/index.md b/docpage/docs/03.- Sources/index.md new file mode 100644 index 00000000..69038a0a --- /dev/null +++ b/docpage/docs/03.- Sources/index.md @@ -0,0 +1,13 @@ +--- +sidebar_position: 3 +--- +# Sources + +`Sources` defines the information of third-party elements in what NiFi connects, waiting for events. + + +Sources support: +- [dCache](/docs/Sources/dcache) +- [KAFKA](/docs/Sources/Kafka) +- [S3](/docs/Sources/AWS/S3) +- [SQS](/docs/Sources/AWS/SQS) \ No newline at end of file diff --git a/docpage/docs/Introduction.md b/docpage/docs/Introduction.md index c13b4fb5..631d4807 100644 --- a/docpage/docs/Introduction.md +++ b/docpage/docs/Introduction.md @@ -9,12 +9,12 @@ DCNiOS is an open-source command-line tool that easily manages the creation of e ![DCNiOS images](/../static/img/dcnios-logo-hor.png) -Apache NiFi Process Group is a group of Processors that compose a dataflow. DCNiOS uses predefined Process Groups that make simple actions like interacting with a third-party component (e.g., consuming from Kafka) or changing the data content (e.g.encoding the data in base64) to compose a complete dataflow. +Apache NiFi Process Group is a group of Processors that compose a dataflow. DCNiOS uses predefined Process Groups that make simple actions like interacting with third-party elements (e.g., consuming from Kafka) or changing the data content (e.g.encoding the data in base64) to compose a complete dataflow. In DCNiOS documentation, the Process Groups are split by purpose into three main groups: 'Sources', 'Destinations', and 'Alterations'. -- 'Sources' interact with a third-party component as the input data receiver. -- 'Destinations' interact with a third-party component as an output data sender. -- 'Alterations' that do not interact with third-party components and change the format of the data flow. +- 'Sources' interact with third-party elements as the input data receiver. +- 'Destinations' interact with third-party elements as an output data sender. +- 'Alterations' that do not interact with third-party elements and change the format of the data flow. diff --git a/docpage/docs/Users.md b/docpage/docs/Users.md index cb3dd82d..1e2d1cb2 100644 --- a/docpage/docs/Users.md +++ b/docpage/docs/Users.md @@ -4,7 +4,7 @@ sidebar_position: 2 # Users Guide -Here, you will find an explanation of the main concepts of DCNiOS, such as the DCNiOS commands, how to define a workflow, involved sections, and the commun sections for all the third-party connections. +This page explains the main concepts of DCNiOS, such as the DCNiOS commands and workflow definition. ## Commands @@ -40,11 +40,11 @@ python dcnios-cli.py changeSchedule --host={nifi-endpoint} \ ## File workflow configuration structure (Yaml structure) -Here, we will explain the workflow definition, the structure of the configuration file, and the information the user has to know about each third-party connection. DCNiOS deploys and configures all the definitions in Apache NiFi. +Here, we explain the workflow definition, the structure of the configuration file, and the information the user has to know about each third-party connection. DCNiOS deploys and configures all the definitions in Apache NiFi. ### Apache NiFi credentials: -In this 'nifi' section, the Apache NiFi credentials will be defined. Inside this section will be defined the Sources that will be deployed and the conection between them. +In this `nifi` section, set the Apache NiFi credentials. Inside this section, define the workflow. ``` nifi: @@ -59,25 +59,26 @@ nifi: Moreover, it is necessary to define the source and destination of data. Sources: -- [dCache](https://www.dcache.org/) -- [KAFKA](https://kafka.apache.org/) -- [S3](https://aws.amazon.com/es/s3/) +- [dCache](/docs/Sources/dcache) +- [KAFKA](/docs/Sources/Kafka) +- [S3](/docs/Sources/AWS/S3) +- [SQS](/docs/Sources/AWS/SQS) Destinations: -- [OSCAR](https://oscar.grycap.net/) +- [OSCAR](/docs/Destinations/OSCAR) -Alterations: -- Merge -- Encoded -- Decoded +The input data format from Sources can change using Alterations. -#### Components Subsection +Alterations: +- [Merge](/docs/Alterations/Merge) +- [Encode](/docs/Alterations/Encode) +- [Decode](/docs/Alterations/Decode) -The subsection `components`, inside Sources and Destinations, is employed to change the configuration of a single Processor of Apache NiFi. It is necessary to know the name of the component. Then, we can change the seconds between executions, -the scheduled time, seconds between executions (ratio execution), and in which kind of node in Nifi is going to execute.the node execution can be changed. +#### Components Subsection +The subsection `components`, inside Sources and Destinations, changes the configuration of a single Processor of Apache NiFi. It is necessary to know the name of the component. The subsection `components` can change the seconds between executions (ratio execution) and select which kind of node execution (PRIMARY or ALL). ``` @@ -90,7 +91,7 @@ components: #### Alterations -The subsection `alterations`, inside Sources, change the data format. These alterations are applied as a descendent definition. In this example, the input data is merged into one message. Then, the merge message is encoded. +The subsection [Alterations](/docs/Alterations), is located inside [Sources](/docs/Sources), and it changes the data format. These alterations are applied as a descendent definition. In this example, the input data is merged into one message. Then, the merge message is encoded in base64 format. ``` - action: Merge @@ -102,9 +103,7 @@ The subsection `alterations`, inside Sources, change the data format. These alte ### Connections - - -In the Connections section, the connections between sources and destinations are established by employing the `from` and `to` keys. +The Connections section defines the links between Sources and Destinations. ``` connection: @@ -137,6 +136,4 @@ nifi: connection: - from: dcache to: edgan3 - - ``` From ff89d6dec0e292ca18e0b2f332f4cdbf83bd558d Mon Sep 17 00:00:00 2001 From: SergioLangaritaBenitez Date: Thu, 17 Oct 2024 12:20:00 +0200 Subject: [PATCH 4/5] use of logo colours in the pages --- docpage/src/css/custom.css | 28 ++++++++++++++-------------- 1 file changed, 14 insertions(+), 14 deletions(-) diff --git a/docpage/src/css/custom.css b/docpage/src/css/custom.css index 2bc6a4cf..bf7672cb 100644 --- a/docpage/src/css/custom.css +++ b/docpage/src/css/custom.css @@ -6,25 +6,25 @@ /* You can override the default Infima variables here. */ :root { - --ifm-color-primary: #2e8555; - --ifm-color-primary-dark: #29784c; - --ifm-color-primary-darker: #277148; - --ifm-color-primary-darkest: #205d3b; - --ifm-color-primary-light: #33925d; - --ifm-color-primary-lighter: #359962; - --ifm-color-primary-lightest: #3cad6e; + --ifm-color-primary: #7a7abc; + --ifm-color-primary-dark: #5454a9; + --ifm-color-primary-darker: #5050a7; + --ifm-color-primary-darkest: #2c2c95; + --ifm-color-primary-light: #7676ba; + --ifm-color-primary-lighter: #8f8fc7; + --ifm-color-primary-lightest: #9f9fcf; --ifm-code-font-size: 95%; --docusaurus-highlighted-code-line-bg: rgba(0, 0, 0, 0.1); } /* For readability concerns, you should choose a lighter palette in dark mode. */ [data-theme='dark'] { - --ifm-color-primary: #25c2a0; - --ifm-color-primary-dark: #21af90; - --ifm-color-primary-darker: #1fa588; - --ifm-color-primary-darkest: #1a8870; - --ifm-color-primary-light: #29d5b0; - --ifm-color-primary-lighter: #32d8b4; - --ifm-color-primary-lightest: #4fddbf; + --ifm-color-primary: #7a7abc; + --ifm-color-primary-dark: #5454a9; + --ifm-color-primary-darker: #5050a7; + --ifm-color-primary-darkest: #2c2c95; + --ifm-color-primary-light: #7676ba; + --ifm-color-primary-lighter: #8f8fc7; + --ifm-color-primary-lightest: #9f9fcf; --docusaurus-highlighted-code-line-bg: rgba(0, 0, 0, 0.3); } From 32d7c8cc4fac90988dbf551f3511334848d262cf Mon Sep 17 00:00:00 2001 From: SergioLangaritaBenitez Date: Thu, 17 Oct 2024 16:29:18 +0200 Subject: [PATCH 5/5] update doc --- docpage/docs/03.- Sources/Kafka.md | 4 ++-- docpage/docs/03.- Sources/dcache.md | 2 +- docpage/docs/04.- Destinations/OSCAR.md | 2 +- docpage/docs/05.- Alterations/Decode.md | 2 +- docpage/docs/05.- Alterations/index.md | 2 +- docpage/docs/AWS.md | 26 ------------------------- docpage/docs/Users.md | 7 ++++--- 7 files changed, 10 insertions(+), 35 deletions(-) delete mode 100644 docpage/docs/AWS.md diff --git a/docpage/docs/03.- Sources/Kafka.md b/docpage/docs/03.- Sources/Kafka.md index b2eb8b2f..fef02e85 100644 --- a/docpage/docs/03.- Sources/Kafka.md +++ b/docpage/docs/03.- Sources/Kafka.md @@ -10,9 +10,9 @@ The Kafka Source allows us to consume a Kafka topic. It requires this informatio - The topic name that is going to be consumed. Required. - The group identifier indicates the consumer group. Required. - [IM](https://www.grycap.upv.es/im/index.php) serve a recipe that supports the SASL_SSL security protocol. So, the user `sasl_username` and password `sasl_password` must be set. These parameters are set at Kafka deployment time. Required. -- If the topic consuming follows a `key:value` pattern, set the argument `separate_by_key` true and select the demarcator with `message_demarcator`. +- If the consumed topic follows a `key:value` pattern, set the argument `separate_by_key` true and select the demarcator with `message_demarcator`. -Also, an SSL connection between NiFi and Kafka is necessary. A PKCS12 certificate and the certificate's password make this connection. +An SSL connection between NiFi and Kafka is necessary. A PKCS12 certificate and the certificate's password must be provided. ``` Kafka: diff --git a/docpage/docs/03.- Sources/dcache.md b/docpage/docs/03.- Sources/dcache.md index c7bcdb60..6be1e07d 100644 --- a/docpage/docs/03.- Sources/dcache.md +++ b/docpage/docs/03.- Sources/dcache.md @@ -7,7 +7,7 @@ dCache is a Source that listens into a dCache instance. The following values mus - An identifier name of the process. It must be unique. Required. - Endpoint, user, and password of a dCache instance. Required. - Folder of dCache where keeps an active listening.Required. -- Statefile is the file name that will store the state. `dcache` value is not recommended. It creates misbehavior. Required. +- Statefile is the file that will store the state. Please, do not employ `dcache` as its name, as it may cause problems. Required. The dCache Source only works when the NiFi cluster is deployed with the image `ghcr.io/grycap/nifi-sse:latest`, is composed of: - ExecuteProcess diff --git a/docpage/docs/04.- Destinations/OSCAR.md b/docpage/docs/04.- Destinations/OSCAR.md index b0fc5fec..41d9ede3 100644 --- a/docpage/docs/04.- Destinations/OSCAR.md +++ b/docpage/docs/04.- Destinations/OSCAR.md @@ -8,7 +8,7 @@ The OSCAR Destination invokes an OSCAR service asynchronously: - An identifier name of the process. It must be unique. Required. - Endpoint. Required. - Service in OSCAR. Required. -- Token or user/password. The user/password has priority. Do not edit the OSCAR services. Required. +- Token or user/password. User/password or token. The user/password has priority over the token. Please, do not edit the OSCAR services. Required. Destination is composed of this component: diff --git a/docpage/docs/05.- Alterations/Decode.md b/docpage/docs/05.- Alterations/Decode.md index d7d0b0c3..0bef27c4 100644 --- a/docpage/docs/05.- Alterations/Decode.md +++ b/docpage/docs/05.- Alterations/Decode.md @@ -5,7 +5,7 @@ sidebar_position: 2 # Decode Alteration's Decode decodes the data flow from the chosen encoding. The user must ensure the input data is encoded using the selected encoding. -Three encodes are available: `base64`, `base32` and `hex`. It is similar to the command `base64 -d` or `base32 -d`. For example, If the input data is a string in base64 with the value `aGVsbG8K` or in base32 with the value `NBSWY3DPBI======`. The output data is the same in both cases, `hello`. +Three encodes are available: `base64`, `base32` and `hex`. They behave like the command `base64 -d`, `base32 -d`, and hex respectively. For example, If the input data is a string in base64 with the value `aGVsbG8K` or in base32 with the value `NBSWY3DPBI======`. The output data is the same in both cases, `hello`. Here is the YAML example. diff --git a/docpage/docs/05.- Alterations/index.md b/docpage/docs/05.- Alterations/index.md index 9a85f446..c12d9dc8 100644 --- a/docpage/docs/05.- Alterations/index.md +++ b/docpage/docs/05.- Alterations/index.md @@ -4,7 +4,7 @@ sidebar_position: 5 # Alterations -The subsection `alterations`, is located inside a Source element and changes the input data format. These alterations are applied as a descendent definition. These steps are helpful to be okay with the input Sources format and to re-use the Sources with no changes. +The `alterations` subsection, is located inside a Source element and changes the input data format. These alterations are applied as a descendent definition. These steps are helpful to be okay with the input Sources format and to re-use the Sources with no changes. diff --git a/docpage/docs/AWS.md b/docpage/docs/AWS.md deleted file mode 100644 index 909f1a7b..00000000 --- a/docpage/docs/AWS.md +++ /dev/null @@ -1,26 +0,0 @@ ---- -sidebar_position: 5 ---- -# AWS - - -**DCNiOS can use some AWS services**: This section clarifies the configuration required to use the credentials! - - -DCNiOS can use some AWS as input. A valid pair of AWS Access Key and AWS Secret Key is necessary in all those cases. DCNiOS takes the AWS credentials from several places, following a hierarchy. This implementation is made to minimize the times that the credentials are written in the configuration file. - -- Environment variables `AWS_ACCESS_KEY_ID` and `AWS_SECRET_ACCESS_KEY`. -- From [aws file credentials](https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-files.html) with the `default` section. Here is an example: - ``` bash - [default] - aws_access_key_id = AK<> - aws_secret_access_key = <> - ``` -- From the file of DCNiOS workflow file named `AWS_ACCESS_KEY_ID` and `AWS_SECRET_ACCESS_KEY`. - - - -AWS_DEFAULT_REGION is mandatory in any Source that uses AWS in the configuration file. These ProcessGroups can employ AWS credentials: -- [SQS](/docs/Sources/SQS) -- [S3](/docs/Sources/S3) - diff --git a/docpage/docs/Users.md b/docpage/docs/Users.md index 1e2d1cb2..0a69a4f2 100644 --- a/docpage/docs/Users.md +++ b/docpage/docs/Users.md @@ -78,12 +78,12 @@ Alterations: #### Components Subsection -The subsection `components`, inside Sources and Destinations, changes the configuration of a single Processor of Apache NiFi. It is necessary to know the name of the component. The subsection `components` can change the seconds between executions (ratio execution) and select which kind of node execution (PRIMARY or ALL). +The components subsection changes the behavior of an inter-process. When you deploy an element, there are some processes running in the background. You can change the seconds between executions (execution ratio) and select which node will perform the execution (PRIMARY or ALL). However, it is necessary to know the name of the process. For example, the destination OSCAR has the component InvokeOSCAR, which sends an HTTP call. ``` components: -- name: GetFile +- name: InvokeOSCAR seconds: 2 node: (ALL | PRIMARY) ``` @@ -91,7 +91,8 @@ components: #### Alterations -The subsection [Alterations](/docs/Alterations), is located inside [Sources](/docs/Sources), and it changes the data format. These alterations are applied as a descendent definition. In this example, the input data is merged into one message. Then, the merge message is encoded in base64 format. +[Alterations](/docs/Alterations), located inside [Sources](/docs/Sources), are employed to modify the format of data. The alterations are applied in the specified order. In the following example, the input data is merged into one message. Then, the merged message is encoded in base64 format. + ``` - action: Merge