@@ -57,6 +57,7 @@ ScanPipe's own commands are listed under the ``[scanpipe]`` section::
57
57
add-input
58
58
add-pipeline
59
59
archive-project
60
+ batch-create
60
61
check-compliance
61
62
create-project
62
63
create-user
@@ -83,7 +84,8 @@ For example::
83
84
$ scanpipe create-project --help
84
85
usage: scanpipe create-project [--input-file INPUTS_FILES]
85
86
[--input-url INPUT_URLS] [--copy-codebase SOURCE_DIRECTORY]
86
- [--pipeline PIPELINES] [--execute] [--async]
87
+ [--pipeline PIPELINES] [--label LABELS] [--notes NOTES]
88
+ [--execute] [--async]
87
89
name
88
90
89
91
Create a ScanPipe project.
@@ -124,6 +126,10 @@ Optional arguments:
124
126
- ``--copy-codebase SOURCE_DIRECTORY `` Copy the content of the provided source directory
125
127
into the :guilabel: `codebase/ ` work directory.
126
128
129
+ - ``--notes NOTES `` Optional notes about the project.
130
+
131
+ - ``--label LABELS `` Optional labels for the project.
132
+
127
133
- ``--execute `` Execute the pipelines right after project creation.
128
134
129
135
- ``--async `` Add the pipeline run to the tasks queue for execution by a worker instead
@@ -133,6 +139,90 @@ Optional arguments:
133
139
.. warning ::
134
140
Pipelines are added and are executed in order.
135
141
142
+ .. _cli_batch_create :
143
+
144
+ `$ scanpipe batch-create [--input-directory INPUT_DIRECTORY] [--input-list FILENAME.csv] `
145
+ -----------------------------------------------------------------------------------------
146
+
147
+ Processes files from the specified ``INPUT_DIRECTORY `` or rows from ``FILENAME.csv ``,
148
+ creating a project for each file or row.
149
+
150
+ - Use ``--input-directory `` to specify a local directory. Each file in the directory
151
+ will result in a project, uniquely named using the filename and a timestamp.
152
+
153
+ - Use ``--input-list `` to specify a ``FILENAME.csv ``. Each row in the CSV will be used
154
+ to create a project based on the data provided.
155
+
156
+ Supports specifying pipelines and asynchronous execution.
157
+
158
+ Required arguments (one of):
159
+
160
+ - ``input-directory `` The path to the directory containing the input files to process.
161
+ Ensure the directory exists and contains the files you want to use.
162
+
163
+ - ``input-list `` Path to a CSV file with project names and input URLs.
164
+ The first column must contain project names, and the second column should list
165
+ comma-separated input URLs (e.g., Download URL, PURL, or Docker reference).
166
+
167
+ **CSV content example **:
168
+
169
+ +----------------+---------------------------------+
170
+ | project_name | input_urls |
171
+ +================+=================================+
172
+ | project-1 | https://url.com/file.ext |
173
+ +----------------+---------------------------------+
174
+ | project-2
| pkg:deb/debian/
[email protected] |
175
+ +----------------+---------------------------------+
176
+
177
+ Optional arguments:
178
+
179
+ - ``--project-name-suffix `` Optional custom suffix to append to project names.
180
+ If not provided, a timestamp (in the format [YYMMDD_HHMMSS]) will be used.
181
+
182
+ - ``--pipeline PIPELINES `` Pipelines names to add on the project.
183
+
184
+ - ``--notes NOTES `` Optional notes about the project.
185
+
186
+ - ``--label LABELS `` Optional labels for the project.
187
+
188
+ - ``--execute `` Execute the pipelines right after project creation.
189
+
190
+ - ``--async `` Add the pipeline run to the tasks queue for execution by a worker instead
191
+ of running in the current thread.
192
+ Applies only when ``--execute `` is provided.
193
+
194
+ Example: Processing Multiple Docker Images
195
+ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
196
+
197
+ Assume multiple Docker images are available in a directory named ``local-data/ `` on
198
+ the host machine.
199
+ To process these images with the ``analyze_docker_image `` pipeline using asynchronous
200
+ execution::
201
+
202
+ $ docker compose run --rm \
203
+ --volume local-data/:/input-data:ro \
204
+ web scanpipe batch-create input-data/ \
205
+ --pipeline analyze_docker_image \
206
+ --label "Docker" \
207
+ --execute --async
208
+
209
+ **Explanation **:
210
+
211
+ - ``local-data/ ``: A directory on the host machine containing the Docker images to
212
+ process.
213
+ - ``/input-data/ ``: The directory inside the container where ``local-data/ `` is
214
+ mounted (read-only).
215
+ - ``--pipeline analyze_docker_image ``: Specifies the ``analyze_docker_image ``
216
+ pipeline for processing each Docker image.
217
+ - ``--label "Docker" ``: Tagging all the projects with the "Docker" label to enable
218
+ easy search and filtering.
219
+ - ``--execute ``: Runs the pipeline immediately after creating a project for each
220
+ image.
221
+ - ``--async ``: Adds the pipeline run to the worker queue for asynchronous execution.
222
+
223
+ Each Docker image in the ``local-data/ `` directory will result in the creation of a
224
+ project with the specified pipeline (``analyze_docker_image ``) executed by worker
225
+ services.
136
226
137
227
`$ scanpipe list-pipeline [--verbosity {0,1,2,3}] `
138
228
--------------------------------------------------
0 commit comments