Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feedback] docs/components/pipelines/user-guides/core-functions/build-advanced-pipeline.md #11647

Open
samoshyn opened this issue Feb 14, 2025 · 1 comment

Comments

@samoshyn
Copy link

It appears that the pipeline is currently using hard-coded values (True/False) for the scaler parameters in the normalize_dataset component:

@dsl.pipeline(name='iris-training-pipeline')
def my_pipeline(
    standard_scaler: bool,
    min_max_scaler: bool,
    neighbors: List[int],
):
    create_dataset_task = create_dataset()

    normalize_dataset_task = normalize_dataset(
        input_iris_dataset=create_dataset_task.outputs['iris_dataset'],
        standard_scaler=True,
        min_max_scaler=False)

Since the pipeline already defines standard_scaler and min_max_scaler as input arguments, it would be more flexible to pass these arguments directly to the task.

@varodrig
Copy link
Contributor

/transfer pipelines

@google-oss-prow google-oss-prow bot transferred this issue from kubeflow/website Feb 18, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants