pipelines: Unable to download directory artifact from the UX link
What steps did you take:
I am using the off-the-shelf component to download a directory from GCS. It seems to run to completion, but when I attempt to download the artifact from minio, it fails.
here is the pipeline:
import_op = components.load_component_from_url(
'https://raw.githubusercontent.com/kubeflow/pipelines/2dac60c/components/google-cloud/storage/download_dir/component.yaml'
def trainer(train_path: 'URI' = 'gs://my-fake-bucket/tf_records/*'):
import_train_data_task = import_op(train_path)
What happened:
Error when trying to download
What did you expect to happen:
Successfully able to download files
Environment:
Hosted KFP
How did you deploy Kubeflow Pipelines (KFP)?
KFP version:
KFP SDK version:
Anything else you would like to add:
[Miscellaneous information that will assist in solving the issue.]
/kind bug
About this issue
- Original URL
- State: closed
- Created 4 years ago
- Reactions: 3
- Comments: 17 (14 by maintainers)
Commits related to this issue
- feat(frontend): change artifact link to download without extracting. Fixes #3667 — committed to Bobgy/pipelines by Bobgy 4 years ago
- feat(frontend): UX change to support downloading directory artifacts. Fixes #3667 (#4696) * feat(frontend): change artifact link to download without extracting. Fixes #3667 * update artifact preview... — committed to Jeffwan/pipelines by Bobgy 4 years ago
Can we let the user download the actual .tar.gz archive? I think this solves an important problem and makes every artifact downloadable. I just had an issue where I could not download a Tensorflow SavedModel, because it’s a directory format.
The algorithm can be as follows: If the archive contains a single file, return that file. Otherwise return the whole archive.