azure-pipelines-tasks: AzureFileCopyV2 failed when uploading to Azure Storage Static Website

Hi,

With the new release of Azure Static Website. I can’t seem to find a way to upload to its default container of $web. Same setting would work if I switch to a different container.

Upload to container: '$web' in storage account: '<account name>' with blob prefix: '' failed with error: 'AzCopy.exe exited with non-zero exit code while uploading files to blob storage.' For more info please refer to https://aka.ms/azurefilecopyreadme

Is there something extra I need to do? Or is it not supported yet? And if so, is it in the pipeline to be released soon?

Thanks!

About this issue

  • Original URL
  • State: closed
  • Created 6 years ago
  • Comments: 56 (21 by maintainers)

Most upvoted comments

Had the same issue…Reverting back to version 2 works. Version 3 and 4 do not work for me.

@onpaws AzureFileCopyV2 (using AzCopy v7.1.0) is already publicly available. When you add Azure File Copy task in your build/release definition, you should see an option of selecting the version of the task. You can select version 2 to test the preview version.

image

This issue is occuring in version 3.* and 4.*(preview):

image

image

We’ve updated the AzCopy version to v7.3.0. The task should now be able to handle upload to $web container. The fix will roll out in the next deployment cycle. So it will take a couple of weeks to reach all accounts. The fix will be available in Azure File Copy task v2.0.6 and above.

I got the same error today

just downgrade task version to 2.* can solve this issue

Deployment is a bit delayed. It is planned to complete by next week. I’ll post more updates when I have any.

@onpaws Support for $web for static websites will be available a few after AzCopy tool is updated. It will be a minor version update only, so it will also be version 2.x. I’ll let you know the exact version of task you should see when we have made the changes.

Hi @alaconda Thanks for reporting this. AzureFileCopyV2 task uses AzCopy v7.1.0 to copy files. This is the latest version of AzCopy available and it does not support Azure Static Websites. Refer this issue

A new AzCopy version will be released soon. We’ll update our task to use that latest version of AzCopy which should support static websites. Depending on when the new AzCopy version will be made available, it will take at least a few weeks for the updated AzureFileCopyV2 task to reach all accounts.

I have the same issue, I tried all versions but the first and I get the same error in all of them, this definetly shouldn’t be closed.

I also got this error, and I got it working by setting the Storage Blob Data Contributor role to the connection principal of the service connection, as @ChristosMonogios mentioned.

Static website support should be available in all accounts now. Please check if you are seeing Azure File Copy v2.0.7 in your builds/releases and whether you are able to deploy to $web container successfully.

works now with v2.0.7

Ran into this problem today. Assigning the “Storage Blob Data Contributor” role did not help. Dropped to version three seemed to do the trick.

- task: AzureFileCopy@3
  inputs:
    SourcePath: 'HelpDocumentation'
    azureSubscription: 'Subscription Name'
    Destination: 'AzureBlob'
    storage: 'storageaccountname'
    ContainerName: '$web'

@securityvoid the service principal associated with the ARM connection you’re using has to have access to the storage resource in Azure RBAC. The easiest way to get to the service principal is to navigate to your Azure Devops project settings, select “Service Connections”, select the service connection you’re using and then click “Manage Service Principal”. That’ll show you its name and ID in Azure AD. Then you’ll need to go to the storage resource you want to grant it access to, and give the service principal the “Storage Blob Data Contributor” role on that resource.

You could possibly give it that role on the entire subscription, but I’d rather keep that kind of thing to a minimum…

Assigning the “Storage Blob Data Contributor” role to the ARM service connection’s service principal as @adrien-constant so helpfully suggested worked, using version 4.171.3.

My YAML:

- task: AzureFileCopy@4
  displayName: 'Copy files to blob storage: $(storageName)'
  inputs:
    SourcePath: '$(build.sourcesDirectory)\archive\'
    Destination: AzureBlob
    storage: $(storageName)
    ContainerName: $web
    azureSubscription: 'ARM service connection goes here' # needs a role assignment before it'll work

Okay, I figured it out - yet I wish the documentation was a little bit more helpful.

So I was missing publish artefacts from my azure-pipeline yaml:

pool:
  vmImage: 'Ubuntu 16.04'

steps:
- task: NodeTool@0
  inputs:
    versionSpec: '8.x'
  displayName: 'Install Node.js'

- script: |
    cd srv
    npm install
    npm run build
  displayName: 'npm install and build'

# This was missing
- task: PublishBuildArtifacts@1
  inputs:
    PathtoPublish: '$(System.DefaultWorkingDirectory)/srv/public'

After I added this step, my UI now properly displays the published artefact and I can download and/or explore it: http://prntscr.com/l4pemh

One more important thing I found the hard way - you can only deploy to AzureBlob from Windows Hosted machine (I had mine setup with Ubuntu). The throwing error is something like

you have script build for Windows only such as powershell

I haven’t, but obviously the uploading job has 😃

Hope this is useful to someone in the future.

Regards,

@jonathantower You can ignore that backtick. We add it because when we actually execute this command, AzCopy resolves all strings starting with ‘$’. So it would convert $web null/empty string because it would treat $web as a powershell variable. Hence, we escape the ‘$’. The actual command that is executed does not contain this backtick.

You can add a variable named ‘system.debug’ with value ‘true’ in your release to get the debug logs. In those logs, you will find the AzCopy logs included and it will show the exact AzCopy command executed. If your upload to $web container is not working as expected, please send us these debug logs at RM_Customer_Queries [at] microsoft [dot] com and we will try to debug the issue.

@nathanvv Patch versions of tasks are updated on almost every check-in of code. So the patch versions can easily jump multiple versions between 2 releases. The latest release hasn’t been rolled out to all accounts so you might see v2.0.4 in a few days. For v2.0.6 and above, it will take at least 2-3 weeks more for all accounts to get updated.

Just curious, looks like we are still at v2.0.1. Is there a release log where I could check to get the date of release of v2.0.1 ?. More curious to get a rough idea oh how long it would take for us to get this patch

image

@Jaxwood One way is to run the task. If you see the release logs of your task, you’ll find the version of the task near the top of the logs. image

Another way is to get data from tasks api. Just enter the following URL in your browser (assuming you are logged in to your account): https://{you_account_name}.visualstudio.com/_apis/Distributedtask/tasks. This will give you a json of task versions in your account. Search for “AzureFileCopy” (with quotes) in that json and you should find 2 search results (for V1 and V2 of the task). Version will be there right after the search match locations.

HI @rajatagrawal-dev I’m curious. For VSTS users who are interested in testing out the new release of AzureFileCopyV2, is there a way to opt-in before public release? I’d be happy to help test. I’ve already opted into the other ‘preview’ VSTS features but curious about VSTS Tasks specifically. Thanks