GitVersion: LibGit2Sharp.LockedFileException: The index is locked. This might be due to a concurrent or crashed process
Wasn’t sure if this was an issue or whether we shouldn’t be using msbuild /m
to enable concurrent builds.
[11:45:48] [BuildSolution] 17>MSBUILD : warning : WARN [09/02/16 1:45:48:65] Could not determine assembly version: LibGit2Sharp.LockedFileException: The index is locked. This might be due to a concurrent or crashed process [Y:\Work\refs\heads\master\source\Octopus.E2ETests\Octopus.E2ETests.csproj]
[11:45:48] [BuildSolution] MSBUILD : warning : at LibGit2Sharp.Core.Ensure.HandleError(Int32 result) [Y:\Work\refs\heads\master\source\Octopus.E2ETests\Octopus.E2ETests.csproj]
[11:45:48] [BuildSolution] MSBUILD : warning : at LibGit2Sharp.Core.Proxy.git_checkout_tree(RepositoryHandle repo, ObjectId treeId, GitCheckoutOpts& opts) [Y:\Work\refs\heads\master\source\Octopus.E2ETests\Octopus.E2ETests.csproj]
[11:45:48] [BuildSolution] MSBUILD : warning : at LibGit2Sharp.Repository.CheckoutTree(Tree tree, IList`1 paths, IConvertableToGitCheckoutOpts opts) [Y:\Work\refs\heads\master\source\Octopus.E2ETests\Octopus.E2ETests.csproj]
[11:45:48] [BuildSolution] MSBUILD : warning : at LibGit2Sharp.Repository.Checkout(Tree tree, CheckoutOptions checkoutOptions, String refLogHeadSpec) [Y:\Work\refs\heads\master\source\Octopus.E2ETests\Octopus.E2ETests.csproj]
[11:45:48] [BuildSolution] MSBUILD : warning : at LibGit2Sharp.Repository.Checkout(Branch branch, CheckoutOptions options) [Y:\Work\refs\heads\master\source\Octopus.E2ETests\Octopus.E2ETests.csproj]
[11:45:48] [BuildSolution] MSBUILD : warning : at LibGit2Sharp.Repository.Checkout(String committishOrBranchSpec, CheckoutOptions options) [Y:\Work\refs\heads\master\source\Octopus.E2ETests\Octopus.E2ETests.csproj]
[11:45:48] [BuildSolution] MSBUILD : warning : at GitTools.Git.GitRepositoryHelper.EnsureLocalBranchExistsForCurrentBranch(Repository repo, String currentBranch) [Y:\Work\refs\heads\master\source\Octopus.E2ETests\Octopus.E2ETests.csproj]
[11:45:48] [BuildSolution] MSBUILD : warning : at GitTools.Git.GitRepositoryHelper.NormalizeGitDirectory(String gitDirectory, AuthenticationInfo authentication, Boolean noFetch, String currentBranch) [Y:\Work\refs\heads\master\source\Octopus.E2ETests\Octopus.E2ETests.csproj]
[11:45:48] [BuildSolution] MSBUILD : warning : at GitVersion.GitPreparer.Initialise(Boolean normaliseGitDirectory, String currentBranch) [Y:\Work\refs\heads\master\source\Octopus.E2ETests\Octopus.E2ETests.csproj]
[11:45:48] [BuildSolution] MSBUILD : warning : at GitVersion.ExecuteCore.ExecuteGitVersion(String targetUrl, String dynamicRepositoryLocation, Authentication authentication, String targetBranch, Boolean noFetch, String workingDirectory, String commitId, Config overrideConfig) [Y:\Work\refs\heads\master\source\Octopus.E2ETests\Octopus.E2ETests.csproj]
[11:45:48] [BuildSolution] MSBUILD : warning : at GitVersion.ExecuteCore.TryGetVersion(String directory, VersionVariables& versionVariables, Boolean noFetch, Authentication authentication) [Y:\Work\refs\heads\master\source\Octopus.E2ETests\Octopus.E2ETests.csproj]
This occurs on TeamCity 10.0.0
building ~30 projects. We’ve only just upgraded to GitVersionTask.3.6.3
from GitVersionTask.3.6.1
which had the gitversion_cache
issue.
In the meantime I’ll try some more builds and see if it recurs frequently.
About this issue
- Original URL
- State: closed
- Created 8 years ago
- Reactions: 4
- Comments: 83 (54 by maintainers)
Commits related to this issue
- workaround for GitVersion file lock https://github.com/GitTools/GitVersion/issues/1031 — committed to nmklotas/GitLabApiClient by jetersen 4 years ago
- workaround for GitVersion file lock https://github.com/GitTools/GitVersion/issues/1031 — committed to MindaugasLaganeckas/GitLabApiClient by jetersen 4 years ago
One thing I’m trying as a workaround right now (Azure DevOps Pipeline) is to do this:
/p:Version="$(GitVersion.AssemblySemVer)" /p:FileVersion="$(GitVersion.AssemblySemFileVer)" /p:InformationalVersion="$(GitVersion.InformationalVersion)" /p:PackageVersion="$(GitVersion.NugetVersion)"
Since we don’t use AssemblyInfo.cs at all (dotnet core SDK), this seems to work. Whether it will fix the issue is unknown since it’s wildly unpredictable as to when it happens.
edit: added the powershell script I used.
🎉 This issue has been resolved in version 5.6.5 🎉 The release is available on:
Your GitReleaseManager bot 📦🚀
@asbjornu, We have been having the same issue in our CI/CD builds on our private GitLab instance
Sounds great! 👍
It’s a viable choice if other options are too difficult. It’s much better than crashing and will still yield better performance with parallelization than a serial build would.
I don’t know either.
Good. We can instead harmonize this a bit in v6 and gather everything GitVersion related into a
.git/gitversion/
folder.Awesome!
I think we should be able to surface this bug with enough projects in a single solution, so a very simple repository with just a bare minimum solution with perhaps 100 projects all using GitVersionTask should be able to consistently reproduce this bug, I’d assume.
I’ve set up and invited you to GitVersion.TestCases so we can collaborate on the reproduction there. I’m thinking this one repository can have one branch per test case, starting with this parallelization problem.
My point exactly! I have a general distaste for
static
as well, since unless it’s always paired withreadonly
and immutable objects, leads to global mutable state, meaning bugs and bad encapsulation. It’s also impossible to mock in tests and generally doesn’t play well IOC.So do I! 😃
Thanks for your answering. When I find time, I will try to look around the code in GitVersionCode and seek for a possiblity to wrap each single action that involves GitVersion.yml, caching or LibGit2Sharp by an file lock. Another approach would be to lock on startup and unlock on exit. But that is quick and dirty and reminds me of the time, where you instantiated one SqlConnection during lifetime. 😃 Yes would become a breaking change if someone externals is relying on exact these created caches. But not sure if someone is really working with those temporary cach files that are generated by GitVersion and not by someone elses. I won’t change any structure. I think I have now all infos to proceed when I find time. I hope for weekend or next week. One question, what lets me open, is a test case: It is quite difficult to create such an environment to test it. My project with its 10+ sub projects that relies on GitVersionTask could serve that kind of test, but it does fail only each second or third time due to LibGit2Sharp.LockedFileException. What is your idea to make a test case? Maybe a kind of penetration test of a huge project with many sub projects someone can provide us?
Regarding to your distate of Tool and Utility. I can understand you and agree with you, that such a Tool whatever static prupose it does serve, can be lost quickly. But a library is not a library without such functions. Those functions cannot be related to only one purpose easily, So when you have two classes that serves completely its own purposes, but have both one method that share a bit of the same logic like Directory.Exists or File.Exists, then you should outsource them definetely to let them become Directory.Exists and File.Exists. My taste for Tools (or Utilities) are coming for this reason, that I don’t want to collide with existing classes from System.*, but want to state its relation to them. So the name LockFileTools comes more or less by habit. When I think more over it: FileLocker does associate a kind of encapsulation between these static functions, so even when I would call it FileLocker: The class itself does not provide a file locker, but each of its static functions. Moreover they do not provide a file locker instead they do lock a file it without an instanstiated context so the name would irritate somehow. But this would change if FileLocker would be instantiable and would require a file path and provides methods like “Lock”, then the class itself would be a file locker and therefore named FileLocker. Haha, but I do respect your programming style. ☺️
To your last words: I am looking forward that this library will become a strong library. 😊
The workaround worked for me (disable parallel builds). There have been longer term discussions to fix this. If you want to see this issue fixed for GitVersionTask in the near term it needs someone to contribute a fix, and get involved. Or don’t and wait on someone else to fix it. The long term discussion I was aware of involve :
Adding test coverage to detect this issue during CI builds. That way we can make that test / check active on a branch and verify the fix there also.
Talk of using our own lock around git operations. Not sure if this had legs.
Talk of raising this is a libgit2sharp issue. Not sure its a libgit2sharp problem though.
Another idea of allowing gitversiontask to read the version from somewhere else, which would be primed at the start of a build I.e by running gitversion.exe first during a ci build. This means gitversiontask potentially wouldn’t need to access git at all. This sort of thing idea though likely won’t be explored further untilafter we have the new V6 CLI (seperate issue).
I think someone commented above with some msbuild magic. It wasn’t in PR form or easily understandable - perhaps its a fix, I don’t know. If you want to see it fixed and have ideas for a fix, it would speed things up to have something in PR form that is testable.
Good point, I guess we need to limit the code that uses Libgit2Sharp, and rely on cache, if possible
This is working for us - the only annoying thing is that package version does not contain the full semver for commits ahead in the same way the gitversion package does. i.e. 1.9.0-alpha0123 Vs. 1.9.0-alpha.122.
Much like the others, we’re seeing this using on Azure DevOps, using the .Net Core build step. We’re building a solution with 19 projects, of which around 16 have the gitversion NuGet package.
edit: using “Automatic package versioning” option on the .Net Core task with the Pack option, we can set the Environment Variable option to be GITVERSION_FullSemVer. This is working well.
🙏 👏 🙏 👏 🙏
As long as the method chosen is cross-platform, any method will do.
In version 6 we are moving towards executing
gitversion calculate
once at the start of the build, and then doing all the various transformations, outputs, code generation, etc., as individual steps that may occur once per project, once perAssemblyInfo
file, etc. after.Yes, absolutely.
That’s the plan. Each GitVersionTask instance don’t need to calculate their own version number, as the result will be the same each and every time.
Yes.
Just in case anyone needs gitversion before the errors are corrected:
assembly: AssemblyVersion
,AssemblyFileVersion
, andAssemblyInformationalVersion
from theAssemblyInfo.cs
files.GitVersion.yml
inside the solution directory.Path
environment variable.GitVersionTask
NuGet package to all project files. It will be used by Visual Studio, etc.Execute Windows Batch Command
frame, modify if necessary and/or replace with your own build commands:gitversion /updateassemblyinfo
dotnet publish SolutionName.sln -c release -r win-x86 /p:DisableGitVersionTask=true
It works for build servers if we disable MSBuild task via the property, and inject the version number directly to each AssemblyInfo.cs file. Everywhere else it will just use GitVersionTask if not specified otherwise. Simple and reliable.
I like the sound of this, @rose-a. Would you be up for providing a pull request for such a reorganization?
This sounds a lot like what is proposed in #983. It’s been open almost 4 years, so if someone finally has the time and energy to tackle that, it would be awesome. 😅
Tangentially related: #1227
We’re experiencing this too on a local GitLab instance when building solutions which contain multiple projects that reference
GitVersionTask
.My workaround:
Put the following in each project file (or a
Directory.build.props
file in the solution dir):Then run
gitversion
before executing the build and write the output into environment variables. This way gitversion is only run once and the result is then reused in each project being built.Edit: PowerShell Script
Edit 2
You also need to set the version properties (which is normally performed within the
GetVersion
task here) and disable other tasks likeWriteVersionInfoToBuildLog
.We also have this issue.
dotnet core 2.2.203 solution with a dozen or so projects
Often it’s “build->fail, build->fail, build->fail, build->succeed”. Doesn’t ever seem to happen locally when building through VS. We’ve tried adding “-p:maxCpuCount=1” to the build command to see if we could force it to build without parallelization (thinking that might have something to do with it), but no dice.
edit: also this doesn’t just happen on build, it also sometimes happens on “dotnet test” as well.
We also have the issue when building on Azure DevOps. Locally we haven’t experienced any issues.
Im happy with that