sdk: Regression: `dotnet test` hangs for 15 minutes after test run completed on mac and linux
Our builds on VSTS include dotnet test, which restores, builds, and tests our product. With dotnet CLI 2.1.100 all went well. But when we upgraded to 2.1.300, the dotnet tool doesn’t exit for 15 minutes after the test run is completed with no output to explain the delay.
Expected behavior
The dotnet tool exits immediately, as can be seen on this 2.1.100 run:

Actual behavior
The dotnet tool exits after waiting 15 minutes, as can be seen on this 2.1.300 run:

Steps to reproduce
Please contact me via my microsoft.com email address for links to the builds if you’d like to investigate further.
About this issue
- Original URL
- State: open
- Created 6 years ago
- Reactions: 11
- Comments: 48 (15 by maintainers)
Commits related to this issue
- Disable msbuild node reuse when running benchmarks This is to work around a Linux issue with .NET Core 2.1 https://github.com/dotnet/cli/issues/9397 (So far we haven't seen it hurt the Travis build,... — committed to jskeet/nodatime by jskeet 6 years ago
- Disable msbuild node reuse when running benchmarks This is to work around a Linux issue with .NET Core 2.1 https://github.com/dotnet/cli/issues/9397 (So far we haven't seen it hurt the Travis build,... — committed to nodatime/nodatime by jskeet 6 years ago
- try to disable node reuse (https://github.com/dotnet/sdk/issues/9452#issuecomment-417681156) — committed to aws/jsii by RomainMuller 3 years ago
- Disable nodereuse on linux tests trying https://github.com/dotnet/sdk/issues/9452#issuecomment-417681156 — committed to russcam/apm-agent-dotnet by russcam 3 years ago
- Run test assemblies in parallel (#1106) This commit separates out the common test components from Elastic.Apm.Tests into a new assembly, Elastic.Apm.Tests.Utilities. This allows assemblies containi... — committed to elastic/apm-agent-dotnet by russcam 3 years ago
I can confirm that this still happens on .NET Core 3.1
-nodereuse:falsedoesn’t work either.Hello Guys, I tried all options said here, but nothing work.
I solves my problem (happening only in azure devops integration) turning off parallel test executing
In my test project > Add folder
Properties> AddAssemblyInfo.csThis the exact spot where my test run also hangs.
One interesting point I noticed is that the same code without changes runs in Visual Studio for Mac (the very first test method runs ok, and the others take 2+ minutes each - pure unit tests with mocking - so not even hitting the db). Not sure if that can help solve the issue. dotnet version:
3.1.201I am using Xunit and was able to fix this issue by adding the following to csproj:
<PackageReference Include="Microsoft.NET.Test.Sdk" Version="17.0.0" />FWIW
I’m having a similar situation at github actions with setup .NET action, using XUnit + .NET6
I am still seeing this behavior on Azure DevOps with .NET Core 3.1.301 on a Windows Hosted Agent with VS2019
running dotnet test within a powershell script causes it to hang at the end of the execution. willing to share build and repro details.
Eg: Get’s stuck here for 40+ minutes
CC: @rainersigwald @peterhuene @livarcocc
I just want to chime in and say that I’m experiencing this issue as well. I haven’t been able to reproduce it reliably yet, but I’ll keep trying and once I do, I’ll post a link to a repo here.
The scenario I have is a unit test that is testing some MSBuild targets files. This involves calling
dotnet buildduring the unit test. This is basically what it does:dotnet build.When running this via
dotnet test, it sometimes hangs for 15 minutes after the tests complete (the test duration reported is only a few seconds), but sometimes it doesn’t hang at all.@Evengard Try the
--blameand--blame-hangflags to help pinpoint the hanging tests. It creates a memory dump on hang timeout that helped me find out that it was one of my hosted services that was still alive.https://learn.microsoft.com/en-us/dotnet/core/tools/dotnet-test
Filter flag is optional, but useful to limit the tests you want to run.
Seems like passing
-v:quiettodotnet test(not needed fordotnet build, only fordotnet test- at least in my experience) alleviates the issue somehow… No ideas how, but well, it works. I guess the solution above was the same - just setting the verbosity to quiet.Also from here seems like running the binary directly inside a container (eg as a dockerfile entrypoint, or in my case directly with kubectl exec inside a running kubernetes pod), not wrapping it into a bash call, seems to invoke the issue somehow, whereas wrapping it into a bash call seems to alleviate it. Maybe bash somehow helps with theese stdin/stdout streams deadlocks?
Can you please explain @ana-cozma what exactly did you change? The steps you included seem pretty standard to me, unless you have something special in the runsettings file.
Hi, I think lot of these issues are due to a bug in the
Processtype, I opened an issue about it here: https://github.com/dotnet/runtime/issues/51277In summary:
It looks like most of these issues happens when the dotnet sdk is summoned from another dotnet process, so with the Process class.
@reduckted You should be able to resolve that by calling
dotnet build -nodereuse:falseinside your test. What’s happening for you is that the outerdotnet testsets the environment variable that MSBuild uses to decide whether to attach the console or not. That is propagated down through child processes to the test process and then to the test’s childdotnet buildprocess, which then starts with node reuse but without the safe disconnect-console node startup. Explicitly specifying “no node reuse” should cause the worker nodes to exit immediately after the build, allowing the wait-on-child-process code to complete quickly.