runtime: Clean Enlistment Test Failures
After installing Windows 10 on a Parallels VM (on a MacPro), installing VS 2015, cloning, then running build
, and build-tests
from a VS 2015 Developer Command Prompt
I consistently get the following errors:
Verified fixed by dotnet/corefx#24052
~C:\git\corefx\Tools\tests.targets(483,5): error : One or more tests failed while running tests from ‘System.Globalization.Tests’ please check C:\git\corefx\bin/tests/System.Globalization.Tests/netcoreapp-Windows_NT-Debug-x64/testResults.xml for details! [C:\git\corefx\src\System.Globalization\tests\System.Globalization.Tests.csproj]~
Verified fixed by dotnet/corefx#24016
~C:\git\corefx\Tools\tests.targets(483,5): error : One or more tests failed while running tests from ‘System.Console.Tests’ please check C:\git\corefx\bin/tests/System.Console.Tests/netcoreapp-Windows_NT-Debug-x64/testResults.xml for details! [C:\git\corefx\src\System.Console\tests\System.Console.Tests.csproj]~
Verified fixed by dotnet/corefx#24147
~C:\git\corefx\Tools\tests.targets(483,5): error : One or more tests failed while running tests from ‘System.Net.NameResolution.Functional.Tests’ please check C:\git\corefx\bin/tests/System.Net.NameResolution.Functional.Tests/netcoreapp-Windows_NT-Debug-x64/testResults.xml for details! [C:\git\corefx\src\System.Net.NameResolution\tests\FunctionalTests\System.Net.NameResolution.Functional.Tests.csproj]~
Verified fixed by dotnet/corefx#24009 and dotnet/corefx#24097
~C:\git\corefx\Tools\tests.targets(483,5): error : One or more tests failed while running tests from ‘System.Drawing.Common.Tests’ please check C:\git\corefx\bin/tests/System.Drawing.Common.Tests/netcoreapp-Windows_NT-Debug-x64/testResults .xml for details! [C:\git\corefx\src\System.Drawing.Common\tests\System.Drawing.Common.Tests.csproj]~
About this issue
- Original URL
- State: closed
- Created 7 years ago
- Comments: 47 (47 by maintainers)
Commits related to this issue
- Remove invalid NameResolution tests (#24147) The Dns_GetHostEntryAsync_* are fundamentially invalid because it's possible for the broadcast address, 255.255.255.255, to have an DNS mapping via manu... — committed to dotnet/corefx by davidsh 7 years ago
- Sync (#5) * Microsoft.ServiceModel.Syndication skeleton project * Adding the existing classes of SyndicationFeed from .net fx * Added the needed references to get the code to compile * Chang... — committed to beniamin-airapetian/corefx by beniamin-airapetian 7 years ago
Based on this discussion, it sounds like these particular tests are invalid. The test was originally created because it was assumed that doing a DNS lookup against IPAddress.None (which is the broadcast address, 255.255.255.255) would always be invalid. Then it was tweaked to ignore the test on MacOS.
But in reality, since this IP address could have a DNS mapping on any platform via hard-coding it in the hosts file, this test should be removed. I will submit a PR to take care of it.
Interesting. We’ll need to study this test and replace “255.255.255.255” with something else that is appropriate for this particular test.
Also, adding the
255.255.255.255
entry is recommended as workaround for all sorts of problems. For example, https://technet.microsoft.com/en-us/library/security/ms16-077.aspx suggests to add255.255.255.255 wpad
as a workaround. So this problem is not specific to Parallels. Any machine with these sort of workarounds applied will have the same problem.Naively I would expect Windows under Parallels on Mac to behave in similar way as in any other virtualization technology (with Windows/Linux host). Ideally it should behave the same way as on physical HW (I assume that’s how Windows VM on Windows/Linux host behaves). I wonder why that is not the case here - maybe Parallels/driver bug?
Given that we do not have unlimited Mac HW, adding new CI legs to run Windows in Mac/Parallels is very costly for us. It also adds complexity to the infrastructure - and we have much higher priority gaps/bugs in infrastructure, so this would be very low pri from that perspective. The impact on CoreFX contributors seems to be fairly contained (“just” 4 tests difference, brought down to 2 now) compared to other problems/differences in our test bed affecting contributors.
Overall, it feels like being reactive to discovered differences seems the right thing to do from cost/benefit point of view at this moment.
If there is a way to detect we are running under Mac/Parallels virtualization, we should be able to disable/modify the tests in such environment. Is it possible to detect? Otherwise we could just consider this kind of error in particular tests as “by design”.
Ok, so the variable here is that the “hardware” is not a real machine but rather the Parallels hypervisor architecture on a Mac.
cc: @karelz You’ll need a Mac person to troubleshoot this since this is a Mac specific problem causing different behaviors to leak thru into the Windows container.
Yes, the same code, when run against .NET Framework should throw SocketException, but it does not on a windows vm on top of parallels on mac. Below is the repro code used, same as the failing test.
@kingces95 Could you let me know what the error message is from the System.Drawing.Common tests is, if you are still hitting issues there? I added some logging that will help me diagnose the issue.
Basically, yeah. Hopefully when we sort out your initial problems and get to a clean state, it is more manageable. Usually the tests are expected to be 100% clean, unless you have changed something (and then the failure should be obvious). Also, it is possible to run individual test projects, which cuts down on some of the verbosity and makes things more manageable, especially when you are only working on one library at a time.