runtime: Clean Enlistment Test Failures

After installing Windows 10 on a Parallels VM (on a MacPro), installing VS 2015, cloning, then running build, and build-tests from a VS 2015 Developer Command Prompt I consistently get the following errors:

Verified fixed by dotnet/corefx#24052

~C:\git\corefx\Tools\tests.targets(483,5): error : One or more tests failed while running tests from ‘System.Globalization.Tests’ please check C:\git\corefx\bin/tests/System.Globalization.Tests/netcoreapp-Windows_NT-Debug-x64/testResults.xml for details! [C:\git\corefx\src\System.Globalization\tests\System.Globalization.Tests.csproj]~

Verified fixed by dotnet/corefx#24016

~C:\git\corefx\Tools\tests.targets(483,5): error : One or more tests failed while running tests from ‘System.Console.Tests’ please check C:\git\corefx\bin/tests/System.Console.Tests/netcoreapp-Windows_NT-Debug-x64/testResults.xml for details! [C:\git\corefx\src\System.Console\tests\System.Console.Tests.csproj]~

Verified fixed by dotnet/corefx#24147

~C:\git\corefx\Tools\tests.targets(483,5): error : One or more tests failed while running tests from ‘System.Net.NameResolution.Functional.Tests’ please check C:\git\corefx\bin/tests/System.Net.NameResolution.Functional.Tests/netcoreapp-Windows_NT-Debug-x64/testResults.xml for details! [C:\git\corefx\src\System.Net.NameResolution\tests\FunctionalTests\System.Net.NameResolution.Functional.Tests.csproj]~

Verified fixed by dotnet/corefx#24009 and dotnet/corefx#24097

~C:\git\corefx\Tools\tests.targets(483,5): error : One or more tests failed while running tests from ‘System.Drawing.Common.Tests’ please check C:\git\corefx\bin/tests/System.Drawing.Common.Tests/netcoreapp-Windows_NT-Debug-x64/testResults .xml for details! [C:\git\corefx\src\System.Drawing.Common\tests\System.Drawing.Common.Tests.csproj]~

The console log.

About this issue

  • Original URL
  • State: closed
  • Created 7 years ago
  • Comments: 47 (47 by maintainers)

Commits related to this issue

Most upvoted comments

We should probably split off the Networking test discussion and harden the test more against not-so-common environment settings.

Based on this discussion, it sounds like these particular tests are invalid. The test was originally created because it was assumed that doing a DNS lookup against IPAddress.None (which is the broadcast address, 255.255.255.255) would always be invalid. Then it was tweaked to ignore the test on MacOS.

public static IEnumerable<object[]> GetNoneAddresses()
{
    yield return new object[] { IPAddress.None };
}

[PlatformSpecific(~TestPlatforms.OSX)] // macOS will resolve IPAddress.None to broadcasthost and produce a valid listing
[Theory]
[MemberData(nameof(GetNoneAddresses))]
public async Task Dns_GetHostEntryAsync_NoneIPAddress_Fail(IPAddress address)
{
    string addressString = address.ToString();

    await Assert.ThrowsAnyAsync<SocketException>(() => Dns.GetHostEntryAsync(address));
    await Assert.ThrowsAnyAsync<SocketException>(() => Dns.GetHostEntryAsync(addressString));
}

[PlatformSpecific(TestPlatforms.OSX)] // macOS will resolve IPAddress.None to broadcasthost and produce a valid listing
[Theory]
[MemberData(nameof(GetNoneAddresses))]
public async Task Dns_GetHostEntryAsync_NoneIPAddress_Success(IPAddress address)
{
    IPHostEntry result = await Dns.GetHostEntryAsync(address);
    Assert.NotNull(result);
    Assert.NotNull(result.AddressList);
    Assert.Equal(1, result.AddressList.Length);
    Assert.Equal(address, result.AddressList[0]);
}

But in reality, since this IP address could have a DNS mapping on any platform via hard-coding it in the hosts file, this test should be removed. I will submit a PR to take care of it.

Also, adding the 255.255.255.255 entry is recommended as workaround for all sorts of problems. For example, https://technet.microsoft.com/en-us/library/security/ms16-077.aspx suggests to add 255.255.255.255 wpad as a workaround. So this problem is not specific to Parallels. Any machine with these sort of workarounds applied will have the same problem.

Interesting. We’ll need to study this test and replace “255.255.255.255” with something else that is appropriate for this particular test.

Also, adding the 255.255.255.255 entry is recommended as workaround for all sorts of problems. For example, https://technet.microsoft.com/en-us/library/security/ms16-077.aspx suggests to add 255.255.255.255 wpad as a workaround. So this problem is not specific to Parallels. Any machine with these sort of workarounds applied will have the same problem.

@kingces95 It’d be great if we could add a mac/parallels run to the CI.

Naively I would expect Windows under Parallels on Mac to behave in similar way as in any other virtualization technology (with Windows/Linux host). Ideally it should behave the same way as on physical HW (I assume that’s how Windows VM on Windows/Linux host behaves). I wonder why that is not the case here - maybe Parallels/driver bug?

Given that we do not have unlimited Mac HW, adding new CI legs to run Windows in Mac/Parallels is very costly for us. It also adds complexity to the infrastructure - and we have much higher priority gaps/bugs in infrastructure, so this would be very low pri from that perspective. The impact on CoreFX contributors seems to be fairly contained (“just” 4 tests difference, brought down to 2 now) compared to other problems/differences in our test bed affecting contributors.

Overall, it feels like being reactive to discovered differences seems the right thing to do from cost/benefit point of view at this moment.

If there is a way to detect we are running under Mac/Parallels virtualization, we should be able to disable/modify the tests in such environment. Is it possible to detect? Otherwise we could just consider this kind of error in particular tests as “by design”.

Yes, the same code, when run against .NET Framework should throw SocketException, but it does not on a windows vm on top of parallels on mac

Ok, so the variable here is that the “hardware” is not a real machine but rather the Parallels hypervisor architecture on a Mac.

cc: @karelz You’ll need a Mac person to troubleshoot this since this is a Mac specific problem causing different behaviors to leak thru into the Windows container.

Or are you saying that that the test is failing when run against .NET Framework in a Windows container on top of Parallels on a Mac?

Yes, the same code, when run against .NET Framework should throw SocketException, but it does not on a windows vm on top of parallels on mac. Below is the repro code used, same as the failing test.

using System;
using System.Net;

public class Program
{
    public static void Main(string[] args)
    {
        Console.WriteLine(Dns.GetHostEntryAsync(IPAddress.None).GetAwaiter().GetResult().AddressList[0]);
    }
}

@kingces95 Could you let me know what the error message is from the System.Drawing.Common tests is, if you are still hitting issues there? I added some logging that will help me diagnose the issue.

Hm. So how are folks investigating test failures? They’re just looking at the raw console output and raw xml xunit output?

Basically, yeah. Hopefully when we sort out your initial problems and get to a clean state, it is more manageable. Usually the tests are expected to be 100% clean, unless you have changed something (and then the failure should be obvious). Also, it is possible to run individual test projects, which cuts down on some of the verbosity and makes things more manageable, especially when you are only working on one library at a time.