-
Notifications
You must be signed in to change notification settings - Fork 4.9k
Update BuildTools, CoreClr, CoreSetup, ProjectNTfs, ProjectNTfsTestILC to preview2-02606-01, preview2-26306-01, preview2-26305-07, beta-26306-00, beta-26306-00, respectively (master) #27737
Update BuildTools, CoreClr, CoreSetup, ProjectNTfs, ProjectNTfsTestILC to preview2-02606-01, preview2-26306-01, preview2-26305-07, beta-26306-00, beta-26306-00, respectively (master) #27737
Conversation
|
Discarded CI Status: 12:hourglass: 2:heavy_check_mark: (click to expand)
|
0622f06 to
ef0357c
Compare
|
Ingestion of buildtools will fail because of my recent mapping changes. I will fix this in a separate PR. |
|
Discarded CI Status: 3:x: 1:hourglass: 10:heavy_check_mark: (click to expand)
|
ef0357c to
30b0cc2
Compare
|
Discarded CI Status: 3:x: 2:hourglass: 8:heavy_check_mark: (click to expand)
|
30b0cc2 to
6a63e67
Compare
|
The packaging build is hitting these failures: @ericstj, the netfx build is hitting this failure: @danmosemsft, and the Linux leg failed because every distro fireballed with a seg fault in the System.ComponentModel.Composition.Tests tests, e.g. |
|
Discarded CI Status: 3:x: 9:heavy_check_mark: (click to expand)
|
…C to preview2-02606-01, preview2-26306-01, preview2-26305-07, beta-26306-00, beta-26306-00, respectively
6a63e67 to
04aebb6
Compare
|
Couldn't update this pull request: Head commit author 'Jose Perez Rodriguez' is not 'dotnet-maestro-bot' |
|
@stephentoub I'm not sure why a few tests are failing and not sure if this is caused by the ingestion of coreclr, coresetup, or some other dependency. Do you know if this is ok to merge? |
|
Couldn't update this pull request: Head commit author 'Jose Perez Rodriguez' is not 'dotnet-maestro-bot' |
Please hold off. Those component model test failures look like a real issue, as they're seg faulting consistently on all Linux distros. @danmosemsft and @Anipik are investigating, I believe. |
|
I am trying to repro the failure on Linux, i will post as soon as i have an update. |
|
Couldn't update this pull request: Head commit author 'Jose Perez Rodriguez' is not 'dotnet-maestro-bot' |
|
@danmosemsft @stephentoub |
|
Couldn't update this pull request: Head commit author 'Jose Perez Rodriguez' is not 'dotnet-maestro-bot' |
|
Couldn't update this pull request: Head commit author 'Jose Perez Rodriguez' is not 'dotnet-maestro-bot' |
|
Callstack for the failure @ruben-ayrapetyan it seems related to your change dotnet/coreclr#16747 . can you please take a look ? |
|
Ironically dotnet/coreclr#16747 was to fix https://github.com/dotnet/coreclr/issues/15544 which was discovered by these same tests. They are handy. |
|
cc: @jkotas |
|
I have reverted the bad change in dotnet/coreclr#16790 |
|
I guess the bot will not update this due to @joperezr commit. I suggest we close, it will make a new one and he will need to apply your change to the new one |
|
@danmosemsft I applied the following patch to CoreFX version 2be66d4. For the patched CoreFX (based on 2be66d4) Debug Linux x64 build I tried the following builds of CoreCLR:
Output for
Output for |
|
Could it be that I can't reproduce the failure, because I use different base version of CoreFX? |
|
@Anipik do you still have the machine on which this reproed on? |
|
no, I reset my desktop yesterday, but I can give it a try reproducing it. |
|
Yes please. @jkotas can you think of what we're missing, why this isn't reproing? It seems the exact same. |
|
Submitted #27926 to see whether it still repros in CI. |
|
@ruben-ayrapetyan #27926 has the repro now.
Yes, that was the problem. Perhaps we should keep the tests enable, but just change the expected outcome on Unix. |
Do we know what's unique about those tests? I assumed we already separately had tests that validated loading something other than an assembly. |
|
They call GetAssemblyName on locked file. I have pushed a commit to #27926 to see whether a simple test that does just that will reproduce the problem. |
|
@danmosemsft @jkotas Locally, I get
and I don't get
As far as I understand, the CI in #27926 was started for dotnet/coreclr@964fa686 version of CoreCLR. Please, find details of the local experiment below. I tried the
This resulted in the following output on the tests: |
|
To clarify, the local call stack for The LLDB version is very similar to #27737 (comment) |
|
Also, not sure if it does really matter, call to With the change dotnet/coreclr#16747:
|
Did you already try setting a remote to https://github.com/jkotas/corefx and checking out the "test" branch? This should give you exactly what the CI built. I would do a As for figuring out exactly what CoreCLR commit the bits have, the easiest way to know for sure is to determine which system.private.corelib.dll is getting loaded, then look at that assembly in a tool like ILDASM or ILSPY for the CoreCLR commit hash in its assembly metadata, eg here is one from a different one: |
|
@danmosemsft For now, I reproduced the However,
According to #27737 (comment), this should indicate that the
|
|
Do I understand correctly that the CI run used 2.1.0-preview2-26306-01 version? |
|
The CI job has which gives So yes. |
|
And from this we can get the commit most easily using myget: So aea9340 which confirms what you see. |
The commit hash dotnet/coreclr@aea9340 is before merge of dotnet/coreclr#16747 Does it imply that the CI checked something different than expected?
This suggests that the change dotnet/coreclr#16747 didn't introduce the regression - other way around, it fixed the failure that looked as a regression. Does the analysis look correct? |
|
Logs above are unfortunately gone, but based on history above #27737 (comment) is the first one where Linux definitely failed - that one introduced CoreCLR preview2-26305-12 - and preview2-26306-01 is the last one before @stephentoub commented it segfaulted so it definitely contained the problem. Looking at https://dotnet.myget.org/feed/dotnet-core/package/nuget/Microsoft.NETCore.Runtime.CoreCLR/2.1.0-preview2-26305-12 I see this corresponds to CoreCLR dotnet/coreclr@1affbe5. Similarly 26306-01 corresponds to dotnet/coreclr@aea9340. .. and both were BEFORE your dotnet/coreclr#16747 which was dotnet/coreclr@383736b: So this seems to eliminate your change as the cause. However I do not understand why we did not see the segfault before. At this point I am inclined to recommend you put your change back in CoreCLR. As an extra check, in your PR in CoreCLR you can request it run CoreFX tests: cc @jkotas. |
If we still believe there may be a negative interaction with the tests that were permanently disabled on Unix, I'd suggest we re-enable those (with whatever changes they need) before putting the change back. |
|
Ah, yes, @ruben-ayrapetyan you may want to submit a revert of f31d24f in CoreFX before doing the change again in CoreCLR. Note that when you request CoreFX tests in a CoreCLR CI job, it uses HEAD of CoreFX, so it will immediately "see" your revert. Thanks for your patience... |
|
And @ruben-ayrapetyan also apologies for apparently misattributing ths issue. |
It can't just be a revert; the tests will need to also be changed to expect a different exception type on Unix. |
|
I understand what happened: Change that made System.ComponentModel.Composition.Tests crash was dotnet/coreclr#16752 . The tests stopped failing on file lock violation and proceeded further and hit https://github.com/dotnet/coreclr/issues/15544 that was being fixed by dotnet/coreclr#16747. I have submitted revert of the revert as dotnet/coreclr#16917 . @ruben-ayrapetyan Thanks for helping us to trace this down, and apologizes for the false alarm. Wrt tests: dotnet/coreclr#16747 is adding a simple regression test to CoreCLR. I am going to take a look whether it make sense to re-enable and update the System.ComponentModel.Composition.Tests . |
It's fine.
Thank you @danmosemsft, @jkotas. |
No description provided.