aws-lambda-dotnet: System.AccesssViolationException encountered in Lambda based AspNetCore Kestrel Hosting
Hi there,
I’ve been working with a Lambda based AspNetCore Kestrel hosted website for almost a year now. It appears that I’ve encountered my first memory leak. It took me a few days to realize what was going on. I had figured I was somehow running into a bad .NET Core 1.0 reference because the issue was relatively intermittent and seemed to largely occur when working with an asynchronous context. After drilling through logs for hours on end, I was finally able to notice the following pattern:
REPORT RequestId: 3dc67903-dc6e-11e7-b70b-f31bf853f674 Duration: 202.36 ms Billed Duration: 300 ms Memory Size: 1024 MB Max Memory Used: 94 MB
REPORT RequestId: 3e0186b1-dc6e-11e7-bba9-cbb50eea6ba9 Duration: 5356.47 ms Billed Duration: 5400 ms Memory Size: 1024 MB Max Memory Used: 104 MB
REPORT RequestId: 44329b95-dc6e-11e7-90d7-2182089f4926 Duration: 11254.99 ms Billed Duration: 11300 ms Memory Size: 1024 MB Max Memory Used: 332 MB
REPORT RequestId: 4d5615ed-dc6e-11e7-9c98-716c7ef1723a Duration: 394.21 ms Billed Duration: 400 ms Memory Size: 1024 MB Max Memory Used: 333 MB
REPORT RequestId: 52da7074-dc6e-11e7-9119-a5d0b8a8fdb4 Duration: 334.18 ms Billed Duration: 400 ms Memory Size: 1024 MB Max Memory Used: 333 MB
REPORT RequestId: 53364c70-dc6e-11e7-896c-7b016224c80c Duration: 1023.85 ms Billed Duration: 1100 ms Memory Size: 1024 MB Max Memory Used: 347 MB
REPORT RequestId: 571a8ad8-dc6e-11e7-a845-cf16d636b90a Duration: 317.14 ms Billed Duration: 400 ms Memory Size: 1024 MB Max Memory Used: 348 MB
REPORT RequestId: 5773f563-dc6e-11e7-8dc6-a511bfb15ed3 Duration: 206.93 ms Billed Duration: 300 ms Memory Size: 1024 MB Max Memory Used: 354 MB
REPORT RequestId: 5bb8a443-dc6e-11e7-a659-19b60c381b76 Duration: 304.21 ms Billed Duration: 400 ms Memory Size: 1024 MB Max Memory Used: 361 MB
REPORT RequestId: 5c2ac769-dc6e-11e7-a798-67709e053235 Duration: 246.65 ms Billed Duration: 300 ms Memory Size: 1024 MB Max Memory Used: 364 MB
REPORT RequestId: 6152a9ff-dc6e-11e7-b081-af7f83c5a458 Duration: 248.99 ms Billed Duration: 300 ms Memory Size: 1024 MB Max Memory Used: 370 MB
REPORT RequestId: 63376f28-dc6e-11e7-abff-05587fca15ab Duration: 353.03 ms Billed Duration: 400 ms Memory Size: 1024 MB Max Memory Used: 382 MB
REPORT RequestId: 66371a86-dc6e-11e7-943d-476fdbc4a89e Duration: 302.32 ms Billed Duration: 400 ms Memory Size: 1024 MB Max Memory Used: 389 MB
REPORT RequestId: 67bad4cf-dc6e-11e7-98ce-c9bd53b06765 Duration: 280.72 ms Billed Duration: 300 ms Memory Size: 1024 MB Max Memory Used: 398 MB
REPORT RequestId: 6961f4ef-dc6e-11e7-9be0-854d80ba7e47 Duration: 287.53 ms Billed Duration: 300 ms Memory Size: 1024 MB Max Memory Used: 408 MB
REPORT RequestId: 6b865b1c-dc6e-11e7-a7a3-9fe091123029 Duration: 313.54 ms Billed Duration: 400 ms Memory Size: 1024 MB Max Memory Used: 417 MB
REPORT RequestId: 6cf15d65-dc6e-11e7-a30e-f75bdf934684 Duration: 265.60 ms Billed Duration: 300 ms Memory Size: 1024 MB Max Memory Used: 422 MB
REPORT RequestId: 6ede8798-dc6e-11e7-983d-637afb276ab6 Duration: 325.28 ms Billed Duration: 400 ms Memory Size: 1024 MB Max Memory Used: 426 MB
REPORT RequestId: 70c8f1cf-dc6e-11e7-b14e-abacaf2bcd85 Duration: 327.12 ms Billed Duration: 400 ms Memory Size: 1024 MB Max Memory Used: 431 MB
REPORT RequestId: 712bab41-dc6e-11e7-b9bd-a7abf148bb2f Duration: 142.82 ms Billed Duration: 200 ms Memory Size: 1024 MB Max Memory Used: 436 MB
REPORT RequestId: 72a7c46d-dc6e-11e7-88c8-03031d971a5e Duration: 291.63 ms Billed Duration: 300 ms Memory Size: 1024 MB Max Memory Used: 444 MB
REPORT RequestId: 73059b9e-dc6e-11e7-80f0-8fc1070e72d3 Duration: 179.62 ms Billed Duration: 200 ms Memory Size: 1024 MB Max Memory Used: 449 MB
REPORT RequestId: 781c18e8-dc6e-11e7-b23f-87f00aea05f3 Duration: 280.97 ms Billed Duration: 300 ms Memory Size: 1024 MB Max Memory Used: 457 MB
REPORT RequestId: 79789c37-dc6e-11e7-a931-3939b64a1005 Duration: 277.57 ms Billed Duration: 300 ms Memory Size: 1024 MB Max Memory Used: 485 MB
REPORT RequestId: 7b02223e-dc6e-11e7-8e1a-b763e3a90f96 Duration: 258.71 ms Billed Duration: 300 ms Memory Size: 1024 MB Max Memory Used: 487 MB
REPORT RequestId: 7c4fb0d0-dc6e-11e7-986e-2d7a10d25acb Duration: 274.12 ms Billed Duration: 300 ms Memory Size: 1024 MB Max Memory Used: 501 MB
REPORT RequestId: 7e197447-dc6e-11e7-b014-d103a19007c1 Duration: 268.75 ms Billed Duration: 300 ms Memory Size: 1024 MB Max Memory Used: 501 MB
REPORT RequestId: 7f4830b6-dc6e-11e7-9286-b53eee0eeb97 Duration: 265.90 ms Billed Duration: 300 ms Memory Size: 1024 MB Max Memory Used: 515 MB
REPORT RequestId: 80640011-dc6e-11e7-a4e9-7f5813bc1d9b Duration: 249.71 ms Billed Duration: 300 ms Memory Size: 1024 MB Max Memory Used: 526 MB
REPORT RequestId: 827e7c48-dc6e-11e7-9882-553249bb136f Duration: 280.02 ms Billed Duration: 300 ms Memory Size: 1024 MB Max Memory Used: 535 MB
REPORT RequestId: 834ffd3a-dc6e-11e7-a179-3f8683750833 Duration: 283.13 ms Billed Duration: 300 ms Memory Size: 1024 MB Max Memory Used: 546 MB
REPORT RequestId: 83adfb90-dc6e-11e7-a8b9-41d613e950a4 Duration: 190.46 ms Billed Duration: 200 ms Memory Size: 1024 MB Max Memory Used: 562 MB
REPORT RequestId: 8469f847-dc6e-11e7-a4aa-b57bba0768d9 Duration: 258.33 ms Billed Duration: 300 ms Memory Size: 1024 MB Max Memory Used: 573 MB
REPORT RequestId: 84b46f28-dc6e-11e7-834b-1b7516fa4c49 Duration: 157.16 ms Billed Duration: 200 ms Memory Size: 1024 MB Max Memory Used: 577 MB
REPORT RequestId: 8691e148-dc6e-11e7-9e55-5de525454cc7 Duration: 240.47 ms Billed Duration: 300 ms Memory Size: 1024 MB Max Memory Used: 602 MB
REPORT RequestId: 87a06b57-dc6e-11e7-819c-53693dcc9ff9 Duration: 262.19 ms Billed Duration: 300 ms Memory Size: 1024 MB Max Memory Used: 621 MB
REPORT RequestId: 88b6e413-dc6e-11e7-a578-0dbbbe0950ce Duration: 276.35 ms Billed Duration: 300 ms Memory Size: 1024 MB Max Memory Used: 635 MB
REPORT RequestId: 89b827fa-dc6e-11e7-bb16-69053db8f778 Duration: 259.09 ms Billed Duration: 300 ms Memory Size: 1024 MB Max Memory Used: 700 MB
REPORT RequestId: 8ad1d4f6-dc6e-11e7-8f57-1bf6944b48ee Duration: 230.31 ms Billed Duration: 300 ms Memory Size: 1024 MB Max Memory Used: 711 MB
REPORT RequestId: 8c3b2906-dc6e-11e7-8a23-73cb8c4a31ae Duration: 273.98 ms Billed Duration: 300 ms Memory Size: 1024 MB Max Memory Used: 729 MB
REPORT RequestId: 8d89efba-dc6e-11e7-b0af-fd0d10e41231 Duration: 230.58 ms Billed Duration: 300 ms Memory Size: 1024 MB Max Memory Used: 743 MB
REPORT RequestId: 8edfbca7-dc6e-11e7-b8bc-47811c280c05 Duration: 249.17 ms Billed Duration: 300 ms Memory Size: 1024 MB Max Memory Used: 757 MB
REPORT RequestId: 8f538ccd-dc6e-11e7-b026-f39c6b7800ea Duration: 50.67 ms Billed Duration: 100 ms Memory Size: 1024 MB Max Memory Used: 760 MB
REPORT RequestId: 8fd2f5fd-dc6e-11e7-9765-2d56e3bf64ea Duration: 234.21 ms Billed Duration: 300 ms Memory Size: 1024 MB Max Memory Used: 773 MB
REPORT RequestId: 90370e64-dc6e-11e7-9156-e5870857454b Duration: 175.30 ms Billed Duration: 200 ms Memory Size: 1024 MB Max Memory Used: 785 MB
REPORT RequestId: 90f7511c-dc6e-11e7-983d-637afb276ab6 Duration: 261.36 ms Billed Duration: 300 ms Memory Size: 1024 MB Max Memory Used: 841 MB
REPORT RequestId: 914c9d8e-dc6e-11e7-898b-1585b8f8e7d4 Duration: 320.56 ms Billed Duration: 400 ms Memory Size: 1024 MB Max Memory Used: 889 MB
REPORT RequestId: 921a269d-dc6e-11e7-accb-17367ab2cdd9 Duration: 254.26 ms Billed Duration: 300 ms Memory Size: 1024 MB Max Memory Used: 892 MB
REPORT RequestId: 938d17d0-dc6e-11e7-8436-01961fd413f3 Duration: 275.37 ms Billed Duration: 300 ms Memory Size: 1024 MB Max Memory Used: 897 MB
REPORT RequestId: 9549bbfd-dc6e-11e7-97dc-bf10238f2752 Duration: 246.73 ms Billed Duration: 300 ms Memory Size: 1024 MB Max Memory Used: 900 MB
REPORT RequestId: 968807ad-dc6e-11e7-a9e8-bf97d825ac8c Duration: 258.74 ms Billed Duration: 300 ms Memory Size: 1024 MB Max Memory Used: 904 MB
REPORT RequestId: 98707793-dc6e-11e7-832e-7703726b047a Duration: 224.04 ms Billed Duration: 300 ms Memory Size: 1024 MB Max Memory Used: 908 MB
REPORT RequestId: 99aa2fbb-dc6e-11e7-82a7-2f2cc7d38d3b Duration: 264.40 ms Billed Duration: 300 ms Memory Size: 1024 MB Max Memory Used: 912 MB
REPORT RequestId: 9ad62c85-dc6e-11e7-81b0-456cdc9dc8d7 Duration: 258.75 ms Billed Duration: 300 ms Memory Size: 1024 MB Max Memory Used: 917 MB
REPORT RequestId: 9b27cf02-dc6e-11e7-9694-dffb979f1940 Duration: 846.53 ms Billed Duration: 900 ms Memory Size: 1024 MB Max Memory Used: 937 MB
REPORT RequestId: 9cf318eb-dc6e-11e7-9580-05514ef1ee68 Duration: 17867.03 ms Billed Duration: 17900 ms Memory Size: 1024 MB Max Memory Used: 1024 MB
As you can see from the “Max Memory Used”, it’s progressively increasing to a breaking point where the Memory Used equals Memory Allocated. At this point, the Lambda crashes, returns a 502 error and restarts the container as if it were brand new. I would normally start drilling through to try and find the memory leak point, but I was slightly concerned by the following post:
https://forums.aws.amazon.com/message.jspa?messageID=587945
According to the post above, there’s a point at which the Lambda will stabilize and stop consuming extra memory allocation. I’m about to take advantage of the increased Lambda Memory limit and up my Max Memory Allocation to 3008 MB and see if there is a stabilization point. Is this normal for a Lambda-based Kestrel hosting to be surpassing a 1024 MB limit with a Lambda based Kestrel hosting?
My AspNetCore API is getting somewhat large intensive. It is quite possible that all of this added functionality is truly causing an increase in memory consumption, but this seems very heavily in excess of some of the original memory usages I’ve seen.
I was experimenting with an in-memory caching mechanism for a short time period about a month ago, so it’s possible some remnants of this functionality could be causing the underlying issue as well.
About this issue
- Original URL
- State: closed
- Created 7 years ago
- Comments: 23 (6 by maintainers)
I believe there is something with your dependencies that isn’t compatible with .NET Core and Linux, not Lambda related.
Here is what I have done to try and figure this out. I deployed your test case and after 4 executions Lambda reported the process died and restarted it. Then hoping that are upcoming .NET Core 2.0 runtime would have more information I ported the sample to .NET Core 2.0 and ran it on our internal .NET Core 2.0 Lambda runtime. The process suddenly died in .NET Core 2.0 Lambda just like it did in .NET Core 1.0.
The next step was I took my .NET Core 2.0 port of your code to an Amazon Linux EC2 instance with .NET Core 2.0. Ran the values request in a loop and the dotnet process suddenly died after a few runs. From their I got an error message from the dotnet runtime saying “Aborted” and exit code of 134 which is basically saying the process was aborted. So now we have the failing behavior outside of Lambda.
One other interesting debugging information. In .NET Core 2.0 version I attached an event for unresolved types from the AppDomain.
And it got a notification about an unresolved assembly Mono.Runtime. That seems very suspect to me. Can you think of anything in your dependencies that would cause the Mono.Runtime to be loaded.