eleventy: Crash when sourcing from large JSON file

Describe the bug I am trying to paginate over a pretty large JSON file, 17.2MB. Eleventy crashes when ran. Any advice?

Expected behavior Output of paginated HTML files in the build folder.

Error Message

<--- Last few GCs --->

[22797:0x1f43a50]     6587 ms: Mark-sweep 1259.2 (1457.7) -> 1258.7 (1457.7) MB, 105.9 / 0.0 ms  (average mu = 0.202, current mu = 0.049) allocation failure scavenge might not succeed
[22797:0x1f43a50]     6989 ms: Mark-sweep 1259.3 (1457.7) -> 1259.2 (1457.7) MB, 399.9 / 0.0 ms  (average mu = 0.067, current mu = 0.007) allocation failure scavenge might not succeed


<--- JS stacktrace --->

==== JS stack trace =========================================

    0: ExitFrame [pc: 0xc6ef995452b]
    1: StubFrame [pc: 0xc6ef99556f3]
Security context: 0x02ac1e2aee11 <JSObject>
    2: normalize [0x27b150b9a359] [path.js:~1111] [pc=0xc6ef9892c5f](this=0x34884c282389 <Object map = 0x3b1419dcab89>,path=0x02ec0cb71549 <String[32]: /home/will/Projects/myRedactedProjectName>)
    3: getLocalData [0x19ccc3cfda89] [/usr/local/lib/node_modules/@11ty/eleventy/src/TemplateData.js:~280] [pc=0xc6ef9a25130](this=0x2d6b...

FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
 1: 0x7f74b00dc46c node::Abort() [/lib/x86_64-linux-gnu/libnode.so.64]
 2: 0x7f74b00dc4b5  [/lib/x86_64-linux-gnu/libnode.so.64]
 3: 0x7f74b0308e6a v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [/lib/x86_64-linux-gnu/libnode.so.64]
 4: 0x7f74b03090e1 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [/lib/x86_64-linux-gnu/libnode.so.64]
 5: 0x7f74b06a3c66  [/lib/x86_64-linux-gnu/libnode.so.64]
 6: 0x7f74b06b5043 v8::internal::Heap::PerformGarbageCollection(v8::internal::GarbageCollector, v8::GCCallbackFlags) [/lib/x86_64-linux-gnu/libnode.so.64]
 7: 0x7f74b06b5930 v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace, v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [/lib/x86_64-linux-gnu/libnode.so.64]
 8: 0x7f74b06b791d v8::internal::Heap::AllocateRawWithLigthRetry(int, v8::internal::AllocationSpace, v8::internal::AllocationAlignment) [/lib/x86_64-linux-gnu/libnode.so.64]
 9: 0x7f74b06b7975 v8::internal::Heap::AllocateRawWithRetryOrFail(int, v8::internal::AllocationSpace, v8::internal::AllocationAlignment) [/lib/x86_64-linux-gnu/libnode.so.64]
10: 0x7f74b0683dda v8::internal::Factory::NewFillerObject(int, bool, v8::internal::AllocationSpace) [/lib/x86_64-linux-gnu/libnode.so.64]
11: 0x7f74b090f31e v8::internal::Runtime_AllocateInNewSpace(int, v8::internal::Object**, v8::internal::Isolate*) [/lib/x86_64-linux-gnu/libnode.so.64]
12: 0xc6ef995452b 
Aborted (core dumped)

Environment:

  • OS and Version: Ubuntu Linux 20.04.1 LTS, 32GB RAM
  • Eleventy Version: 0.11.1

About this issue

  • Original URL
  • State: closed
  • Created 3 years ago
  • Comments: 19 (12 by maintainers)

Most upvoted comments

@willmartian I like https://github.com/pdehaan/gray-matter-liquidjs-test. I rewrote the liquidjs-tweets repo from scratch last night and pushed it this morning and I think that’s a much cleaner approach with support for some basic template front-matter and permalinks.

Re: developing, Don’t care. Copy all the code and release it under your name. I’m probably done with both repos and they’ll just sit there and rot. No attribution needed. If your GitHub repo is current, I can submit a few PRs to the repo for some HTML issues I found on the original templates, or I can wait for you to update the site if you’re thinking of migrating to a different solution. Nothing big, just a few unclosed </div> tags, or Prettier complaining about closing </input> tags.

Yeah, I think that is definitely a next step. To be fair, this whole project might also be better suited for a framework like Preact rather than trying to generate so many static files. But hey, I like Eleventy. 😃

Oh, I am doing similar stuff with my Twitter takeout (European citizen here. Do you have the option to download everything Twitter knows about you in other places of the world?). That scratches more on that 36k limit we’ve seen above.

I can feel your pain 😃

I am figuring out a lot of this as I go along.

I’m still figuring this out as I go along…


First, I split tweets.json into smaller JSONs, using a template with pagination:

Great workaround! I’ll have to check it out when you push the changes to GitHub. I wouldn’t have thought of using pagination that way. ❤️

Even more amazed at generating 170k files in around 3 minutes (only 1.1ms per file). 🏁 🔥 🔥