-
Notifications
You must be signed in to change notification settings - Fork 55
Description
I was trying to edit (ie add or change a field of json) with jj. I see the memory is shooting up atleast 10 times before it can process the change. So I need for editing a 1GB json atleast 11GB memory allocated. Though the whole process is fast (for a i3, 15GB RAM machine, takes ~18 secs to edit the last index), it crashes many times because of out of memory.
I have seen reference about jj caching large amounts of memory. Please let me know if this can be optimized.
FYI, Fetching data is fast and fine and only takes the same amount of memory as the size of the json for a short period while doing the GETs.
-------------------Reference issue mentioned----------
Hi pkoppstein, I'm looking into these issues. I'm believe jj is buffering too much data prior to processing and low memory systems suffer when dealing with large json files. I'll look asap and keep you posted. Thanks!
Originally posted by @tidwall in #9 (comment)