Memory usage with large list

Hey Erica,

Thanks for keeping an eye on this.

I have a large list open in Dynalist on Chrome that is eating up 1 GB+ of RAM in my Chrome tab. I think the app needs to move all of the in-memory search indexes, etc. to something more scalable (either server-side storage or something like https://caniuse.com/#search=indexeddb) for large documents, as this is not going to be scalable.

Would love to hear other suggestions.

thanks!

2 Likes

If not in memory, things like dynamically resolve internal link titles and global tags won’t work properly.

For example, should we load megabytes of data from our server or your local index DB every time you want to do a global tag autocomplete or internal link look up? Don’t think that’s realistic. Even if you don’t mind the performance hits, server-side storage is not the “scalable” solution as it dramatically increases our server load, if every user just grabs data from our server and throw it away once they’re done with it.

I hope that’s understandable. It’s necessary to keep everything in memory unless you want to just give up on some nice and convenient features (jump to item, global tag pane, auto-complete tag, internal link lookup, etc.).

@Erica, is it only the current document that’s stored in memory, or all documents?

If all documents are kept in memory, then I should probably start archiving and deleting my old ones. :slight_smile:

Yes, all documents are stored in memory. The current document has extra stuff like the visual elements you see and other states, so if document A and B has the same amount of data, A will take more space than B when it’s currently open. Therefore, a large list of a bigger issue if you frequently access it.

It really depends on how much RAM you have and how much Dynalist is taking for you. If it’s taking too much and you have things you can archive into OPML format, definitely do that.

thanks for the info. I don’t know enough about the technical architecture to recommend something, but I could really use a solution beyond creating separate, smaller documents. If you must keep stuff in memory, can the app be smarter about which items it keeps there? One of my large documents won’t even load on the iOS app because it exceeds the memory limitations for apps and just crashes or tries to reload constantly. It is unusable in that case. There has to be a way to move some functionality server-side (at the expense of latency) for larger documents/pro accounts.

2 Likes

if your to do list has crippled a modern computing devices memory how are you as a human not overwhelmed by it lol

But there’s no way 100,000 lines of text should cripple a modern computing device! It’s a clear sign of an architectural error.

3 Likes

Depends. Text editors like UltraEdit have specific architecture to deal with large files like that. A simple program like Notepad can choke on it.

Likewise an Excel spreadsheet with 100,000 rows can have your modern CPU and 8+GB RAM quite occupied, often with calculations not being instant anymore.

In the case of a service like Dynalist in the browser, you also have to keep in mind that the “program” itself, the “thing” that is Dynalist and does what Dynalist does, is JavaScript downloaded to the browser, and parsed and executed by the browser, which puts an additional load on that computer with a 100,000 lines document.

@erica is everything really kept in memory or do you write to local storage?

My impression is yes, for all the features that need global auto-completion (“Move to” and inserting internal links). Everything comes at a price. If they are not kept in memory, you’ll probably notice a lag when you start searching for something globally.

@Shida: correct me if I’m wrong.

That’s correct. If they’re not loaded into memory, global search (includes internal links, move to, go anywhere) would not work at all since we won’t have reasonably enough time to load it, especially over the internet.

I would be concerned that Dynalist has fundamental limits on scalability. Constantly deleting information to manage memory when getting to the threshold is a problem. Defeats the object of having a single trusted system which can archive indefinitely. This is causing me to rethink my use of the product over the long term which is a pity. I understand it is a tough tradeoff, but users manually archiving etc. to me is a show stopper. Some segregation of search space is probably needed so that global tags will need server side search.

1 Like

@Shida What do you think?

I’ve split up my Dynalist into multiple separate documents to try to alleviate the issue, but the iPad app still (apparently) runs out of memory, as it refreshes itself every ~5 seconds or so. The performance is OK on a desktop browser with lots of memory - but it’s basically unusable on an iOS device. Some more testing with large documents may provide some ideas on further optimizations.

Thank you again for a terrific product.

We were considering the ability to ‘archive’ documents which would make them completely unloaded and not in memory. The effects of archived documents would be:

  • Not global-searchable
  • Won’t appear in [[ links
  • Won’t appear in move anywhere
  • Won’t be indexed for go to any item
  • Tags won’t count towards tag pane
  • Won’t instant-load when you open it, and will not load on the web app if no internet
  • Won’t be synced until you open it (must load into memory to sync)

I think for archive purposes, these limitations are fine. The last one we might be able to work over to load it into memory in the background when there are changes to be synced, then unload the document.

@Erica do you think this can work?

1 Like

I think that would work and I sure hope it does @Shida.

The biggest problem in my opinion is that the iOS system has pretty strict limitations on how much memory can be used, whereas on desktop you can pretty much as however you want. If I understand it correctly, even if there’s plenty of memory left on iOS, the system will only allow you to use that much.

I recently watched a video call " Run Photoshop FAST with 1000+ Layers! Easy Trick" on YouTube. I don’t know anything about programming, but it might give you guys some ideas how you can improve the performance with large file (he was able to decrease memory usage from 37.8GB to 77.2MB by technically only display 2 layers). I don’t know it will help or not, but if not at least you learn a new trick for Photoshop. LOL. Happy New Year!

I’m surprised and, honestly, concerned that the only 2 options being considered are “don’t store all the data in memory” or “ignore the memory usage problem”. I’m sure there’s room to optimize how the data is stored / handled in memory, based on my observations:
I imported my WorkFlowy outline and split it into 2 documents. When viewing the larger of the two, Dynalist uses 800MB of memory. The OPML export of my outline from WorkFlowy is 2.4MB. The export from Dynalist of the same outline is 3.5 MB uncompressed in OPML and 1.9 MB in text. Even with 2x or 3x encoding overhead, the outline data itself does not come close to accounting for 800MB of memory usage.

My WorkFlowy tab is currently using 345 MB, while retaining the ability to search the entire outline and navigate quickly. WorkFlowy did recently release a rewrite that made it so snappy, but even before that, it was still faster and used less memory than Dynalist.

Even though I prefer WorkFlowy’s “everything is one outline” approach (because it’s less mental overhead for me), I did attempt to convert to the Dynalist way of multiple documents to try to mitigate the performance issues. Unfortunately, this caused the items I moved to lose their date metadata, which was the last straw that sent me back to WorkFlowy for now.

I do hope you guys are able to solve this issue, because WorkFlowy could use some competition in terms of features.

4 Likes

@dan_smith_X1011 @Erica @Shida I did some quick experimentation. It seems to me that memory usage is much higher when I have documents opened with a lot of images in it, i.e. the high memory usage is for most parts not from having all the text indexed (because that doesn’t change with changing the document) but rather that all images are loaded even if the respective item is not openened / visible on screen.

Also see App consumes 160 MB of data with little usage

It might be the images if he’s coming from WorkFlowy… WorkFlowy doesn’t support image markdown if I remember correctly.

@dan_smith_X1011: I think the 2-3x encoding overhead is reasonable, but the DOM elements and internal structures (parse tree for markdown, internal tag index, etc) do take a lot of space. If the 2-3x encoding overhead is the majority, the 345 MB memory used by WorkFlowy would be crazy as well (more than 100x of the OPML size).

I want to add a quick “me too.” I imported my workflowy list and dynalist chugged, even navigating the non-list portions of the UI was a crawl. It got better when I closed the open lists, but workflowy is fast even with everything open.

I too would like software with more features such as dynalist, but speed is everything.

“Move at the speed of thought”