Memory usage with large list

Steps to reproduce

Have a large list with thousands of nodes open in Safari.

Expected result

Low memory usage.

Actual result

Safari re-loads page every few minutes and displays a warning about ā€œthis webpage was reloaded because it was using significant memory.ā€

Environment

Safari 11.0.1 on Mac OS X

2 Likes

Has this happened in previous versions of Safari? Safari 11.0.1 sounds really new and if thereā€™s no problem before, it could be a memory issue with the browser. I do get many results when I search for ā€œSafari memory issueā€.

We havenā€™t changed anything regarding memory usage in a long time, so unless this problem has existed for ages, it shouldnā€™t be caused by a recent change.

Hi @Imran_Akbar, just following up on this, is it still happening?

Hey Erica, yes it still complains of running out of memory. On my laptop dynalist at least loads - but on my iPad Pro it never loads (both the App and opening it in Safari).

Hi Imran,

ā€‹Sorry for the late reply due to the holidays!

Is the situation any better at the moment?

Iā€™m really sorry this is happening. This is a bigger thread that complains about a similar issue: Dynalist using a ridiculous amount of RAM (Windows Desktop App)

We have plans to update Electron and see if that can fix it.

I wonder why itā€™s happening in Safari though, that makes it seem like itā€™s a somewhat different issue.

Hey Erica,

Thanks for keeping an eye on this.

I have a large list open in Dynalist on Chrome that is eating up 1 GB+ of RAM in my Chrome tab. I think the app needs to move all of the in-memory search indexes, etc. to something more scalable (either server-side storage or something like https://caniuse.com/#search=indexeddb) for large documents, as this is not going to be scalable.

Would love to hear other suggestions.

thanks!

2 Likes

If not in memory, things like dynamically resolve internal link titles and global tags wonā€™t work properly.

For example, should we load megabytes of data from our server or your local index DB every time you want to do a global tag autocomplete or internal link look up? Donā€™t think thatā€™s realistic. Even if you donā€™t mind the performance hits, server-side storage is not the ā€œscalableā€ solution as it dramatically increases our server load, if every user just grabs data from our server and throw it away once theyā€™re done with it.

I hope thatā€™s understandable. Itā€™s necessary to keep everything in memory unless you want to just give up on some nice and convenient features (jump to item, global tag pane, auto-complete tag, internal link lookup, etc.).

@Erica, is it only the current document thatā€™s stored in memory, or all documents?

If all documents are kept in memory, then I should probably start archiving and deleting my old ones. :slight_smile:

Yes, all documents are stored in memory. The current document has extra stuff like the visual elements you see and other states, so if document A and B has the same amount of data, A will take more space than B when itā€™s currently open. Therefore, a large list of a bigger issue if you frequently access it.

It really depends on how much RAM you have and how much Dynalist is taking for you. If itā€™s taking too much and you have things you can archive into OPML format, definitely do that.

thanks for the info. I donā€™t know enough about the technical architecture to recommend something, but I could really use a solution beyond creating separate, smaller documents. If you must keep stuff in memory, can the app be smarter about which items it keeps there? One of my large documents wonā€™t even load on the iOS app because it exceeds the memory limitations for apps and just crashes or tries to reload constantly. It is unusable in that case. There has to be a way to move some functionality server-side (at the expense of latency) for larger documents/pro accounts.

2 Likes

if your to do list has crippled a modern computing devices memory how are you as a human not overwhelmed by it lol

But thereā€™s no way 100,000 lines of text should cripple a modern computing device! Itā€™s a clear sign of an architectural error.

3 Likes

Depends. Text editors like UltraEdit have specific architecture to deal with large files like that. A simple program like Notepad can choke on it.

Likewise an Excel spreadsheet with 100,000 rows can have your modern CPU and 8+GB RAM quite occupied, often with calculations not being instant anymore.

In the case of a service like Dynalist in the browser, you also have to keep in mind that the ā€œprogramā€ itself, the ā€œthingā€ that is Dynalist and does what Dynalist does, is JavaScript downloaded to the browser, and parsed and executed by the browser, which puts an additional load on that computer with a 100,000 lines document.

@erica is everything really kept in memory or do you write to local storage?

My impression is yes, for all the features that need global auto-completion (ā€œMove toā€ and inserting internal links). Everything comes at a price. If they are not kept in memory, youā€™ll probably notice a lag when you start searching for something globally.

@Shida: correct me if Iā€™m wrong.

Thatā€™s correct. If theyā€™re not loaded into memory, global search (includes internal links, move to, go anywhere) would not work at all since we wonā€™t have reasonably enough time to load it, especially over the internet.

I would be concerned that Dynalist has fundamental limits on scalability. Constantly deleting information to manage memory when getting to the threshold is a problem. Defeats the object of having a single trusted system which can archive indefinitely. This is causing me to rethink my use of the product over the long term which is a pity. I understand it is a tough tradeoff, but users manually archiving etc. to me is a show stopper. Some segregation of search space is probably needed so that global tags will need server side search.

1 Like

@Shida What do you think?

Iā€™ve split up my Dynalist into multiple separate documents to try to alleviate the issue, but the iPad app still (apparently) runs out of memory, as it refreshes itself every ~5 seconds or so. The performance is OK on a desktop browser with lots of memory - but itā€™s basically unusable on an iOS device. Some more testing with large documents may provide some ideas on further optimizations.

Thank you again for a terrific product.

We were considering the ability to ā€˜archiveā€™ documents which would make them completely unloaded and not in memory. The effects of archived documents would be:

  • Not global-searchable
  • Wonā€™t appear in [[ links
  • Wonā€™t appear in move anywhere
  • Wonā€™t be indexed for go to any item
  • Tags wonā€™t count towards tag pane
  • Wonā€™t instant-load when you open it, and will not load on the web app if no internet
  • Wonā€™t be synced until you open it (must load into memory to sync)

I think for archive purposes, these limitations are fine. The last one we might be able to work over to load it into memory in the background when there are changes to be synced, then unload the document.

@Erica do you think this can work?

1 Like

I think that would work and I sure hope it does @Shida.

The biggest problem in my opinion is that the iOS system has pretty strict limitations on how much memory can be used, whereas on desktop you can pretty much as however you want. If I understand it correctly, even if thereā€™s plenty of memory left on iOS, the system will only allow you to use that much.