Can browsers enforce any sort of limit on the amount of data that can be stored in JavaScript objects? If so, is there any way to detect that limit?
It appears that by default, Firefox does not:
var data; $("document").ready(function() { data = []; for(var i = 0; i < 100000000000; i++) { data.push(Math.random()); } });
That continues to consume more and more memory until my system runs out.
Since we can’t detect available memory, is there any other way to tell we are getting close to that limit?
Update
The application I’m developing relies on very fast response times to be usable (it’s the core selling point). Unfortunately, it also has a very large data set (more than will fit into memory on weaker client machines). Performance can be greatly improved by preemptively loading data strategically (guessing what will be clicked). The fallback to loading the data from the server works when the guesses are incorrect, but the server round trip isn’t ideal. Making use of every bit of memory I can makes the application as performant as possible.
Right now, it works to allow the user to “configure” their performance settings (max data settings), but users don’t want to manage that. Also, since it’s a web application, I have to handle users setting that per computer (since a powerful desktop has a lot more memory than an old iPhone). It’s better if it just uses optimal settings for what is available on the systems. But guessing too high can cause problems on the client computer too.
Advertisement
Answer
While it might be possible on some browsers, the right approach should be to decide what limit is acceptable for the typical customer and optionally provide a UI to define their limit.
Most heavy web apps get away with about 10MB JavaScript heap size. There does not seem to be a guideline. But I would imagine consuming more than 100MB on desktop and 20MB on mobile is not really nice. For everything after that look into local storage, e.g. FileSystem API (and you can totally make it PERSISTENT)
UPDATE
The reasoning behind this answer is the following. It is next to never user runs only one application. More so with counting on the browser having only one tab open. Ultimately, consuming all available memory is never a good option. Hence determining the upper boundary is not necessary.
Reasonable amount of memory user would like to allocate to the web app is a guess work. E.g. highly interactive data analytics tool is quite possible in JS and might need millions of data points. One option is to default to less resolution (say, daily instead of each second measurements) or smaller window (one day vs. a decade of seconds). But as user keeps exploring the data set, more and more data will be needed, potentially crippling the underlying OS on the agent side.
Good solution is to go with some reasonable initial assumption. Let’s open some popular web applications and go to dev tools – profiles – heap snapshots to take a look:
- FB: 18.2 MB
- GMail: 33 MB
- Google+: 53.4 MB
- YouTube: 54 MB
- Bing Maps: 55 MB
Note: these numbers include DOM nodes and JS Objects on the heap.
It seems to be then, people come to accept 50MB of RAM for a useful web site. (Update 2022: nowadays averaging closer to 100MB.) Once you build your DOM Tree, fill your data structures with test data and see how much is OK to keep in RAM.
Using similar measurements while turning device emulation in Chrome, one can see the consumption of the same sites on tablets and phones, BTW.
This is how I arrived at 100 MB on desktop and 20 MB on mobile numbers. Seemed to be reasonable too. Of course, for occasional heavy user it would be nice to have an option to bump max heap up to 2 GB.
Now, what do you do if pumping all this data from the server every time is too costly?
One thing is to use Application Cache. It does create mild version management headaches but allows you to store around 5 MB of data. Rather than storing data though, it is more useful to keep app code and resources in it.
Beyond that we have three choices:
- SQLite –
support was limited and it seems to be abandoned - IndexDB – better option
but support is not universal yet (can I use it?) - FileSystem API
Of them, FileSystem is most supported and can use sizable chunk of storage.