David Bruant (2013-07-27T15:52:09.000Z)
Hi,

There seems to be a consensus on bringing in WeakRefs in the language. I 
came around to the idea myself as some use cases seem to require it (as 
in: some use cases can't be achieved even with a ".dispose" convention 
like distributed acyclic garbage collection). However, I worry.

Recently, some memory leaks where found in Gaia (FirefoxOS front-end). 
The investigation led to a subtle nested function scopes condition under 
which the reference to some variables are kept alive unnecessarily [1].

One of these days the SpiderMonkey folks will certainly fix this bug and 
will that creates 2 worlds. A "before" world where some objects *never* 
gets collected because of the bug and an "after" world where some of the 
same objects get collected.
If all current major JS engines have common limitations ([1] or any 
other), I worry that code using WeakRefs could implicitly (and 
mistakenly!) rely on these common limitations and break if one of these 
limitation is fixed. We know the rest of the story; it involves browser 
competition, "don't break the web" and this time would put in spec to 
require some missed opportunities for optimization. Phrased differently, 
we could end up with memory leaks imposed by the spec...

I understand the necessity of WeakRefs for some use cases, but I worry.

David

[1] https://bugzilla.mozilla.org/show_bug.cgi?id=894971#c0
domenic at domenicdenicola.com (2013-07-31T14:49:07.790Z)
There seems to be a consensus on bringing in WeakRefs in the language. I 
came around to the idea myself as some use cases seem to require it (as 
in: some use cases can't be achieved even with a ".dispose" convention 
like distributed acyclic garbage collection). However, I worry.

Recently, some memory leaks where found in Gaia (FirefoxOS front-end). 
The investigation led to a subtle nested function scopes condition under 
which the reference to some variables are kept alive unnecessarily [1].

One of these days the SpiderMonkey folks will certainly fix this bug and 
will that creates 2 worlds. A "before" world where some objects *never* 
gets collected because of the bug and an "after" world where some of the 
same objects get collected.
If all current major JS engines have common limitations ([1] or any 
other), I worry that code using WeakRefs could implicitly (and 
mistakenly!) rely on these common limitations and break if one of these 
limitation is fixed. We know the rest of the story; it involves browser 
competition, "don't break the web" and this time would put in spec to 
require some missed opportunities for optimization. Phrased differently, 
we could end up with memory leaks imposed by the spec...

I understand the necessity of WeakRefs for some use cases, but I worry.

[1]: https://bugzilla.mozilla.org/show_bug.cgi?id=894971#c0