K. Gadd (2013-07-27T16:27:00.000Z)
If memory serves,
http://point.davidglasser.net/2013/06/27/surprising-javascript-memory-leak.htmlwas
also complaining about a similar closure/scope leak in v8 where locals
that you wouldn't expect to be retained are retained by closures in some
cases.

Arguably those cases just need to be fixed. Locals that aren't ever
reachable from a closure being retained by the closure is definitely
non-obvious and it's difficult to even identify or debug these cases with
current debugging tools. JSIL has had huge memory leak issues caused by
this in a few cases. Of course, I don't know how difficult it actually is
to fix this.

I agree that WeakRefs going in with these sorts of leaks remaining would be
a big problem, but it would be worse for developers if these leaks just
stuck around and kept WRs from ever getting into the language. Better to
put them in and have them serve as a strong motivator for fixing those
leaks for good, IMO.


On Sat, Jul 27, 2013 at 8:52 AM, David Bruant <bruant.d at gmail.com> wrote:

> Hi,
>
> There seems to be a consensus on bringing in WeakRefs in the language. I
> came around to the idea myself as some use cases seem to require it (as in:
> some use cases can't be achieved even with a ".dispose" convention like
> distributed acyclic garbage collection). However, I worry.
>
> Recently, some memory leaks where found in Gaia (FirefoxOS front-end). The
> investigation led to a subtle nested function scopes condition under which
> the reference to some variables are kept alive unnecessarily [1].
>
> One of these days the SpiderMonkey folks will certainly fix this bug and
> will that creates 2 worlds. A "before" world where some objects *never*
> gets collected because of the bug and an "after" world where some of the
> same objects get collected.
> If all current major JS engines have common limitations ([1] or any
> other), I worry that code using WeakRefs could implicitly (and mistakenly!)
> rely on these common limitations and break if one of these limitation is
> fixed. We know the rest of the story; it involves browser competition,
> "don't break the web" and this time would put in spec to require some
> missed opportunities for optimization. Phrased differently, we could end up
> with memory leaks imposed by the spec...
>
> I understand the necessity of WeakRefs for some use cases, but I worry.
>
> David
>
> [1] https://bugzilla.mozilla.org/**show_bug.cgi?id=894971#c0<https://bugzilla.mozilla.org/show_bug.cgi?id=894971#c0>
> ______________________________**_________________
> es-discuss mailing list
> es-discuss at mozilla.org
> https://mail.mozilla.org/**listinfo/es-discuss<https://mail.mozilla.org/listinfo/es-discuss>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.mozilla.org/pipermail/es-discuss/attachments/20130727/a85afe0b/attachment-0001.html>
domenic at domenicdenicola.com (2013-07-31T14:49:21.058Z)
If memory serves,
http://point.davidglasser.net/2013/06/27/surprising-javascript-memory-leak.html was
also complaining about a similar closure/scope leak in v8 where locals
that you wouldn't expect to be retained are retained by closures in some
cases.

Arguably those cases just need to be fixed. Locals that aren't ever
reachable from a closure being retained by the closure is definitely
non-obvious and it's difficult to even identify or debug these cases with
current debugging tools. JSIL has had huge memory leak issues caused by
this in a few cases. Of course, I don't know how difficult it actually is
to fix this.

I agree that WeakRefs going in with these sorts of leaks remaining would be
a big problem, but it would be worse for developers if these leaks just
stuck around and kept WRs from ever getting into the language. Better to
put them in and have them serve as a strong motivator for fixing those
leaks for good, IMO.