TC39 bashing

# Axel Rauschmayer (12 years ago)

I’m seeing quite a bit of anti-TC39 sentiment out there and I don’t think it’s fair. Some examples (paraphrasing):

  • “TC39 doesn’t care about web developers and/or doesn’t understand web development.”
  • “TC39 ignores what the people want and designs ‘by committee’.”
  • “TC39 is moving too slowly, does too little.”

The following are counter-points to those opinions:

  • If you want to do good design, it is impossible to please everybody. Design by popular vote is worse than design by committee. That’s why we have representative democracies.

  • Evolving the language while remaining backward compatible is a hard problem. I like what TC39 has done so far. The main goal must be to have as clean a language as possible in the future. Doing so while being backward compatible means that the transition can be a little messy (several similar constructs existing in parallel etc.), but that is unavoidable. One doesn’t need to understand all the gory details as long is things are simple in practice.

  • TC39 has a lot of responsibility and must keep many parties happy. The payoff, however, is huge: I’m not aware of any other programming language that is as open and has as many different, yet highly compatible, implementations. Hence, moving at a deliberate pace is a good thing. Compare with how much progress Java has made over the years (in an environment that is much simpler than JavaScript’s). In that light ES.next’s progress looks quite good.

  • I find es-discuss quite open and appreciate it as a resource. I see TC39 members expend a lot of energy and patience in answering as many questions as possible. Every now and then a question won’t be answered. But that is understandable, as es-discuss is not a support hotline that has to cover 100% of the questions.

What could be improved:

  • Make it easier to search the mailing list archives. Might be a minor thing, but it would really help. I wonder how Brendan always finds those old threads that are relevant to a particular topic.

  • Possibly add an FAQ. This could be as simple as collecting all emails that have long-term explanatory value.

  • I like the idea of having a forum what some people can suggest ideas and everyone can vote on them. One would need both up-votes and down-votes, as there is bound to be a lot of troll material. Such a forum can only ever have an advisory role. But it gives developers the opportunity to vent their feelings and it gives TC39 popular feedback (including ideas that might not have come up before). By bundling requests, traffic is reduced.

  • Some complaints are about evolving the standard library (including collection types). I’ve seen Brendan hint at a strategy for doing so, but I’d love to read more about it.

Axel

# Mikeal Rogers (12 years ago)

What is the goal of this?

If the goal is to get people to stop complaining, don't bother, people will always complain. So long as there is a TC-39 there will be people that strive to be armchair language designers and rail against any actual work.

The core problem is that people who work nearly full time on designing a language are necessarily out of touch with people using it, and the people using it are ill equipped to balance the priorities all all the parties involved in designing it.

I think a better strategy is for TC-39 to state definitively what is not currently working on or is of a very low priority. This would allow the community of people using JavaScript to tackle those problems more directly rather than just waiting. At some point in the future TC-39 can adopt or ratify behavior that has proved itself in the community. I know this process is eluded to often but I don't think you understand how much momentum gets sucked out of the community when they are under the impression that new behavior will be handed down from TC-39 and that their work may fall in conflict or out of date.

The recent discussion about Object.isObject is a great example. If this isn't happening please state so definitely so that we can rally around existing work (underscore) or build something new.

To be honest, creating better ways for developers to get directly involved in this process is a bad idea. You'll either be bombarded with opinions that haven't been well thought through or you'll gain a crowd of enthusiastic people that stick around long enough to forget their old priorities and come to the same compromises you come to already. It might be beneficial to invite a few people from the developer community to meetings and to rotate them out so that no one becomes truly intrenched in the process. Let's not delude ourselves in to thinking that the only barrier to being a part of the current process is technical and can be solved with indexing or a bulletin board.

# John J Barton (12 years ago)

+1 too all this

# Angus Croll (12 years ago)

You have some valid points Axel. Maybe the biggest problem is one of communication - JSFixed has received a lot of attention and many followers in just a few days - which suggests developers want more access to the process by which the language is developed. To that end I hope we are providing a service. Incidentally the advisory forum with voting which you suggest is exactly what JSFixed is now providing.

Apologies if the JSFixed project has spawned or even encouraged anti-TC39 sentiment. Anton, Kit and I committed to keeping things civil and respectful on our side. In any case its my hope, as developer advocates we can offer useful input to the TC39 process.

Angus

# Axel Rauschmayer (12 years ago)

What is the goal of this?

I wanted to give positive feedback to the TC39 work (didn’t want the negative feedback to be the only one) and wanted to mention ideas for improvement.

To be honest, creating better ways for developers to get directly involved in this process is a bad idea. [...] Let's not delude ourselves in to thinking that the only barrier to being a part of the current process is technical and can be solved with indexing or a bulletin board.

I agree that involving developers directly is a bad idea (that’s what I meant when I said that upsetting people is unavoidable and that design by popular vote is not a good approach). Most of the time, a forum would indicate what needs to be explained better – features that people misunderstand. It would also allow people to disagree with something without having to go to es-discuss and sending an email.

+1 to the following idea.

# Brendan Eich (12 years ago)

I have done this since 2006, inviting Alex Russell when he worked for Sitepen. I actually ran afoul of Ecma "invited expert" restrictions, but who cares? Alex now works for Google on Chrome and represents at TC39 meetings. He showed C++ (WebKit) code at TXJS last year :-P.

IIRC Yehuda Katz may be joining us on TC39 but I haven't heard confirmation.

There's a moral here somewhere. Or a punch-line. Or just a punch.

# David Bruant (12 years ago)

Le 10/05/2012 04:44, Mikeal Rogers a écrit :

The core problem is that people who work nearly full time on designing a language are necessarily out of touch with people using it, and the people using it are ill equipped to balance the priorities all all the parties involved in designing it.

I understand your point, but I'm afraid I disagree with your vision which I think is too simplistic. Dave Herman's task.js is a library that has been suggested on the JSFixed thread about asynchronisity [1]. Dave Herman is part of TC39 and what he did seems to resonate well in the developer community. It seems like you're opposing people who design the English language and people who use the English language. The world is not that dichotomic. People who used ECMAScript 3 felt the lack of .bind. People who design the language added it in ECMAScript 5.

I think a better strategy is for TC-39 to state definitively what is not currently working on or is of a very low priority. This would allow the community of people using JavaScript to tackle those problems more directly rather than just waiting. At some point in the future TC-39 can adopt or ratify behavior that has proved itself in the community. I know this process is eluded to often but I don't think you understand how much momentum gets sucked out of the community when they are under the impression that new behavior will be handed down from TC-39 and that their work may fall in conflict or out of date.

The recent discussion about Object.isObject is a great example. If this isn't happening please state so definitely so that we can rally around existing work (underscore) or build something new.

This point is interesting. Among the changes that TC39 have to make to the language, I see 2 categories. One is adding new language capabilities (WeakMap, proxies, lexical |this| functions, binary types, proto operator, etc.) the other is new built-in functions (for Array, Math, Object, etc.). Maybe the latter category could be "(partially) delegated" to the developer community and ratified by TC39. I foresee that with modules, a new field for this second category is opening and there will have some important discussion about what module should be in the standard and which shouldn't. Maybe the responsibility for this part could be also shared with the developer community since they are those who usually write these functions (out of need).

It might be beneficial to invite a few people from the developer community to meetings and to rotate them out so that no one becomes truly intrenched in the process.

Or that the developer community decide to gather by itself and then share feedback on es-discuss.

David

[1] JSFixed/JSFixed#1

# David Bruant (12 years ago)

Le 10/05/2012 01:46, Axel Rauschmayer a écrit :

I’m seeing quite a bit of anti-TC39 sentiment out there and I don’t think it’s fair. Some examples (paraphrasing):

  • “TC39 doesn’t care about web developers and/or doesn’t understand web development.”
  • “TC39 ignores what the people want and designs ‘by committee’.”
  • “TC39 is moving too slowly, does too little.”

The following are counter-points to those opinions:

  • If you want to do good design, it is impossible to please everybody. Design by popular vote is worse than design by committee. That’s why we have representative democracies.

nit: The metaphor is not good I think. In Athens, people weren't elected by vote, but randomly chosen. It's debatable, but to some extent, Athens democracie could be considered as better than any of our current democraties (assuming they can be called so [1] [2])

  • Evolving the language while remaining backward compatible is a hard problem. I like what TC39 has done so far. The main goal must be to have as clean a language as possible in the future. Doing so while being backward compatible means that the transition can be a little messy (several similar constructs existing in parallel etc.), but that is unavoidable. One doesn’t need to understand all the gory details as long is things are simple in practice.

  • TC39 has a lot of responsibility and must keep many parties happy. The payoff, however, is huge: I’m not aware of any other programming language that is as open and has as many different, yet highly compatible, implementations. Hence, moving at a deliberate pace is a good thing. Compare with how much progress Java has made over the years (in an environment that is much simpler than JavaScript’s). In that light ES.next’s progress looks quite good.

  • I find es-discuss quite open and appreciate it as a resource. I see TC39 members expend a lot of energy and patience in answering as many questions as possible. Every now and then a question won’t be answered. But that is understandable, as es-discuss is not a support hotline that has to cover 100% of the questions.

What could be improved:

  • Make it easier to search the mailing list archives. Might be a minor thing, but it would really help. I wonder how Brendan always finds those old threads that are relevant to a particular topic.

  • Possibly add an FAQ. This could be as simple as collecting all emails that have long-term explanatory value. +1. I wish to see es-discuss discussions better documented on the wiki (or maybe the bug tracker?), but I do acknowledge that it's a lot of work that TC39 members (who are the only one with an access on the wiki) don't necessarily have the time to do it.

Also, the proposal page plays this role a bit. By the way, it could be worth adding "backward compatibility" as a 0th goal as no proposal is worth considering if it breaks something on the web.

David

[1] youtu.be/FCQHlmVN8qQ [2] (French) www.tedxrepubliquesquare.com/etienne-chouard [3] harmony:harmony

# Allen Wirfs-Brock (12 years ago)

On May 9, 2012, at 7:44 PM, Mikeal Rogers wrote:

... I think a better strategy is for TC-39 to state definitively what is not currently working on or is of a very low priority. This would allow the community of people using JavaScript to tackle those problems more directly rather than just waiting. At some point in the future TC-39 can adopt or ratify behavior that has proved itself in the community. I know this process is eluded to often but I don't think you understand how much momentum gets sucked out of the community when they are under the impression that new behavior will be handed down from TC-39 and that their work may fall in conflict or out of date.

The recent discussion about Object.isObject is a great example. If this isn't happening please state so definitely so that we can rally around existing work (underscore) or build something new.

I actually don't see why, for functionality like this, you care so much about what TC39 is doing. If you need something right now that is implementable using the current language just build it. Either in your individual apps or in libraries that you promote. If the functionality doesn't require new language syntax or semantics, you don't need us. For such functionality, the most power input into the TC39 process is wide adoption across a many applications and libraries. After proving that level of utility, a feature is ripe for standardization in order to assume universal available and common semantics.

WRT Object.isObject, you've seen the debate. It's in the current ES6 draft. It may or may not stay there. It could die at the next meeting or anytime before final publication, hopefully in Dec. 2013. The same applies to any feature. Even if you could get a irrevocable decision today (and you can't) what meaningful difference does it make to you as a developer shipping software in 2012?

# Mikeal Rogers (12 years ago)

On May 10, 2012, at May 10, 20128:55 AM, Allen Wirfs-Brock wrote:

On May 9, 2012, at 7:44 PM, Mikeal Rogers wrote:

... I think a better strategy is for TC-39 to state definitively what is not currently working on or is of a very low priority. This would allow the community of people using JavaScript to tackle those problems more directly rather than just waiting. At some point in the future TC-39 can adopt or ratify behavior that has proved itself in the community. I know this process is eluded to often but I don't think you understand how much momentum gets sucked out of the community when they are under the impression that new behavior will be handed down from TC-39 and that their work may fall in conflict or out of date.

The recent discussion about Object.isObject is a great example. If this isn't happening please state so definitely so that we can rally around existing work (underscore) or build something new.

I actually don't see why, for functionality like this, you care so much about what TC39 is doing. If you need something right now that is implementable using the current language just build it. Either in your individual apps or in libraries that you promote. If the functionality doesn't require new language syntax or semantics, you don't need us. For such functionality, the most power input into the TC39 process is wide adoption across a many applications and libraries. After proving that level of utility, a feature is ripe for standardization in order to assume universal available and common semantics.

WRT Object.isObject, you've seen the debate. It's in the current ES6 draft. It may or may not stay there. It could die at the next meeting or anytime before final publication, hopefully in Dec. 2013. The same applies to any feature. Even if you could get a irrevocable decision today (and you can't) what meaningful difference does it make to you as a developer shipping software in 2012?

Let me clarify.

People obviously have solutions for this in pure js. What isn't happening is any sort of movement around a common API or implementation while TC-39 is considering it.

I'm not trying to say that TC-39 should stop it's work on this, or any other, API. I'm just trying to make it clear that there would be a positive effect in the community if you were to publicly state work you are no longer undertaking or considering for the next version.

# Mikeal Rogers (12 years ago)

On May 10, 2012, at May 10, 20121:41 AM, David Bruant wrote:

Le 10/05/2012 04:44, Mikeal Rogers a écrit :

The core problem is that people who work nearly full time on designing a language are necessarily out of touch with people using it, and the people using it are ill equipped to balance the priorities all all the parties involved in designing it. I understand your point, but I'm afraid I disagree with your vision which I think is too simplistic. Dave Herman's task.js is a library that has been suggested on the JSFixed thread about asynchronisity [1]. Dave Herman is part of TC39 and what he did seems to resonate well in the developer community. It seems like you're opposing people who design the English language and people who use the English language. The world is not that dichotomic. People who used ECMAScript 3 felt the lack of .bind. People who design the language added it in ECMAScript 5.

I think a better strategy is for TC-39 to state definitively what is not currently working on or is of a very low priority. This would allow the community of people using JavaScript to tackle those problems more directly rather than just waiting. At some point in the future TC-39 can adopt or ratify behavior that has proved itself in the community. I know this process is eluded to often but I don't think you understand how much momentum gets sucked out of the community when they are under the impression that new behavior will be handed down from TC-39 and that their work may fall in conflict or out of date.

The recent discussion about Object.isObject is a great example. If this isn't happening please state so definitely so that we can rally around existing work (underscore) or build something new. This point is interesting. Among the changes that TC39 have to make to the language, I see 2 categories. One is adding new language capabilities (WeakMap, proxies, lexical |this| functions, binary types, proto operator, etc.) the other is new built-in functions (for Array, Math, Object, etc.). Maybe the latter category could be "(partially) delegated" to the developer community and ratified by TC39. I foresee that with modules, a new field for this second category is opening and there will have some important discussion about what module should be in the standard and which shouldn't. Maybe the responsibility for this part could be also shared with the developer community since they are those who usually write these functions (out of need).

There is a difference to TC-39 but not much of a different to the community. The web developer motto is "suck it up", if something is broken we write a workaround or monkey patch it, If something is impossible to do one way we find another way.

People in the community don't ask for "WeakMap" until there is a proposal. Instead they come up with other crazy ways to accomplish their end goals without needing such a type to varying degrees success (we're still finding leaks in FreeList in node.js http.js that might have been prevented had we had WeakMap a few years ago).

It might be beneficial to invite a few people from the developer community to meetings and to rotate them out so that no one becomes truly intrenched in the process. Or that the developer community decide to gather by itself and then share feedback on es-discuss.

That's hard. JavaScript is a diverse community which also means that it is fragmented. I think the node.js community has a good core group of leadership with somewhat consistent ideas about what the community wants from the language. There are leaders in the web developer community that I believe could speak well to their supporters (Jeremy Ashkenas, Thomas Fuchs) but there isn't a place where the js community really comes together outside of JSConf and the range of ideas is a diverse as the community.

Maybe it would be a good idea to carve out some time at developer events that bring these groups together to discuss what people want out of the language and invite a leader from that group to a TC-39 meeting some time after. I can carve out some time at NodeConf and I can talk to the BackboneConf guys about doing the same.

# Anton Kovalyov (12 years ago)

Since David Bruant mentioned JSFixed here let me just say that our intent is not to fight with TC39 (that'd be silly) but to cooperate with it. I try hard to keep people from pitchforking; instead I want them to share their pain points.

I personally[1] think that TC39 does a pretty good job but there are big problems with feedback loop. Half of the issues I see on JSFixed had been already harmonized[2] but people are not aware about it. The wiki is too confusing and searching mailing list archives is not the best way to spend your afternoon. Somehow I know more about what's coming to Python even though, nowadays, I spend less than 5% of my time actually using Python.

Also, how does one share that their life sucks because of some particular quirk in JavaScript? TC39 members always tell us that they need this kind of feedback but there's no meaningful way to submit it. For example, in the thread about typeof I really wanted to +1 Crockford's comment and say: "Yes! I trip on typeof null all the time". Now, I'm not the one who gets easily intimidated by authoritative figures but I ended up not sending anything because I don't have a solution—only a problem shared with other devs. Wouldn't be nice if I could +1/star that thread and maybe add a paragraph describing how this issue makes my life harder without spamming inboxes of all es-discuss subscribers?[3]

In addition to that, it'd be really nice to have down-to-earth explanation of harmonized changes with code samples that resemble real world use cases. FWIW, Rick and David are doing a great job writing code samples but not everyone follows them on Twitter/GitHub. And also its really hard to go back and find those gists/repos.

As for the conferences, I think Brendan and other are already doing a good job speaking about what's coming to ES6 and what's in works. I don't see how group discussions will help there—especially since our community likes smaller conferences. But I might be wrong.

Anton

[1] As in "not speaking on behalf of JSFixed". [2] Kudos to Rick Waldron and David Herman for educating people about that. [3] Implementation details: GitHub is actually pretty terrible for that, as I learned from JSFixed, because of all +1 comments. Google Code's star feature is much better.

# Brendan Eich (12 years ago)

David Bruant wrote:

the other is new built-in functions (for Array, Math, Object, etc.). Maybe the latter category could be "(partially) delegated" to the developer community and ratified by TC39.

I constantly endorse this, averring that TC39 will screw up any library design exercise of necessary scale, like NPM's most popular, or smaller gems such as underscore.

The way to get these right is github.com, not Ecma TC39!

I foresee that with modules, a new field for this second category is opening and there will have some important discussion about what module should be in the standard and which shouldn't. Maybe the responsibility for this part could be also shared with the developer community since they are those who usually write these functions (out of need).

It's shared only in the sense that TC39 writes down de-jure specs that browser vendors and others treat as normative. We have to do right, and spec stuff people want. By spec'ing what people actually use, based mostly on the cream that we skim off of github, we avoid all the risks of trying to design API sets in too narrow a setting.

Your point about overlap, no dichotomies, is good too. We need cross-training among developers and TC39ers where feasible. TC39ers I know are all writing more JS over time. I'm trying to kick the C++ habit myself ;-).

# Brandon Benvie (12 years ago)

On a small tangent, it turns out you can provide a nearly perfect shim for WeakMap given ES5. That is: O(1) lookup time, keys hold reference to values strongly and not vice versa, being a WeakMap key or value doesn't prevent an object from being gced, weakmaps themselves can be gced while object/keys in it still exist, key->value mappings are non-forgeable and

non-interceptable, and all of this done [almost] non-observerably. This gist by Gozala demonstrates the concept in action ( gist.github.com/1269991). I built on the technique he used as well as some ideas from the SES WeakMap impl by Mark Miller ( code.google.com/p/es-lab/source/browse/trunk/src/ses/WeakMap.js ) and redid my original not-so-good es6 collection shim with one that is very close in functionality and interface to the real thing: Benvie/ES6-Harmony-Collections-Shim . Compatibility goes to IE9 (relies on defineProperty and getOwnPropertyNames).

# Allen Wirfs-Brock (12 years ago)

Near perfect except it can't have the required GC behavior. It is impossible to emulate the required GC behavior outside of the garbage collector. That's why WeakMap exists.

# Brandon Benvie (12 years ago)

Where does it fail on the GC semantics? The only references created are key -> closure on key -> value. The spec says that for any key that is

reachable, so it the value it maps to, with that being the only connection created. That is upheld here.

# Brandon Benvie (12 years ago)

Ahh yes I see. The circular reference case. Well, almost....

# Brendan Eich (12 years ago)

Any tangle involving keys as values of other entries whose keys point to values (that are keys, etc.).

# Allen Wirfs-Brock (12 years ago)

On May 10, 2012, at 7:24 PM, Brandon Benvie wrote:

Ahh yes I see. The circular reference case. Well, almost....

And that's not the only leak. You create a new valueOf method for each WeakDictionary a given object is used to key. These methods are threaded off of the key object starting with the most recently created one and each valueOf indirectly captures the value associated with that object key in the corresponding WeakDictionary. If there are no reference to a WeakDictionary it will get garbage collected. However, the corresponding valueOf method for each object that was used as a key for that WeakDictionary will stay behind. So will the value that was associated with that key in the dictionary.

This isn't a bad solution if you know need a weak keyed map where you know that each object is only going to key a single such map

# Brandon Benvie (12 years ago)

Yeah I modified gozala's method a bit. The value isn't kept in valueOf, rather in a null proto object that allows different WeakMaps to not step on another one.

Maybe my understanding was flawed, but my understanding was that it's not an issue for these single purpose objects/closures in terms of garbage collection. A set of objects (mostly null proto objects that only ever have one or no refereces to other objects, and only one reference to them from the local system, essentially will act like one "unit" and ultimately the livelihood of all of them maps back to the single key object that is the entry point for them. The only other object that ever gets a reference to any of them is the weakmap instance, which never keeps any references. It simply has all the keys to unlock the gates to get to the mapped value when requested.

Would the garbage collection for this be more complex than I imagine? Or would it be pretty easy to clearly link them in the way I supposed. If my understanding is right, the remaining issue is the circular reference one in terms of key -> value -> key -> value keeping each other alive. I had

kind of supposed that this itself would fall under the same rules though because it's determinable whether that kind of set is reachable from roots or not.

I'm also curious of what significance, if any, of the fact that the actual organization of the data "map" isn't just weak by declaration/fiat but is weak in the sense that the weakmap itself actually has no references to any data at all aside from one string key and one hash key that proves identity and can only do its job by being given references to objects and seeing if the keys work. Perhaps this is immaterial or I may just not understand the underlying reality of how the data is organized.

re: Mark there's a few useful tweaks for performance as well as security. The first for performance is the double array [keys] and [values] outlined on the wiki and used in your implementation loses out on performance having to do the O(n) indexOf look up. The structure I ended up using starts with the same idea of a single global HIDDEN_NAME entry point, but then from there the next level works by assigning each WeakMap its own random name so it's just another property lookup. It's probably helpful to show a condensed definition of what is happening.

weakmap.get(obj) turns into:

var unlocker = weakmap[globalUID]internalUID.value; obj[globalUID]unlockerUID.value

Going in reverse order, starting from .value at the end here's what happens:

  • The 'value' is the sole property on a "lockbox" and is shaped as Object.preventExtensions(Object.create(null, { value: { value: undefined, writable: true } })). This is the slot for the value for a single weakmap.

  • The closure containing the lockbox is a "locker" created from new Function('h','l', '"use strict"; return function(k){ if (k === h) return l; }')(hashkey, lockbox). The lockbox should only have references to one or two other objects: the hashkey and potentially an object set as the value.

  • The "unlocker" creates the lockers using its private hashkey Object.create(null). An unlocker also has its own UID in order for it to address the lockers it makes.

  • The lockers, named by unlockerUID, are defined on the obj[globalUID] "perObjectStorage", which is also Object.create(null) and created on demand. In the SES implementation, this is the area where the [key] and [value] arrays live instead.

  • The weakmaps themselves are stored in the same manner, so the first step is unwrapping the weakmap's locker from the shell public interface. The "internalUID" would be the UID of the locker for unwrapping weakmaps.

  • "globalUID" is generated inside a closured and has a dedicated function responsible for retrieving existing or defining new perObjectStorages. This is also where Object.getOwnPropertyNames is shimmed.

getOwnPropertyNames is shimmed such that it first checks for hasOwn(obj, globalUID) so that in most cases there's no performance penalty added for looping through the property array to splice out things to hide. If found then a splice(indexOf) is done. This is the only function that is monkey patched. This is also the only place indexOf is used. All other lookups are property lookups or variable records in closures.

With SES the goal is different than mine, as I'm mostly interested in the various benefits provided by using weakmaps without too much focus on security. But unless my understanding of the SES WeakMap shim is incomplete, it is inherently insecure in that it is reliant on preventing unauthorized access to finding the store. If one is able to get a real getOwnPropertyNames through some means then all bets are off. With the scheme in use above, the weakmap has the only unforgeable key to the data, so even the object itself couldn't unlock it and even if someone was able to expose the hidden properties by importing GOPN from another frame and get all the lockers they'd be unable to open them. They'd ruin the garbage collection by introducing references into an otherwise peaceful system but it wouldn't be a security breach.

I've tried to minimize any incidental unintended storage so a number of the functions generated use the new Function route of minimizing implicit scope scope data.

# Allen Wirfs-Brock (12 years ago)

On May 11, 2012, at 2:39 AM, Brandon Benvie wrote:

Yeah I modified gozala's method a bit. The value isn't kept in valueOf, rather in a null proto object that allows different WeakMaps to not step on another one.

Maybe my understanding was flawed, but my understanding was that it's not an issue for these single purpose objects/closures in terms of garbage collection. A set of objects (mostly null proto objects that only ever have one or no refereces to other objects, and only one reference to them from the local system, essentially will act like one "unit" and ultimately the livelihood of all of them maps back to the single key object that is the entry point for them. The only other object that ever gets a reference to any of them is the weakmap instance, which never keeps any references. It simply has all the keys to unlock the gates to get to the mapped value when requested.

Would the garbage collection for this be more complex than I imagine? Or would it be pretty easy to clearly link them in the way I supposed. If my understanding is right, the remaining issue is the circular reference one in terms of key -> value -> key -> value keeping each other alive. I had kind of supposed that this itself would fall under the same rules though because it's determinable whether that kind of set is reachable from roots or not.

I'm also curious of what significance, if any, of the fact that the actual organization of the data "map" isn't just weak by declaration/fiat but is weak in the sense that the weakmap itself actually has no references to any data at all aside from one string key and one hash key that proves identity and can only do its job by being given references to objects and seeing if the keys work. Perhaps this is immaterial or I may just not understand the underlying reality of how the data is organized.

Abstractly a weak map is just a way to identify a specific unidirectional relationship between pairs of objects (lets call them the "source" and "target"). Given a specific such relationship and a source object it should be easy to produce the target object. There are two pretty obvious ways to represent this abstraction. You can have an object (corresponding to the relationship) to which is attached a table of key/value pairs where the keys are source objects and and the values are target objects. Our you can attach to each object that is used as a source a table of key/value pairs where the keys are relationship objects and and the values are target objects.. There may be tons of specific implementation variations and details for either approach, but fundamentally that it what it comes down to.

Using normal garbage collector reachability rules, either of the approaches will be leaky. When a source/target table is used, the source objects will not get garbage collected as long as the relationship object is reachable; even when the only reference to a source object are from within such a table. When a relationship/target table is used, the relationship objects will not get GC'ed as long as one of its source objects is reachable; even when the only reference to the relationship object is from such a table. So depending upon the approach you will either leak source objects or relationship objects. In both cases, the leaky key values also lead to retention of otherwise GC'able target objects. The only way around this is for the GC to know about such tables and to treat them specially. The "Weak" in the name of the abstraction is simply an indication of the requirement for this special treatment. Arguably "NonLeakyMap" would be an even more describe name.

# Grant Husbands (12 years ago)

Brandon Benvie wrote:

Yeah I modified gozala's method a bit. The value isn't kept in valueOf, rather in a null proto object that allows different WeakMaps to not step on another one.

I could easily be misunderstanding things, so I'll be brief. Under the standard WeakMap, if I have a WeakMap W, an Object A and an Object B and I add A=>B to the WeakMap, the behaviour is that both W and A

being reachable (during GC) will make B reachable, but that neither of them being reachable alone will make B reachable.

In the case of at least some of these shims, it seems that adding A=>B

to any weakmap and not undoing it later means than either B or W will reachable for as long as A is, whether or not they are otherwise reachable.

, Grant Husbands.

# Grant Husbands (12 years ago)

Oops, obvious typo:

Grant Husbands wrote:

In the case of at least some of these shims, it seems that adding A=>B to any weakmap and not undoing it later means than either B or W will

"any weakmap" should be "W".

G.

# Aymeric Vitte (12 years ago)

Le 10/05/2012 20:04, Anton Kovalyov a écrit :

it'd be really nice to have down-to-earth explanation of harmonized changes with code samples that resemble real world use cases.

Yes.

TC39 can not ask the opinion of every developers or community representatives, and developers can not spend their time following TC39

But I am not sure that TC39 does really realize how far they are from the "advanced" developers, not talking about the "usual" webmasters...

A side effect is that a lot of wrong things are advised or said on well known forums or projects, then things keep being developed wrongly as it is the tendance since a long time in the web world, and right code is reserved to a very specific elite, making both life of language designers and developers hard, making js less and less accessible

Then the truth should come from TC39 with the help of developers, and maybe a significant effort should be made to make things more understandable with simple real world use cases, starting with official strawman proposals/ES specs simplificated version annexes that could include for example this kind of attempt es5/es5.github.com#14

A performances annex would be good too (ie what to use or not use if you care about performances)