New private names proposal

# Allen Wirfs-Brock (14 years ago)

On the wiki, I've posted a new strawman Private Name proposal strawman:private_names . It replaces Dave and Sam's original Names strawman and was developed in consultation with them. This is a significant revision to the original proposal but builds upon some of the same base ideas.

In reading this strawman it's important to understand that its concept of "private" is quite different from what you may be use to from C++ or Java. In those languages "private" is an attribute of a member (field or method) of an class. It means that the member is only accessible to other members of the same class (ignoring what can be done via reflection). This model is not a particularly good match to the JavaScript object model where the structure of an object is much more dynamic and method functions can be dynamically associated or disassociated with an object and shared by many different kinds of objects.

In this proposal, "private" is an attribute of a property name, rather than of an actual property. Any code that has access to a "private name" can use that name to create or access a property of any object. It is accessibility to the name that is controlled rather than accessibility to the property. This seems to fit more naturally with JavaScript's dynamic object construction patterns and without really changing the fundamental JavaScript object model it enables JavaScript programmers to create a number of different information hiding abstractions.

Please read the proposal and let's start the discussion.

# Kris Zyp (14 years ago)

-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1

This sounds great, but doesn't this kind of violate referential transparency? The following function has always worked as expected: function foo(){ var obj = {"bar":"hello"}; // assuming quoting names are strings alert(obj.bar); } foo(); until is put in a context (or even just concatenated with?) with a "private bar;" declaration. Changing the behavior of property identifiers seems like an awkwardly complicating addition to EcmaScript.

Couldn't the goals of this be achieved by having a Name constructor (albiet less convenient syntax, since you have to use obj[name], perhaps that is what you are addressing) or having private name create a name Name (to be used like obj[name])? Kris

# David Herman (14 years ago)

This sounds great, but doesn't this kind of violate referential transparency?

That's a loaded criticism. JS doesn't have referential transparency in any meaningful sense. But it does generalize the meaning of the dot-operator to be sensitive to scoping operators, that's true.

Couldn't the goals of this be achieved by having a Name constructor (albiet less convenient syntax, since you have to use obj[name], perhaps that is what you are addressing) or having private name create a name Name (to be used like obj[name])?

You can do the same thing with the current proposal (except that private names in Allen's strawman are primitive values and not objects). You can write a library that produces new private name values, and you can use the bracket operator to get and set properties with that name.

But your "albeit less convenient syntax" is the crux of why I like the proposal as is. I would much rather write (and I suspect many programmers would as well):

function Point(x, y) {
    private x, y;
    this.x = x;
    this.y = y;
    ...
}

than

function Point(x, y) {
    var _x = gensym(), _y = gensym();
    this[_x] = x;
    this[_y] = y;
}
# Kris Kowal (14 years ago)

On Thu, Dec 16, 2010 at 1:53 PM, David Herman <dherman at mozilla.com> wrote:

function Point(x, y) {        private x, y;        this.x = x;        this.y = y;        ...    }

than

function Point(x, y) {        var _x = gensym(), _y = gensym();        this[_x] = x;        this[_y] = y;    }

I tend to disagree with most developers, so take it with a grain of salt that I find the latter form, with all the implied abilities, easier to understand.

Kris Kowal

# Mark S. Miller (14 years ago)

On Thu, Dec 16, 2010 at 1:58 PM, Kris Kowal <kris.kowal at cixar.com> wrote:

On Thu, Dec 16, 2010 at 1:53 PM, David Herman <dherman at mozilla.com> wrote:

function Point(x, y) { private x, y; this.x = x; this.y = y; ... }

than

function Point(x, y) { var _x = gensym(), _y = gensym(); this[_x] = x; this[_y] = y; }

I tend to disagree with most developers, so take it with a grain of salt that I find the latter form, with all the implied abilities, easier to understand.

I do too. While terseness clearly contributes to understandability, regularity and simplicity do too. When these conflict, we should be very careful about sacrificing regularity.

Currently is JS, x['foo'] and x.foo are precisely identical in all contexts. This regularity helps understandability. The terseness difference above is not an adequate reason to sacrifice it.

# Chuck Jazdzewski (14 years ago)

Ø Currently is JS, x['foo'] and x.foo are precisely identical in all contexts.

This is true when the string contains a is a legal identifier and false otherwise. The [] syntax is a superset of the '.' syntax. The proposal preserves this through the use of "#" expression, e.g.,

            private x;
            this.x = 1;

is equivalent to

            private x;
            this[#.x] = 1;

which means it is still regular. What it doesn't preserve is the property name is always representable as a string. The property name becomes a plus type of String+Name. With proxies it becomes a String+Name+ProxiedName type where the ProxiedName is used to hide the Name value.

Chuck.

From: es-discuss-bounces at mozilla.org [mailto:es-discuss-bounces at mozilla.org] On Behalf Of Mark S. Miller Sent: Thursday, December 16, 2010 2:19 PM To: Kris Kowal Cc: es-discuss at mozilla.org Subject: Re: New private names proposal

On Thu, Dec 16, 2010 at 1:58 PM, Kris Kowal <kris.kowal at cixar.com<mailto:kris.kowal at cixar.com>> wrote:

On Thu, Dec 16, 2010 at 1:53 PM, David Herman <dherman at mozilla.com<mailto:dherman at mozilla.com>> wrote:

function Point(x, y) { private x, y; this.x = x; this.y = y; ... }

than

function Point(x, y) { var _x = gensym(), _y = gensym(); this[_x] = x; this[_y] = y; }

I tend to disagree with most developers, so take it with a grain of salt that I find the latter form, with all the implied abilities, easier to understand.

I do too. While terseness clearly contributes to understandability, regularity and simplicity do too. When these conflict, we should be very careful about sacrificing regularity.

Currently is JS, x['foo'] and x.foo are precisely identical in all contexts. This regularity helps understandability. The terseness difference above is not an adequate reason to sacrifice it.

Kris Kowal

# Brendan Eich (14 years ago)

On Dec 16, 2010, at 2:19 PM, Mark S. Miller wrote:

Currently is JS, x['foo'] and x.foo are precisely identical in all contexts. This regularity helps understandability. The terseness difference above is not an adequate reason to sacrifice it.

Aren't you proposing the same syntax x[i] where i is a soft field map, to make exactly the same sacrifice?

strawman:names_vs_soft_fields

# Mark S. Miller (14 years ago)

On Thu, Dec 16, 2010 at 3:23 PM, Brendan Eich <brendan at mozilla.com> wrote:

On Dec 16, 2010, at 2:19 PM, Mark S. Miller wrote:

Currently is JS, x['foo'] and x.foo are precisely identical in all contexts. This regularity helps understandability. The terseness difference above is not an adequate reason to sacrifice it.

Aren't you proposing the same syntax x[i] where i is a soft field map, to make exactly the same sacrifice?

strawman:names_vs_soft_fields

I am not proposing these syntactic extensions. Neither am I avoiding them on that page, since the point of that page is to compare semantics, not syntax. The first paragraph (!) of that page clearly states:

"This translation does not imply endorsement of all elements of the names proposal as translated to soft fields, such as the proposed syntactic extensions."

The two issues are orthogonal. Whichever of Names or Soft Fields wins, we can have an orthogonal argument about whether the winner should use this syntactic shorthand. Conversely, whatever the outcome of the syntax argument in this thread, they would apply equally well to either semantics.

# David Herman (14 years ago)

Without new syntax, isn't soft fields just a library on top of weak maps?

# Mark S. Miller (14 years ago)

On Thu, Dec 16, 2010 at 3:51 PM, David Herman <dherman at mozilla.com> wrote:

Without new syntax, isn't soft fields just a library on top of weak maps?

Semantically, yes. However, as a library, they cannot benefit from the extraordinary efforts of all JavaScript engines to optimize inherited property lookup. Nor from the GC benefits that follow from the transposed representation explained at < strawman:inherited_explicit_soft_fields#a_transposed_representation>.

OTOH, if soft fields are built in and implemented in this transposed manner, both benefits easily follow.

# Mark S. Miller (14 years ago)

On Thu, Dec 16, 2010 at 3:23 PM, Brendan Eich <brendan at mozilla.com> wrote:

On Dec 16, 2010, at 2:19 PM, Mark S. Miller wrote:

Currently is JS, x['foo'] and x.foo are precisely identical in all contexts. This regularity helps understandability. The terseness difference above is not an adequate reason to sacrifice it.

Aren't you proposing the same syntax x[i] where i is a soft field map, to make exactly the same sacrifice?

strawman:names_vs_soft_fields

Btw, near the end of <

strawman:inherited_explicit_soft_fields>,

I also say:

"I (MarkM) do not like the sugar proposed for Names, as I think it encourages confusion between literal text and lexical lookup. However, this issue seems to be orthogonal to the soft fields vs. Names debate."

# Brendan Eich (14 years ago)

On Dec 16, 2010, at 3:57 PM, Mark S. Miller wrote:

On Thu, Dec 16, 2010 at 3:51 PM, David Herman <dherman at mozilla.com> wrote: Without new syntax, isn't soft fields just a library on top of weak maps?

Semantically, yes. However, as a library, they cannot benefit from the extraordinary efforts of all JavaScript engines to optimize inherited property lookup. Nor from the GC benefits that follow from the transposed representation explained at strawman:inherited_explicit_soft_fields#a_transposed_representation. OTOH, if soft fields are built in and implemented in this transposed manner, both benefits easily follow.

Have you talked to any JS engine implementors about any of this? If so, more details would be great.

Private name property name lookup is not transposed into a get along the prototype chain followed by a method call, and calling a method is inherently costlier than getting a property, in all the JS engines I've studied. Private names use exactly the same optimized lookup paths today's engines sport, just with a different type of identifier.

In strawman:inherited_explicit_soft_fields#a_transposed_representation you write:

"This representation parallels the implementation techniques and resulting performance benefits expected for names but without the semantic problems (leaking via proxy traps and inability to associate soft state with frozen objects)."

but the first point in the parenthetical aside was addressed by Allen's writeup at

strawman:private_names

(note: not strawman:names) as follows:

[strawman:private_names#proxies]

"The Proxy object could be replaced with an alternative implementation that added an additional handler layer that would wrapper all private name values passed through its traps. The wrappers would be opaque encapsulations of each private name value and provide a method that could be used to test whether the encapsulated private name was === to an argument value. This would permit handlers to process known private name values but would prevent exposing arbitrary private name values to the handlers. If there is sufficient concern about proxies exposing private name values in this manner, such wrapping of private names could be built into the primitive trap invocation mechanism."

Please update your page accordingly.

The other parenthetical point, the inability to associate soft state with frozen objects, is a feature of private names, not a bug. If it's an important use case then we can certainly use soft fields (given weak maps) to solve it.

This does not mean soft fields and private names are mutually exclusive and locked in some "there can be only one!" Highlander contest.

Really, it's starting to feel like "Survivor" or "American Idol" around here. The apples to oranges death-matching has to stop.

# Mark S. Miller (14 years ago)

On Thu, Dec 16, 2010 at 4:27 PM, Brendan Eich <brendan at mozilla.com> wrote:

On Dec 16, 2010, at 3:57 PM, Mark S. Miller wrote:

On Thu, Dec 16, 2010 at 3:51 PM, David Herman <dherman at mozilla.com> wrote:

Without new syntax, isn't soft fields just a library on top of weak maps?

Semantically, yes. However, as a library, they cannot benefit from the extraordinary efforts of all JavaScript engines to optimize inherited property lookup. Nor from the GC benefits that follow from the transposed representation explained at < strawman:inherited_explicit_soft_fields#a_transposed_representation>. OTOH, if soft fields are built in and implemented in this transposed manner, both benefits easily follow.

Have you talked to any JS engine implementors about any of this? If so, more details would be great.

Private name property name lookup is not transposed into a get along the prototype chain followed by a method call, and calling a method is inherently costlier than getting a property, in all the JS engines I've studied. Private names use exactly the same optimized lookup paths today's engines sport, just with a different type of identifier.

In strawman:inherited_explicit_soft_fields#a_transposed_representationyou write:

"This representation parallels the implementation techniques and resulting performance benefits expected for namesstrawman:namesbut without the semantic problems (leaking via proxy traps and inability to associate soft state with frozen objects)."

but the first point in the parenthetical aside was addressed by Allen's writeup at

strawman:private_names

(note: not strawman:names) as follows:

[strawman:private_names#proxies]

"The Proxy object could be replaced with an alternative implementation that added an additional handler layer that would wrapper all private name values passed through its traps. The wrappers would be opaque encapsulations of each private name value and provide a method that could be used to test whether the encapsulated private name was === to an argument value. This would permit handlers to process known private name values but would prevent exposing arbitrary private name values to the handlers.

If there is sufficient concern about proxies exposing private name values in this manner, such wrapping of private names could be built into the primitive trap invocation mechanism." Please update your page accordingly.

The other parenthetical point, the inability to associate soft state with frozen objects, is a feature of private names, not a bug. If it's an important use case then we can certainly use soft fields (given weak maps) to solve it.

This does not mean soft fields and private names are mutually exclusive and locked in some "there can be only one!" Highlander contest.

I'll address this last point first, since this seems to be the core issue. The question I am raising is: given soft fields, why do we need private names? I translated your examples to show why I see no need for the latter. If we don't need them, we shouldn't add them.

# Brendan Eich (14 years ago)

On Dec 16, 2010, at 4:44 PM, Mark S. Miller wrote:

This does not mean soft fields and private names are mutually exclusive and locked in some "there can be only one!" Highlander contest.

I'll address this last point first, since this seems to be the core issue. The question I am raising is: given soft fields, why do we need private names? I translated your examples to show why I see no need for the latter. If we don't need them, we shouldn't add them.

This reasoning is backwards. If we decide the use-cases motivating private names, including usability: both lexical bindings consulted on the right of dot, and as first-class generated values in expressions, are worth supporting, then we will support the private names proposal -- however specified.

(If we do decide to support these use-cases with anything like the proposed observable syntax and semantics, then I'm strongly in favor of the specificatiion approach in Allen's writeup, and strongly against the frankly obscure and tortured specification on top of soft fields that you've used.)

If we decide to support soft fields as a library on top of weak maps, or (it's already in harmony:proposals) just assuming weak maps, then soft field use-cases have their day in the sun via library code (standardized or not). But this has nothing to do with private names as proposed!

Really, it's starting to feel like "Survivor" or "American Idol" around here. The apples to oranges death-matching has to stop.

I want to change the channel. Reasoning backwards from assumed conclusions and overcomplicated translations is not the way to proceed here. We need to agree on use-cases and address usability, not start with an executable spec and stretch it to cover (grudgingly) other applications.

I also doubt that soft fields beat private names from any optimizing implementor's point of view (I wear that hat still myself, but I'll defer to full-timers).

We really need to consider what users will see, how they'll use the proposed factility, what ends it serves in its untranslated form, and go from there. Anything else is overcomplicated and courts over-specification.

# Mark S. Miller (14 years ago)

At this point, I'll just mention that this response seems overly emotional, filled with name calling, and seemingly disconnected from the case I'm actually making. (For example, I never said that soft fields would be more or as efficient as names. Merely that, if primitive, they could benefit from the same representation you are considering for names. As a library, they cannot be close to that efficient.) I'll wait till I see a calmer and more reasoned response to what I'm actually saying, rather than (in the pejorative sense) straw men.

# David Herman (14 years ago)

I'll address this last point first, since this seems to be the core issue. The question I am raising is: given soft fields, why do we need private names?

I didn't see that asked as a question; I saw it asserted, but not opened for discussion.

And I disagree. Now, I happen to think it's not worth blessing libraries simply because they could be optimized, but I do not see soft fields as supplanting private names -- especially because of usability -- but especially because I happen to like weak maps very much, and very much hope for a world where ES a) makes it easy to write (any number of) soft field libraries and b) makes it easy to store encapsulated data in objects.

# Brendan Eich (14 years ago)

On Dec 16, 2010, at 5:01 PM, Mark S. Miller wrote:

At this point, I'll just mention that this response seems overly emotional,

On the contrary.

filled with name calling,

No.

Calling a specification "obscure and tortured" is not name-calling in any personal sense. However, if you think I've called you a name, I can only back off and apologize.

I invite others to join in with comments on the two proposals, so this doesn't degenerate into something personal -- I do not believe that I've done anything to make it so, and I assume you don't want what you decry here, either.

and seemingly disconnected from the case I'm actually making.

You're explicitly trying to eliminate one of two proposals, inherited soft fields and private names, which address two different use-cases in different ways, by extending your favored proposal to cover the one, which you don't like anyway. You've been clear enough about this (thanks for pointing out that disclaimer at the top of strawman:inherited_explicit_soft_fields).

I am calling this backward reasoning, which assumes a conclusion not in evidence: that soft fields are a good specification (or implementation) of private names.

Furthermore, I am clearly arguing for proceeding forward from use-cases and usable designs addressing them, toward specifications and (of course; we do prototype harmony proposals) implementation.

If it turns out that two solutions for different use-cases have specification and implementation in common, then we can unify. This must be done a posteriori or we will overcomplicate and over-specify.

# Mark S. Miller (14 years ago)

On Thu, Dec 16, 2010 at 5:03 PM, David Herman <dherman at mozilla.com> wrote:

I'll address this last point first, since this seems to be the core issue. The question I am raising is: given soft fields, why do we need private names?

I didn't see that asked as a question; I saw it asserted, but not opened for discussion.

Ok, I open it for discussion. Given soft fields, why do we need private names?

And I disagree.

I'm sorry, but what do you disagree with? That I am raising the question? Below, you seem to be saying that given names, why do we need soft fields? How is that not a "there can be only one!" Highlander contest? If that's not what you're saying, then what?

Now, I happen to think it's not worth blessing libraries simply because they could be optimized, but I do not see soft fields as supplanting private names -- especially because of usability -- but especially because I happen to like weak maps very much, and very much hope for a world where ES a) makes it easy to write (any number of) soft field libraries and b) makes it easy to store encapsulated data in objects.

How do names make this easier than soft fields?

# Douglas Crockford (14 years ago)

On 11:59 AM, Brendan Eich wrote:

Really, it's starting to feel like "Survivor" or "American Idol" around here. The apples to oranges death-matching has to stop.

I don't mind a good deathmatch as long as it ends in death.

We will soon be at the point where we need to start culling the strawmen so we can focus on the stuff that will eventually go to standard. So we will have to reach consensus on the stuff that goes forward, essentially voting the other strawmen off the island.

But I agree about the apples and oranges part. The arguments all around need to be better targeted.

# David Herman (14 years ago)

Ok, I open it for discussion. Given soft fields, why do we need private names?

I believe that the syntax is a big part of the private names proposal. It's key to the usability: in my view, the proposal adds 1) a new abstraction to the object model for private property keys and 2) a new declarative abstraction to the surface language for creating these properties.

And I disagree.

I'm sorry, but what do you disagree with? That I am raising the question?

No, I disagree that the two are in direct competition with one another.

Below, you seem to be saying that given names, why do we need soft fields? How is that not a "there can be only one!" Highlander contest? If that's not what you're saying, then what?

It's not what I'm saying. I'm saying that private names are independently useful, and weak maps are independently useful, and given weak maps we don't need soft fields. It might have been less confusing if I had left out the latter point. Just to break it down as clearly as possible:

  • I don't believe soft fields obviate the utility of private names.
  • I don't believe private names obviate the utility of soft fields.
  • Given that soft fields are expressible via weak maps, I don't believe they are worth adding to the standard.

In fairness, I think the apples-to-apples comparison you can make between the two proposals is the object model. On that score, I think the private names approach is simpler: it just starts where it wants to end up (private names are in the object, with an encapsulated key), whereas the soft fields approach takes a circuitous route to get there (soft fields are semantically a side table, specified via reference implementation, but optimizable by storing in the object).

I happen to like weak maps very much, and very much hope for a world where ES a) makes it easy to write (any number of) soft field libraries and b) makes it easy to store encapsulated data in objects.

How do names make this easier than soft fields?

The syntax (see above).

# Mark S. Miller (14 years ago)

On Thu, Dec 16, 2010 at 5:24 PM, David Herman <dherman at mozilla.com> wrote:

Ok, I open it for discussion. Given soft fields, why do we need private names?

I believe that the syntax is a big part of the private names proposal. It's key to the usability: in my view, the proposal adds 1) a new abstraction to the object model for private property keys and 2) a new declarative abstraction to the surface language for creating these properties.

As shown on < strawman:inherited_explicit_soft_fields>,

the syntax you like applies equally well to both proposals. The fact that I don't like this syntax is not an argument (to one who does like the syntax) that we should do names. Were names adopted, I would still not like this syntax and would still argue against it. Were the syntax adopted, I would still argue that the syntax should be used to make soft fields more convenient, rather than make names more convenient. The arguments really are orthogonal.

And I disagree.

I'm sorry, but what do you disagree with? That I am raising the question?

No, I disagree that the two are in direct competition with one another.

Below, you seem to be saying that given names, why do we need soft fields? How is that not a "there can be only one!" Highlander contest? If that's not what you're saying, then what?

It's not what I'm saying. I'm saying that private names are independently useful, and weak maps are independently useful, and given weak maps we don't need soft fields. It might have been less confusing if I had left out the latter point. Just to break it down as clearly as possible:

  • I don't believe soft fields obviate the utility of private names.
  • I don't believe private names obviate the utility of soft fields.
  • Given that soft fields are expressible via weak maps, I don't believe they are worth adding to the standard.

In fairness, I think the apples-to-apples comparison you can make between the two proposals is the object model. On that score, I think the private names approach is simpler: it just starts where it wants to end up (private names are in the object, with an encapsulated key), whereas the soft fields approach takes a circuitous route to get there (soft fields are semantically a side table, specified via reference implementation, but optimizable by storing in the object).

I happen to like weak maps very much, and very much hope for a world where ES a) makes it easy to write (any number of) soft field libraries and b) makes it easy to store encapsulated data in objects.

How do names make this easier than soft fields?

The syntax (see above).

Ok, how do names with your syntax make this easier than soft fields with your syntax?

# David-Sarah Hopwood (14 years ago)

On 2010-12-17 01:24, David Herman wrote:

Mark Miller wrote:

Ok, I open it for discussion. Given soft fields, why do we need private names?

I believe that the syntax is a big part of the private names proposal. It's key to the usability: in my view, the proposal adds 1) a new abstraction to the object model for private property keys and 2) a new declarative abstraction to the surface language for creating these properties.

I don't like the private names syntax. I think it obscures more than it helps usability, and losing the x["id"] === x.id equivalence is a significant loss.

As Mark points out, though, that syntax can be supported with either proposal. The private names proposal is more entangled with syntactic changes, but that's a bug, not a feature.

In fairness, I think the apples-to-apples comparison you can make between the two proposals is the object model. On that score, I think the private names approach is simpler: it just starts where it wants to end up (private names are in the object, with an encapsulated key), whereas the soft fields approach takes a circuitous route to get there (soft fields are semantically a side table, specified via reference implementation, but optimizable by storing in the object).

The private names approach is not simpler. It's strictly more complicated for the same functionality. You can see that just by comparing the two proposals: in strawman:inherited_explicit_soft_fields the specification consists entirely of the given code for the SoftField abstraction. In practice you'd also add a bit of non-normative rationale concerning how soft fields can be efficiently implemented, but that's it.

In strawman:private_names (even exlcuding the syntactic changes, to give a fairer comparison), we can see a very significant amount of additional language mechanism, including:

  • a new primitive type, with behaviour distinct from any other type. This requires changes, not just to 'typeof' as the strawman page acknowledges, but to every other abstract operation in the spec that can take an arbitrary value. (Defining these values to be objects would simplify this to some extent, but if you look at how much verbiage each [Class] of objects takes to specify in ES5, possibly not by much.)

  • quite extensive changes to the behaviour of property lookup and EnvironmentRecords. (The strawman is quite naive in suggesting that only 11.2.1 step 6 needs to be changed here.)

  • changes to [[Put]] (for arrays and other objects) and to object literal initialization; also checking of all uses of [[DefineOwnProperty]] that can bypass [[Put]].

  • changes to a large number of APIs on Object.prototype and Object, the 'in' operator, JSON.stringify, and probably others.

None of these additional mechanisms and spec changes are needed in the soft field approach.

In addition, the proposal acknowledges that it only provides weak encapsulation, because of reflective operations accessing private properties. It justifies this in terms of the utility of "monkey patching", but this seems like a weak argument; it is not at all clear that monkey patching of private properties is needed. Scripts that did that would necessarily be violating abstraction boundaries and depending on implementation details of the code they are patching, which tends to create forward-compatibility problems. (This is sometimes true of scripts that monkey-patch public properties. I'm not a fan of monkey patching in general, but I think it is particularly problematic for private properties.)

There is some handwaving about the possibility of sandboxing environments being able to work around this deficiency, but the details have not been thought through; in practice I suspect this would be difficult and error- prone.

In general, I disagree with the premise that the best way to specify a language feature is to "start where it wants to end up", i.e. to directly specify the programmer's view of it. Of course the programmer's view needs to be considered in the design, but as far as specification is concerned, if a high-level feature cannot be specified by a fairly simple desugaring to lower-level features, then it's probably not a good feature.

# Brendan Eich (14 years ago)

On Dec 16, 2010, at 9:11 PM, David-Sarah Hopwood wrote:

On 2010-12-17 01:24, David Herman wrote:

Mark Miller wrote:

Ok, I open it for discussion. Given soft fields, why do we need private names?

I believe that the syntax is a big part of the private names proposal. It's key to the usability: in my view, the proposal adds 1) a new abstraction to the object model for private property keys and 2) a new declarative abstraction to the surface language for creating these properties.

I don't like the private names syntax. I think it obscures more than it helps usability, and losing the x["id"] === x.id equivalence is a significant loss.

As Chuck Jazdzewski pointed out, this equivalence does not hold for id not an IdentifierName.

The new equivalence under private names would be x[#.id] === x.id.

As Mark points out, though, that syntax can be supported with either proposal. The private names proposal is more entangled with syntactic changes, but that's a bug, not a feature.

No, that is a usability feature.

The inherited soft fields approach is more entangled with its reference implementation, which is not the efficient route VM implementors can swallow.

# Erik Corry (14 years ago)

On Dec 17, 2010 2:14 AM, "Douglas Crockford" <douglas at crockford.com> wrote:

On 11:59 AM, Brendan Eich wrote:

Really, it's starting to feel like "Survivor" or "American Idol" around

here. The apples to oranges death-matching has to stop.

I don't mind a good deathmatch as long as it ends in death.

We will soon be at the point where we need to start culling the strawmen

so we can focus on the stuff that will eventually go to standard. So we will have to reach consensus on the stuff that goes forward, essentially voting the other strawmen off the island.

Hear hear

But I agree about the apples and oranges part. The arguments all around

need to be better targeted.

# David Herman (14 years ago)

Let me make a gentle plea for not creating unnecessary controversy. Take a step back: we all seem to agree we would like to provide a more convenient and performant way to create private fields in objects. In terms of observable behavior in the runtime model, there aren't that many differences between your proposed soft fields and the original names proposal or Allen's recent revisions. There are a handful of points where we have different ideas about what the desired behavior should be, and those are worth discussing.

But let's please not battle over specification mechanism, especially not in this phase of design. We shouldn't jeopardize the process over whether it's better to conceptualize the feature as storing private fields in a side table or internally. Can we try to stay on track with the Harmony process, where we recognize that we have common goals and try to move forward from there and discuss individual features as objectively as possible, rather than engaging in winner-take-all wars?

Remember, the real enemy is the Romans:

http://www.youtube.com/watch?v=gb_qHP7VaZE
# Mark S. Miller (14 years ago)

On Fri, Dec 17, 2010 at 10:06 AM, David Herman <dherman at mozilla.com> wrote:

Let me make a gentle plea for not creating unnecessary controversy. Take a step back: we all seem to agree we would like to provide a more convenient and performant way to create private fields in objects.

Yes. We agree on more here than we disagree on. It is easy to lose sight of this because we have more reason to talk about the areas where we disagree. So to make the agreement explicit, yes, "we would like to provide a more convenient and performant way to create private fields in objects". Exactly. Thanks.

In terms of observable behavior in the runtime model, there aren't that many differences between your proposed soft fields and the original names proposal or Allen's recent revisions. There are a handful of points where we have different ideas about what the desired behavior should be, and those are worth discussing.

Yes. There aren't many, but they are important.

But let's please not battle over specification mechanism, especially not in this phase of design.

The specification mechanism is not at all central to the controversy from my side. Were the semantics of soft fields equivalently specified using your specification mechanism, I'd be perfectly happy. If that was the main source of disagreement from your side, perhaps we are closer to agreement than we thought.

This has come up before verbally, and my position has always been clear and consistent. For the record: Different spec mechanisms serve different purposes. In the end, it would be good to have multiple forms of specifications that we're confident are equivalent. For various reasons, I prefer to focus on specification by (approximate) self-hosting or desugaring wherever possible -- it helps me distinguish between new fundamentals vs new conveniences. For standards process reasons, I understand that only one form of spec should be normative. I have always made clear that the self-hosted specification need not be the normative one, as long as we're confident of exact equivalence

We shouldn't jeopardize the process over whether it's better to conceptualize the feature as storing private fields in a side table or internally.

Conceptualize for which purpose? For casual use, I think we're again in agreement. The reason I called them "soft fields" is to emphasize that they can be conceptualized as storing private fields internally. For implementation, we're in agreement again. That's the point of the transposed representation. For formal semantics, I feel strongly that the side table is the necessary right conceptualization, for all the reasons David-Sarah so well captured in his message.

Can we try to stay on track with the Harmony process, where we recognize that we have common goals and try to move forward from there and discuss individual features as objectively as possible, rather than engaging in winner-take-all wars?

The issue was never winner-take-all. The issue is all the many problems with the current names proposal, and my desire to see us avoid these problems. If we can agree on something as simple, secure, usable, and functional as soft fields, and with so little contagion across the rest of the spec, I'd be very happy. I don't care whether this is seen as a successor to current soft fields or current names. That was never the point.

Regarding the Harmony process, we are staying on track by objecting that the current names strawman has many problems that some of us consider fatal. With these fatal issues unresolved, we would violate the process to allow these to falsely claim consensus and more forward to "proposal". If you see raising such objections as getting off track, then I'm not sure what you see as the Harmony process.

Remember, the real enemy is the Romans:

http://www.youtube.com/watch?v=gb_qHP7VaZE

;)

Thanks for reminding us of our many points of agreement and helping us find a way forward.

# Brendan Eich (14 years ago)

On Dec 16, 2010, at 9:11 PM, David-Sarah Hopwood wrote:

I don't like the private names syntax. I think it obscures more than it helps usability, and losing the x["id"] === x.id equivalence is a significant loss.

Again, this equivalence has never held in JS for all possible characters in a string. But let's agree that it must hold where "id" happens to contain a lexically valid identifier in ES1-5.

It's still not holy writ, unchangeable. The private names proposal intentionally changes the equivalence:

x[#.id] === x.id

given either const id="id" or private id.

Is there a significant loss? For some people, could be. For others learning Harmony fresh (assuming private names makes it) the cognitive load may be higher but it's not a beginner topic. ES5, never mind Harmony, has advanced features that should not be introduced until students are prepared.

For users who can make good use of private names but find x[name] cumbersome and error-prone compared to x.name, the private name; declaration may be useful. It may be that such users aren't numerous enough for this syntax to be worth adding. We'll have a hard time proving this one way or another. More below.

As Mark points out, though, that syntax can be supported with either proposal.

No one ever disagreed on this point.

In fairness, I think the apples-to-apples comparison you can make between the two proposals is the object model. On that score, I think the private names approach is simpler: it just starts where it wants to end up (private names are in the object, with an encapsulated key), whereas the soft fields approach takes a circuitous route to get there (soft fields are semantically a side table, specified via reference implementation, but optimizable by storing in the object).

The private names approach is not simpler. It's strictly more complicated for the same functionality.

Dave clearly meant "simpler" by analogy to property names today. The "circuitous route" via transposition in inherited soft fields is not simple and it does not correspond to property lookup today.

If you could forget all you know about soft fields and weak maps for a minute, and try to imagine what a JS programmer who knows about today's objects and properties has to digest to understand what is going on, you might see what I mean.

Without "private x" syntax, the JS programmer has to grok something new in any event, as we all agree: x[name] where name is not converted to a string is a new thing under the sun, whether name is a private name, or the expression transposes as name.get(x) and name is a soft field.

You can see that just by comparing the two proposals: in strawman:inherited_explicit_soft_fields the specification consists entirely of the given code for the SoftField abstraction. In practice you'd also add a bit of non-normative rationale concerning how soft fields can be efficiently implemented, but that's it.

There's a higher point of order here: users need conceptually simple and usable features, and the spec serves users. Yes, the spec should not be overcomplicated, all else equal. Desugaring (not arbitrary compilation) to kernel semantics is preferable, if it sufficies for usability.

But (to channel TRON) we fight for the users: they need to be considered first, and throughout. Your entire exposition here about "simpler" vs. "more complicated" is about the spec, not the users.

In strawman:private_names (even exlcuding the syntactic changes, to give a fairer comparison), we can see a very significant amount of additional language mechanism, including:

  • a new primitive type, with behaviour distinct from any other type. This requires changes, not just to 'typeof' as the strawman page acknowledges, but to every other abstract operation in the spec that can take an arbitrary value. (Defining these values to be objects would simplify this to some extent, but if you look at how much verbiage each [Class] of objects takes to specify in ES5, possibly not by much.)

I've advocated a new object subtype and Allen's writeup mentions the idea. It's strictly easier to spec (by a lot). [[Class]] is not enumerated much, only checked against one or another string value. But this is a minor point.

  • quite extensive changes to the behaviour of property lookup and EnvironmentRecords. (The strawman is quite naive in suggesting that only 11.2.1 step 6 needs to be changed here.)

Did you read the whole proposal? It includes much more than 11.2.1 in proposing ToPropertyName replace ToString: Object.prototype.hasOwnProperty (ES5 15.2.4.5), Object.prototype.PropertyIsEnumerable (ES5 15.2.4.7) and the in operator (ES5 11.8.7) are all extended to accept private name values in addition to string values as property names. Where they currently call ToString on property names they will instead call ToPropertyName. The JSON.stringify algorithm (ES5 15.12.3) will be modified such that it does not process enumerable properties that have private name values as their property names.

All the Object reflection functions defined in ES5 section 15.2.3 that accept property names as arguments or return property names are extended to accept or produce private name values in addition to string values as property names. A private name value may appear as a property name in the collection of property descriptors passed to Object.create and Object.defineProperties. If an object has private named properties then their private name values will appear in the arrays returned by Object.getOwnPropertyNames and Object.keys (if the corresponding properties are enumerable).

  • changes to [[Put]] (for arrays and other objects) and to object literal initialization; also checking of all uses of [[DefineOwnProperty]] that can bypass [[Put]].

The proposal discusses [[Put]] and [[DefineOwnProperty]]. It's certainly not complete, but you write here as if it failed to mention these internal methods. No fair.

  • changes to a large number of APIs on Object.prototype and Object, the 'in' operator, JSON.stringify, and probably others.

See above.

None of these additional mechanisms and spec changes are needed in the soft field approach.

And that gets back to the higher point of order. Fight for the users, not for the programs.

In addition, the proposal acknowledges that it only provides weak encapsulation, because of reflective operations accessing private properties. It justifies this in terms of the utility of "monkey patching", but this seems like a weak argument; it is not at all clear that monkey patching of private properties is needed. Scripts that did that would necessarily be violating abstraction boundaries and depending on implementation details of the code they are patching, which tends to create forward-compatibility problems. (This is sometimes true of scripts that monkey-patch public properties. I'm not a fan of monkey patching in general, but I think it is particularly problematic for private properties.)

Users cannot monkey-patch Object.prototype now without fear of collision, and although Prototype.js does it still, monkey-patching other prototypes requires too much care as well. With private names, especially with the lexically bound x declared by 'private x', users can monkey-patch without fear of collision.

Never mind what you're a fan of. Could you say what is pariticularly problematic about private names, given that it resolves the collision problem?

There is some handwaving about the possibility of sandboxing environments being able to work around this deficiency, but the details have not been thought through; in practice I suspect this would be difficult and error- prone.

Nice suspicion without effort!

The sandboxing idea is simple to explain: if your code has no access to a private name N, then it can't reflect on it or gain unwanted access to it via a proxy. The idea is to wrap any such N returned via reflection APIs and passed through proxy traps with a wrapper capable only of being asked whether it is the same name as N. The reflecting or trapping could would have to possess N to be able to use the reflected or passed-in name.

I believe this restores equivalence between the two semantic models. If we made private names not leak, and separated the 'private x' syntax, then we would have a more apples-to-apples setting for evaluating the semantic models. I think we should do that, in order to make progress and avoid rehashing fights about name leaks and new syntax.

In general, I disagree with the premise that the best way to specify a language feature is to "start where it wants to end up", i.e. to directly specify the programmer's view of it. Of course the programmer's view needs to be considered in the design, but as far as specification is concerned, if a high-level feature cannot be specified by a fairly simple desugaring to lower-level features, then it's probably not a good feature.

This is too extreme, since taken as you word it, it would have banned function expressions from being added to ES3. Mapping function expressions to function declarations does not entail a "fairly simple desugaring".

# David-Sarah Hopwood (14 years ago)

On 2010-12-17 06:44, Brendan Eich wrote:

On Dec 16, 2010, at 9:11 PM, David-Sarah Hopwood wrote:

On 2010-12-17 01:24, David Herman wrote:

Mark Miller wrote:

Ok, I open it for discussion. Given soft fields, why do we need private names?

I believe that the syntax is a big part of the private names proposal. It's key to the usability: in my view, the proposal adds 1) a new abstraction to the object model for private property keys and 2) a new declarative abstraction to the surface language for creating these properties.

I don't like the private names syntax. I think it obscures more than it helps usability, and losing the x["id"] === x.id equivalence is a significant loss.

As Chuck Jazdzewski pointed out, this equivalence does not hold for id not an IdentifierName.

Of course not, because the syntax 'x.id' is not valid for id not an IdentifierName (in either ES5 or ES5 + private_names). Whenever we state semantic equivalences, we mean them to hold only for syntactically valid terms. So 'x' necessarily has to be something that can precede both '[_]' and '.', i.e. MemberExpression or CallExpression. Similarly, 'id' necessarily has to be something that can both occur within quotes and follow '.', i.e. it must be an IdentifierName.

(This has nothing to do with any difference between soft fields and private names. Anyway, for future reference, I rarely state the productions that syntactic variables range over when they can be unambiguously inferred as above.)

The new equivalence under private names would be x[#.id] === x.id.

... which is strictly weaker, more complex, and less explanatory. Let's simplify things by taking the '===' operator out of the picture. For ES5 we have x["id"] ≡ x.id in any context, and for ES5 + private_names we have x[#.id] ≡ x.id, where ≡ means substitutability.

x["id"] ≡ x.id relates the meaning of existing basic constructs: string literals, '[]', and '.'. In particular, it defines the semantics of . in terms of string literals and [], so that . need not be considered as being in the "core" or "kernel" of the language [*].

In the case of x[#.id] ≡ x.id, '#.id' is a new construct that is being added as part of the private names proposal. Furthermore, the meaning of '#.id' is context-dependent; it depends on whether a 'private id' declaration is in scope. So, what if we want to understand '.' in terms of existing constructs? Unfortunately, '#.id' must be primitive; there is nothing else that it can desugar to because 'private id' does not introduce an ordinary variable (unlike 'const id_ = SoftField()', say). Rather it introduces an element in an entirely new lexically scoped namespace alongside ordinary variables. This is fundamentally more complex than "id", which is just a stringification of the identifier.

[*] Yes, I know that the ES5 spec doesn't take a "kernel language approach" to defining ECMAScript. That doesn't mean it can't be understood that way, and it's very useful to be able to do so.

As Mark points out, though, that syntax can be supported with either proposal. The private names proposal is more entangled with syntactic changes, but that's a bug, not a feature.

No, that is a usability feature.

You're misunderstanding me.

The syntax could be considered a usability feature. Some people like it, others don't.

The fact that the proposal is entangled with that syntax, so that it is difficult to see its semantic consequences separate from the syntax, cannot possibly be considered a feature of the proposal, at the meta level of the language design process. At that level it's clearly undesirable -- as the course of the discussion has amply demonstrated!

The inherited soft fields approach is more entangled with its reference implementation, which is not the efficient route VM implementors can swallow.

I think you're being rather patronising to VM implementors (including yourself!) if you think that they're incapable of understanding how a feature specified in this way is intended to be implemented. Of course they can.

Specifying it in this way has very concrete benefits to VM implementors:

  • A test suite can directly compare this very simple reference implementation with the optimized implementation, to check that they give the same results in cases of interest. (It's still necessary to identify which cases are needed to give adequate test coverage, but that's no more difficult than in any other approach.)

  • Disputes about the validity of any proposed optimization can be resolved by asking what the reference implementation would do in that case.

  • The specification is concise and localised, without being cryptic. This kind of conciseness aids understanding.

Of course this style of specification also has potential disadvantages, the main one being a risk of overspecification. However, to argue against a particular proposal in this style, you need to say why that proposal overspecifies, not just handwave that it must. I have looked at the proposed definition of SoftField carefully and cannot see any overspecification in it. The only mildly tricky part is the treatment of the 'undefined' value, and it's fairly easy to see that that is handled as intended.

# Lasse Reichstein (14 years ago)

On Thu, 16 Dec 2010 23:19:12 +0100, Mark S. Miller <erights at google.com>

wrote:

On Thu, Dec 16, 2010 at 1:58 PM, Kris Kowal <kris.kowal at cixar.com> wrote:

On Thu, Dec 16, 2010 at 1:53 PM, David Herman <dherman at mozilla.com>
wrote:

function Point(x, y) { private x, y; this.x = x; this.y = y; ... }

than

function Point(x, y) { var _x = gensym(), _y = gensym(); this[_x] = x; this[_y] = y; }

I tend to disagree with most developers, so take it with a grain of salt that I find the latter form, with all the implied abilities, easier to understand.

I do too. While terseness clearly contributes to understandability, regularity and simplicity do too. When these conflict, we should be very careful about sacrificing regularity.

While I dislike the "private" syntax just as much, it does have the
advantage of being statically detectable as using a private name, both "this.foo" in
the scope of "private foo", and "this[#.foo]". The "gensym" syntax requires runtime checks to recognize that _x is a
non-string property name.

Currently is JS, x['foo'] and x.foo are precisely identical in all
contexts. This regularity helps understandability. The terseness difference above
is not an adequate reason to sacrifice it.

Agree. I would prefer something like x.#foo to make it obvious that it's
not the same as x.foo (also so you can write both in the same scope), and use "var bar = #foo /* or just foo */; x[#bar]" for computed private name
lookup. I.e. effectively introducing ".#", "[#" as alternatives to just "." or "[".

I'm not sure it's better to use an operator to reify the private name
("#foo") or it being directly denotable ("foo") in an expression context. Both can be used. In the latter case, x.#foo and x[#foo] would be equivalent.

# David-Sarah Hopwood (14 years ago)

On 2010-12-21 08:49, Lasse Reichstein wrote:

On Thu, 16 Dec 2010 23:19:12 +0100, Mark S. Miller <erights at google.com> wrote:

On Thu, Dec 16, 2010 at 1:58 PM, Kris Kowal <kris.kowal at cixar.com> wrote:

On Thu, Dec 16, 2010 at 1:53 PM, David Herman <dherman at mozilla.com> wrote:

[...]

than

function Point(x, y) { var _x = gensym(), _y = gensym(); this[_x] = x; this[_y] = y; }

I tend to disagree with most developers, so take it with a grain of salt that I find the latter form, with all the implied abilities, easier to understand.

I do too. While terseness clearly contributes to understandability, regularity and simplicity do too. When these conflict, we should be very careful about sacrificing regularity.

While I dislike the "private" syntax just as much, it does have the advantage of being statically detectable as using a private name, both "this.foo" in the scope of "private foo", and "this[#.foo]".

That's not correct in general, since '#.foo' is first-class. (The specific case "expr[#.foo]" is more easily optimizable without type inference, but that's a case in which the #. syntax need not have been used.)

The "gensym" syntax requires runtime checks to recognize that _x is a non-string property name.

Any expr[p] lookup needs a check for whether p is a string, when that cannot be determined by type inference. The check that it is a private name or soft field when when it is not a string is on the infrequent path, so will not significantly affect performance.

Currently is JS, x['foo'] and x.foo are precisely identical in all contexts. This regularity helps understandability. The terseness difference above is not an adequate reason to sacrifice it.

Agree. I would prefer something like x.#foo to make it obvious that it's not the same as x.foo (also so you can write both in the same scope), and use "var bar = #foo /* or just foo */; x[#bar]" for computed private name lookup. I.e. effectively introducing ".#", "[#" as alternatives to just "." or "[".

If we're going to add an operator specifically for private lookup, we only need one, for example:

function Point(x, y) { var x = SoftField(), y = SoftField(); this.#x = x; this.#x = y; }

(i.e. 'MemberExpression .# PrimaryExpression' or alternatively 'MemberExpression [ # Expression ]')

# Brendan Eich (14 years ago)

On Dec 20, 2010, at 11:05 PM, David-Sarah Hopwood wrote:

The new equivalence under private names would be x[#.id] === x.id.

... which is strictly weaker, more complex, and less explanatory.

So is a transposed get from an inherited soft field. Soft fields change the way square brackets work in JS, for Pete's sake!

Talk about more complex and less explanatory. Yes, if you know about weak maps and soft fields, then it follows -- that is a bit too circular, too much assuming the conclusion.

Either way (soft fields vs. private names), something changes from the old x["id"] / x.id equivalence.

So, what if we want to understand '.' in terms of existing constructs? Unfortunately, '#.id' must be primitive; there is nothing else that it can desugar to because 'private id' does not introduce an ordinary variable (unlike 'const id_ = SoftField()', say).

SoftField(), #.id -- something new in either case. And what's this "const id_"? A gensym?

It's tiresome to argue by special pleading that one extension or transformation (including generated symbols) is "more complex, and less explanatory", while another is less so, when the judgment is completely subjective. And the absolutism about how it's always better in every instance to use strong encapsulation is, well, absolutist (i.e., wrong).

We should debate strong vs. weak encapsulation, for sure, and in the other thread you started (thanks for that). But without absolutes based on preferences or judgment calls about trade-offs and economics. People differ on abstraction leak costs and make different trade-offs in programming all the time.

Rather it introduces an element in an entirely new lexically scoped namespace alongside ordinary variables. This is fundamentally more complex than "id", which is just a stringification of the identifier.

I agree that "private x" adds complexity to the spec. It adds something to solve a use-case not satisfied by the existing language. There's (again) a trade-off, since with this new syntax, the use-cases for private names become more usably expressible.

The fact that the proposal is entangled with that syntax, so that it is difficult to see its semantic consequences separate from the syntax, cannot possibly be considered a feature of the proposal, at the meta level of the language design process.

Didn't I already agree that it's a good idea to separate "private x" from the semantics, since we have a conflict over semantics?

So let's do that (my plea to everyone, not just you). Let's separate "private x" syntax, since I now know of a use-case courtesy Mark, and it's a good one (a frozen AST being extended sparsely via soft fields) that wants that "private x" and the sweet dot operator syntax, but on top of soft fields not private property names that require unfrozen objects.

The inherited soft fields approach is more entangled with its reference implementation, which is not the efficient route VM implementors can swallow.

I think you're being rather patronising to VM implementors (including yourself!) if you think that they're incapable of understanding

I wrote "can swalow" not "can understand". "Swallow" and "understand" have pretty different connotations.

Mapping from soft fields to something more efficient that VM implementors will implement is non-trivial. Requiring all implementors (the primary audience of ECMA-262) to do this mapping, each on his or her own, is a bad idea. The spec should use formalisms that are not at odds with common implementation. But let's wait to hear from more implementors on this point.

In the mean time, how about we quit fencing over matters of taste or trade-offs turned into false absolutes, and try to get ahead on semantics: the issues that remain even after separating syntax are the abstraction leaks.

With inherited soft fields, the ability to "extend" frozen objects with private fields is an abstraction leak (and a feature, I agree).

With inherited soft fields, the transposed get or set magic that changes how square brackets work in JS is a leak on the inside of the abstraction. If you don't like x[#.id] / x.id supplanting x["id"] / x.id, it seems to me you have to count some similar demerits against this change.

With private names as proposed in full, the #.id syntax which can reflect a private name as an expression result, including the typeof-type or built-in class of a private name, is a definitely both new complexity that makes an overt observable difference between soft fields and private names. No such operator for soft fields.

The weak encapsulation design points are likewise "leaky" for private names, where no such leaks arise with soft fields: reflection and proxies can learn private names, they "leak" in the real ocap sense that secure subsets will have to plug.

To make progress, we could try to agree on strong encapsulation only. TC39 works by consensus, meaning general agreement, so we may not achieve consensus on strong encapsulation, but if we could, then I think almost all our semantic quarrels go away, since nothing other than frozen objects being "extensible" via soft fields, but not via private names, would be observable. Perhaps we could even agree that this was a feature of soft fields and be done.

If we somehow all agreed in committee (meaning, without you :-P) on strong encapsulation, then private names wouldn't reflect as values, period. I raise this even though it looks like it won't get consensus, to give it a fair and clear try.

If we can't get consensus in favor of only strong encapsulation, we might try for consensus in favor of weak encapsulation, with secure subset languages having a solid and demonstrated way to restore strong encapsulation at relatively low cost. But that would have to be solid and demonstrated.

I hope this helps. I'm not looking to debate to the death over all of private names vs. all of soft fields, since I'm pretty sure neither will have total victory. We do not want to end up with nothing, if there is a "something" we all could agree on that would materially help developers.

In this light, I'm still sympathetic to weak encapsulation. Mainstream languages do not lock all escape hatches: java.lang.reflect discloses private members, e.g. Languages that try to lock all escape hatches fail or breed extensions, often wildly unsafe. In particular, only closures in JS make leak-proof encapsulations and no one (I hope!) wants to change this (debuggers do not count).

I also see the ocap purity of soft fields, and I like Mark's AST-decorated-sparsely soft fields use-case. But we already have weak maps in harmony:proposals, so one can write such code now, just at some loss of convenience: without square brackets or (even better) dots for convenient soft-field access expressions.

This makes me think we want usable syntax for soft fields, as for anything like private property names. So to close, again I'd like to urge consensus building by splitting out the syntactic proposals where we can. Or even just mentally separating them for now.

# Oliver Hunt (14 years ago)

Just giving my feedback to this (so it's recorded somewhere other than my head).

I find the apparent necessity of conflating syntax and semantics irksome, i'd much rather that there be two distinct discussions one of syntax and the other for semantics of soft-fields, private names, gremlins, etc (or whatever this general concept ends up being called)

That said I don't really like the private names syntax, mostly for reasons others have bought up already, but one case I don't recall seeing is something that I think will lead to surprise on the part of developers. Say i have a piece of code:

function MyAwesomeThing() { .... }

MyAwesomeThing.prototype.myCoolFunction = function() { if (!this._myCachedHotness) this._myCachedHotness = doExpensiveThing(this) return this._myCachedHotness; }

I see this nifty private names feature, and say "cool! now i can make my cache super secret!" and do:

MyAwesomeThing.prototype.myCoolFunction = function() { private cachedHotness; if (!this.cachedHotness) this.cachedHotness = doExpensiveThing(this) return this.cachedHotness; }

I would expect this to work. That's what the syntax makes me think. But it won't work because 'cachedHotness' is going to be different on every call (at least to my reading).

I am not trying to argue that making the above work is impossible -- you just need to use a few closures to get everything into the right place. But it is contrary to what I might expect or want.

wrt. proxies, I still think that we should just allow all non-objects property names to transfer through uncoerced. It would solve the problem of communicating private names to a proxy and allow efficient array-like implementations.

# Brendan Eich (14 years ago)

On Dec 21, 2010, at 4:01 PM, Oliver Hunt wrote:

Just giving my feedback to this (so it's recorded somewhere other than my head).

I find the apparent necessity of conflating syntax and semantics irksome, i'd much rather that there be two distinct discussions one of syntax and the other for semantics of soft-fields, private names, gremlins, etc (or whatever this general concept ends up being called)

That said I don't really like the private names syntax, mostly for reasons others have bought up already, but one case I don't recall seeing is something that I think will lead to surprise on the part of developers. Say i have a piece of code:

function MyAwesomeThing() { .... }

MyAwesomeThing.prototype.myCoolFunction = function() { if (!this._myCachedHotness) this._myCachedHotness = doExpensiveThing(this) return this._myCachedHotness; }

I see this nifty private names feature, and say "cool! now i can make my cache super secret!" and do:

MyAwesomeThing.prototype.myCoolFunction = function() { private cachedHotness; if (!this.cachedHotness) this.cachedHotness = doExpensiveThing(this) return this.cachedHotness; }

I would expect this to work. That's what the syntax makes me think. But it won't work because 'cachedHotness' is going to be different on every call (at least to my reading).

Why does your expectation differ here compared to the following:

MyAwesomeThing.prototype.myCoolFunction = function() { var cachedHotness = gensym(); if (!this[cachedHotness]) this[cachedHotness] = doExpensiveThing(this) return this[cachedHotness]; }

Is it because |private cachedHotness;| does not "look generative"?

I am not trying to argue that making the above work is impossible -- you just need to use a few closures to get everything into the right place. But it is contrary to what I might expect or want.

wrt. proxies, I still think that we should just allow all non-objects property names to transfer through uncoerced. It would solve the problem of communicating private names to a proxy and allow efficient array-like implementations.

You mean non-string property names, right?

But what is an array index, then? uint32 is not a type in the language. Would proxy[3.14] really pass a double through?

Array elements are named by a weird uint32 index name, with consequences

# Oliver Hunt (14 years ago)

On Dec 21, 2010, at 4:25 PM, Brendan Eich wrote:

On Dec 21, 2010, at 4:01 PM, Oliver Hunt wrote:

Just giving my feedback to this (so it's recorded somewhere other than my head).

I find the apparent necessity of conflating syntax and semantics irksome, i'd much rather that there be two distinct discussions one of syntax and the other for semantics of soft-fields, private names, gremlins, etc (or whatever this general concept ends up being called)

That said I don't really like the private names syntax, mostly for reasons others have bought up already, but one case I don't recall seeing is something that I think will lead to surprise on the part of developers. Say i have a piece of code:

function MyAwesomeThing() { .... }

MyAwesomeThing.prototype.myCoolFunction = function() { if (!this._myCachedHotness) this._myCachedHotness = doExpensiveThing(this) return this._myCachedHotness; }

I see this nifty private names feature, and say "cool! now i can make my cache super secret!" and do:

MyAwesomeThing.prototype.myCoolFunction = function() { private cachedHotness; if (!this.cachedHotness) this.cachedHotness = doExpensiveThing(this) return this.cachedHotness; }

I would expect this to work. That's what the syntax makes me think. But it won't work because 'cachedHotness' is going to be different on every call (at least to my reading).

Why does your expectation differ here compared to the following:

MyAwesomeThing.prototype.myCoolFunction = function() { var cachedHotness = gensym(); if (!this[cachedHotness]) this[cachedHotness] = doExpensiveThing(this) return this[cachedHotness]; }

Is it because |private cachedHotness;| does not "look generative"?

Yes. The fact that we know how they are implemented behind the scenes may make the non-obvious appear obvious.

I am not trying to argue that making the above work is impossible -- you just need to use a few closures to get everything into the right place. But it is contrary to what I might expect or want.

wrt. proxies, I still think that we should just allow all non-objects property names to transfer through uncoerced. It would solve the problem of communicating private names to a proxy and allow efficient array-like implementations.

You mean non-string property names, right?

Yes

But what is an array index, then? uint32 is not a type in the language. Would proxy[3.14] really pass a double through?

Yes, I would expect no coercion of any non-object. The reason for disallowing objects is safety afaik, those arguments don't apply to non-objects.

Array elements are named by a weird uint32 index name, with consequences on 'length' (but only up to 2^32 - 1 for length). I don't think passing the property name through uncoerced helps, unless you assume a normalizing layer above all name-based operations that specializes to index-names per Array's uint32 magic weirdness.

And people are welcome to implement those semantics if they so desire. I just see no reason to artificially limit behaviour. Especially when the alternative is to say that the argument to the trap "will be a string, except in this case that you don't expect". I think it's better to be consistent and simply allow all non-objects through. This allows better perf for some use cases, and to my mind simpler semantics in the face of things like private names.

# Brendan Eich (14 years ago)

On Dec 21, 2010, at 4:51 PM, Oliver Hunt wrote:

But what is an array index, then? uint32 is not a type in the language. Would proxy[3.14] really pass a double through? Yes, I would expect no coercion of any non-object. The reason for disallowing objects is safety afaik, those arguments don't apply to non-objects.

Array elements are named by a weird uint32 index name, with consequences on 'length' (but only up to 2^32 - 1 for length). I don't think passing the property name through uncoerced helps, unless you assume a normalizing layer above all name-based operations that specializes to index-names per Array's uint32 magic weirdness.

And people are welcome to implement those semantics if they so desire.

If engines do not agree on whether 0xffffffff as a property name goes through a proxy get trap as a number and not a string, we have a problem.

Not all engines optimize 0xffffffff to the same (uint32) value; some keep it as a string since it doesn't fit in an int32.

I just see no reason to artificially limit behaviour.

The spec must prescribe exactly what is coerced and what is not, or we lose interoperation.

Engines that choose to optimize id to a union of string with int32, e.g., might need to change. Some engines use tagged words still, so 31-bit signed int, not int32.

It's not clear every implementor will agree. It's also not obvious why the spec must dictate implementation here if the performance has been tuned already for non-proxy classes, and the results were whatever they were (different among implementations, probably; non-standard, definitely). Why should proxies cause retuning in some value-neutral way that assumes a certain dynamic frequency of id types?

This looks like over-specification.

# Oliver Hunt (14 years ago)

On Dec 21, 2010, at 5:00 PM, Brendan Eich wrote:

On Dec 21, 2010, at 4:51 PM, Oliver Hunt wrote:

But what is an array index, then? uint32 is not a type in the language. Would proxy[3.14] really pass a double through? Yes, I would expect no coercion of any non-object. The reason for disallowing objects is safety afaik, those arguments don't apply to non-objects.

Array elements are named by a weird uint32 index name, with consequences on 'length' (but only up to 2^32 - 1 for length). I don't think passing the property name through uncoerced helps, unless you assume a normalizing layer above all name-based operations that specializes to index-names per Array's uint32 magic weirdness.

And people are welcome to implement those semantics if they so desire.

If engines do not agree on whether 0xffffffff as a property name goes through a proxy get trap as a number and not a string, we have a problem.

Not all engines optimize 0xffffffff to the same (uint32) value; some keep it as a string since it doesn't fit in an int32.

What does that have to do with anything? That's an internal implementation detail, not something that is directly observable from js (you can direct indirectly through timing, etc)

I just see no reason to artificially limit behaviour.

The spec must prescribe exactly what is coerced and what is not, or we lose interoperation.

Okay, this and the prior comment indicate that you're missing what I am saying.

I am not suggesting that we expose the internal optimisations for avoiding int->string->int.

I am being very precise: anything that is not an object goes is passed through with no coercion of any kind.

eg. assuming the get trap is: function getTrap(property) { log(property + ": " + typeof property); }

myProxy[0] => 0: number

myProxy[1.5] => 1.5: number

myProxy["0"] => 0: string

myProxy[true] => true: boolean

myProxy[undefined] => undefined: undefined

myProxy[null] => null: object // questionable - null is classed as an object but i doubt many people actually think of it in that way

myProxy[somePrivateNameThingy] => ???: ???? // I have no idea what typeof privatename or String(privateName) are expected to do

# David Flanagan (14 years ago)

On 12/21/2010 04:25 PM, Brendan Eich wrote:

Why does your expectation differ here compared to the following:

MyAwesomeThing.prototype.myCoolFunction = function() { var cachedHotness = gensym(); if (!this[cachedHotness]) this[cachedHotness] = doExpensiveThing(this) return this[cachedHotness]; }

Is it because |private cachedHotness;| does not "look generative"?

I agree with Oliver: the private keyword is going to cause confusion. It looks like it is declaring something, not generating something. A small step toward making the proposed syntax less Java-like (and therefore less likely to cause confusion) might be:

use private cachedHotness;

A use directive feels vaguely more comfortable here to me. It makes it clearer to the programmer that some kind of magic is going on.

But I confess that I haven't actually read Allen's proposal, so take this with a grain of salt.

# Brendan Eich (14 years ago)

On Dec 21, 2010, at 5:09 PM, Oliver Hunt wrote:

On Dec 21, 2010, at 5:00 PM, Brendan Eich wrote:

On Dec 21, 2010, at 4:51 PM, Oliver Hunt wrote:

But what is an array index, then? uint32 is not a type in the language. Would proxy[3.14] really pass a double through? Yes, I would expect no coercion of any non-object. The reason for disallowing objects is safety afaik, those arguments don't apply to non-objects.

Array elements are named by a weird uint32 index name, with consequences on 'length' (but only up to 2^32 - 1 for length). I don't think passing the property name through uncoerced helps, unless you assume a normalizing layer above all name-based operations that specializes to index-names per Array's uint32 magic weirdness.

And people are welcome to implement those semantics if they so desire.

If engines do not agree on whether 0xffffffff as a property name goes through a proxy get trap as a number and not a string, we have a problem.

Not all engines optimize 0xffffffff to the same (uint32) value; some keep it as a string since it doesn't fit in an int32.

What does that have to do with anything? That's an internal implementation detail, not something that is directly observable from js (you can direct indirectly through timing, etc)

It matters to implementors what they might have to convert back to if the object is a proxy.

This started because you said "It would solve the problem of communicating private names to a proxy and allow efficient array-like implementations." Two issues:

  1. Some people do not want to communicate private names to proxies. Others do, but there's no problem in principle, whether private names are a new typeof-type or just a built-in object [[Class]]. Your proposal here does nothing for the people who object, and doesn't really matter for those in favor of leaking private names via proxy handler trap name parameters.

  2. "allow efficient array-like implementations" -- not really. Arrays must equate "42" and 42, and must update length for indexes in [0, 0xfffffffe] (closed range notation). Other properties must be string-equated. While it helps to get 42 instead of "42", it does not help to get 3.14 instead of "3.14" if you are implementing an array-like.

You want arbitrary values as identifiers to flow through to the proxy handler trap's name param without coercion. But many engines currently do coerce, and to internal types (not in-language) types.

So either implementations have to use a generic Value type for all property names above the per-[[Class]] implementation layer, and coerce only under that layer; or else keep their optimized above-the-[[Class]]-layer property-name type encodings (using internal types, etc.) and undo the coercion in the Proxy (object and function proxy) implementations, to implement what you want.

I don't believe all implementations will re-layer to let values pass through uncoerced. This leaves undoing the coercion in proxy trap-calling code, which is not only more expensive for "3.14" -> 3.14, it is ambiguous: the original type was lost due to coercion, so you're really talking about requiring all engines to layer things so property names are never coerced (apart from string intern'ing).

I don't see that flying with all implementors, and I don't see it buying array-like proxies much. It still looks like overspecification.

To let private names through, we need only enlarge the type of names in ECMA-262 from string to (string | private name) as the proposal says. To let array indexes through, we might do something similar, but it would not let 3.14 through as a double.

# Brendan Eich (14 years ago)

On Dec 21, 2010, at 5:13 PM, David Flanagan wrote:

On 12/21/2010 04:25 PM, Brendan Eich wrote:

Why does your expectation differ here compared to the following:

MyAwesomeThing.prototype.myCoolFunction = function() { var cachedHotness = gensym(); if (!this[cachedHotness]) this[cachedHotness] = doExpensiveThing(this) return this[cachedHotness]; }

Is it because |private cachedHotness;| does not "look generative"?

I agree with Oliver: the private keyword is going to cause confusion. It looks like it is declaring something, not generating something.

It's a fair objection for sure.

Mark made a different objection in strawman:inherited_explicit_soft_fields#can_we_subsume_names: "I (MarkM) do not like the sugar proposed for Names, as I think it encourages confusion between literal text and lexical lookup."

That is, without private_names, the "p" in the expression |o.p| is literal text, not lexically bound as it could be via a prior |private p|. This too is a fair objection, not an absolute as Mark's words convey, rather an informed opinion given with pretty-clear reason backing it.

So (must remember to say this), thanks to Oliver and Mark for these objections (and to David for amplifying).

A small step toward making the proposed syntax less Java-like (and therefore less likely to cause confusion) might be:

use private cachedHotness;

We have that kind of syntax reserved for pragmas:

strawman:pragmas

It seems even less generative, since pragmas are typically compile-time (or compile-and-runtime, but at least compile-time).

A use directive feels vaguely more comfortable here to me. It makes it clearer to the programmer that some kind of magic is going on.

But I confess that I haven't actually read Allen's proposal, so take this with a grain of salt.

What's generative in JS already? Mutable object "literals":

function f() {} // inside the outer myCoolFunction var o = {p:1}; var a = [1,2]; var r = /hi/; // thanks to ES5 recognizing reality over ES3

That suggests the obvious:

private cachedHotness = gensym();

but then the gensym() initializer, which does indeed scream "new generated value here on every evaluation!", becomes both proper notice and (over time) deadwood, boilerplate. Plus, what's gensym calling? Maybe that built-in is shadowed.

We could try shortening and using a keyword for unambiguous and novel generativity-here notice to readers:

private cachedHotness = new;

I admit, it looks weird. Lose the =, try for special-form syntax like a function declaration uses:

private cachedHotness {};

Does this "look generative"?

How about a "more special" special form:

new private cachedHotness;

Not generative-looking enough?

Discussion list fodder, glad we have es-discuss for this stuff.

# Allen Wirfs-Brock (14 years ago)

On Dec 21, 2010, at 5:57 PM, Brendan Eich wrote:

A use directive feels vaguely more comfortable here to me. It makes it clearer to the programmer that some kind of magic is going on.

But I confess that I haven't actually read Allen's proposal, so take this with a grain of salt.

What's generative in JS already? Mutable object "literals":

function f() {} // inside the outer myCoolFunction var o = {p:1}; var a = [1,2]; var r = /hi/; // thanks to ES5 recognizing reality over ES3

That suggests the obvious:

private cachedHotness = gensym();

but then the gensym() initializer, which does indeed scream "new generated value here on every evaluation!", becomes both proper notice and (over time) deadwood, boilerplate. Plus, what's gensym calling? Maybe that built-in is shadowed.

We could try shortening and using a keyword for unambiguous and novel generativity-here notice to readers:

private cachedHotness = new;

I admit, it looks weird. Lose the =, try for special-form syntax like a function declaration uses:

private cachedHotness {};

Does this "look generative"?

How about a "more special" special form:

new private cachedHotness;

Not generative-looking enough?

Discussion list fodder, glad we have es-discuss for this stuff.

/be

This is a useful line of discussion. Up to this point there has been quite a bit of "I don't like the syntax in the private names proposal" but not a lot of suggested alternatives.

My request to anyone who wants some sort of private object state access is: propose the syntax you want to use.

In doing so, you will have do decide what constraints you are going to impose upon yourself. In the private name proposal, a constraint we imposed was that we would only use keywords that were already identified as future reserved words in the ES5 spec. That pretty much left "private" and "protected" as usable keywords. After some private conversations I concluded that "private" was probably going to be a better choice than "protected".

How much of the discomfort about the proposed syntax comes form implications associated the word "private"? What if we allowed expansion of the set of available keywords. Would any of the following be more comfortable:

property cachedHotness; field cachedHotness; name cachedHotness;

Alternatively, if we eliminated the ability to use dotted access syntax for private properties/fields we wouldn't need a declarator for at all. We could get by with something like: const cachedHotness = gensym(); //this was new Name() in the original Names proposal const obj = {}; obj[cachedHotness] = foo.hotness;

This still requires the ToPrivateName extension of the [ ] semantics described in the "Accessing Private Names as Values" section of the proposal. It eliminates the ability to say obj.cachedHotness and have it mean anything related to the value that was gensym'ed. obj.chachedHotness would always mean obj["cachedHotness"] exactly as it does now. It would also eliminate any scoping issues related to private names.

However, a consequence of eliminating the private declaration is that we also eliminate the obvious way to extend object literals to support private properties: const obj = { cachedHotness: foo.hotness //not the same as obj[cachedHotness] = foo.hotness; in the above example. }

This seems quite undesirable as I think quite a few of us favor declarative object construction using object literals over imperative object construction.

There are a lot of trade-off like this that went into the private names proposal. I'd be happy to see other alternative complete proposals. I'm also happy to talk about the rationale behind each feature of the proposal and to consider alternatives. However, there are a number of inter-related design decisions and you can't necessarily change just one of them without impacting the others.

# Allen Wirfs-Brock (14 years ago)

On Dec 21, 2010, at 4:01 PM, Oliver Hunt wrote:

function MyAwesomeThing() { .... }

MyAwesomeThing.prototype.myCoolFunction = function() { if (!this._myCachedHotness) this._myCachedHotness = doExpensiveThing(this) return this._myCachedHotness; }

I see this nifty private names feature, and say "cool! now i can make my cache super secret!" and do:

MyAwesomeThing.prototype.myCoolFunction = function() { private cachedHotness; if (!this.cachedHotness) this.cachedHotness = doExpensiveThing(this) return this.cachedHotness; }

I would expect this to work. That's what the syntax makes me think. But it won't work because 'cachedHotness' is going to be different on every call (at least to my reading).

I am not trying to argue that making the above work is impossible -- you just need to use a few closures to get everything into the right place. But it is contrary to what I might expect or want.

I don't think there is any new issue here we don't already have with things like closure captured object state or prototype construction such as:

function awesomeFactory(a,b) { const awesomeProto = { ...}; //oops, better move this outside of awesomeFactory!! return Object.create(awesomeProto, {a: {value:a}, b: {value:b}}); }

Personally, I think function own (aka static) declarations are a good solution to this problem and could be apply equally well to private name declarations. However, that's not something that I want to put on the table at this time.

# Allen Wirfs-Brock (14 years ago)

On Dec 21, 2010, at 2:12 PM, Brendan Eich wrote:

I also see the ocap purity of soft fields, and I like Mark's AST-decorated-sparsely soft fields use-case. But we already have weak maps in harmony:proposals, so one can write such code now, just at some loss of convenience: without square brackets or (even better) dots for convenient soft-field access expressions.

(I'm not sure where Mark's original AST use case is, or I would have also quoted it.)

To me, this use case sounds like a form of aspect oriented programming. I'm not at all sure that AOP support is something we want to add as an additional requirement to our designs. As Brendan points out, if you really want to do this, you can use weak maps whose inclusion I strongly support for exactly this sort of use case.

However, why would you bother freezing your AST nodes in the first place. JavaScript has a great mechanism for "soft fields" -- it's called properties. You can even make your base properties non-configurable if you want to. But why make them non-extensible in this situation. My sense that this is a great fear of an important segment of the JavaScript usage community. That people will start arbitrarily freezing or otherwise locking down objects resulting in systems that are much less "elastic" then they are today.

# Brendan Eich (14 years ago)

On Dec 21, 2010, at 8:17 PM, Allen Wirfs-Brock wrote:

However, why would you bother freezing your AST nodes in the first place. JavaScript has a great mechanism for "soft fields" -- it's called properties. You can even make your base properties non-configurable if you want to. But why make them non-extensible in this situation.

For safer parallelization: see, e.g., www.ics.uci.edu/~franz/Site/pubs-pdf/ICS-TR-07-12.pdf, or its successor: www.springerlink.com/content/u16t58805r879263 (trace-trees of linear SSA, not ASTs, but no matter).

The pattern is pretty common in compilers these days (rustc, the self-hosted Rust compiler, has an immutable AST; McPeak & Wilkerson's Elsa/Oink tools from UCB, which we at Mozilla used at the start for our C++ static analysis work over four years ago, had an immutable AST). It's not AOP.

We're looking into data parallel extensions to JS, not anywhere near ready to propose for standardization, but plausible now given the ability to freeze.

My sense that this is a great fear of an important segment of the JavaScript usage community. That people will start arbitrarily freezing or otherwise locking down objects resulting in systems that are much less "elastic" then they are today.

This is not a relevant fear in my view. It's also kind of silly given all the open source JS libraries. If someone did over-freeze, you could stop using their library, or fork and fix it. Libraries that suck tend to die fast.

Mark did bring up freezing primordials recently, and I know that causes some "Dr. Freeze" fear (even on this list the other year, from Arv, IIRC). Nevertheless, it's simply not credible that we on TC39 will agree to freeze primordials in any ECMA-262 edition I can foresee.

Sometimes fear is an appropriate reaction. The lamb fears the wolf. When some overwhelming force threatens you, be afraid. But there is no Freeze Force both willing and able to take over the JS world.

We don't need to be afraid of well-used immutability for safety and parallelization. Such filter-pipeline architectures do need weak maps or better to associate filter-specific fields with shared immutable data.

This can of course be done explicitly, but the implicit "private x" or (to a lesser extent) the transposed square-bracket access of implicit soft fields, look strictly easier to use, albeit at the price that dherman pointed out: implicit side table access using property syntax is confusing, it makes for a syntax vs. mental model conflict.

# Alex Russell (14 years ago)

On Dec 21, 2010, at 9:38 PM, Brendan Eich wrote:

On Dec 21, 2010, at 8:17 PM, Allen Wirfs-Brock wrote:

However, why would you bother freezing your AST nodes in the first place. JavaScript has a great mechanism for "soft fields" -- it's called properties. You can even make your base properties non-configurable if you want to. But why make them non-extensible in this situation.

For safer parallelization: see, e.g., www.ics.uci.edu/~franz/Site/pubs-pdf/ICS-TR-07-12.pdf, or its successor: www.springerlink.com/content/u16t58805r879263 (trace-trees of linear SSA, not ASTs, but no matter).

The pattern is pretty common in compilers these days (rustc, the self-hosted Rust compiler, has an immutable AST; McPeak & Wilkerson's Elsa/Oink tools from UCB, which we at Mozilla used at the start for our C++ static analysis work over four years ago, had an immutable AST). It's not AOP.

We're looking into data parallel extensions to JS, not anywhere near ready to propose for standardization, but plausible now given the ability to freeze.

My sense that this is a great fear of an important segment of the JavaScript usage community. That people will start arbitrarily freezing or otherwise locking down objects resulting in systems that are much less "elastic" then they are today.

This is not a relevant fear in my view. It's also kind of silly given all the open source JS libraries. If someone did over-freeze, you could stop using their library, or fork and fix it. Libraries that suck tend to die fast.

That's...an interesting reading of recent history.

Mark did bring up freezing primordials recently, and I know that causes some "Dr. Freeze" fear (even on this list the other year, from Arv, IIRC).

And from me right this minute.

Nevertheless, it's simply not credible that we on TC39 will agree to freeze primordials in any ECMA-262 edition I can foresee.

Sometimes fear is an appropriate reaction. The lamb fears the wolf. When some overwhelming force threatens you, be afraid. But there is no Freeze Force both willing and able to take over the JS world.

We don't need to be afraid of well-used immutability for safety and parallelization. Such filter-pipeline architectures do need weak maps or better to associate filter-specific fields with shared immutable data.

So long as the application of freezing is restricted to the uses at hand and doesn't find its way into the drinking water (ice-nine style).

This can of course be done explicitly, but the implicit "private x" or (to a lesser extent) the transposed square-bracket access of implicit soft fields, look strictly easier to use, albeit at the price that dherman pointed out: implicit side table access using property syntax is confusing, it makes for a syntax vs. mental model conflict.

/be


es-discuss mailing list es-discuss at mozilla.org, mail.mozilla.org/listinfo/es-discuss

-- Alex Russell slightlyoff at google.com slightlyoff at chromium.org alex at dojotoolkit.org BE03 E88D EABB 2116 CC49 8259 CF78 E242 59C3 9723

# Brendan Eich (14 years ago)

On Dec 21, 2010, at 10:03 PM, Alex Russell wrote:

This is not a relevant fear in my view. It's also kind of silly given all the open source JS libraries. If someone did over-freeze, you could stop using their library, or fork and fix it. Libraries that suck tend to die fast.

That's...an interesting reading of recent history.

What recent history? Please cite some specifics.

ES5 isn't even implemented in final versions of shipping browsers, so overuse of its Object.freeze can't be a historical fact yet.

Mark did bring up freezing primordials recently, and I know that causes some "Dr. Freeze" fear (even on this list the other year, from Arv, IIRC).

And from me right this minute.

What are you afraid of?

Nevertheless, it's simply not credible that we on TC39 will agree to freeze primordials in any ECMA-262 edition I can foresee.

Sometimes fear is an appropriate reaction. The lamb fears the wolf. When some overwhelming force threatens you, be afraid. But there is no Freeze Force both willing and able to take over the JS world.

We don't need to be afraid of well-used immutability for safety and parallelization. Such filter-pipeline architectures do need weak maps or better to associate filter-specific fields with shared immutable data.

So long as the application of freezing is restricted to the uses at hand and doesn't find its way into the drinking water (ice-nine style).

I've used that metaphor, it is apt when transitively freezing a graph, while developing your freeze-based code.

As a runaway that freezes the web, forever? C'mon. It's not even plausible as a worm vector, let alone a standardization mistake that developers reject. I'm not sure what we are talking about at this point (I hope not "Cat's Cradle").

# Alex Russell (14 years ago)

On Dec 21, 2010, at 10:14 PM, Brendan Eich wrote:

On Dec 21, 2010, at 10:03 PM, Alex Russell wrote:

This is not a relevant fear in my view. It's also kind of silly given all the open source JS libraries. If someone did over-freeze, you could stop using their library, or fork and fix it. Libraries that suck tend to die fast.

That's...an interesting reading of recent history.

What recent history? Please cite some specifics.

ES5 isn't even implemented in final versions of shipping browsers, so overuse of its Object.freeze can't be a historical fact yet.

Mark did bring up freezing primordials recently, and I know that causes some "Dr. Freeze" fear (even on this list the other year, from Arv, IIRC).

And from me right this minute.

What are you afraid of?

Nevertheless, it's simply not credible that we on TC39 will agree to freeze primordials in any ECMA-262 edition I can foresee.

Sometimes fear is an appropriate reaction. The lamb fears the wolf. When some overwhelming force threatens you, be afraid. But there is no Freeze Force both willing and able to take over the JS world.

We don't need to be afraid of well-used immutability for safety and parallelization. Such filter-pipeline architectures do need weak maps or better to associate filter-specific fields with shared immutable data.

So long as the application of freezing is restricted to the uses at hand and doesn't find its way into the drinking water (ice-nine style).

I've used that metaphor, it is apt when transitively freezing a graph, while developing your freeze-based code.

As a runaway that freezes the web, forever? C'mon. It's not even plausible as a worm vector, let alone a standardization mistake that developers reject. I'm not sure what we are talking about at this point (I hope not "Cat's Cradle").

I fear APIs that freeze, only take frozen objects or only have versions that do, or are so mutability-hostile that they warp our use of the language toward frozen-by-default constructs. Those are the sorts of things that spread it.

-- Alex Russell slightlyoff at google.com slightlyoff at chromium.org alex at dojotoolkit.org BE03 E88D EABB 2116 CC49 8259 CF78 E242 59C3 9723

# Brendan Eich (14 years ago)

On Dec 21, 2010, at 10:17 PM, Alex Russell wrote:

On Dec 21, 2010, at 10:14 PM, Brendan Eich wrote:

I fear APIs that freeze, only take frozen objects or only have versions that do, or are so mutability-hostile that they warp our use of the language toward frozen-by-default constructs. Those are the sorts of things that spread it.

Spread it how, pray tell?

Putting Object.create/freeze/etc. in ES5 (let's leave aside the API design errors that people debate) does not create a wolf in the fold, or a runaway water crystallization threat. No coercive entity forces developers to use any of the freezy bits here.

You think TC39 will make APIs that only work with frozen objects? Or HTML5 or Web Apps in the w3c? There's no evidence for this fear, no obvious way these APIs could be deployed in a mixed-browser version market, and plenty of evidence that developers -- both web and browser -- would reject such attempts. I know a bunch of us at Mozilla would.

I think we've gone way off the technical beam here. There are valid uses for immutability, in building systems that have safety and parallelization properties. That's a fact. We're not going to reject freeze from JS just out of fear that someone might become Dr. Freeze. If that happens, call Batman. Better: be Batman.

# David Flanagan (14 years ago)

Brendan wrote:

use private cachedHotness; We have that kind of syntax reserved for pragmas:

I know, and that is why I threw it out there. To me, this kind of messing around with the meaning of identifiers feels like a compile-time thing, not a runtime thing... Allen's proposal treats "private" like "var" and "const", and it makes me uneasy because it seems as if there is something deeply non-parallel there. More below, but for now, I withdraw the "use private" suggestion.

On 12/21/2010 07:33 PM, Allen Wirfs-Brock wrote:

My request to anyone who wants some sort of private object state access is: propose the syntax you want to use.

Okay. I've read the strawman now, and I'm ready to make a sketchy proposal:

What we do currently for weak encapsulation (where currently weak means advisory only) is prefix our identifiers with underscores. I would like it if the private names syntax just made something like this work to give us a stronger form of weak encapsulation. The strawman already uses the # character, so let's stick with that. I propose that if you prefix an identifier with #, you get a private name:

 var counter = {
    #count: 0;
    next: function() { return this.#count++; }
    reset: function() { this.#count = 0; }
 };

This is just what we might write today, but with # instead of _.

In order for this to work you have to abandon the idea of scoped private identifiers. I say: make all private identifiers scoped to the compilation unit. Names wouldn't be private within a file or <script> or

eval'ed string, but they would be private across files. Given that this encapsulation can be defeated with Object.getOwnPropertyNames anyway, and given that programmers who really need within-file privacy can skip this convenient syntax and create their own Name objects, I think this ought to be good enough. (Though it does have important consequences for script concatenation.)

The second part of my proposal is a little more shaky. The # token, by itself, would evaluate to an opaque value such that:

  o.#foo === o[# + "foo"]

This means that to share private names (all of them) across compilation unit boundaries, you'd share the # value. (I'm not sure whether I'd propose that Allen's #.foo syntax for converting just a single private identifier to a Name object be retained or discarded.)

Note that this addresses the concern Oliver raised about accidentally generating new private names on each invocation of a constructor. With this proposal, #foo means the same thing everywhere and on every invocation within a compilation unit.

Also note that the use of the # prefix removes the need for the whole "Private Declarations Exist in a Parallel Environment" section of the strawman.

# Kam Kasravi (14 years ago)

Caja is a good example of looking at the ramifications of freezing objects especially related to the DOM. MarkM (and others) could comment on its impacts. For YUI 2.8 - there were some problems when Yahoo cajoled the library, but I believe less than expected. We had some app breakage for 3rd party apps that were cajoled for open social as well.

Would extending the descriptor in Object.defineProperty(obj, prop, descriptor) to provide a meta-value for private field be less confusing to the user?

  • where the descriptor could be extended in have a field of visible or private?

{value: 'foo', private: true}

though it doesn't make much sense to have

{value:'foo',private:true,enumerable:true,writable:true}

in which case an exception could be thrown about incompatible descriptor values...

# Brendan Eich (14 years ago)

[Resending as you did to the right group, although I filter both to the same folder and prune dups. Whee! /be]

On Dec 21, 2010, at 10:22 PM, David-Sarah Hopwood wrote:

On 2010-12-21 22:12, Brendan Eich wrote:

On Dec 20, 2010, at 11:05 PM, David-Sarah Hopwood wrote:

Please retain all relevant attribution lines.

Oops, sorry about that.

But really, lighten up. I don't think anyone mixed up your words for mine.

Brendan Eich wrote:

The new equivalence under private names would be x[#.id] === x.id.

You said "under private names" here, but it should actually be "under the syntax proposed for private names".

Didn't I propose separating "private x" syntax two messages ago (but after I wrote the cited bit you snip here), and repeat that request last message? Argh.

It applies to that syntax with either the soft fields or private names semantics.

No. #.id does not apply to soft fields.

... which is strictly weaker, more complex, and less explanatory.

So is a transposed get from an inherited soft field. Soft fields change the way square brackets work in JS, for Pete's sake!

They do not.

Ok, then I'm arguing with someone else on that point. But I still believe apples-to-apples comparison means mapping the same syntax to the two different semantics.

If the "private x" syntax, without #.id, is supported on soft fields, there's still a change to evaluating identifiers on the right of dot and the left of colon-in-object-initialisers, which has some complexity cost. This particular lexical binding and lookup cost doesn't change whether you use soft field side tables or private-named object properties for the value storage. At the very bottom of your reply, you return to this point (cited below).

So, what if we want to understand '.' in terms of existing constructs? Unfortunately, '#.id' must be primitive; there is nothing else that it can desugar to because 'private id' does not introduce an ordinary variable (unlike 'const id_ = SoftField()', say).

SoftField(), #.id -- something new in either case.

<sarcasm> Oh, OK, it obviously doesn't matter what we add to the language, it's all the same. Library abstractions, new syntax, major changes in semantics, who cares? Something new is something new. Let's just roll a bunch of dice and pick proposals at random. </sarcasm>

This is sad. You can do better, read what I wrote, model it fairly and respond directly.

Sheesh. A library class, specified in terms of existing language constructs, is not the same as a new primitive construct, and does not have the same consequences for language complexity.

No, we still disagree. "Executable specifications" have complexity and bugs. Weak maps (however harmonious) are complex too. Users who understand properties and objects do not generally know about "ephemeron tables". It still seems to me (and others who've sounded off) that you are discounting some complexity as a starting point for your favored approach, and ignoring the further complexity inherent in building library code as spec.

And what's this "const id_"? A gensym?

A possible convention for naming variables holding private names.

Ok. That wasn't clear.

It doesn't matter, you're picking on details.

I'm trying to understand what you wrote!

We are all detail-oriented here, no need to take offense all of a sudden. You yourself always wield a fine-toothed comb on details ;-).

It's tiresome to argue by special pleading that one extension or transformation (including generated symbols) is "more complex, and less explanatory", while another is less so, when the judgment is completely subjective. And the absolutism about how it's always better in every instance to use strong encapsulation is, well, absolutist (i.e., wrong).

I gave clear technical arguments in that post. If you want to disagree with them, disagree with specific arguments, rather than painting me as an absolutist. (I'm not.)

Here's a quote: "As you can probably tell, I'm not much impressed by this counterargument. It's a viewpoint that favours short-termism and code that works by accident, rather than code that reliably works by design."

How do you expect anyone to respond? By endorsing bugs or programming based on partial knowledge and incomplete understanding? Yet the real world doesn't leave us the option to be perfect very often. This is what I mean by absolutism.

Weak encapsulation is defensible. It's not inherently wrong in all settings. When prototyping, weak or even no encapsulation is often the right thing, but you have to be careful with prototypes that get pressed into products too quickly (I should know). JS is used to prototype all the time.

Ok, of course prototypers need not use any of these facilities. But (it shouldn't surprise you to hear) I think prototype builders would benefit from very convenient private field support of some kind. Dave said so in his reply to Mark recently. We've all been burned by JS's softness and late error reporting.

So rather than argue for strong encapsulation by setting up a straw man counterargument you then are not much impressed by, decrying short-termism, etc., I think it would be much more productive to try on others' hats and model their concerns, including for usable syntax.

Does this make sense? I'm really not looking for a very long reply, and I'm not trying for some rhetorical advantage. But boy, it sure seems like you are.

We should debate strong vs. weak encapsulation, for sure, and in the other thread you started (thanks for that). But without absolutes based on preferences or judgment calls about trade-offs and economics.

Tell you what, I'll debate based on the things I think are important, and you debate based on the things you think are important. Agreed?

Debate takes two people agreeing on proposition and rules. I'm not sure we can debate profitably, but I'm giving it another try here.

To be clear, it's not the syntax itself, but the parallel namespace introduced by 'private x' that I find problematic in terms of both specification complexity, and conceptual complexity for programmers.

We did strive to make this namespace static.

You pointed out in a separate reply that #.id can promote private names from this static parallel namespace into runtime, but not in any scoped (dynamic) fashion. Just a value reflection, usable in brackets. No parallel runtime lexical and after-dot/before-colon scope chains.

It adds something to solve a use-case not satisfied by the existing language. There's (again) a trade-off, since with this new syntax, the use-cases for private names become more usably expressible.

It isn't at all clear that there aren't alternative syntaxes that would achieve the usability benefit while not being subject to the criticisms that have been made of the current syntax proposal. Lasse Reichstein posted some possibilities (.# or [#]). The syntax design space has been barely explored in the discussion so far.

I agree we need to explore more. Allen posted recently arguing for alternatives, but not pot-shots or concern-trolling -- coherent, worked out alternatives.

I'm not keen on adding # as a sigil for private names, but this is mostly because such things are ugly, Perlish line noise. Under the "explicit is better than implicit" philosophy, and in particular the desire to eliminate even a static (compile-time only) parallel namespace, maybe.

Usability experiments are tricky. Too many confounders, too little time to explore all the paths. We might be able to take at most two or three of syntax strawmen and test them on users.

Better if we can find something that wins general acclamation and keeps it. Clearly, we're not there yet with "private x" and sigil-free this.x.

The fact that the proposal is entangled with that syntax, so that it is difficult to see its semantic consequences separate from the syntax, cannot possibly be considered a feature of the proposal, at the meta level of the language design process.

Didn't I already agree that it's a good idea to separate "private x" from the semantics, since we have a conflict over semantics?

It's clear how to do that for the soft field semantics, which are defined as a library abstraction.

How do the proponents of private names propose to do that? (This is a technical question, not a rhetorical one.)

Change ToString to ToPropertyName where needed, follow your nose. Allen did list multiple sections (you never responded on that point).

There is probably a more principled way to do it, but again: just because soft fields can be a library on weak maps does not make that approach globally simpler, compared to widening property names from string to be (string | private name).

We no doubt could spend a long time fighting over how to measure complexity and score the two approaches, but touching more places in the spec is only part of the cost of private names.

So let's do that (my plea to everyone, not just you). Let's separate "private x" syntax, since I now know of a use-case courtesy Mark, and it's a good one (a frozen AST being extended sparsely via soft fields) that wants that "private x" and the sweet dot operator syntax, but on top of soft fields not private property names that require unfrozen objects.

I can't parse that sentence; please clarify.

See followups in the thread.

[snipping VM implementor stuff, we agree that we need VM implementors to speak up.]

There is a separate discussion to be had about whether the form of executable specification MarkM has used (not to be confused with the semantics) is the best form to use for any final spec. Personally, I like this form of specification: I think it is clear, concise (which aids holding the full specification of a feature in short-term memory), easy to reason about relative to other approaches, useful for prototyping, and useful for testing.

I don't mind at all that the correspondance with the implementation is less direct than it would be in a more operational style; implementors often need to handle less direct mappings than this, and I don't expect a language specification to be a literal description of how a language is implemented in general (excluding naive reference implementations).

Once again, you've argued about what you like, with subjective statements such as "I don't mind".

I'm not sure how to respond, except to reiterate that the practical downsides of the executable spec approach remain over-specification and a mismatch with the (secondary) user and (primary) implementor audiences of the spec.

With inherited soft fields, the ability to "extend" frozen objects with private fields is an abstraction leak (and a feature, I agree).

How is it an abstraction leak? The abstraction is designed to allow this; it's not an accident (I'm fairly sure, without mind-reading MarkM).

If I give you an object but I don't want you "adding" fields to it, what do I do? Freezing works with private names, but it does not with soft fields.

With private names, the inability to "extend" frozen objects with private fields is a significant limitation.

Can you try taking a different point of view and exploring it, for a change? :-/

With inherited soft fields, the transposed get or set magic that changes how square brackets work in JS is a leak on the inside of the abstraction.

"that changes how square brackets work in JS" is out of place here; we are discussing issues that remain after separating syntax, to the extent possible. In the inherited soft field proposal by itself, the syntax is 'field.get(obj)' and there is no magic. That is the proposal that I favour, absent some better proposal for the syntax.

I must have been arguing with Mark. Any proposal where x[n] becomes n.get(x) not only leaks inside the abstraction, but outside -- you can't write x.n or x.m for any m. Only square-bracketed indexing would work.

But it's finally clear you aren't in favor of this bracket syntax, so please ignore.

If you don't like x[#.id] / x.id supplanting x["id"] / x.id, it seems to me you have to count some similar demerits against this change.

If we compare both proposals with the additional syntax, they are equally "magical"; the only difference is whether the magic is built-in or the result of a desugaring.

Agreed. Disagree on desugaring to weak maps, not counting weak map complexity, or costs in executable spec approach, globally winning.

The weak encapsulation design points are likewise "leaky" for private names, where no such leaks arise with soft fields: reflection and proxies can learn private names, they "leak" in the real ocap sense that secure subsets will have to plug.

As I said earlier, designers of secure subsets would prefer that this leak not exist in the first place, rather than having to plug it. Regardless of your statements above, this is not an absolutist position; the onus is on proponents of weak encapsulation to say why it is useful to have the leak (by technical argument, not just some vague philosophical position against strong encapsulation).

We're arguing about the default in a mostly-compatible next edition of JavaScript. JS has only closures for strong encapsulation. You can't make a "technical argument" or proof that strong encapsulation must be presumed the default, so how can you put the onus on proponents of weak encapsulation to make any such bogus argument?

This whole paragraph, "onus" in particular, is absolutism on parade.

Programmers use no-encapsulation and weak encapsulation in JS every day. On a purely "sociology of programming languages" or pedagogical level, you're selling some powerful hopium if you think this will change overnight to strong encapsulation by default. And hey, there's Alex Russell worrying about Mr. Freeze.

To be fair, I think that trends and tendencies do matter. The language is being used at larger scale, where integrity and other properties need automated enforcement (at the programmer's discretion).

But there's no onus reversal or technical standard of proof that will help us make design decisions. I know you want strong encapsulation. Others want weak. Now what?

[snipping stuff where we agree -- which was nice, btw!]

Right. The syntactic and semantic proposals for soft fields are already separated; it is only private names proposal that does not separate syntax and semantics. It would be helpful to fix that, although I'm not sure it is possible to completely separate the syntax and semantics of the parallel namespace for private names.

To compare apples to apples, we'd want to do that. I don't see why we couldn't, with enough effort. Not tonight, though.

# David Flanagan (14 years ago)

On 12/21/2010 11:58 PM, Brendan Eich wrote:

I'm not keen on adding # as a sigil for private names, but this is mostly because such things are ugly, Perlish line noise. Under the "explicit is better than implicit" philosophy, and in particular the desire to eliminate even a static (compile-time only) parallel namespace, maybe.

Ruby uses @, and while that still looks like Perl line noise, it doesn't in practice seem to get in the way all that much: there just aren't that many of them in most Ruby code, in my experience. (Part of that is because of Ruby's metaprogramming methods attr_reader and attr_accessor that define @ fields and corresponding getter and setter methods.)

And speaking of Ruby and attr_reader, could the need for new syntax be reduced or eliminated with a sufficiently clever metaprogramming API? For example, and using an ungainly ES5-style function name:

// Create and return a private name, and define a getter "foo" for it var name = Object.definePrivateProperty(o, "foo"); o.foo // => anyone can query the private property o[name] = new_value; // Setting the property requires the name object

(I don't know how this would handle private fields in object literals, though)

# David Herman (14 years ago)

In order for this to work you have to abandon the idea of scoped private identifiers. I say: make all private identifiers scoped to the compilation unit.

This is the part of your suggestion that I don't like: it makes private identifiers too blunt a tool. You end up sharing with more code than you want, and when you refactor by pulling code out of one compilation unit and into another, you share less than you want. Lexical scope is a tried-and-true mechanism for controlling sharing, and it works better than compilation units. Moreover, we don't even have a clear notion of compilation unit in the language.

But your idea suggests yet another alternative worth adding to our growing pantheon. We could allow for the scoping of private names, but always require them to be prefixed by the sigil. This way there's no possibility of mixing up public and private names. So to use an earlier example from this thread (originally suggested to me by Allen):

function Point(x, y) {
    private #x, #y;
    this.#x = x;
    this.#y = y;
}

For your counter example, the original names proposal allowed for object literals to declare private properties, and the private name was scoped to the entire literal. So you'd write:

var counter = { private #count: 0; next: function() { return this.#count++; } reset: function() { this.#count = 0; } };

This is of course still strictly noisier than the original private names proposal, but it does have the advantage of never capturing public names by private declarations. It doesn't, however, address your concern about generativity. Personally, I like the generativity; I think it matches up with the use cases. But I acknowledge that it might be subtle.

# David Flanagan (14 years ago)

On 12/22/2010 01:02 AM, David Herman wrote:

In order for this to work you have to abandon the idea of scoped private identifiers. I say: make all private identifiers scoped to the compilation unit.

This is the part of your suggestion that I don't like: it makes private identifiers too blunt a tool.You end up sharing with more code than you want, and when you refactor

by pulling code out of one compilation unit and into another, you share less than you want. Lexical scope is a tried-and-true mechanism for controlling sharing, and it works better than compilation units. Moreover, we don't even have a clear notion of compilation unit in the language.

It is blunt, and I'm not thrilled with it. But it might satisfy the most common use cases without preventing authors from using private names "manually" when more control is required...

But your idea suggests yet another alternative worth adding to our growing pantheon. We could allow for the scoping of private names, but always require them to be prefixed by the sigil. This way there's no possibility of mixing up public and private names. So to use an earlier example from this thread (originally suggested to me by Allen):

I do think that the conceptual clarity of requiring the sigil everywhere might well outweigh the ugliness of the sigil.

 function Point(x, y) {
     private #x, #y;
     this.#x = x;
     this.#y = y;
 }

I keep seeing this basic constructor example. But isn't this the case that Oliver raised of private being over generative? Those private names have been generated and discarded, and those two fields can never be read again...

For your counter example, the original names proposal allowed for object literals to declare private properties, and the private name was scoped to the entire literal. So you'd write:

var counter = {
   private #count: 0;
   next: function() { return this.#count++; }
   reset: function() { this.#count = 0; }
};

I like private as a keyword in object literals: it doesn't seem any more confusing than get and set in literals. I don't like seeing it in functions though: there it looks like a kind of var and const analog. I suppose that allowing it only within object literals would eliminate too many important use cases, however.

This is of course still strictly noisier than the original private names proposal, but it does have the advantage of never capturing public names by private declarations. It doesn't, however, address your concern about generativity. Personally, I like the generativity; I think it matches up with the use cases. But I acknowledge that it might be subtle. Dave

Is there any syntax from the old ES4 namespace stuff that could be applied here? Instead of declaring individual private identifers, could we declare a private namespace identifier, and then use that namespace identifier as a prefix for private properties within the namespace? No sigil would be required then....

At this point, I'm just thinking out loud, so I'll call it a night.

# David Herman (14 years ago)

On Dec 22, 2010, at 2:00 AM, David Flanagan wrote:

On 12/22/2010 01:02 AM, David Herman wrote:

function Point(x, y) {
    private #x, #y;
    this.#x = x;
    this.#y = y;
}

I keep seeing this basic constructor example. But isn't this the case that Oliver raised of private being over generative? Those private names have been generated and discarded, and those two fields can never be read again...

Oops, I left out the ellipses:

function Point(x, y) {
    private #x, #y;
    this.#x = x;
    this.#y = y;
    ...
}

Of course, if you wanted to extend the scope further, you could lift it out of the constructor.

As for the complaint of it being over-generative, that's mitigated in this case by the sigil. For example, if you wrote:

function Point(x, y) {
    private #x, #y;
    this.#x = x;
    this.#y = y;
}
Point.prototype = {
    ... #x ... #y ...
};

you'd get a compile-time error since #x and #y aren't in scope. Unless, of course, they are already in scope as another private, although I'd expect this kind of thing to be a bit rarer than variable scope errors since I would guess private names wouldn't be nested and repurposed as often as variables -- that's just a guess; it's hard to be sure.

Also, you can only take the "but if you do it wrong, it doesn't work" arguments so far. After all, the generativity is by design. The question is whether that design will be too surprising and confusing. We shouldn't make JS too complicated or baroque, but we shouldn't nix an idea based on assuming too little of programmers. IOW, I think the "too complicated" criticism should be used with competent programmers in mind.

Anyway, I'm also just thinking out loud. :)

I like private as a keyword in object literals: it doesn't seem any more confusing than get and set in literals. I don't like seeing it in functions though: there it looks like a kind of var and const analog.

Isn't this less the case when what follows the keyword isn't an ordinary identifier, i.e., has the sigil?

Is there any syntax from the old ES4 namespace stuff that could be applied here?

An interesting thought, but I'm skeptical -- ES4 namespaces are pretty dis-Harmonious, and for good reason: there be dragons. :)

# Peter van der Zee (14 years ago)

On Wed, Dec 22, 2010 at 3:26 PM, David Herman <dherman at mozilla.com> wrote:

As for the complaint of it being over-generative, that's mitigated in this case by the sigil. For example, if you wrote:

function Point(x, y) {
    private #x, #y;
    this.#x = x;
    this.#y = y;
}
Point.prototype = {
    ... #x ... #y ...
};

you'd get a compile-time error since #x and #y aren't in scope. Unless, of course, they are already in scope as another private, although I'd expect this kind of thing to be a bit rarer than variable scope errors since I would guess private names wouldn't be nested and repurposed as often as variables -- that's just a guess; it's hard to be sure.

What about adding an attribute to properties that somehow identify which classes (in the prototype chain for protected) have access to the object? I'll leave the "somehow" up in the air, but you could introduce a [[Private]] attribute which, if not undefined, says which context must be set (and for protected, either directly or through the prototypal chain of the current context) to gain access to this property. And if that context is not found, some error is thrown. Maybe it would be [[EncapsulationType]] :: {private, protected, public} and [[EncapsulationContext]] :: <?>. You could

also add a simple api to check for these (isPrivate, isProtected, isPublic, hasEncapsulatedProperty, etc) depending on how it would affect "in" and enumeration.

Pro's:

  • meaning of private will be more to what people expect
  • minimal "magic" going on, trying to access a private property out of scope should result in a proper error
  • possibly less impact on the spec (although I'm not sure there...)
  • no need to introduce a new type/class to denote private properties

Con's:

  • what should be the value of [[Private]]
  • there are probably still complications in the specification
  • "weak encapsulation"

Unknowns:

  • what about inheritance and prototype?
  • in, enumeration, etc?

I don't really see weak encapsulation as an issue. If you really want unreachable variables you can use a closure. In that regard, it seems to me like the private keyword wouldn't really add anything new.

Syntactical specs aside, it would be something like this:

function F(n){ private x = n; // private this.x = n; // this.#x = n; // .. or whatever } F.prototype.set = function(n){ this.x = n; }; F.prototype.get = function(){ return this.x; };

// the method does not carry the "private" burden F.prototype.get.call({x:13}); // 13, this is fine because the property being accessed was never defined to be private

// simple access of private x var f = new F(10); f.set(20); // ok f.get(); // 20 log(f.x); // error, accessing private property x

// global function that doesn't inherit function fail(){ return this.x; } fail.call(f); // error, fail is not allowed to access private variable x

// G, child of F function G(){} G.prototype = new F(16); // (G.prototype.x becomes 16? or remains undefined?) G.prototype.test = function(){ this.x = 4; };

var g = new G(); g.test(); // is ok? or need to make "protected" distinction? fail.call(g); // fail, accessing private x

Again, not proposing syntax. Not stepping into that minefield ;)

I personally don't like adding more magic to the language. Adding fields that do or don't exist depending on context, especially when we have to create a new type or class for it, doesn't seem to be in line with the language. Having it throw an error on bad access seems better to me, and not something that's unexpected. That way the field will still always exist when it would in current contexts. It would simply not be read/write-able from all contexts that properties are currently.

It would not be strong encapsulation (as demonstrated) but we already have a truly strong method of encapsulation (closures). Is it really worth it to drastically change the specification just to add the "classic" notion of private? In all the (recent) threads, I've not yet seen another reason to include it.

And I'd rather introduce a new attribute than a new global magic auto-type, to handle the "being private" part of this strawman.

# Kyle Simpson (14 years ago)

What about adding an attribute to properties that somehow identify which classes (in the prototype chain for protected) have access to the object? I'll leave the "somehow" up in the air, but you could introduce a [[Private]] attribute which, if not undefined, says which context must be set (and for protected, either directly or through the prototypal chain of the current context) to gain access to this property. And if that context is not found, some error is thrown. Maybe it would be [[EncapsulationType]] :: {private, protected, public} and [[EncapsulationContext]] :: <?>. You could also add a simple api to check for these (isPrivate, isProtected, isPublic, hasEncapsulatedProperty, etc) depending on how it would affect "in" and enumeration.

I’m assuming (perhaps incorrectly) that this suggestion is to model the flag of the private vs. non-private as a “property descriptor” that can be set by Object.defineProperty(). Am I correct?

If so, I think that makes a lot of sense. I would like private to work that way.

Of course, the setting of private would probably have to be one-way, like configurable is, so that such a property could be made un-private by another context.

BTW, pardon (and ignore) me if I just stepped on an ant-bed and confused the whole topic. I’ve been following this thread silently and mostly felt like it was much more complicated than I could understand. Peter’s post was the first one that seemed to make sense. :)

# Kevin Smith (14 years ago)
# Brendan Eich (14 years ago)

On Dec 22, 2010, at 6:26 AM, David Herman wrote:

On Dec 22, 2010, at 2:00 AM, David Flanagan wrote:

On 12/22/2010 01:02 AM, David Herman wrote:

function Point(x, y) {
    private #x, #y;
    this.#x = x;
    this.#y = y;
}

I keep seeing this basic constructor example. But isn't this the case that Oliver raised of private being over generative? Those private names have been generated and discarded, and those two fields can never be read again...

Oops, I left out the ellipses:

function Point(x, y) {
    private #x, #y;
    this.#x = x;
    this.#y = y;
    ...
}

This is for instance-private instance variables.

If you want class-private instance variables, use

const Point = (function () { private #x, #y; return function Point(x, y) { this.#x = #x; this.#y = #y; ... }; })();

(Allen mentioned function-own, aka static, as a way to avoid the module pattern nesting and syn-tax, but that's a separate proposal.)

I'm still sympathetic to Oliver's objection that declaration-style "private #x, #y" does not "look generative" enough. Agree that the sigil addresses Mark's concern about confusing literal identifiers with lexically bound names, at a Perlish price.

David F. mentioned script concatenation. It happens freely on the web, and it is already biting us because of premature "use strict" usage where parts of the concatenation violate strict mode and most browsers don't check yet (bugzilla.mozilla.org/show_bug.cgi?id=579119).

To me this is the nail in the coffin for "compilation unit private name scope". I'm with dherman: lexical scope with a declaration for bindings, but it is not clear how to make the declaration look more generative. It seems important for similar things to look alike, and different-in-generativity/etc. things to look different (somehow).

# Allen Wirfs-Brock (14 years ago)

I think there are some interesting ideas to explore in both D. Flanagan's proposal and D. Herman's variations upon it. However, they both seem to be ignoring the second primary use case that I identified: conflict-free extensions of build-in or third party objects. While naming conventions or a sigil seems to be satisfactory to many as a way to implement "weak encapsulation". I don't think it works for the extension case.

# Allen Wirfs-Brock (14 years ago)

On Dec 22, 2010, at 9:47 AM, Brendan Eich wrote:

I'm still sympathetic to Oliver's objection that declaration-style "private #x, #y" does not "look generative" enough. Agree that the sigil addresses Mark's concern about confusing literal identifiers with lexically bound names, at a Perlish price.

David F. mentioned script concatenation. It happens freely on the web, and it is already biting us because of premature "use strict" usage where parts of the concatenation violate strict mode and most browsers don't check yet (bugzilla.mozilla.org/show_bug.cgi?id=579119).

To me this is the nail in the coffin for "compilation unit private name scope". I'm with dherman: lexical scope with a declaration for bindings, but it is not clear how to make the declaration look more generative. It seems important for similar things to look alike, and different-in-generativity/etc. things to look different (somehow).

Consider

function f() { var captured; //this generates a new long-lived data store return {get value() {return captured}, set value(n) {captured=n}}; };

or

function g() { function inner() {}; return inner; }

I don't see why private foo; is any more or less generative than: var captured; or function inner() {};

They are all are declarative forms and all implicitly generate new runtime entities each time they are evaluated.

# Brendan Eich (14 years ago)

On Dec 22, 2010, at 10:07 AM, Allen Wirfs-Brock wrote:

I don't see why private foo; is any more or less generative than: var captured; or function inner() {};

They are all are declarative forms and all implicitly generate new runtime entities each time they are evaluated.

The function case looks different enough, even if special in no common way with object and array initialisers or regexps. All four of functionos, object&array initialisers, and regexp literals evaluated to fresh mutable objects, and that's just something (or things) to know about the language.

For var captured; vs. private foo; there's a strange difference. These look quite alike but the former allocates storage for one value and creates a binding to that store. The latter does both of those things and generates a new private name.

I agree that users could learn this and fold it into their knowledge of how the language works. It would be another "thing to know". It still seems slightly off because var x; (no initializer) uses undefined. The private name generation is novel, but not conveyed by the syntax.

# Brendan Eich (14 years ago)

On Dec 21, 2010, at 11:58 PM, Brendan Eich wrote:

... which is strictly weaker, more complex, and less explanatory.

So is a transposed get from an inherited soft field. Soft fields change the way square brackets work in JS, for Pete's sake!

They do not.

Ok, then I'm arguing with someone else on that point.

Many of us were wondering where my (shared) square brackets change for soft fields memory came from, and re-reading the numerous wiki pages. Finally we found what I had recollected in writing the above:

strawman:inherited_explicit_soft_fields#can_we_subsume_names

There, after Mark's demurral ("I (MarkM) do not like the sugar proposed for Names, ..."), is this:

"If we wish to adopt the same sugar for soft fields instead, then

private key; ... base[key] ...

could expand to

const key = SoftField(); ... key.get(base) ....

If there are remaining benefits of Names not addressed above, here would be a good place to list them. If we can come to consensus that soft fields do subsume Names, then “Name” becomes a possible choice for the name of the “SoftField” constructor."

This is clearly pitting soft fields against names in full, including a change to JS's square bracket syntax as sugar.

The hedging via "If" and the demurral do not remove this from the soft fields "side" of the death match I've been decrying, indeed they add to it. This is making a case for dropping names in full in favor of soft fields in (mostly -- no dot operator or object literal support) comparable fullness.

For the record, and in case there's a "next time": I don't think it's good form to chop up proposals made on the wiki (inherited soft fields, explicit soft fields, inherited explicit soft fields), put arguments about orthogonal syntax issues (the demurral even says "orthogonal"), and then use only some of the pieces to refute someone's argument based on the entirety of the wiki'ed work and the clear thrust of that work: to get rid of private names with something that is not a complete replacement.

It doesn't really matter what one correspondent among many wants (IOW, it's not "all out you" [or "me"]). The argument is about a shared resource, the wiki at , and the strawman proposals on it that are advancing a particular idea (soft fields, with variations and syntax), and by doing so are trying to get rid of a different proposal (private names).

In arguing about this, I have this bait-and-switch sense that I'm being told A+B, then when I argue in reply against B, I'm told "no, no! only A!". (Cheat sheet: A is soft fields, B is transposed square bracket syntax for them.)

Of course we should all separate syntax (we agree now; mea culpa for not doing my part earlier). But that didn't happen, even on the soft fields side. And the wiki'ed result, particularly the bit quoted above, is what everyone including me on the "private names" "side" (I want to have no side) who was reading the wiki indeed reacted to.

I'm not saying this was any one person's malicious "trick". But it's clear now what happened; the wiki and list record and diagram how it played out. It leaves a bad taste.

# Brendan Eich (14 years ago)

On Dec 22, 2010, at 8:50 AM, Kevin Smith wrote:

From my perspective as a JS programmer, overloading the dot seems confusing. The gains in elegance don't appear to me to be worth it. However, overloading [] might be more acceptable:

[] gets no respect, I tell ya! ;-)

let x = new PrivateName(); // or perhaps: private x;

function Point() { this[x] = 100; }

function createPoint() { return { [x]: 100, };

This is an interesting idea, one I've heard about from Pythonistas who want property names to be evaluated expressions, not implciitly quoted literals if identifier-names, in object initialiser. It would save some amount of eval and Function use.

It conflicts with the original MetaProperties syntax at strawman:object_initialiser_extensions (grammar) and strawman:obj_initialiser_meta (examples), which went like so:

var fancyObj = { [proto: fancyProto, sealed] prop1: value1, . . . };

but now uses <> instead of []. So no longer a strawman conflict, but I fear the angle brackets are going to cause us some grammatical and nesting-in-HTML pain.

# David Flanagan (14 years ago)

On 12/22/2010 09:57 AM, Allen Wirfs-Brock wrote:

I think there are some interesting ideas to explore in both D. Flanagan's proposal and D. Herman's variations upon it. However, they both seem to be ignoring the second primary use case that I identified: conflict-free extensions of build-in or third party objects. While naming conventions or a sigil seems to be satisfactory to many as a way to implement "weak encapsulation". I don't think it works for the extension case.

Having to use # or [] to identify a private extension method would make those extensions too ugly for common use, I suppose. I can't help thinking that what you're trying to propose is a kind of poor-man's private namespace. I don't know what the problems with ES4 namespaces were nor how the current proposal avoids them.

In a subsequent message, Allen also wrote:

I don't see why private foo; is any more or less generative than: var captured; or function inner() {};

They are all are declarative forms and all implicitly generate new runtime entities each time they are evaluated.

I've now realized that I don't actually object so much to the generative nature of private. What bugs me is that it essentially declares a meta-identifier that is then used as if it were a regular identifier. It is the meta-mismatch that I have a problem with.

If "private foo" declared a meta identifier <foo>, and I could then

write o.<foo>, that would make sense to me. But of course, that syntax

is more cumbersome than just using square brackets.

It feels to me as if the private declaration is behaving like a macro.

Are there precedents for this kind of meta-identifier in other languages?

# David Herman (14 years ago)

I think there are some interesting ideas to explore in both D. Flanagan's proposal and D. Herman's variations upon it. However, they both seem to be ignoring the second primary use case that I identified: conflict-free extensions of build-in or third party objects. While naming conventions or a sigil seems to be satisfactory to many as a way to implement "weak encapsulation". I don't think it works for the extension case.

I guess the intended spirit of my admittedly not-fully-specified idea last night was that, other than the required '#' sigil, there's no major difference from the private names strawman on the wiki. In particular, you could still reify a private name in an expression context to get a value.

Let's just say, for the sake of concreteness, that the syntax would be:

PrimaryExpression ::= ... | '#' Identifier

So you could do, for example:

function gensym() {
    private #x;
    return #x;
}

The exact lexical syntax isn't so much the point as just trying to avoid the ambiguity between public and private identifiers when used after dot or before colon by using a distinct lexical syntax for private identifiers. This way you don't have to know what's in scope to know whether an identifier is private.

# David Flanagan (14 years ago)

More musings: the current proposal allows this form where the generation of the private name is explicit:

private x = new Name();

What if the silently generative form were not allowed? That would make the mapping of identifiers more explicit.

And if so, could we replace = with a token that indicates mapping?

private x <=> new Name();

What about mapping public identifiers to other public identifiers?

private cos <=> "cosine"  // Now I can write Math.cosine()

If that is allowed then the private keyword no longer makes sense.

How about something like the let statement:

     var privateX = new Name(), privateY = new Name()
for (x,y) use (privateX, privateY) {
    // Identifier mapping scoped to this block
     }

Or flip the for and use clauses around if that looks too much like a loop:

use(new Name(), new Name()) for (x, y) {}
# Mark S. Miller (14 years ago)

On Tue, Dec 21, 2010 at 2:44 PM, Allen Wirfs-Brock <allen at wirfs-brock.com> wrote:

Please don't totally disengage from the syntax discussion. Most programmers understanding of the language starts with the concrete (syntax) and then proceeds to the abstract (semantics). Syntax design can have a big impact on the usability of the underlying semantics

Ok, I am not sure what I think of the following idea, but it's a bit different in flavor and so may stimulate other thoughts. I will express the expansion in terms of the natural expansion for a soft fields underpinning. One could do an equally natural expansion for private names. "==>" means "expands to". Actual expansions would be a bit more complex to preserve the left-to-right order of evaluation of the original.

The basic idea is, since we're considering a sigil anyway, and since .# and [# would both treat the thing to their right as something to be evaluated, why not turn the sigil into an infix operator instead? Then it can be used as "."-like "[]"-like without extra notation or being too closely confused with "." or "[]" themselves. Finally, given the meaning of the sigil-turned operator, "@" seemed to read better to me than "#". YMMV.

  expr1 @ expr2

==> expr2.get(expr1)

  expr1 @ expr2 = expr3;

==> expr2.set(expr1, expr3);

  const obj = {@expr1: expr2, ...};

==> const obj = {...}; expr1.set(obj, expr2);

Perhaps the expression on the right need not be a Name/SoftField. It could be anything that responds to "get" and "set".

# David Herman (14 years ago)

First of all, I think you may not be reading the current "private names" proposal. Allen wanted to change the name so he created a new page:

http://wiki.ecmascript.org/doku.php?id=strawman:private_names

Part of what you're reacting against is in fact what he changed (more below). But let me answer your question to provide some background.

It feels to me as if the private declaration is behaving like a macro.

Are there precedents for this kind of meta-identifier in other languages?

It might be illuminating to know that this whole line of exploration started when I was reading the documentation for PLT Scheme (now called Racket), which has a library for creating generative, lexically scoped names for use with their object system. You can write:

(define-local-member-name foo)

and then use foo' as a property in a class, and only the code in the scope of thedefine-local-member-name' can access that property name.

In other words, that's where I got the idea for the original private names proposal. And yes, internally, the implementation defines `foo' as a macro that expands into the gensym'ed name.

What also might be illuminating is the fact that this is part of the Scheme way of doing things, which is never to have more than one lexical environment. So absolutely everything in Scheme shares a single "namespace" -- `define-local-member-name' and macros and keywords and variables and anything fancy new binding forms you can invent (via macros, of course!).

I probably followed this approach without even thinking explicitly about it, because I'm so used to the Scheme Way. :) So my original proposal worked very much the same as `define-local-member-name'.

But Allen convinced me that there are drawbacks to having the single namespace, especially since '.' and ':' are a separate syntactic space from variable references. He reworked the proposal so that `private' binds in a separate namespace from variables, so there's no conflict between the two. Take a look at his proposal and see what you think.

# David Herman (14 years ago)

On Dec 22, 2010, at 7:10 AM, Peter van der Zee wrote:

What about adding an attribute to properties that somehow identify which classes (in the prototype chain for protected) have access to the object? I'll leave the "somehow" up in the air, but you could introduce a [[Private]] attribute which, if not undefined, says which context must be set (and for protected, either directly or through the prototypal chain of the current context) to gain access to this property. And if that context is not found, some error is thrown. Maybe it would be [[EncapsulationType]] :: {private, protected, public} and [[EncapsulationContext]] :: <?>. You could also add a simple api to check for these (isPrivate, isProtected, isPublic, hasEncapsulatedProperty, etc) depending on how it would affect "in" and enumeration.

IMO, this is too class-oriented for JS. We should allow the creation of private members of arbitrary objects, not just those that inherit from new constructors. I think it also doesn't address the use case of adding new operations to existing classes like Object or Array without danger of name conflicts.

Pro's:

  • meaning of private will be more to what people expect

I find this a little hard to believe. It's tricky to make claims about what people will expect. It's true this feels somewhat analogous to Java, but there's a wide diversity of JS programmers. And a lot of them don't want us to just "make it like Java" and do their best to remind us of this fairly regularly. ;)

  • minimal "magic" going on, trying to access a private property out of scope should result in a proper error
  • possibly less impact on the spec (although I'm not sure there...)
  • no need to introduce a new type/class to denote private properties

This last point confuses me -- it sounds like you have to introduce a class to denote private properties, because they're associated with a class. Or are you referring to the SoftField type?

# Mark S. Miller (14 years ago)

On Wed, Dec 22, 2010 at 11:56 AM, Mark S. Miller <erights at google.com> wrote:

On Tue, Dec 21, 2010 at 2:44 PM, Allen Wirfs-Brock <allen at wirfs-brock.com> wrote:

Please don't totally disengage from the syntax discussion. Most programmers understanding of the language starts with the concrete (syntax) and then proceeds to the abstract (semantics). Syntax design can have a big impact on the usability of the underlying semantics

Ok, I am not sure what I think of the following idea, but it's a bit different in flavor and so may stimulate other thoughts. I will express the expansion in terms of the natural expansion for a soft fields underpinning. One could do an equally natural expansion for private names. "==>" means "expands to". Actual expansions would be a bit more complex to preserve the left-to-right order of evaluation of the original.

The basic idea is, since we're considering a sigil anyway, and since .# and [# would both treat the thing to their right as something to be evaluated, why not turn the sigil into an infix operator instead? Then it can be used as "."-like "[]"-like without extra notation or being too closely confused with "." or "[]" themselves. Finally, given the meaning of the sigil-turned operator, "@" seemed to read better to me than "#". YMMV.

  expr1 @ expr2

==> expr2.get(expr1)

  expr1 @ expr2 = expr3;

==> expr2.set(expr1, expr3);

  const obj = {@expr1: expr2, ...};

==> const obj = {...}; expr1.set(obj, expr2);

Perhaps the expression on the right need not be a Name/SoftField. It could be anything that responds to "get" and "set".

Redoing the "class private" example at the end of strawman:private_names#using_private_identifiers, strawman:names_vs_soft_fields#using_private_identifiers

const key = SoftField(); // or, obviously, Name(), depending...function Thing() { this @ key = "class private value"; this.hasKey = function(x) { return x @ key === this @ key; }; this.getThingKey = function(x) { return x @ key; };} var thing1 = new Thing;var thing2 = new Thing; print("key" in thing1); // falseprint(thing1.hasKey(thing1)); // trueprint(thing1.hasKey(thing2)); // true

# Peter van der Zee (14 years ago)

On Dec 22, 2010, at 7:10 AM, Peter van der Zee wrote:

What about adding an attribute to properties that somehow identify which classes (in the prototype chain for protected) have access to the object? I'll leave the "somehow" up in the air, but you could introduce a [[Private]] attribute which, if not undefined, says which context must be set (and for protected, either directly or through the prototypal chain of the current context) to gain access to this property. And if that context is not found, some error is thrown. Maybe it would be [[EncapsulationType]] :: {private, protected, public} and [[EncapsulationContext]] :: <?>. You could also add a simple api to check for these (isPrivate, isProtected, isPublic, hasEncapsulatedProperty, etc) depending on how it would affect "in" and enumeration.

IMO, this is too class-oriented for JS. We should allow the creation of private members of arbitrary objects, not just those that inherit from new constructors. I think it also doesn't address the use case of adding new operations to existing classes like Object or Array without danger of name conflicts.

Ok. Indeed it doesn't address adding private properties to any object nor extending existing classes, although I think that might be fixable. And you're right, it doesn't address conflicts.

Pro's:

  • meaning of private will be more to what people expect

I find this a little hard to believe. It's tricky to make claims about what people will expect. It's true this feels somewhat analogous to Java, but there's a wide diversity of JS programmers. And a lot of them don't want us to just "make it like Java" and do their best to remind us of this fairly regularly. ;)

Ok, fair enough.

there...) - no need to introduce a new type/class to denote private properties

This last point confuses me -- it sounds like you have to introduce a class to denote private properties, because they're associated with a class. Or are you referring to the SoftField type?

The proposal at strawman:private_names lists three changes right at the top. 1 is a new type. To me, this seems like a rather big impact on the language for introducing something that's already possible through closures.

# Allen Wirfs-Brock (14 years ago)

On Dec 22, 2010, at 11:12 AM, David Flanagan wrote:

've now realized that I don't actually object so much to the generative nature of private. What bugs me is that it essentially declares a meta-identifier that is then used as if it were a regular identifier. It is the meta-mismatch that I have a problem with.

JavaScript already has such "meta identifiers". But they can't be used as "regular identifiers" (which I'll interpret as meaning expression contexts), instead they only occur after a dot or on the left hand side of a colon in an object literal. the private names proposals creates the exact same kind of "meta identifier" for use in the exact same contexts (and adds a new #. context). It simply extends the scoping and binding of such meta identifiers.

More explicitly in JavaScript: var x= new Object; var obj={x:x}; //same identifier, two different meanings obj.obj=obj;

If "private foo" declared a meta identifier <foo>, and I could then write o.<foo>, that would make sense to me. But of course, that syntax is more cumbersome than just using square brackets.

It feels to me as if the private declaration is behaving like a macro.

Are there precedents for this kind of meta-identifier in other languages?

For example, member names in C structs scope to the type that defines them

# Brendan Eich (14 years ago)

On Dec 22, 2010, at 12:45 PM, Peter van der Zee wrote:

IMO, this is too class-oriented for JS. We should allow the creation of private members of arbitrary objects, not just those that inherit from new constructors. I think it also doesn't address the use case of adding new operations to existing classes like Object or Array without danger of name conflicts.

Ok. Indeed it doesn't address adding private properties to any object nor extending existing classes, although I think that might be fixable. And you're right, it doesn't address conflicts.

Hold this thought ;-).

there...) - no need to introduce a new type/class to denote private properties

This last point confuses me -- it sounds like you have to introduce a class to denote private properties, because they're associated with a class. Or are you referring to the SoftField type?

The proposal at strawman:private_names lists three changes right at the top. 1 is a new type. To me, this seems like a rather big impact on the language for introducing something that's already possible through closures.

Wait, closures can't be used to avoid name collisions when extending existing objects (that held thought).

The new type would be an internal, spec-only thing, were it not for #.id -- that requires typeof #.id == "private name" or some such. It all follows, but it's a hornet's nest in my view. An object subtype (internal [[Class]] property with new value "PrivateName", e.g.) would be less bitey. Allen's strawman raises this possibility, so it's not really the "hill to die on" -- it's not a big deal over which to shoot down the whole proposal. But it does draw fire that we might prefer held for bigger targets, I agree.

# David-Sarah Hopwood (14 years ago)

On 2010-12-22 07:57, Brendan Eich wrote:

On Dec 21, 2010, at 10:22 PM, David-Sarah Hopwood wrote:

On 2010-12-21 22:12, Brendan Eich wrote:

It's tiresome to argue by special pleading that one extension or transformation (including generated symbols) is "more complex, and less explanatory", while another is less so, when the judgment is completely subjective. And the absolutism about how it's always better in every instance to use strong encapsulation is, well, absolutist (i.e., wrong).

I gave clear technical arguments in that post. If you want to disagree with them, disagree with specific arguments, rather than painting me as an absolutist. (I'm not.)

Here's a quote: "As you can probably tell, I'm not much impressed by this counterargument. It's a viewpoint that favours short-termism and code that works by accident, rather than code that reliably works by design."

How do you expect anyone to respond? By endorsing bugs or programming based on partial knowledge and incomplete understanding? Yet the real world doesn't leave us the option to be perfect very often. This is what I mean by absolutism.

That isn't what absolutism generally means, so you could have been clearer.

What I said, paraphrasing, is that weak encapsulation favours code that doesn't work reliably in cases where the encapsulation is bypassed. Also, that if the encapsulation is never bypassed then it didn't need to be weak. What's wrong with this argument? Calling it "absolutist" is just throwing around insults, as far as I'm concerned.

When prototyping, weak or even no encapsulation is often the right thing, but you have to be careful with prototypes that get pressed into products too quickly (I should know). JS is used to prototype all the time.

OK, let's consider prototyping. In the soft fields proposal, a programmer could temporarily set a variable that would otherwise have held a soft field to a string. All accesses via that variable will work, but so will encapsulation-breaking accesses via the string name. Then before we release the code, we can put back the soft field (requiring only minimal code changes) and remove any remaining encapsulation-breaking accesses. Does this address the issue?

So rather than argue for strong encapsulation by setting up a straw man counterargument you then are not much impressed by, decrying short-termism, etc., I think it would be much more productive to try on others' hats and model their concerns, including for usable syntax.

Weak vs strong encapsulation is mostly independent of syntax. At least, all of the syntaxes that have been proposed so far can provide either strong or weak encapsulation, depending on the semantics.

There is a separate discussion to be had about whether the form of executable specification MarkM has used (not to be confused with the semantics) is the best form to use for any final spec. Personally, I like this form of specification: I think it is clear, concise (which aids holding the full specification of a feature in short-term memory), easy to reason about relative to other approaches, useful for prototyping, and useful for testing.

I don't mind at all that the correspondance with the implementation is less direct than it would be in a more operational style; implementors often need to handle less direct mappings than this, and I don't expect a language specification to be a literal description of how a language is implemented in general (excluding naive reference implementations).

Once again, you've argued about what you like, with subjective statements such as "I don't mind".

Yes, I try very hard not to misrepresent opinions as facts.

With inherited soft fields, the ability to "extend" frozen objects with private fields is an abstraction leak (and a feature, I agree).

How is it an abstraction leak? The abstraction is designed to allow this; it's not an accident (I'm fairly sure, without mind-reading MarkM).

If I give you an object but I don't want you "adding" fields to it, what do I do? Freezing works with private names, but it does not with soft fields.

What's your intended goal in preventing "adding" fields to the object?

If the goal is security or encapsulation, then freezing the object is sufficient. If I add the field in a side table, that does not affect your use of the object. I could do the same thing with aWeakMap.set(obj, value).

If the goal is concurrency-safety, then we probably need to have a concurrency model in mind before discussing this in detail. However, adding fields in a side table does not affect the concurrency-safety of your code that does not have access to the table or those fields. It might affect the concurrency-safety of my code that does have that access; so I shouldn't add new fields and rely on my view of the object to be concurrency-safe just because the object is frozen. This doesn't seem like an onerous or impractical restriction.

With private names, the inability to "extend" frozen objects with private fields is a significant limitation.

Can you try taking a different point of view and exploring it, for a change? :-/

That statement you quoted is a technical argument. If you disagree, please say why you think that the inability to extend frozen objects is not a significant limitation. (It's an inability to do something, so it's a limitation, and it has plausible use cases that one might expect to be supported by this feature, so it's significant.)

It's not up to me to enumerate all of the possible ways in which people might disagree with me.

If you don't like x[#.id] / x.id supplanting x["id"] / x.id, it seems to me you have to count some similar demerits against this change.

If we compare both proposals with the additional syntax, they are equally "magical"; the only difference is whether the magic is built-in or the result of a desugaring.

Agreed. Disagree on desugaring to weak maps, not counting weak map complexity, or costs in executable spec approach, globally winning.

Weak map complexity is a sunk cost when considering the additional global cost of each of these proposals, so it should be discounted. The cost of the executable spec approach indeed should not be discounted.

The weak encapsulation design points are likewise "leaky" for private names, where no such leaks arise with soft fields: reflection and proxies can learn private names, they "leak" in the real ocap sense that secure subsets will have to plug.

As I said earlier, designers of secure subsets would prefer that this leak not exist in the first place, rather than having to plug it. Regardless of your statements above, this is not an absolutist position; the onus is on proponents of weak encapsulation to say why it is useful to have the leak (by technical argument, not just some vague philosophical position against strong encapsulation).

We're arguing about the default in a mostly-compatible next edition of JavaScript. JS has only closures for strong encapsulation. You can't make a "technical argument" or proof

I never claimed to make any proof. Statements about the desirability of language properties are not amenable to proof.

that strong encapsulation must be presumed the default,

I never claimed that it must be presumed the default. I presented arguments in favour of it.

so how can you put the onus on proponents of weak encapsulation to make any such bogus argument?

Because very few technical arguments have so far been made in favour of weak encapsulation. I made one that you dismissed as a strawman (it wasn't), and you made one about prototyping above.

Programmers use no-encapsulation and weak encapsulation in JS every day.

They would continue to be able to do that. No-one is suggesting that the ability to create unencapsulated objects with only public properties should be removed. No-one is suggesting that programmers be forbidden from putting _ in their public property names as a convention to mark weak encapsulation.

The question is, given that we are proposing to add a new encapsulation mechanism (at least, a feature with keywords and semantics that strongly suggest it is intended to be usable for encapsulation, even if it also has other uses), whether the encapsulation provided by that mechanism should be strong or weak.

To be fair, I think that trends and tendencies do matter. The language is being used at larger scale, where integrity and other properties need automated enforcement (at the programmer's discretion).

It should be clear that the programmer of an encapsulated abstraction always has discretion over the visibility of its state. For a strongly encapsulated abstraction, programmers of code outside the scope of the abstraction cannot have any discretion over that visibility (given a correct language implementation and excluding debugging, etc.), by definition.

But there's no onus reversal or technical standard of proof that will help us make design decisions. I know you want strong encapsulation. Others want weak. Now what?

I've also stated clearly why I want strong encapsulation, for both security and software engineering reasons. To be honest, I do not know why people want weak encapsulation. They have not told us. Perhaps their actual concerns can be addressed by a mechanism that provides strong encapsulation according to the definition I gave.

# David-Sarah Hopwood (14 years ago)

On 2010-12-22 18:59, Brendan Eich wrote:

On Dec 21, 2010, at 11:58 PM, Brendan Eich wrote:

... which is strictly weaker, more complex, and less explanatory.

So is a transposed get from an inherited soft field. Soft fields change the way square brackets work in JS, for Pete's sake!

They do not.

Ok, then I'm arguing with someone else on that point.

Many of us were wondering where my (shared) square brackets change for soft fields memory came from, and re-reading the numerous wiki pages. Finally we found what I had recollected in writing the above:

strawman:inherited_explicit_soft_fields#can_we_subsume_names

There, after Mark's demurral ("I (MarkM) do not like the sugar proposed for Names, ..."), is this:

"If we wish to adopt the same sugar for soft fields instead, then

private key; ... base[key] ...

could expand to

const key = SoftField(); ... key.get(base) ....

If there are remaining benefits of Names not addressed above, here would be a good place to list them. If we can come to consensus that soft fields do subsume Names, then “Name” becomes a possible choice for the name of the “SoftField” constructor." -----

This is clearly pitting soft fields against names in full, including a change to JS's square bracket syntax as sugar.

The hedging via "If" and the demurral do not remove this from the soft fields "side" of the death match I've been decrying, indeed they add to it. This is making a case for dropping names in full in favor of soft fields in (mostly -- no dot operator or object literal support) comparable fullness.

For the record, and in case there's a "next time": I don't think it's good form to chop up proposals made on the wiki (inherited soft fields, explicit soft fields, inherited explicit soft fields), put arguments about orthogonal syntax issues (the demurral even says "orthogonal"), and then use only some of the pieces to refute someone's argument based on the entirety of the wiki'ed work and the clear thrust of that work: to get rid of private names with something that is not a complete replacement.

It doesn't really matter what one correspondent among many wants (IOW, it's not "all out you" [or "me"]). The argument is about a shared resource, the wiki at , and the strawman proposals on it that are advancing a particular idea (soft fields, with variations and syntax), and by doing so are trying to get rid of a different proposal (private names).

In arguing about this, I have this bait-and-switch sense that I'm being told A+B, then when I argue in reply against B, I'm told "no, no! only A!". (Cheat sheet: A is soft fields, B is transposed square bracket syntax for them.)

This criticism is baseless and without merit.

In order to compare the two semantic proposals, strawman:inherited_explicit_soft_fields#can_we_subsume_names

considers what they would look like with the same syntax. In that case, soft fields are semantically simpler.

This should not in any way preclude also criticising the syntax.

If your criticisms of soft fields plus the change to [] depended on the fact that the syntax change was layered on soft fields, then you might have a point. But in fact those criticisms apply to the syntax change regardless of which proposal it is layered on.

There was and is no "bait and switch".

I'm not saying this was any one person's malicious "trick". But it's clear now what happened; the wiki and list record and diagram how it played out. It leaves a bad taste.

You have willfully assumed bad faith, despite clear explanations. That certainly does leave a bad taste.

# Brendan Eich (14 years ago)

On Dec 22, 2010, at 2:56 PM, David-Sarah Hopwood wrote:

What I said, paraphrasing, is that weak encapsulation favours code that doesn't work reliably in cases where the encapsulation is bypassed. Also, that if the encapsulation is never bypassed then it didn't need to be weak. What's wrong with this argument?

The reliability point is fine but not absolute. Sometimes reliability is not primary.

You may disagree, but every developer knows this who has had to meet a deadline, where missing the deadline meant nothing further would be developed at all, while hitting the deadline in a hurry, say without strong encapsulation, meant there was time to work on stronger encapsulation and other means to achieve the end of greater reliability -- but later, after the deadline. At the deadline, the demo went off even though reliability bugs were lurking. They did not bite.

This is a common story from successful startups that I've been part of or advised.

The second part, asserting that if the encapsulation was never bypassed then it didn't need to be weak, as if that implies it might as well have been strong, assumes that strong is worth its costs vs. the (not needed, by the hypothesis) benefits.

But that's not obviously true, because strong encapsulation does have costs as well as benefits. It's often worth it, but not always. It may cost more than weak in the short run, but not the long run. It may cost more than it benefits in any time frame.

Yet your argument tries to say strong encapsulation is absolutely always worth it, since either it was needed for reliability, or else it wouldn't have hurt. This completely avoids the economic trade-offs -- the costs over time. Strong can hurt if it is unnecessary.

To be utterly concrete in the current debate: I'm prototyping something in a browser-based same-origin system that already uses plain old JS objects with properties. The system also has an inspector written in JS. Oh, and the system uses a framework where objects can be cloned using ES5's new meta-object APIs called by a clone method.

Now I want to make a private member of my objects, not for security but just to save my code from myself, who tends to use .x as a property name too much, and my colleague MonkeyBob, who likes to monkey-patch.

With private names, I just need to add the *** line to my constructor:

function MyConstructor(x, ...) { private x; // *** this.x = x; ... // closures defined that use x }

and I'm better off. Even if ES5's Object.getOwnPropertyNames is used by the inspector run by my other colleague ReliableFred, who needs to see x even though it is private. Fred won't do anything wrong with the value of x, but if he can't see it, he can't debug his code, which uses my code (the bug could be anywhere, or multi-factorial).

With soft fields, one has to write strictly more code:

function MyConstructor(x, ...) { const mySoftField = SoftField(); mySoftField.set(this, x); ... // closures defined that use mySoftField }

And what's worse, I've broken ReliableFred's benign inspector use-case. And I've also broken clone. My boss finds out and fires me, we miss the demo deadline and fail to get funding, the company fails. Even the strong encapsulation angels cry tears of blood.

[big snip]

I've also stated clearly why I want strong encapsulation, for both security and software engineering reasons. To be honest, I do not know why people want weak encapsulation. They have not told us.

Yes, they have. In the context of this thread, Allen took the trouble to write this section:

strawman:private_names#private_name_properties_support_only_weak_encapsulation

Quoting: "Private names are instead intended as a simple extensions of the classic JavaScript object model that enables straight-forward encapsulation in non-hostile environments. The design preserves the ability to manipulate all properties of an objects at a meta level using reflection and the ability to perform “monkey patching” when it is necessary."

# Brendan Eich (14 years ago)

On Dec 22, 2010, at 3:49 PM, David-Sarah Hopwood wrote:

In arguing about this, I have this bait-and-switch sense that I'm being told A+B, then when I argue in reply against B, I'm told "no, no! only A!". (Cheat sheet: A is soft fields, B is transposed square bracket syntax for them.)

This criticism is baseless and without merit.

In order to compare the two semantic proposals, strawman:inherited_explicit_soft_fields#can_we_subsume_names considers what they would look like with the same syntax.

Wrong. That section has

private key; ... base[key] ...

and thus assumes "private key" creates a private-name value bound to key that can be used in brackets. That is not how private names as proposed by Allen works, nor how the earlier names proposal worked.

Private names, and names before it, proposes lexical bindings for private names which can be used only after dot in a member expression and before colon in an object initialiser's property initialiser. A private-declared name cannot be used in brackets to get at the property -- not without #. to reflect it into runtime.

Clearly, the strawman:inherited_explicit_soft_fields#can_we_subsume_names gets this wrong. Either the names proposal was misunderstood, or the square-bracket-only syntax was a compromise. It simply is not and never was "the same syntax".

In that case, soft fields are semantically simpler.

I reject all your premises, so it is pointless to argue about conclusions that depend on them.

First, the simpler semantics for a different, inferior syntax does not win over more complex semantics for a simpler and more usable syntax. Users of the language are the audience in most need of simplicity, not implementors or spec writers. The spec is not the ultimate good to optimize in this way.

Second, the soft fields semantic model is not simpler when you count everything it depends on, and where it shifts complexity (implementors and users).

Finally, I disagree that an executable spec, aka self-hosted library code as spec, wins.

But see below -- at this point, it's clear we should not be arguing about soft fields vs. private names as if they are alternatives or in any way substitutable.

I'm not saying this was any one person's malicious "trick". But it's clear now what happened; the wiki and list record and diagram how it played out. It leaves a bad taste.

You have willfully assumed bad faith, despite clear explanations. That certainly does leave a bad taste.

No, I explicitly disclaimed bad faith in the cited first line above ("I'm not saying this was any one person's malicious 'trick'.").

Putting up a bunch of wiki pages with the intent of knocking down a different proposal is aggressive. I don't think that's "bad faith" and I never said so. Crock cheered it on. In accusing me of assuming bad faith, you are just missing the target completely.

But any such (premature or just wrong) contest between proposals has to compare apples to apples. We can't even agree on the syntax apples, never mind how to compare the semantic model apples.

Even with the best faith (which I presume MarkM has, and he knows this), trying to compare different semantic models, tracking multiple wiki pages, while setting up an elimination-contest context, is "tricky", as in, a lot can go wrong. And (see above, where you repeat the falsehood that Mark's section can_we_subsume_names uses "the same syntax" as private names or names) things did go wrong.

I now think it is a mistake to try to build consensus by separating syntax from semantics, mapping one chopped-down, feature-stripped syntax to both private names and soft fields, and essentially joining in the campaign to make there "be only one" (among two proposals). Here's why:

Private names have nothing to do with soft fields. They are an independent proposal.

Soft fields do not need more than weak maps, which are already harmonious.

If you don't like private names, fine. Maybe they won't make it. Or with feedback from this list and the community, they might evolve to something that gets into a future edition, but soft fields don't bear on the odds at all.

Trying to replace private names with (inevitably different) syntax mapped to soft fields is not going to please fans of either proposal.

# David-Sarah Hopwood (14 years ago)

On 2010-12-23 00:40, Brendan Eich wrote:

On Dec 22, 2010, at 2:56 PM, David-Sarah Hopwood wrote:

What I said, paraphrasing, is that weak encapsulation favours code that doesn't work reliably in cases where the encapsulation is bypassed. Also, that if the encapsulation is never bypassed then it didn't need to be weak. What's wrong with this argument?

The reliability point is fine but not absolute. Sometimes reliability is not primary.

You may disagree, but every developer knows this who has had to meet a deadline, where missing the deadline meant nothing further would be developed at all, while hitting the deadline in a hurry, say without strong encapsulation, meant there was time to work on stronger encapsulation and other means to achieve the end of greater reliability -- but later, after the deadline. At the deadline, the demo went off even though reliability bugs were lurking. They did not bite.

How precisely would weak encapsulation (specifically, a mechanism that is weak because of the reflection and proxy trapping loopholes) help them to meet their deadline?

(I don't find your inspector example compelling, for reasons given below.)

The second part, asserting that if the encapsulation was never bypassed then it didn't need to be weak, as if that implies it might as well have been strong, assumes that strong is worth its costs vs. the (not needed, by the hypothesis) benefits.

But that's not obviously true, because strong encapsulation does have costs as well as benefits.

What costs are you talking about?

  • Not specification complexity, because the proposal that has the simplest spec complexity so far (soft fields, either with or without syntax changes) provides strong encapsulation.

  • Not runtime performance, because the strength of encapsulation makes no difference to that.

  • Not syntactic convenience, because there exist both strong-encapsulation and weak-encapsulation proposals with the same syntax.

  • Not implementation complexity, because that's roughly similar.

So, what costs? It is not an axiom that proposals with any given desirable property have greater cost (in any dimension) than proposals without that property.

Yet your argument tries to say strong encapsulation is absolutely always worth it, since either it was needed for reliability, or else it wouldn't have hurt. This completely avoids the economic trade-offs -- the costs over time. Strong can hurt if it is unnecessary.

How precisely can it hurt, relative to using the same mechanism with loopholes?

To be utterly concrete in the current debate: I'm prototyping something in a browser-based same-origin system that already uses plain old JS objects with properties. The system also has an inspector written in JS.

[snip example in which the only problem is that the inspector doesn't show private fields because it is using getOwnPropertyNames]

Inspectors can bypass encapsulation regardless of the language spec. Specifically, an inspector that supports Harmony can see that there is a declaration of a private variable x, and show that field on any objects that are being inspected. It can also display the side table showing the value of x for all objects that have that field.

Disadvantages: slightly greater implementation complexity in the inspector, and lack of compatibility with existing inspectors that don't explicitly support Harmony.

Note that inspectors for JS existed prior to the addition of getOwnPropertyNames, so that is merely a convenience and a way to avoid implementation dependencies in the inspector.

With soft fields, one has to write strictly more code:

Nope, see above.

I've also stated clearly why I want strong encapsulation, for both security and software engineering reasons. To be honest, I do not know why people want weak encapsulation. They have not told us.

Yes, they have. In the context of this thread, Allen took the trouble to write this section:

strawman:private_names#private_name_properties_support_only_weak_encapsulation

Quoting: "Private names are instead intended as a simple extensions of the classic JavaScript object model that enables straight-forward encapsulation in non-hostile environments. The design preserves the ability to manipulate all properties of an objects at a meta level using reflection and the ability to perform “monkey patching” when it is necessary."

Strong encapsulation does not interfere with the ability to add new monkey-patched properties (actually fields). What it does prevent, by definition, is the ability to modify or read existing private fields to which the accessor does not have the relevant field object. What I was looking for was not mere assertion that this is sometimes necessary to be able to do that, but an explanation of why.

As for "the ability to manipulate all properties of objects at a meta level using reflection", strictly speaking that is still possible in the soft fields proposal because soft fields are not properties. This is not mere semantics; these fields are associated with the object, but it is quite intentional that the object model views them as being stored on a side table. Note that other methods of associating private state with an object, such as closing over variables, do not allow that state to be accessed by reflection on the object either.

# Brendan Eich (14 years ago)

On Dec 22, 2010, at 6:39 PM, David-Sarah Hopwood wrote:

Inspectors can bypass encapsulation regardless of the language spec.

The Inspector is written in ES5. How does it bypass soft field strong encapsulation?

As for "the ability to manipulate all properties of objects at a meta level using reflection", strictly speaking that is still possible in the soft fields proposal because soft fields are not properties. This is not mere semantics; these fields are associated with the object, but it is quite intentional that the object model views them as being stored on a side table.

The side table is in a closure environment only, not available to the inspector, which uses getOwnPropertyNames:

function MyConstructor(x, ...) { const mySoftField = SoftField(); mySoftField.set(this, x); ... // closures defined that use mySoftField }

Note that other methods of associating private state with an object, such as closing over variables, do not allow that state to be accessed by reflection on the object either.

That's right, and that is exactly Allen's point in writing the rationale for weak encapsulation that he wrote, and my point in using the example ReliableFred relies upon: an inspector hosted in the browser written in ES5.

You wrote too long a reply again, with lots of extra words claiming to rebut me, but you got this fundamental part of the example completely wrong, and inverted the rationale for weak encapsulation.

We do not want to require a deoptimizing native code hosted debugger or inspector to peek in closures. Even if you have one, finding the soft field requires the user to know where to look. With private names, there's no mystery: you look on the object itself, using getOwnPropertyNames.

Please reply in <500 words.

# David-Sarah Hopwood (14 years ago)

On 2010-12-23 01:11, Brendan Eich wrote:

On Dec 22, 2010, at 3:49 PM, David-Sarah Hopwood wrote:

In arguing about this, I have this bait-and-switch sense that I'm being told A+B, then when I argue in reply against B, I'm told "no, no! only A!". (Cheat sheet: A is soft fields, B is transposed square bracket syntax for them.)

This criticism is baseless and without merit.

In order to compare the two semantic proposals, strawman:inherited_explicit_soft_fields#can_we_subsume_names considers what they would look like with the same syntax.

Wrong. That section has

private key; ... base[key] ...

and thus assumes "private key" creates a private-name value bound to key that can be used in brackets. That is not how private names as proposed by Allen works, nor how the earlier names proposal worked.

That section is clear that it is talking about the syntax proposed in strawman:names. (Adapting it to the private_names syntax is trivial, though.)

The "Name objects as property names" section of that page gives an example in which 'var name = new Name' creates an object that can be used via 'obj[name]'. The "Binding private names" section says that in the scope of a 'private x' declaration, "x is also bound as a plain variable to the Name value."

Therefore, 'private key;' binds the plain variable 'key' to a Name value which can be used as 'base[key]'. Your interpretation of the names proposal is wrong and Mark's was correct.

As far as I can see, MarkM has not (at least, not on the wiki) proposed any new syntax in this discussion that had not already been proposed in one of Allen's proposals.

Private names, and names before it, proposes lexical bindings for private names which can be used only after dot in a member expression and before colon in an object initialiser's property initialiser. A private-declared name cannot be used in brackets to get at the property -- not without #. to reflect it into runtime.

Clearly, the strawman:inherited_explicit_soft_fields#can_we_subsume_names gets this wrong.

You apparently missed the statement "x is also bound as a plain variable to the Name value." in the names proposal, which would explain your confusion on this point.

Either the names proposal was misunderstood, or the square-bracket-only syntax was a compromise. It simply is not and never was "the same syntax".

In that case, soft fields are semantically simpler.

I reject all your premises, so it is pointless to argue about conclusions that depend on them.

Do you still reject them after being shown that the syntax in MarkM's proposal is in fact the same syntax?

First, the simpler semantics for a different, inferior syntax does not win over more complex semantics for a simpler and more usable syntax. Users of the language are the audience in most need of simplicity, not implementors or spec writers. The spec is not the ultimate good to optimize in this way.

This argument clearly fails, because the syntax that you're criticising as inferior is actually the syntax defined in the names proposal.

There is no obstacle whatsoever to the soft fields semantics being used with any of the syntaxes that have been proposed so far.

Second, the soft fields semantic model is not simpler when you count everything it depends on, and where it shifts complexity (implementors and users).

OK, there's an interesting point here, which is the extent to which reliance on existing language constructs ("existing" in the sense of not added as part of the feature under consideration), should be counted toward a new feature's complexity, relative to reliance on new constructs added together with the feature.

I think that use of new constructs ought to be charged more in complexity cost than use of existing constructs, all else being equal. This is an opinion, but I would have thought it's a rather uncontroversial one.

In any case, I don't find WeakMap (or other constructs used by the SoftField executable specification) particularly complex. YMMV.

The soft fields model does not shift complexity onto users because their perception of complexity depends mainly on the syntax, which is the same. The differences in semantics are unlikely to be noticed in most situations.

The actual implementation complexity is no greater for soft fields. The soft fields specification has a less direct correspondance to the implementation, and we disagree on the significance of that.

Finally, I disagree that an executable spec, aka self-hosted library code as spec, wins.

But see below -- at this point, it's clear we should not be arguing about soft fields vs. private names as if they are alternatives or in any way substitutable.

We're not going to be able to agree on adding both, so they are alternatives.

I'm not saying this was any one person's malicious "trick". But it's clear now what happened; the wiki and list record and diagram how it played out. It leaves a bad taste.

You have willfully assumed bad faith, despite clear explanations. That certainly does leave a bad taste.

No, I explicitly disclaimed bad faith in the cited first line above ("I'm not saying this was any one person's malicious 'trick'.").

Putting up a bunch of wiki pages with the intent of knocking down a different proposal is aggressive.

I think you're seeing "aggression" where there is only straightforward technical disagreement, combined with your own misunderstanding of one of the proposals.

# David-Sarah Hopwood (14 years ago)

On 2010-12-23 02:48, Brendan Eich wrote:

On Dec 22, 2010, at 6:39 PM, David-Sarah Hopwood wrote:

Inspectors can bypass encapsulation regardless of the language spec.

The Inspector is written in ES5. How does it bypass soft field strong encapsulation?

I meant, obviously, that inspectors in general can bypass encapsulation.

It is not clear to me that a usable inspector can be written purely in ES5 using the reflection API. Doesn't an inspector have to be able to read variables in any scope? Or maybe you mean by "inspector" something less ambitious than I'm thinking of (but then it's not clear that it needs to be able to read private fields, since it also can't read closed-over variables).

As for "the ability to manipulate all properties of objects at a meta level using reflection", strictly speaking that is still possible in the soft fields proposal because soft fields are not properties. This is not mere semantics; these fields are associated with the object, but it is quite intentional that the object model views them as being stored on a side table.

The side table is in a closure environment only, not available to the inspector, which uses getOwnPropertyNames:

function MyConstructor(x, ...) { const mySoftField = SoftField(); mySoftField.set(this, x); ... // closures defined that use mySoftField }

OK, you're assuming that the inspector can't read state from closures. So why does it matter that it can't read private fields, given that the programmer would probably have used closures if they were not using private fields?

Note that other methods of associating private state with an object, such as closing over variables, do not allow that state to be accessed by reflection on the object either.

That's right, and that is exactly Allen's point in writing the rationale for weak encapsulation that he wrote, and my point in using the example ReliableFred relies upon: an inspector hosted in the browser written in ES5.

The constraint that the inspector be written in ES5 seems to be a purely artificial one. All of the commonly used browsers have debugger extensions.

Please reply in <500 words.

No, I'm not going to play your word-counting game.

# Brendan Eich (14 years ago)

On Dec 22, 2010, at 7:34 PM, David-Sarah Hopwood wrote:

As far as I can see, MarkM has not (at least, not on the wiki) proposed any new syntax in this discussion that had not already been proposed in one of Allen's proposals.

Wrong again. Allen did not write the original strawman:names proposal.-- from the top of private_names:

"Original proposal by Dave Herman and Sam Tobin-Hochstadt is here."

Follow that link and read strawman:names#binding_private_names to see only examples using x.key, etc. -- no square brackets.

Mark's example predates private_names and so may have worked in the old names proposal, but only via square brackets. Not via dot -- so again not "the same syntax" as what even strawman:names proposed.

Never mind the private names proposal that supersedes names -- not faulting Mark for lacking clairvoyance here -- I'm faulting you for twisting "the same syntax" from its obvious meaning of "all the same syntax" to "the subset that uses square brackets".

You seem to have problem owning up to mistakes. I've counted four so far and each time I point them out you either deny with lots of words, or ignore. This makes it hard to justify continuing our little exchange.

# Brendan Eich (14 years ago)

On Dec 22, 2010, at 7:49 PM, David-Sarah Hopwood wrote:

On 2010-12-23 02:48, Brendan Eich wrote:

On Dec 22, 2010, at 6:39 PM, David-Sarah Hopwood wrote:

Inspectors can bypass encapsulation regardless of the language spec.

The Inspector is written in ES5. How does it bypass soft field strong encapsulation?

I meant, obviously, that inspectors in general can bypass encapsulation.

I gave an example where weak encapsulation wins and you want to generalize it to include native-code-hosted inspectors. Nope.

OK, you're assuming that the inspector can't read state from closures.

It's an object inspector.

So why does it matter that it can't read private fields, given that the programmer would probably have used closures if they were not using private fields?

We starving startup programmers would probably have done what you wish to change the example? Nope.

The constraint that the inspector be written in ES5 seems to be a purely artificial one. All of the commonly used browsers have debugger extensions.

Nope, our little startup (mine, MonkeyBob's, and ReliableFred's -- plus the boss) is writing a cross-browser framework and app. No native code, let alone deoptimizing magic VM-ported code for each top JS VM.

Please reply in <500 words.

No, I'm not going to play your word-counting game.

  1. Game over.
# David-Sarah Hopwood (14 years ago)

On 2010-12-23 05:14, Brendan Eich wrote:

On Dec 22, 2010, at 7:49 PM, David-Sarah Hopwood wrote:

The constraint that the inspector be written in ES5 seems to be a purely artificial one. All of the commonly used browsers have debugger extensions.

Nope, our little startup (mine, MonkeyBob's, and ReliableFred's -- plus the boss) is writing a cross-browser framework and app. No native code, let alone deoptimizing magic VM-ported code for each top JS VM.

You don't need the debugger to be part of your framework and app, in order to use it for development.

(There, concise enough this time?)

# Brendan Eich (14 years ago)

On Dec 22, 2010, at 9:31 PM, David-Sarah Hopwood wrote:

On 2010-12-23 05:14, Brendan Eich wrote:

On Dec 22, 2010, at 7:49 PM, David-Sarah Hopwood wrote:

The constraint that the inspector be written in ES5 seems to be a purely artificial one. All of the commonly used browsers have debugger extensions.

Nope, our little startup (mine, MonkeyBob's, and ReliableFred's -- plus the boss) is writing a cross-browser framework and app. No native code, let alone deoptimizing magic VM-ported code for each top JS VM.

You don't need the debugger to be part of your framework and app, in order to use it for development.

Nope, the inspector (not a debugger) is a higher-level tool customized to our framework and app, written in cross-browser JS, using jQuery and whatever's hawt.

Why you keep changing terms of my example is beyond me. Not your startup!

(There, concise enough this time?)

Yes!

# David-Sarah Hopwood (14 years ago)

On 2010-12-23 05:08, Brendan Eich wrote:

On Dec 22, 2010, at 7:34 PM, David-Sarah Hopwood wrote:

As far as I can see, MarkM has not (at least, not on the wiki) proposed any new syntax in this discussion that had not already been proposed in one of Allen's proposals.

Wrong again. Allen did not write the original strawman:names proposal.

Fine, one of Allen or Dave Herman and Sam Tobin-Hochstadt's proposals. Mea culpa. Does it affect my argument at all? No.

Follow that link and read strawman:names#binding_private_names to see only examples using x.key, etc. -- no square brackets.

What does the lack of an example have to do with anything?

Read what it says: in the scope of 'private x', "x is also bound as a plain variable to the Name value."

Combined with the previous example:

var name = new Name; ... obj[name] = "secret"; print(obj[name]); // secret

it's clear that the square bracket syntax is valid in the scope of a private declaration. That is what MarkM's desugaring faithfully emulates.

Perhaps that is not what the authors of the names proposal intended. If so, how was MarkM supposed to know that?

Mark's example predates private_names and so may have worked in the old names proposal,

It explicitly says that it does; there's no "may" here.

but only via square brackets. Not via dot -- so again not "the same syntax" as what even strawman:names proposed.

That page doesn't explicitly spell out the desugaring of '.', but MarkM did so later. There's clearly no conflict with the soft field semantics, which is the important thing, anyway.

Never mind the private names proposal that supersedes names -- not faulting Mark for lacking clairvoyance here -- I'm faulting you for twisting "the same syntax" from its obvious meaning of "all the same syntax" to "the subset that uses square brackets".

Only if you're determined to misinterpret it, can strawman:inherited_explicit_soft_fields#can_we_subsume_names

be mistaken for a complete proposal of how to desugar the names syntax. It is obviously a partial outline.

You seem to have problem owning up to mistakes.

I have a problem owning up to mistakes?

secure.wikimedia.org/wikipedia/en/wiki/Psychological_projection

# Dave Herman (14 years ago)

MarkM's desugaring doesn't look correct to me at all. Given that names can always be looked up in objects, regardless of whether they are bound with 'private', it is not amenable to simulation via local desugaring. You'd have to change the way square brackets are treated universally. Did you see my message about this earlier in the thread?

# Mark S. Miller (14 years ago)

I would like to encourage everyone to stop arguing about whether my old syntax at < strawman:inherited_explicit_soft_fields#can_we_subsume_names>

was or was not a faithful adaptation of the old names syntax at < strawman:names>. Names has moved

beyond that old syntax and I am now concerned with the new one. My apologies for not updating my old page since then.

At the top of < strawman:names_vs_soft_fields> I show

a soft fields desugaring for the currently proposed names syntax. Because [] is problematic for the reasons Dave and Brendan have explained, I delay the discussion of [] till < strawman:names_vs_soft_fields#accessing_private_identifiers_as_soft_field_values>,

where it parallels the discussion of [] in the private names proposal. There, I examine that issue a bit, mentioning that we could use the "kludge" on my earlier page, but also showing alternatives that some here will find less satisfying. Be sure to read to the bottom of that section.

In any case, I'd like to withdraw my unqualified use of "orthogonal" on this thread, lest it be misunderstood as a claim that the syntax issues for private names and for soft fields are precisely identical. Please everyone, read the page where I go through parallel examples. I discuss some pros and cons each way as I go. I wrote this listings of differences before this thread started. I stand by my overall sense that the syntactic issues are still independent enough from the semantics that, with minor adjustments, any viable syntax proposal could be applied to either semantics. I still think the syntax and semantics at stake here are best discussed as separate questions, unless of course one of these non-orthogonalities actually turns out to be important.

I would also like to encourage the continued exploration of alternative syntaxes, such as the sigil and @ approaches previously mentioned.

Brendan, I still do not understand why you think it is illegitimate to consider private names and soft fields as alternatives. Do you really think we should provide syntactic support for both?

# Brendan Eich (14 years ago)

On Dec 22, 2010, at 11:34 PM, Mark S. Miller wrote:

Brendan, I still do not understand why you think it is illegitimate to consider private names and soft fields as alternatives. Do you really think we should provide syntactic support for both?

The discussion here, including Dave's point about transposed get or set for [] being conceptually mismatched to the current [] meaning, and David-Sarah's reply about why you can't stop a third party from using your frozen object identity as the key in a weak map, have convinced me that even the frozen AST example doesn't need syntax, so much as weak maps and whatever soft fields make sense on top of them as library code.

That leaves the private names proposal the lone bearer of new syntax.

Cheers,
--Dr. Freeze

:-}

# Mark S. Miller (14 years ago)

On Wed, Dec 22, 2010 at 11:30 PM, Dave Herman <dherman at mozilla.com> wrote:

MarkM's desugaring doesn't look correct to me at all. Given that names can always be looked up in objects, regardless of whether they are bound with 'private', it is not amenable to simulation via local desugaring. You'd have to change the way square brackets are treated universally. Did you see my message about this earlier in the thread?

I agree. I have not revisited the [] issue specifically in light of the new names syntax except for the section where I refer to the previous [] discussion as a kludge. (Just noting again that my "kludge" admission there predates this thread.) Depending on what pair of syntax and semantics we desire, if we do want to use [] with soft fields, then I agree -- you cannot do so by desugaring. Instead, I would change the [[Get]] and [[Put]] operations to test if their argument is a SoftField, in a precisely analogous to how Names would change these to check whether the argument is a Name. This would make SoftFields necessarily built-in rather than equivalent to a library, just as Names are. I consider this a demerit but not fatal -- I prefer proposals that can be explained as equivalent to a library. YMMV. Nevertheless, if we decide this use with square brackets is important, I would not object to making this change to [[Get]] and [[Put]].

My current preference is that, rather than extend the use of [] for either proposal, that we adopt some alternate syntax, such as sigils or @, that preserves the analogy with public properties but maintains a distinction between the two. This is not a deeply held or thought through position. I look forward to an exploration of possible syntaxes. As several have suggested, both publicly and privately (thanks), I no longer recuse myself from syntax. But I will strive to keep these discussions separate until someone shows a compelling coupling between the two.

# Mark S. Miller (14 years ago)

On Wed, Dec 22, 2010 at 11:44 PM, Brendan Eich <brendan at mozilla.com> wrote:

On Dec 22, 2010, at 11:34 PM, Mark S. Miller wrote:

Brendan, I still do not understand why you think it is illegitimate to consider private names and soft fields as alternatives. Do you really think we should provide syntactic support for both?

The discussion here, including Dave's point about transposed get or set for [] being conceptually mismatched to the current [] meaning, and David-Sarah's reply about why you can't stop a third party from using your frozen object identity as the key in a weak map, have convinced me that even the frozen AST example doesn't need syntax, so much as weak maps and whatever soft fields make sense on top of them as library code.

I do not understand this reply. Could you expand?

# Dave Herman (14 years ago)

[having trouble with my phone. Trying again]

This doesn't have anything to do with new revisions of the names proposal. Every version, including the original, extended [[Get]] and [[Set]] and hence effectively overloaded the square bracket notation.

# Brendan Eich (14 years ago)

On Dec 22, 2010, at 11:58 PM, Mark S. Miller wrote:

On Wed, Dec 22, 2010 at 11:44 PM, Brendan Eich <brendan at mozilla.com> wrote: On Dec 22, 2010, at 11:34 PM, Mark S. Miller wrote:

Brendan, I still do not understand why you think it is illegitimate to consider private names and soft fields as alternatives. Do you really think we should provide syntactic support for both?

The discussion here, including Dave's point about transposed get or set for [] being conceptually mismatched to the current [] meaning, and David-Sarah's reply about why you can't stop a third party from using your frozen object identity as the key in a weak map, have convinced me that even the frozen AST example doesn't need syntax, so much as weak maps and whatever soft fields make sense on top of them as library code.

I do not understand this reply. Could you expand?

Dave wrote:

"[O]verloading that syntax to mean lookup in a side table is what seems like a drastic break from the intuitive model of objects. I have nothing against side tables as a programming idiom; it's when you make them look like they aren't side tables that they become confusing. Especially when you can do counter-intuitive things like add new properties to a frozen object. Of course, there are clearly use cases where you want to associate new data with a frozen object, and convenience would be helpful. I'm just not convinced that making it look like ordinary object lookup is the right programmer-interface."

David-Sarah wrote:

"What's your intended goal in preventing 'adding' fields to [a frozen] object?

If the goal is security or encapsulation, then freezing the object is sufficient. If I add the field in a side table, that does not affect your use of the object. I could do the same thing with aWeakMap.set(obj, value)."

To answer David-Sarah's question, my goal in preventing "adding" fields to a frozen object depends on the syntax used to add those soft fields in exactly this way: aWeakMap.set(obj, value) is no problem, but (per Dave's words) obj[aSoftField] = value succeeding in spite of obj being frozen is a problem, conceptually for teachers, learners, documenters, hackers -- everyone building a mental model of "objects and properties" including freezing.

It seems you agree enough to be exploring @ instead of ., which could desugar to transposed .get or .set. So perhaps more new syntax will help, rather than less new syntax and too much overloading of old.

# Kevin Smith (14 years ago)

If I might ask a side-question: what's the value in making an object non-extensible in ES5? I understand the value of making properties non-configurable or non-writable, but I don't yet see a reason to prevent extensions.

# Mark S. Miller (14 years ago)

On Thu, Dec 23, 2010 at 5:53 AM, Kevin Smith <khs4473 at gmail.com> wrote:

If I might ask a side-question: what's the value in making an object non-extensible in ES5? I understand the value of making properties non-configurable or non-writable, but I don't yet see a reason to prevent extensions.

Hi Kevin, Allen also asked about this. Quoting from < esdiscuss/2010-December/012342>:

Allen asked:

Even if this style did become the norm, I don't see why you would argue in

support of mechanisms that allow extension of frozen objects. Isn't the whole point of freezing to prevent any extensions.

I responded:

# Brendan Eich (14 years ago)

On Dec 23, 2010, at 5:53 AM, Kevin Smith wrote:

If I might ask a side-question: what's the value in making an object non-extensible in ES5? I understand the value of making properties non-configurable or non-writable, but I don't yet see a reason to prevent extensions.

Mark's answer brought up shared built-ins (primordials) but I don't think the case of shared mutable primordials in a browser frame is necessary to argue for Object.preventExtensions(). An extensible object O can be go wrong as follows:

Object detection patterns can be confused by party A extending O.P, party B detecting ('P' in O) and wrongly assuming O has the P that B expects. In general, an extensible object can be extended to spoof another object (accidentally or on purpose), allowing the extended object to pass where it should not pass.

ES5 strict mode error throwing helps here. This isn't just about security. In developing any moderate amount of JS (e.g., Narcissus, in our experience), it's too easy to have bugs where the wrong object is extended. Of course, I'm not saying you can prevent extensions of all objects in such a codebase. Clearly, some need to be extensible (at least for a while).

But the preventExtensions tool has its uses, and without it one has to insert manual "type tests". Immutable ASTs and IR forms in compilers are a use-case for freezing objects, as mentioned earlier (with some research pointers); these also motivate weak maps.

# Mark S. Miller (14 years ago)

On Thu, Dec 23, 2010 at 12:18 AM, Brendan Eich <brendan at mozilla.com> wrote:

On Dec 22, 2010, at 11:58 PM, Mark S. Miller wrote:

On Wed, Dec 22, 2010 at 11:44 PM, Brendan Eich <brendan at mozilla.com>wrote:

On Dec 22, 2010, at 11:34 PM, Mark S. Miller wrote:

Brendan, I still do not understand why you think it is illegitimate to consider private names and soft fields as alternatives. Do you really think we should provide syntactic support for both?

The discussion here, including Dave's point about transposed get or set for [] being conceptually mismatched to the current [] meaning, and David-Sarah's reply about why you can't stop a third party from using your frozen object identity as the key in a weak map, have convinced me that even the frozen AST example doesn't need syntax, so much as weak maps and whatever soft fields make sense on top of them as library code.

I do not understand this reply. Could you expand?

Dave wrote:

"[O]verloading that syntax to mean lookup in a side table is what seems like a drastic break from the intuitive model of objects. I have nothing against side tables as a programming idiom; it's when you make them look like they aren't side tables that they become confusing. Especially when you can do counter-intuitive things like add new properties to a frozen object. Of course, there are clearly use cases where you want to associate new data with a frozen object, and convenience would be helpful. I'm just not convinced that making it look like ordinary object lookup is the right programmer-interface."

David-Sarah wrote:

"What's your intended goal in preventing 'adding' fields to [a frozen] object?

If the goal is security or encapsulation, then freezing the object is sufficient. If I add the field in a side table, that does not affect your use of the object. I could do the same thing with aWeakMap.set(obj, value)."

To answer David-Sarah's question, my goal in preventing "adding" fields to a frozen object depends on the syntax used to add those soft fields in exactly this way: aWeakMap.set(obj, value) is no problem, but (per Dave's words) obj[aSoftField] = value succeeding in spite of obj being frozen is a problem, conceptually for teachers, learners, documenters, hackers -- everyone building a mental model of "objects and properties" including freezing.

Hi Brendan, thanks for this. For the first time I understand why some find it desirable to not be able to extend a frozen object, even safely with a side table semantics. I think the issue you raise here can be addressed by re-explaining "non-extensibility" as suppressing the addition of public properties. I do not know how satisfying you would find this shift, but it makes sense to me. I would also appreciate reaction from others, thanks.

It seems you agree enough to be exploring @ instead of ., which could desugar to transposed .get or .set. So perhaps more new syntax will help, rather than less new syntax and too much overloading of old.

Rather than more or less, I was suggesting different. I would hate to see @ added to support soft fields in addition to "private" and/or "#" added to support names. That exceeds my sense of the syntax budget we should be willing to pay. But if it helps brainstorming not to constrain this budget early, let's continue to try all syntax proposals on both semantics and see what the pros, cons, and non-orthogonalities are. We can winnow later if you like, but please no later than May.

# David-Sarah Hopwood (14 years ago)

On 2010-12-23 13:53, Kevin Smith wrote:

If I might ask a side-question: what's the value in making an object non-extensible in ES5? I understand the value of making properties non-configurable or non-writable, but I don't yet see a reason to prevent extensions.

Suppose that the object inherits properties from a parent on the prototype chain. Then "extending" the object could override those properties, even if they are non-configurable or non-writable on the parent. So making an object non-extensible is necessary in order to make inherited properties effectively non-configurable and/or non-writable.

# Brendan Eich (14 years ago)

On Dec 23, 2010, at 10:17 AM, Mark S. Miller wrote:

It seems you agree enough to be exploring @ instead of ., which could desugar to transposed .get or .set. So perhaps more new syntax will help, rather than less new syntax and too much overloading of old.

Rather than more or less, I was suggesting different.

More + new = different, but it's also more -- adding @ in addition to dot, or @ as sigil usable after dot and left-square-bracket. We're not taking away syntax, so the budget ceiling must rise just for @.

I would hate to see @ added to support soft fields in addition to "private" and/or "#" added to support names.

I agree, but I'm content to let soft fields and other weak map libraries get enough usage to warrant new syntax. The frozen AST use-case can use .get and .set explicitly in the interim. That's why I wrote that only private names (as currently proposed) is burdened (or blessed, or both) with new syntax.

That exceeds my sense of the syntax budget we should be willing to pay. But if it helps brainstorming not to constrain this budget early, let's continue to try all syntax proposals on both semantics and see what the pros, cons, and non-orthogonalities are.

As I wrote to David-Sarah, I'm now convinced we should not try mapping syntax strawmen to both semantics. We don't even have agreement on the syntax requirements, on #.id to get private names into runtime expressions, reflection, and proxies.

Plus, we have no user testbed (yet -- Narcissus is being beefed up to prototype Harmony proposals and it can run in-browser via Zaphod -- more on this in a bit).

GIven weak maps in harmony, and lack of experience using them enough to motivate syntactic sugar, I'm not in favor of adding syntax for soft fields -- yet. Private names as proposed come with syntax as an essential part of the proposal.

After all this discussion it is clear to me that we should not compare apples to oranges or prematurely standardize only one kind of fruit. We're likely to end up with a Meyer lemon by mistake.

# Mark S. Miller (14 years ago)

On Thu, Dec 23, 2010 at 11:49 AM, Brendan Eich <brendan at mozilla.com> wrote:

On Dec 23, 2010, at 10:17 AM, Mark S. Miller wrote:

It seems you agree enough to be exploring @ instead of ., which could

desugar to transposed .get or .set. So perhaps more new syntax will help, rather than less new syntax and too much overloading of old.

Rather than more or less, I was suggesting different.

More + new = different, but it's also more -- adding @ in addition to dot, or @ as sigil usable after dot and left-square-bracket. We're not taking away syntax, so the budget ceiling must rise just for @.

I would hate to see @ added to support soft fields in addition to "private" and/or "#" added to support names.

I agree, but I'm content to let soft fields and other weak map libraries get enough usage to warrant new syntax. The frozen AST use-case can use .get and .set explicitly in the interim. That's why I wrote that only private names (as currently proposed) is burdened (or blessed, or both) with new syntax.

That exceeds my sense of the syntax budget we should be willing to pay. But if it helps brainstorming not to constrain this budget early, let's continue to try all syntax proposals on both semantics and see what the pros, cons, and non-orthogonalities are.

As I wrote to David-Sarah, I'm now convinced we should not try mapping syntax strawmen to both semantics. We don't even have agreement on the syntax requirements, on #.id to get private names into runtime expressions, reflection, and proxies.

Plus, we have no user testbed (yet -- Narcissus is being beefed up to prototype Harmony proposals and it can run in-browser via Zaphod -- more on this in a bit).

GIven weak maps in harmony, and lack of experience using them enough to motivate syntactic sugar, I'm not in favor of adding syntax for soft fields -- yet. Private names as proposed come with syntax as an essential part of the proposal.

After all this discussion it is clear to me that we should not compare apples to oranges

You've said this "apples to oranges" thing many times. I just don't get it. My comparisons at < strawman:names_vs_soft_fields> show

that these two semantics address extremely overlapping use cases. For both to be in the language, with one group (including myself) saying "use soft fields for these use cases" and another group saying the opposite, is to create conflicting conventions and the horrors of Perl's TIMTOWTDI philosophy.

Do you agree at least that for the use case shown by the < strawman:private_names#conflict-free_object_extension_using_private_names>

clone example, we should all recommend soft fields, so that these extensions will not needlessly break when they encounter frozen prototypes?

or prematurely standardize only one kind of fruit.

I don't get this either. Certainly, both are equally premature at this point. Are you saying that neither should be in ES6?

And please let's also agree not to prematurely standardize both kinds of fruit.

We're likely to end up with a Meyer lemon by mistake.

I will try to resist the temptation to expand on these colorful metaphors ;).

# thaddee yann tyl (14 years ago)

One last syntax proposal, since I agree with Brendan Eich about the Perlishness of the number sign. "private ::secret;" :

a = {}; k = {a: a}; a['k'] = k; function aa(o) { private ::a; k::a = o; // or: private c.a; c.a = o; a.a = a.k.a; // or: a['a'] = a['k']['a']; a.a = k::a; // here, on the other hand, k.a is the private stuff. return ::a; } let a2 = aa(a); print( a[a2] === a ); // true

We should still be able to use "obj.secret" as a way to access properties, even in the scope of "private ::secret;", and the syntax seems vaguely familiar to C++ and Ruby programmers (Ruby's syntax for symbols uses only ":", but we can't use it here, to avoid confusion with labels).

From: David-Sarah Hopwood <david-sarah at jacaranda.org>

# Brendan Eich (14 years ago)

On Dec 23, 2010, at 12:11 PM, Mark S. Miller wrote:

You've said this "apples to oranges" thing many times. I just don't get it.

You've read the recent messages where it became clear only [], not the . operator, was ever mooted for soft fields on the wiki. And how [] can't be a local transformation, as private lexical bindings only on the right of . or the left of : in an initialiser can be. And how private names are intended for weak encapsulation use-cases, as Allen wrote explicitly in the private_names strawman, including being able to find private names via Object.getOwnPropertyNames, e.g., to implement a clone method (so they can't possibly be implemented as soft fields).

Soft fields were never equivalent to private names, or substitutable for them, even in the old strawman:names days. Anyway, strawman:private_names supersedes names, and it is more clearly an orange to the soft field apple.

My comparisons at strawman:names_vs_soft_fields show that these two semantics address extremely overlapping use cases.

Not really. The frozen object extension use-case is a good one for abstractions based on weak maps. I've said that just in the previous reply but I don't see the need for syntax this soon, and (to repeat) it ought to look different from property access syntax.

The "private x" declaration has no analogue in soft fields and it addresses use-cases that are inherently about syntax. Love those use-cases or hate them, they are not served by soft fields.

The reflection, etc. of private names as values also has no analogue in soft fields but does arise in use cases including object inspectors, clone methods, etc.

For both to be in the language, with one group (including myself) saying "use soft fields for these use cases" and another group saying the opposite, is to create conflicting conventions and the horrors of Perl's TIMTOWTDI philosophy.

The use-cases are different and soft fields don't obviously need syntax -- and shouldn't have syntax that looks like property access. Private names are intentionally extending property access syntax because they are a new kind of property name, not a side table. These are different things.

Do you agree at least that for the use case shown by the strawman:private_names#conflict-free_object_extension_using_private_names clone example, we should all recommend soft fields, so that these extensions will not needlessly break when they encounter frozen prototypes?

I'm skeptical we'll ever have frozen prototypes, but anyway, people can use weak maps if they need to, once weak maps are implemented.

But for extending shared objects, as Prototype does? Unlikely. It'll keep doing what it has done. If it could use private names, though, while it wouldn't work with frozen prototypes, it also wouldn't need a bunch of invasive source code changes to call .get/.set or use @. A few well-placed private declarations in an outer module scope, some concatenative programming of Prototoype and its client script, and away go the conflict risks.

This is all speculative, until we get weak maps out there, so it's not important to agree too far in advance.

or prematurely standardize only one kind of fruit.

I don't get this either. Certainly, both are equally premature at this point. Are you saying that neither should be in ES6?

Weak maps are harmonious. I'm saying we should not shotgun-marry some untested syntax for soft fields without lots of users clamoring for syntax instead of .get and .set.

Private names need more time to be developed and user-tested, in any scenario. It's not important to triage them out of ES6 or into ES7 at this point. They're a strawman proposal, they will be implemented in a testable setting.

And please let's also agree not to prematurely standardize both kinds of fruit.

Weak maps are already harmonious. Do you mean syntax for soft fields?

# David Herman (14 years ago)

You've said this "apples to oranges" thing many times. I just don't get it. My comparisons at strawman:names_vs_soft_fields show that these two semantics address extremely overlapping use cases. For both to be in the language, with one group (including myself) saying "use soft fields for these use cases" and another group saying the opposite, is to create conflicting conventions and the horrors of Perl's TIMTOWTDI philosophy.

Respectfully, I disagree pretty emphatically with this. JavaScript has always been a hybrid-paradigm language, and it is used to good effect in many different styles. I think it's problematic for TC39 to assume too didactic a role. We should support paradigms that can be used effectively, especially if they've been proven in practice, but we should not mandate any one particular style. Language design of course has unavoidable effects on programmer style, and this shouldn't be ignored; but this power shouldn't be taken as free license to impose discipline.

As I see it, I actually have different scenarios where I'd decide to use either weak maps or private names. For example, in Narcissus, we create proxies for the VM that hide information from the user program, and we really want them to be airtight -- in this case, I'd prefer to use a weak map to store the private data (currently, the implementation is leaky). On the flip side, when writing cooperative/collaborative code, I'd often prefer the private names approach, because the model of internal-but-hidden properties is simpler and lighter-weight. In these cases, I am not as concerned about the airtightness of the abstraction, and indeed I find some of the reflective use cases we've discussed (e.g., user-land introspection tools, general-purpose cloning operations) pretty compelling.

And ultimately, I just don't buy that TMTOWTDI is a horror. In fact, in the JS community, this aspect of the language is celebrated. We have a lively, vibrant community of programmers that experiments with idioms and shares them liberally. This is a success story of open source software!

Do you agree at least that for the use case shown by the strawman:private_names#conflict-free_object_extension_using_private_names clone example, we should all recommend soft fields, so that these extensions will not needlessly break when they encounter frozen prototypes?

IIUC, I'm afraid I don't agree. I think there will continue to be many programming scenarios where freezing is uncommon. The page says "best practice in many environments will be to freeze the primordials early." Of course, "many" is a flexible word, but it's a big bet that this will become accepted best practice in general. I'm not very good at betting on the future of the web (though I have plenty of experience suggesting that no one should listen to my predictions).

or prematurely standardize only one kind of fruit.

I don't get this either. Certainly, both are equally premature at this point. Are you saying that neither should be in ES6?

And please let's also agree not to prematurely standardize both kinds of fruit.

Nobody's trying to rush both to standardization. But weak maps are harmonious, and soft fields are a very lightweight abstraction on top -- even if SoftField weren't in the standard, programmers would have the ability to use weak maps as soft fields. Purely as a library, I'm skeptical of the value of standardizing SoftField, but I'm open to exploring syntactic conveniences for them. I don't, however, think that they are a good fit for the existing object syntaxes (bracket, dot, colon-in-literals), for reasons I've stated before: the impedance mismatch between object property tables vs. side tables.

Separately from that, I believe that private names are a nice, conservative extension of the existing object model that allows for convenient and lightweight encapsulation.

All we've asked is that we not assume prima facie that we must pick a winner and stop all work on the other. That said, I don't think we should do much design work on the list or in committee meetings. The "champions" model has worked well (for example, for the proxies spec). I think Allen and others should continue working on private names, and Mark and others should continue working on soft fields. This conversation has raised helpful feedback and ideas, so now it's time for people to go back to the drawing board and do some more independent design work.

# David-Sarah Hopwood (14 years ago)

On 2010-12-23 21:02, Brendan Eich wrote:

On Dec 23, 2010, at 12:11 PM, Mark S. Miller wrote:

You've said this "apples to oranges" thing many times. I just don't get it.

You've read the recent messages where it became clear only [], not the . operator, was ever mooted for soft fields on the wiki.

That's false; the examples at strawman:names_vs_soft_fields

show otherwise.

And how [] can't be a local transformation, [...]

Indeed it can't, but I don't see the relevance of that to the '"apples to oranges" thing'. We don't know whether [] will be changed at all. (In the proposal to add a @ or .# operator, it isn't.)

# Allen Wirfs-Brock (14 years ago)

On Dec 23, 2010, at 12:11 PM, Mark S. Miller wrote:

You've said this "apples to oranges" thing many times. I just don't get it. My comparisons at strawman:names_vs_soft_fields show that these two semantics address extremely overlapping use cases. For both to be in the language, with one group (including myself) saying "use soft fields for these use cases" and another group saying the opposite, is to create conflicting conventions and the horrors of Perl's TIMTOWTDI philosophy.

Do you agree at least that for the use case shown by the strawman:private_names#conflict-free_object_extension_using_private_names clone example, we should all recommend soft fields, so that these extensions will not needlessly break when they encounter frozen prototypes?

I'll echo Brendan's and D. Herman's no. But let me take my own crack at trying to explain what I think the difference is.

I believe that your "camp" wants to think of soft fields, stored in a side-table, as extensions of an object. My "camp" thinks of such side-tables as a means of recording information about an object without actually extending the object.

Object oriented programming is largely about identifying abstractions and providing their implementations. Each abstraction has a set of public characteristics. The naming of an abstraction allows us to intellectually chunk a set of characteristics as a single named concept. This makes it easier to conceptualize complex systems. One of the possible characteristic of such abstractions is whether or not the implementation of the abstraction (the class or object, depending upon the language) may be extended or modified. In ES5, I think of the [[Exstensible]] internal property as the mechanism for implementing this characteristic.

There are situations where it is convenient to extend an object (and hence the abstraction it implements). There are also situations where it is necessary to record information about an object without extending the object or the abstraction. There are also situation whether either approach might reasonably be used.

The clone example, in my proposal is explicitly addressing the case of how you might use "private names" to extend an extensible object is a way that avoids naming conflicts. It does not apply to frozen objects because such object are simply not extensible. A similar example could be adding the "array extra" methods to Array.prototype in an ES3 implementation that also included the [[Extensible]] internal property. As long as [[Extensible]] was true you could add those methods. But if [[Extensible]] is false you can't do it. It isn't simply a matter of finding a different way to extend the Array.prototype abstraction (perhaps adding analogous functions to the Array Constructor). Array.prototype simply can not be extended if [[Extensible]] is false. Whatever you do you will not be extending the array abstraction. Instead you will be creating a new parallel mechanism that operates in conjunction with the array abstraction. Extending Array.prototype with a map functions allows you to integrate map into the array abstraction. You don't have to think about it independently. If Array.prototype is frozen you can't do this. You have to think about an independent map function.

Whether or not to support an extensible abstraction is a design decision for the original creator of the abstraction. The designer of a AST framework might make its node abstraction extensible so that clients could directly decorate the AST. That decision becomes part of the abstraction. Given that decision, a client of the frame might choose to extend the node abstraction. Or it might choose to create new abstractions (perhaps via side-tables) to record information about specific nodes. If the framework designer choose to make the node abstraction non-extensible (for reasons of parallelism or whatever) the client has no choice in the matter. They must accomplish their goal without extending the node abstraction.

Weak map based side tables are a great tools for associating supplemental information with an object. But that information is not part of the abstraction represented by the mapped objects. It is part of some new abstraction. By treating such Weak maps as "soft fields" you seem to be trying merge the two abstraction in a manner that imposes inconsistencies. Specifically the merged abstraction is both frozen and extensible, yet frozen is the JavaScript object level manifestation of a non-extensible abstraction.

A apologize if this explanation is still unclear. I think we are indeed conceptualizing things quite differently which is why you see an isomorphism where I see a dichotomy. Hopefully, we can continue to try to develop a shared understanding.

# David Herman (14 years ago)

On Dec 23, 2010, at 4:27 PM, David-Sarah Hopwood wrote:

We don't know whether [] will be changed at all. (In the proposal to add a @ or .# operator, it isn't.)

Hm, this looks like a pretty serious misunderstanding of the private names proposal. In every variant of the proposal, the object model is changed so that private name values are allowable property keys. This means that in every variant of the private names proposal, [] can't be defined via a local transformation. This has nothing to do with the @ or .# operators. For example:

var x = gensym(); // x is a new name value
... obj[x] ...    // the property of obj with private name x

In other words, the core of the private names proposal -- generalizing object property lookup to allow name values in addition to strings -- is a semantic change, not a syntactic one. (IOW, if you wanted to dress it up to look like a syntactic change, you would have to do a global transformation of [] expressions -- in short, you'd have to write a compiler.)

This is a central part of the private names proposal. It can't be eliminated from the proposal. So if you want to compare apples to apples, you have to add this feature to the soft fields proposal. As I've said, when you make this apples-to-apples comparison, I believe the soft fields version is problematic because it makes side-table lookup appear as though it's property lookup.

I think I've made this point several points now; it'd be great to get some sort of reply.

# David-Sarah Hopwood (14 years ago)

On 2010-12-23 23:51, Allen Wirfs-Brock wrote:

I believe that your "camp" wants to think of soft fields, stored in a side-table, as extensions of an object. My "camp" thinks of such side-tables as a means of recording information about an object without actually extending the object.

These are obviously alternative views of the same thing -- as MarkM and I have made clear throughout. It really doesn't matter whether you view the object has having been "extended" or not, if that is semantically unobservable.

(And I don't like people trying to tell me what "camp" I'm in, thankyou.)

# Oliver Hunt (14 years ago)

As a question how do soft fields/private names interact with an object that has had preventExtensions called on it?

Are they entirely independent of normal property rules?

# David-Sarah Hopwood (14 years ago)

On 2010-12-23 23:55, David Herman wrote:

On Dec 23, 2010, at 4:27 PM, David-Sarah Hopwood wrote:

We don't know whether [] will be changed at all. (In the proposal to add a @ or .# operator, it isn't.)

Hm, this looks like a pretty serious misunderstanding of the private names proposal.

I was not referring to the private names proposal, but to the more recent suggestions from various people to add a @ or .# operator instead of changing []. (I should not have referred to those suggestions as a proposal. Careless editing, sorry.)

In every variant of the proposal, the object model is changed so that private name values are allowable property keys. This means that in every variant of the private names proposal, [] can't be defined via a local transformation. This has nothing to do with the @ or .# operators.

Changes to [] are not needed if @ or .# is added (or if [# ] is added).

# David Herman (14 years ago)

On Dec 23, 2010, at 5:03 PM, David-Sarah Hopwood wrote:

On 2010-12-23 23:55, David Herman wrote:

On Dec 23, 2010, at 4:27 PM, David-Sarah Hopwood wrote:

We don't know whether [] will be changed at all. (In the proposal to add a @ or .# operator, it isn't.)

Hm, this looks like a pretty serious misunderstanding of the private names proposal.

I was not referring to the private names proposal, but to the more recent suggestions from various people to add a @ or .# operator instead of changing []. (I should not have referred to those suggestions as a proposal. Careless editing, sorry.)

a) I don't recall seeing people suggesting adding a .# operator instead of changing '[]', but rather instead of changing '.'. To wit, the difference is between:

private #x;
... obj.#x ...

and

private x;
... obj.x ...

In both versions, it's also possible to do:

var x = gensym();
... obj[x] ...

But this is irrelevant, since:

b) You're shifting the terms of the debate anyway. You can't decide for yourself what you want others to propose so you can argue with your favorite strawman. All along, Allen, Brendan, and I have been talking about a proposal wherein property names are first-class values that are usable as property names. This is not separable from the proposal.

# David-Sarah Hopwood (14 years ago)

On 2010-12-24 00:02, Oliver Hunt wrote:

As a question how do soft fields/private names interact with an object that has had preventExtensions called on it?

For soft fields: there is no interaction, a new soft field can be added to an object on which preventExtensions has been called.

For private names: new names are prevented from being added.

This is a useful feature of soft fields. There is no loss of security or encaspulation as a result, for the same reason that there is not for adding a soft field to a frozen object. (Freezing is equivalent to preventing extensions, marking all properties as non-Configurable, and marking all data properties as non-Writable.)

# Allen Wirfs-Brock (14 years ago)

I just spent significant time trying to clarify why it does matter, at least to some of us. In addition, I started with a quote from MarkM concerning an observable semantic difference.

Finally, I don't recall mentioning you in anyway nor directing that message to you other than as a cc via the es-discuss list.

# Brendan Eich (14 years ago)

On Dec 23, 2010, at 3:27 PM, David-Sarah Hopwood wrote:

On 2010-12-23 21:02, Brendan Eich wrote:

On Dec 23, 2010, at 12:11 PM, Mark S. Miller wrote:

You've said this "apples to oranges" thing many times. I just don't get it.

You've read the recent messages where it became clear only [], not the . operator, was ever mooted for soft fields on the wiki.

That's false; the examples at strawman:names_vs_soft_fields show otherwise.

You're right, I missed that. Thanks for pointing it out, but brace yourself for some push-back.

The longstanding wiki page (created 08/14) that I was referring to is:

strawman:inherited_explicit_soft_fields#can_we_subsume_names

The one you cite is a recent clone (started 12/12) of Allen's examples, with translations where possible to soft fields.

Since the new page is a clone of Allen's private_names strawman, of course it clones the "private x" examples and shows . and :-in-literal being used.

It's not clear how this new page helps eliminate private_names as a proposal. No one has denied that soft fields can do some of what private names do, but not all, and with observable differences for #.id, reflection, proxies, implementor-friendliness, etc. as we have been discussing.

Note in particular the place in this new page where Mark does not create a polymorphic property access function:

"Enabling the use of [ ] with soft fields requires the kludge explained at can we subsume names. If we can agree that this polymorphism does not need such direct support, ...."

We cannot agree, that's the point! Orange, not apple, wanted and proposed by private names. Polymorphism wanted. Difference!

So even with . as well as [] thanks to this recent page, we still have observable let's say encapsulation differences between the proposals. These are more important than the [] issue, and were more important before 12/12 when my statement you fault as mistaken was in fact correct (going back to Allen's private_names on 12/08 and in most respects going all the way back to the creation of strawman:names) and did count for something in making names / private_names an orange, not a soft-field-implementable apple.

Moreover, since you are citing a recently added page, and (below) also adducing mere es-discuss sketching of novelties such as @ as somehow moving the proposals forward, even though @ has not yet been proposed in the wiki, I argue that fair play requires you to keep current in all respects: we proponents of both weak maps etc. and private names have argued recently that soft fields should not have syntax looking anything like property access.

Shifting the terms of the debate mid-conversation (across recent weeks, with new pages alongside older ones, and new messages in the list) cuts both ways.

Our rejection of property syntax for soft fields makes this whole "map one (subset, in the case of private names) syntax to two (subset, in the case of private names) semantics" argument obsolete, at least when it comes to property access syntax. So, can we move past this?

And how [] can't be a local transformation, [...]

Indeed it can't, but I don't see the relevance of that to the '"apples to oranges" thing'. We don't know whether [] will be changed at all. (In the proposal to add a @ or .# operator, it isn't.)

Way to change the terms of the debate! The wiki proposals put in a bogus elimination contest are the comparable goods here, not some new twist on the mailing list thread.

Specifically, private names does not map a[b] to b.get(a), and no names proposal ever did any such thing. Yet such a transposed .get or .set mapping is observable (by my reading of the several wiki pages Mark has written; please correct me if I'm wrong) in the soft fields proposals, especially the new one you cite which was cloned recently from private_names (if it is indeed a serious syntax for soft fields proposal -- is it? I can't tell).

The proposal to add @ or #. is not anything "ever mooted on the wiki" (unless MarkM just created a new page! He can move fast for a frosty kind of Dr. Freeze villain :-P). It is fine discussion fodder here, but it is not relevant to the apples-to-oranges contest being proposed between soft fields and names on the wiki, for many months now.

Dave recently repeated the plea for de-escalating the elimination contest pitting soft fields against private names. That is especially important in light of the moving targets on the wiki and here in the list. Otherwise we'll have various parties reacting to stale information, confusion over what is mooted vs. seriously proposed, and other such confusion. We've already had too much of this.

# David-Sarah Hopwood (14 years ago)

On 2010-12-24 00:11, David Herman wrote:

On Dec 23, 2010, at 5:03 PM, David-Sarah Hopwood wrote:

On 2010-12-23 23:55, David Herman wrote:

On Dec 23, 2010, at 4:27 PM, David-Sarah Hopwood wrote:

We don't know whether [] will be changed at all. (In the proposal to add a @ or .# operator, it isn't.)

Hm, this looks like a pretty serious misunderstanding of the private names proposal.

I was not referring to the private names proposal, but to the more recent suggestions from various people to add a @ or .# operator instead of changing []. (I should not have referred to those suggestions as a proposal. Careless editing, sorry.)

a) I don't recall seeing people suggesting adding a .# operator instead of changing '[]', but rather instead of changing '.'.

Lasse Reichstein did so:

Mark Miller wrote:

#> Currently is JS, x['foo'] and x.foo are precisely identical in all #> contexts. This regularity helps understandability. The terseness #> difference above is not an adequate reason to sacrifice it.

Agree. I would prefer something like x.#foo to make it obvious that it's

not the same as x.foo (also so you can write both in the same scope), and

use "var bar = #foo /* or just foo */; x[#bar]" for computed private name

lookup. I.e. effectively introducing ".#", "[#" as alternatives to just "."

or "[".

MarkM responded with a similar proposal, using a single operator:

The basic idea is, since we're considering a sigil anyway, and

since .# and [# would both treat the thing to their right as something

to be evaluated, why not turn the sigil into an infix operator instead?

Then it can be used as "."-like "[]"-like without extra notation or

being too closely confused with "." or "[]" themselves. [...]

b) You're shifting the terms of the debate anyway. You can't decide for yourself what you want others to propose so you can argue with your favorite strawman.

As shown above, I haven't.

# David-Sarah Hopwood (14 years ago)

On 2010-12-24 00:39, Brendan Eich wrote:

On Dec 23, 2010, at 3:27 PM, David-Sarah Hopwood wrote:

On 2010-12-23 21:02, Brendan Eich wrote:

On Dec 23, 2010, at 12:11 PM, Mark S. Miller wrote:

You've said this "apples to oranges" thing many times. I just don't get it.

You've read the recent messages where it became clear only [], not the . operator, was ever mooted for soft fields on the wiki.

That's false; the examples at strawman:names_vs_soft_fields show otherwise.

You're right, I missed that. Thanks for pointing it out, but brace yourself for some push-back.

The longstanding wiki page (created 08/14) that I was referring to is:

strawman:inherited_explicit_soft_fields#can_we_subsume_names

The one you cite is a recent clone (started 12/12) of Allen's examples, with translations where possible to soft fields.

Since the new page is a clone of Allen's private_names strawman, of course it clones the "private x" examples and shows . and :-in-literal being used.

It's not clear how this new page helps eliminate private_names as a proposal.

What it does is adapt the private_names syntax to inherited explicit soft fields, exactly as it claims to do. That removes a lot (not all, since some is associated with the syntax) of the specification complexity from that proposal. Because of the soft field semantics, the resulting mechanism provides strong rather than weak encapsulation.

Note in particular the place in this new page where Mark does not create a polymorphic property access function:

"Enabling the use of [ ] with soft fields requires the kludge explained at can we subsume names. If we can agree that this polymorphism does not need such direct support, ...."

We cannot agree, that's the point! Orange, not apple, wanted and proposed by private names. Polymorphism wanted. Difference!

It is not "comparing apples and oranges" to suggest that a specific subfeature might not be worth its complexity. The phrase "comparing apples and oranges" specifically refers to comparing things that are so different as to be incomparable.

Note that the polymorphism referred to (being able to look up either a private name or a string property) is also achieved by the @ or .# operator approach, but without losing the x["id"] ≡ x.id equivalence, and while being more explicit that this is a new kind of lookup.

So even with . as well as [] thanks to this recent page, we still have observable let's say encapsulation differences between the proposals.

Of course, that's the main reason why I favour the soft fields semantics, because it provides strong encapsulation.

Moreover, since you are citing a recently added page, and (below) also adducing mere es-discuss sketching of novelties such as @ as somehow moving the proposals forward, even though @ has not yet been proposed in the wiki, I argue that fair play requires you to keep current in all respects: we proponents of both weak maps etc. and private names have argued recently that soft fields should not have syntax looking anything like property access.

Yes, I know. I don't know why you are determined to paint me as having some kind of ideological dispute with the proponents of private names, as opposed to merely having strong technical objections to that proposal.

Shifting the terms of the debate mid-conversation (across recent weeks, with new pages alongside older ones, and new messages in the list) cuts both ways.

Our rejection of property syntax for soft fields makes this whole "map one (subset, in the case of private names) syntax to two (subset, in the case of private names) semantics" argument obsolete, at least when it comes to property access syntax. So, can we move past this?

Yes, please! (I barely have any idea what you're talking about when you refer to "shifting the terms of the debate". Isn't that just adapting to the current context of discussion?)

# Mark S. Miller (14 years ago)

On Thu, Dec 23, 2010 at 1:06 PM, David Herman <dherman at mozilla.com> wrote:

All we've asked is that we not assume prima facie that we must pick a winner and stop all work on the other. That said, I don't think we should do much design work on the list or in committee meetings. The "champions" model has worked well (for example, for the proxies spec). I think Allen and others should continue working on private names, and Mark and others should continue working on soft fields. This conversation has raised helpful feedback and ideas, so now it's time for people to go back to the drawing board and do some more independent design work.

+1.

I feel like we've made important progress on this thread: We broke through an impasse of mutual inability to understand each other, are now in a position of a fair degree of mutual understanding, and at a remaining impasse only at making progress from understanding towards towards agreement. I have had some good aha's in getting here, and I hope others have too, but now I feel like we're arguing about the nature of our argument rather than the subject matter. I do not feel I am learning anything new. I think reverting to off-list design work before another round of on-list discussion is a fine thing, and I do like the champion model. So I fully endorse your paragraph above.

That said, once we do resume these on-list or in meeting discussions, I see much right and nothing wrong with comparing the proposals and seeing how much use-case ground that we actually care about we can cover with how little mechanism. Questions of the form "If A can cover this subset of the use cases motivating B, do we need B?" are perfectly legitimate. Indeed, asking such questions vigorously is our only hope at avoiding a kitchen sink language. We have seen the usability of other languages be ruined by undisciplined growth.

That does not mean that we need to ask these questions so early as to suppress exploration and brainstorming. But we are the gatekeepers between "strawman" and "proposal". We need to ask these questions before admitting designs across this threshold.

# Brendan Eich (14 years ago)

On Dec 23, 2010, at 5:17 PM, David-Sarah Hopwood wrote:

On 2010-12-24 00:39, Brendan Eich wrote:

Since the new page is a clone of Allen's private_names strawman, of course it clones the "private x" examples and shows . and :-in-literal being used.

It's not clear how this new page helps eliminate private_names as a proposal.

What it does is adapt the private_names syntax to inherited explicit soft fields, exactly as it claims to do. That removes a lot (not all, since some is associated with the syntax) of the specification complexity from that proposal.

We don't agree on specification complexity.

Because of the soft field semantics, the resulting mechanism provides strong rather than weak encapsulation.

We don't agree on strong always winning.

Therefore "it's not clear how this new page helps eliminate private_names."

If you and MarkM were the dynamic duo of TC39, or 2/3rds of a troika, it would matter. But that's not how the committee cookie crumbles. Also, it's not obvious from all the comments on es-discuss that everyone is on board for strong encapsulation absolutism (and I do mean that, not as an insult).

Really, there's a deeper disagreement here than over syntax mapped to a subset of semantics.

Note in particular the place in this new page where Mark does not create a polymorphic property access function:

"Enabling the use of [ ] with soft fields requires the kludge explained at can we subsume names. If we can agree that this polymorphism does not need such direct support, ...."

We cannot agree, that's the point! Orange, not apple, wanted and proposed by private names. Polymorphism wanted. Difference!

It is not "comparing apples and oranges" to suggest that a specific subfeature might not be worth its complexity. The phrase "comparing apples and oranges" specifically refers to comparing things that are so different as to be incomparable.

We seem to have much DNA in common with other critters. So too with apples and oranges. The point stands: soft fields and private names are not equivalent, observationally and otherwise.

Note that the polymorphism referred to (being able to look up either a private name or a string property) is also achieved by the @ or .# operator approach,

Irrelevant on TC39's January agenda, which is where the elimination-contest was aimed.

but without losing the x["id"] ≡ x.id equivalence, and while being more explicit that this is a new kind of lookup.

Great, the debate continues in es-discuss.

BTW, sigils suck (Allen's refactoring point).

This settles nothing, but I agree with Dave et al. on the champions model trumping design-by-committee or design-by-mailing-list.

So even with . as well as [] thanks to this recent page, we still have observable let's say encapsulation differences between the proposals.

Of course, that's the main reason why I favour the soft fields semantics, because it provides strong encapsulation.

Guess what? It's not all about you. :-|

Moreover, since you are citing a recently added page, and (below) also adducing mere es-discuss sketching of novelties such as @ as somehow moving the proposals forward, even though @ has not yet been proposed in the wiki, I argue that fair play requires you to keep current in all respects: we proponents of both weak maps etc. and private names have argued recently that soft fields should not have syntax looking anything like property access.

Yes, I know. I don't know why you are determined to paint me as having some kind of ideological dispute with the proponents of private names, as opposed to merely having strong technical objections to that proposal.

I never said "ideology", you just did. I did say you were selective in your application of recency or currency in framing the debate. Respond to that, please.

Shifting the terms of the debate mid-conversation (across recent weeks, with new pages alongside older ones, and new messages in the list) cuts both ways.

Our rejection of property syntax for soft fields makes this whole "map one (subset, in the case of private names) syntax to two (subset, in the case of private names) semantics" argument obsolete, at least when it comes to property access syntax. So, can we move past this?

Yes, please! (I barely have any idea what you're talking about when you refer to "shifting the terms of the debate". Isn't that just adapting to the current context of discussion?)

No, as you allowed in replying to Dave Herman about your misuse of "proposal". The wiki.ecmascript.org process feeds into TC39. The es-discuss chatter does not, except by members of the committee championing ideas from the list (which I support, obviously, where the ideas are worth considering). The ideas from the list need to turn into wiki strawman proposals to get further.

# Brendan Eich (14 years ago)

On Dec 23, 2010, at 5:20 PM, Mark S. Miller wrote:

On Thu, Dec 23, 2010 at 1:06 PM, David Herman <dherman at mozilla.com> wrote:

All we've asked is that we not assume prima facie that we must pick a winner and stop all work on the other. That said, I don't think we should do much design work on the list or in committee meetings. The "champions" model has worked well (for example, for the proxies spec). I think Allen and others should continue working on private names, and Mark and others should continue working on soft fields. This conversation has raised helpful feedback and ideas, so now it's time for people to go back to the drawing board and do some more independent design work.

+1.

I feel like we've made important progress on this thread: We broke through an impasse of mutual inability to understand each other, are now in a position of a fair degree of mutual understanding, and at a remaining impasse only at making progress from understanding towards towards agreement. I have had some good aha's in getting here, and I hope others have too,

Agreed. It felt painful because it was painful. Mistakes were made but to err is human. The only way forward is "up".

but now I feel like we're arguing about the nature of our argument rather than the subject matter. I do not feel I am learning anything new. I think reverting to off-list design work before another round of on-list discussion is a fine thing, and I do like the champion model. So I fully endorse your paragraph above.

+∞

That said, once we do resume these on-list or in meeting discussions, I see much right and nothing wrong with comparing the proposals and seeing how much use-case ground that we actually care about we can cover with how little mechanism. Questions of the form "If A can cover this subset of the use cases motivating B, do we need B?" are perfectly legitimate. Indeed, asking such questions vigorously is our only hope at avoiding a kitchen sink language. We have seen the usability of other languages be ruined by undisciplined growth.

Agreed, post-hoc or as I put it a while ago, a posteriori.

That does not mean that we need to ask these questions so early as to suppress exploration and brainstorming. But we are the gatekeepers between "strawman" and "proposal". We need to ask these questions before admitting designs across this threshold.

Fully agree.

# Dave Herman (14 years ago)

Thanks Mark. This seems like a good place to leave this for now. I'm not going to continue respobding on the thread with David-Sarah for now, because I really need to get off the computer and join the family for the holidays over the next few days, and I think it's past the point of diminishing returns.

But I agree with what you say. My current feeling is that it's neither obvious that one proposal obviates the other nor that it's worth supporting both. But we can get back to honing proposals, then discuss them both and ultimately compare.

Happy holidays to all. Catch you next week,