[Harmony proxies] Opinion on the open issue (How to deal with inconsistent data returned by handler traps?)

# David Bruant (15 years ago)

On the proxy proposal is an open issue. It starts with "How to deal with inconsistent data returned by handler traps?" (actually, the issue also applies to inputs since I can provide garbage as Object.defineProperty arguments). First of all, I think that there might be a false assumption in the question. It would be in the word "inconsistent". Inconsistent with what? From what I understand, the answer would be "with our current knowledge, use and understanding of objects". But should proxy be consistent with this?

The first sub-question is "what to do if the return value of getOwnPropertyNames or keys contains duplicate property names: silently filter them out, leave them, or throw?". So I have started to wonder "why would Object.getOwnPropertyNames return duplicates?". So I have thought of a case where objects, for the same key would have not one, but several values. Long story short, with Object.defineProperty, you add a value, with get/set, you play with the last value and with delete, you delete the last value. I have implemented it and invite you to run davidbruant.github.com/PropStackObjects on Firefox4 and look at the source code DavidBruant/PropStackObjects/blob/gh-pages/index.html With this "several values per property name", one could expect to see as many duplicates of the same key than the number of values of this key on the object after a Object.getOwnPropertyNames. I have implemented the thing, but as you will notice on Firebug/WebConsole, since there is some data massaging on the output of the trap, duplicate keys disappear.

My point is that it might be too restrictive to consider proxies as things "that behave like objects with some differencies". The ability to have functions (arbitrary code) to define a behavior potentially offers library authors the potential to consider objects differently than we used to. In my example, I am breaking the idea that a property name is bound to at most a unique value. In my example, it's a stack. And through the handler code, I am offering the guarantee that get and set only act on the last value. The way I see it, proxies offer the possibility of a dialog between library authors and library users and for this dialog, there is no need to learn a new language. This dialog happens with the "Object vocabulary" (with library author-defined semantics). And in my opinion, it would be a mistake to constraint this dialog to what we know for current objects.

While implementing this first example, I have realized that the property descriptor was somewhat inappropriate, because I didn't want, for instance, a non-configurable value that could "block" the stack. Actually, besides the value, nothing really mattered to me. And then I thought that through property descriptor, I could carry more meaning than with the current attributes. Since the property descriptor is an object (and that was a genius idea!) I could add an 'index' property attribute in my descriptor to explicitely tell where in the stack I want the property to be added to. My property descriptors would look like {value:"myValue", index:3}. Hence, second experiment : davidbruant.github.com/PropStackObjects/index2.html code: DavidBruant/PropStackObjects/blob/gh-pages/index2.html

Unfortunately, currently, on FF4, the property descriptor is rewritten, but with my comments on the code, you can see what results I would expect. But the potential of having my library-specific property descriptor format is here.

During these experiments, imposed data massaging has been frustated, because I the dialog I was trying to instore between the library code and the calling code was restricted by the engine. It would be the same with throwing. I know that what I am proposing is a break of the usual contract between what objects are and what people object to be. But I do not see why I, as a library writer couldn't decide to write another contract with my library users, one where I would be allowed to duplicate keys in Object.getOwnPropertyNames or one where I decide of the format and semantics of a property descriptor and so on. To answer the concern of derived traps default behavior expecting some format: if I'm planning on "creating a new contract", I would override them and make sure I respect the invariant and properties my library users expect in all circumstances.

I think that through proxies, with "uncontrolled trapping", it is likely that we see new usages and new forms of objects (Array could have emerged from that: DavidBruant/ProxyArray). Some of these new forms could even be included to ECMAScript one day, why not?

If data validation was expected anyway for security or debugging purposes, maybe that another type of proxy could be invented in which each trap call is controlled (input and output). Maybe that this could be done through Proxy.safeCreate() or something like that. It would return a "safe proxy". Validation/normalizing methods could also be provided by Proxy in order to help implementors validate their inputs.

Of course, all said here is to be discussed, so let's discuss.

# Tom Van Cutsem (15 years ago)

David,

This open issue is indeed one that merits further discussion here.

Some thoughts and comments:

  • I think we did come to an agreement that custom property attributes were seen as very useful for communicating new kinds of meta-data about objects, and that they ought to be preserved. See also this bug: < bugzilla.mozilla.org/show_bug.cgi?id=601379>

That would solve one of the issues you raise.

  • The place where argument/return value validation is most critical is where fundamental traps are called by the runtime itself, not by client code, because of a missing derived trap. The question here is: how should the default implementation of a derived trap deal with inconsistent data returned by the fundamental trap on which it relies?

  • If proxy handlers are allowed to free-wheel and return whatever they like, this will no doubt break some invariants of client code. I don't think something like Proxy.safeCreate is a particularly good idea, since: a) it further increases the complexity of an already complex API b) it puts the responsibility for ensuring safety on the wrong party: if safety is required, it's not a good idea to rely on library authors to use Proxy.safeCreate instead of Proxy.create. The library author can always "forget" (unintentionally or intentionally).

I believe Allen raised the idea at one of the last TC39 meetings that a safe subset could replace Proxy.create, Object.keys, etc. with safe variants that check for consistency. I recall that MarkM objected on the grounds that this approach could be prohibitively expensive for some checks, such as filtering out duplicates.

Cheers, Tom

2011/3/16 David Bruant <bruant at enseirb-matmeca.fr>

# David Bruant (15 years ago)

Le 16/03/2011 10:25, Tom Van Cutsem a écrit :

David,

This open issue is indeed one that merits further discussion here.

Some thoughts and comments:

  • I think we did come to an agreement that custom property attributes were seen as very useful for communicating new kinds of meta-data about objects, and that they ought to be preserved. See also this bug: bugzilla.mozilla.org/show_bug.cgi?id=601379 That would solve one of the issues you raise.

Ok. I'll follow on the bug. However, from what I understand, it (and the current Object.defineProperty semantics) does not fully let the user to define his/her attributes names and semantics. In the proxy semantics strawman, Object.defineProperty step 3: "Let desc be the result of calling ToPropertyDescriptor with Attributes as the argument." The /ToPropertyDescriptor/ call normalizes 'configurable', 'enumerable', 'writable' to booleans and throw if 'get' or 'set' isn't callable. This prevents people from passing differently typed values.

I think we need the other communication side too. [[GetOwnProperty]] calls /ToCompletePropertyDescriptor/ which also calls /ToPropertyDescriptor/. So if I want to use a different type for the usual attribute names (which, in my opinion, are common names), I cannot. Moreover, the property descriptors returned by Object.getOwnPropertyDescriptor are forced to have an 'enumerable' and 'configurable' property (and others depending on data or accessor descriptor). If I want to decide my own attributes&semantics and do not care about the "forced" one, then I am paying a performance/semantics overhead for no obvious reason. This can be annoying if I want to return a proxy as the property descriptor or if I want to iterate over property attributes through a for-in loop or Object.getOwnPropertyNames+Array method (forEach, every, reduce...). What is your position on this point? Is there also a bug number for this one?

  • The place where argument/return value validation is most critical is where fundamental traps are called by the runtime itself, not by client code, because of a missing derived trap. The question here is: how should the default implementation of a derived trap deal with inconsistent data returned by the fundamental trap on which it relies?

The way I see it, default implementations of derived traps are just a convenience. In my opinion, they should be the exact reflection of their related ECMAScript object method/internal method and not do more data validation/formatting than these do. As a convenience, proxy programmers should be aware of limitations and conventions in default implementation expectations. We can document them, it won't be hard at all. And if programmers decide to not respect the conventions in what they return from their fundamental traps, then they are exposing themselves to inconsistencies. Either they accept it and consider that as feature and not bug or they can always implement derived traps if they aren't satisfied with how the default derived trap behaves.

No matter what is decided for derived traps default implementations (current spec, current spec + data validation/formatting, other spec), there are going to be people for which this doesn't fit their use cases, so they will have to re-implement them. So I would be in favor of having default implementations which are consistent with ES objects behavior and perform no further checks (which will also improve performance).

I would like to point out that internal data validation/massaging is always surprising from the programming/debugging point of view. A programmer doesn't always know the restrictions ("I returned duplicates in the get(Own)PropertyNames trap and they've been removed! How so?", etc.)

# Brendan Eich (15 years ago)

On Mar 15, 2011, at 6:43 PM, David Bruant wrote:

Unfortunately, currently, on FF4, the property descriptor is rewritten, but with my comments on the code, you can see what results I would expect. But the potential of having my library-specific property descriptor format is here.

Thanks for raising this issue.

The proxies implementation in Firefox 4 is a prototype of a draft standard, of course. Between bugs in the spec, future not-quite-bug changes to the spec (improvements to go beyond the stated goals of proxies), and bugs in our code, we reserve the right to change it at any time. Caveat jsdevelopers.

This is fine and not a problem, IMHO. Our internal uses for security membranes will of course track our changes to proxies. But I wanted to write to assure you that we will make changes that TC39 thinks are good for future-proofing, as well as intentional changes and of course bug fixes.

# David Bruant (15 years ago)

Le 19/03/2011 01:11, Brendan Eich a écrit :

On Mar 15, 2011, at 6:43 PM, David Bruant wrote:

Unfortunately, currently, on FF4, the property descriptor is rewritten, but with my comments on the code, you can see what results I would expect. But the potential of having my library-specific property descriptor format is here.

Thanks for raising this issue.

The proxies implementation in Firefox 4 is a prototype of a draft standard, of course. Between bugs in the spec, future not-quite-bug changes to the spec (improvements to go beyond the stated goals of proxies), and bugs in our code, we reserve the right to change it at any time. Caveat jsdevelopers.

Of course, that's the reason why there is a "non-standard" bannier on top of the doc (developer.mozilla.org/en/JavaScript/Reference/Global_Objects/Proxy). Should also be added something saying that the API is subject to change from version to version?

This is fine and not a problem, IMHO. Our internal uses for security membranes will of course track our changes to proxies.

For the moment, with all the proxy-related discussions, I only see one thing that could change the way you use proxies and this thing is adding the proxy as an argument of every trap: strawman:handler_access_to_proxy I don't know what is your proxy-based code base size and how proxies are used exactly, but this sounds like a difficult issue to handle.

But I wanted to write to assure you that we will make changes that TC39 thinks are good for future-proofing, as well as intentional changes and of course bug fixes.

Thanks. In several places I talked about FF4 but it was more to relate my experience and give concrete examples of spec consequences. Nonetheless, I knew that the implementation may not currently reflect neither the spec proposal nor the latest discussions and decisions that could have happened.

# Tom Van Cutsem (15 years ago)

2011/3/18 David Bruant <bruant at enseirb-matmeca.fr>

Le 16/03/2011 10:25, Tom Van Cutsem a écrit :

David,

This open issue is indeed one that merits further discussion here.

Some thoughts and comments:

  • I think we did come to an agreement that custom property attributes were seen as very useful for communicating new kinds of meta-data about objects, and that they ought to be preserved. See also this bug: < bugzilla.mozilla.org/show_bug.cgi?id=601379> That would solve one of the issues you raise.

Ok. I'll follow on the bug. However, from what I understand, it (and the current Object.defineProperty semantics) does not fully let the user to define his/her attributes names and semantics. In the proxy semantics strawman, Object.defineProperty step 3: "Let desc be the result of calling ToPropertyDescriptor with Attributes as the argument." The /ToPropertyDescriptor/ call normalizes 'configurable', 'enumerable', 'writable' to booleans and throw if 'get' or 'set' isn't callable. This prevents people from passing differently typed values.

Right, this is following the current ES5 semantics of Object.defineProperty.

I think we need the other communication side too. [[GetOwnProperty]] calls /ToCompletePropertyDescriptor/ which also calls /ToPropertyDescriptor/. So if I want to use a different type for the usual attribute names (which, in my opinion, are common names), I cannot. Moreover, the property descriptors returned by Object.getOwnPropertyDescriptor are forced to have an 'enumerable' and 'configurable' property (and others depending on data or accessor descriptor). If I want to decide my own attributes&semantics and do not care about the "forced" one, then I am paying a performance/semantics overhead for no obvious reason. This can be annoying if I want to return a proxy as the property descriptor or if I want to iterate over property attributes through a for-in loop or Object.getOwnPropertyNames+Array method (forEach, every, reduce...). What is your position on this point? Is there also a bug number for this one?

There is no bug report for this as far as I can tell. I have no strong opinion on the coercion of standard property attributes. Coercing is closest to the current ES5 behavior, though. Also, I'm not convinced that programmers should be able to redefine the meaning of the standard property attributes. The standard attributes have a well-defined meaning, and they are an implicit part of the Object.defineProperty/getOwnPropertyDescriptor/... API. Won't redefining their semantics be needlessly confusing?

  • The place where argument/return value validation is most critical is where

fundamental traps are called by the runtime itself, not by client code, because of a missing derived trap. The question here is: how should the default implementation of a derived trap deal with inconsistent data returned by the fundamental trap on which it relies?

The way I see it, default implementations of derived traps are just a convenience. In my opinion, they should be the exact reflection of their related ECMAScript object method/internal method and not do more data validation/formatting than these do. As a convenience, proxy programmers should be aware of limitations and conventions in default implementation expectations. We can document them, it won't be hard at all. And if programmers decide to not respect the conventions in what they return from their fundamental traps, then they are exposing themselves to inconsistencies. Either they accept it and consider that as feature and not bug or they can always implement derived traps if they aren't satisfied with how the default derived trap behaves.

No matter what is decided for derived traps default implementations (current spec, current spec + data validation/formatting, other spec), there are going to be people for which this doesn't fit their use cases, so they will have to re-implement them. So I would be in favor of having default implementations which are consistent with ES objects behavior and perform no further checks (which will also improve performance).

I would like to point out that internal data validation/massaging is always surprising from the programming/debugging point of view. A programmer doesn't always know the restrictions ("I returned duplicates in the get(Own)PropertyNames trap and they've been removed! How so?", etc.)

I fully agree. On the other hand, it may be equally surprising for a programmer to find out that Object.getOwnPropertyNames(obj) may contain duplicate property names. It's not clear to me which side is more important. However, there will probably be much more "client code" using proxies than code creating proxies.

# David Bruant (15 years ago)

Le 22/03/2011 13:12, Tom Van Cutsem a écrit :

2011/3/18 David Bruant <bruant at enseirb-matmeca.fr <mailto:bruant at enseirb-matmeca.fr>>

Le 16/03/2011 10:25, Tom Van Cutsem a écrit :
David,

This open issue is indeed one that merits further discussion here.

Some thoughts and comments:
* I think we did come to an agreement that custom property
attributes were seen as very useful for communicating new kinds
of meta-data about objects, and that they ought to be preserved.
See also this bug:
<https://bugzilla.mozilla.org/show_bug.cgi?id=601379>
That would solve one of the issues you raise.
Ok. I'll follow on the bug.
However, from what I understand, it (and the current
Object.defineProperty semantics) does not fully let the user to
define his/her attributes names and semantics.
In the proxy semantics strawman, Object.defineProperty step 3:
"Let desc be the result of calling ToPropertyDescriptor with
Attributes as the argument."
The /ToPropertyDescriptor/ call normalizes 'configurable',
'enumerable', 'writable' to booleans and throw if 'get' or 'set'
isn't callable. This prevents people from passing differently
typed values.

Right, this is following the current ES5 semantics of Object.defineProperty.

I think we need the other communication side too.
[[GetOwnProperty]] calls /ToCompletePropertyDescriptor/ which also
calls /ToPropertyDescriptor/. So if I want to use a different type
for the usual attribute names (which, in my opinion, are common
names), I cannot. Moreover, the property descriptors returned by
Object.getOwnPropertyDescriptor are forced to have an 'enumerable'
and 'configurable' property (and others depending on data or
accessor descriptor). If I want to decide my own
attributes&semantics and do not care about the "forced" one, then
I am paying a performance/semantics overhead for no obvious
reason. This can be annoying if I want to return a proxy as the
property descriptor or if I want to iterate over property
attributes through a for-in loop or
Object.getOwnPropertyNames+Array method (forEach, every, reduce...).
What is your position on this point? Is there also a bug number
for this one?

There is no bug report for this as far as I can tell. I have no strong opinion on the coercion of standard property attributes. Coercing is closest to the current ES5 behavior, though. Also, I'm not convinced that programmers should be able to redefine the meaning of the standard property attributes. The standard attributes have a well-defined meaning, and they are an implicit part of the Object.defineProperty/getOwnPropertyDescriptor/... API.

I wouldn't argue if we had: Object.defineDataProperty(o, name, enumerable, configurable, writable, value) and Object.defineAccessorProperty(...). Or one Object.defineProperty with ten arguments. But this is not the case. I think that part of the reason why the commitee has agreed on an object rather than another solution is to have the liberty of providing different flavors (there are currently two with data and accessor, maybe a third will show up eventually) and different attributes one day or another. (Can someone from TC39 can provide insights on the property attribute API design?) Based on this rationale, I think that since proxies have the potential entirely redefine their internal consistency, it could make sense to completely forget about usual semantics of attributes. This actually could be an interesting way to drive innovation and maybe that relevant property attribute semantics patterns made by library authors could eventually be added to the spec.

Won't redefining their semantics be needlessly confusing?

I cannot answer that question. It will depend on people. For the comparison, I would ask: How confusing would it be if a host objects redefined property attribute semantics? I'm going to provide a response below.

* The place where argument/return value validation is most
critical is where fundamental traps are called by the runtime
itself, not by client code, because of a missing derived trap.
The question here is: how should the default implementation of a
derived trap deal with inconsistent data returned by the
fundamental trap on which it relies?
The way I see it, default implementations of derived traps are
just a convenience. In my opinion, they should be the exact
reflection of their related ECMAScript object method/internal
method and not do more data validation/formatting than these do.
As a convenience, proxy programmers should be aware of limitations
and conventions in default implementation expectations. We can
document them, it won't be hard at all.
And if programmers decide to not respect the conventions in what
they return from their fundamental traps, then they are exposing
themselves to inconsistencies. Either they accept it and consider
that as feature and not bug or they can always implement derived
traps if they aren't satisfied with how the default derived trap
behaves.

No matter what is decided for derived traps default
implementations (current spec, current spec + data
validation/formatting, other spec), there are going to be people
for which this doesn't fit their use cases, so they will have to
re-implement them. So I would be in favor of having default
implementations which are consistent with ES objects behavior and
perform no further checks (which will also improve performance).

I would like to point out that internal data validation/massaging
is always surprising from the programming/debugging point of view.
A programmer doesn't always know the restrictions ("I returned
duplicates in the get(Own)PropertyNames trap and they've been
removed! How so?", etc.)

I fully agree. On the other hand, it may be equally surprising for a programmer to find out that Object.getOwnPropertyNames(obj) may contain duplicate property names. It's not clear to me which side is more important. However, there will probably be much more "client code" using proxies than code creating proxies.

About the surprising aspect, I am going to take the example of an host object: NodeLists. NodeLists (returned by document.getElementsByTagName() ) are live objects. It means that once you have created it, it stays in sync with the DOM tree without the programmer having to refresh the object in any way. This behavior cannot be provided by any regular object. So programmers could something equivalent to:

var divs = document.getElementsByTagName('div'); for(var i; i< divs.length ; i++){ document.appendChild( divs[i].cloneNode ); }

Because of the live property, divs.length changes and this turns out to be an infinite loop. This is surprising to the programmer since live objects do not exist in ES3/5. What solves the surprise is that the programmer searches, discover the live property of NodeLists and fixes the bug. There is a surprise, it may be annoying, frustrating, but it doesn't last long. My opinion is that no matter what library you're loading and what crazy sort of proxies you're loading, these things will come with some documentation explaining differences with regular objects ("under such condition, we will return duplicate property names after a Object.getOwnPropertyNames call"). If the documentation is lacking, you still have the source code. In a way or another, the source of your surprise can find its cause. When you choose a proxy library, you know it won't behave like regular objects. This is for the case where you purposefully import a library/module of some sort. If you haven't loaded the module on purpose, well, I think that being surprised by object behaviors may be the least of your concerns :-) I think I do not really understand what is inherently wrong with "surprising" the programmer who has chosen to run your code. The programmer who has chosen your library did it with

Proxies offer the potential to break properties. Even elementary things like:

o.a = 2; b = o.a; // b === 1 (or undefined or "whatever")

Even though there is normalization of property attributes, you, as a proxy writer, has the Turing-complete freedom to deny the usual semantics of these attribute by defining your own internal consistency. As soon as you write "defineProperty: function(name, desc) {}" (emphasis on the /function/ keyword) in your handler no one can ever have any control over you to make sure you properly extract "enumerable" and "configurable" and give them proper semantics. Forcing these properties to be part of a property descriptor without being able to enforce that they are going to be used as they are in regular objects sounds like an half-enforcement. And you certainly agree that the second part of the enforcement is just unreasonably complicated (if not impossible).

Proxies will break things. Proxies will be surprising. Proxies will be confusing. It's part of what they are. It's built-in or rather "to-be-built-in". But I think it won't last for long, like with DOM live objects (or any other tricky host object). Libraries which do not show enough consistency ("internal consistency", not "consistency with regular objects") won't be widely used. For the comparison, it's not that difficult to understand with NodeLists that they are in sync with the DOM tree. This is one "consistent enough" internal consistency. My opinion is that the Proxy API should encourage rich internal consistencies as much as possible. And trying to keep consistency with what we know of current regular objects interfers with that vision. Once again, that's just my opinion, I don't mean to impose it in any way. (I do not have any power to impose it anyway :-p ).

# Brendan Eich (15 years ago)

On Mar 22, 2011, at 11:04 AM, David Bruant wrote:

I wouldn't argue if we had: Object.defineDataProperty(o, name, enumerable, configurable, writable, value) and Object.defineAccessorProperty(...). Or one Object.defineProperty with ten arguments. But this is not the case. I think that part of the reason why the commitee has agreed on an object rather than another solution is to have the liberty of providing different flavors (there are currently two with data and accessor, maybe a third will show up eventually) and different attributes one day or another. (Can someone from TC39 can provide insights on the property attribute API design?)

This has already been discussed, and indeed written up as a rationale document:

www.mail-archive.com/[email protected]/msg02308.html, es3.1:rationale_for_es3_1_static_object_methods.pdf

Based on this rationale, I think that since proxies have the potential entirely redefine their internal consistency, it could make sense to completely forget about usual semantics of attributes. This actually could be an interesting way to drive innovation and maybe that relevant property attribute semantics patterns made by library authors could eventually be added to the spec.

Maybe, but should we be strict and explicit, or loose and implicit, in our interpretation and handling of the currently-spec'ed attributes?

I think we will do better with strict and explicit, even as we allow libraries (such as Mark and Tom's traitsjs.org library) to extend property descriptors.

Won't redefining their semantics be needlessly confusing? I cannot answer that question. It will depend on people. For the comparison, I would ask: How confusing would it be if a host objects redefined property attribute semantics? I'm going to provide a response below.

In the last 15 years, host objects have been strange enough that the phrase "host object" is a malediction in 43 languages ;-). Let's not add loose/implicit opportunities for strangeness.

Sure, let's hope library authors add new and different extensions, and may the least-strange or most-beautiful win, and indeed feed into future standard editions.

About the surprising aspect, I am going to take the example of an host object: NodeLists. NodeLists (returned by document.getElementsByTagName() ) are live objects. It means that once you have created it, it stays in sync with the DOM tree without the programmer having to refresh the object in any way. This behavior cannot be provided by any regular object. So programmers could something equivalent to:

var divs = document.getElementsByTagName('div'); for(var i; i< divs.length ; i++){ document.appendChild( divs[i].cloneNode ); }

That's right, NodeLists are live and therefore must be implemented by Proxies in a self-hosted DOM. Coming soon at

andreasgal/dom.js

Because of the live property, divs.length changes and this turns out to be an infinite loop. This is surprising to the programmer since live objects do not exist in ES3/5.

Some would call it a botch!

What solves the surprise is that the programmer searches, discover the live property of NodeLists and fixes the bug. There is a surprise, it may be annoying, frustrating, but it doesn't last long.

It lasts long with some of us :-/.

My opinion is that no matter what library you're loading and what crazy sort of proxies you're loading, these things will come with some documentation explaining differences with regular objects ("under such condition, we will return duplicate property names after a Object.getOwnPropertyNames call"). If the documentation is lacking, you still have the source code. In a way or another, the source of your surprise can find its cause.

And you can then choose to use a different and less surprising library.

Just pointing out that docs and open source don't make all strangeness palatable.

When you choose a proxy library, you know it won't behave like regular objects. This is for the case where you purposefully import a library/module of some sort.

You may expect some strangeness from proxies, that is, some difference in behavior from regular objects, but you won't know what strangeness in particular (since even with proxies, never mind "host objects", there can be an endless and progressively stranger choice of strangeness).

If you haven't loaded the module on purpose, well, I think that being surprised by object behaviors may be the least of your concerns :-)

Ok, but that's your opinion. If others disagree, they won't use the library. Over time a winnowing process will sort wheat from chaff.

But this is not an argument for our core Harmony Proxies spec embracing strangeness by being loose or having implicit conversions, IMHO.

I think I do not really understand what is inherently wrong with "surprising" the programmer who has chosen to run your code. The programmer who has chosen your library did it with

(Your sentence ended prematurely there.)

My opinion is that the Proxy API should encourage rich internal consistencies as much as possible. And trying to keep consistency with what we know of current regular objects interfers with that vision. Once again, that's just my opinion, I don't mean to impose it in any way. (I do not have any power to impose it anyway :-p ).

I think this is a bad argument. Proxies (and of course host objects implemented in C++ or C) can do all sorts of strange things. This does not argue either that (a) proxies should be used to implement strange objects; (b) the semantics of proxies should be stranger than necessary by including implicit conversions.

# David Bruant (14 years ago)

Le 22/03/2011 23:09, Brendan Eich a écrit :

On Mar 22, 2011, at 11:04 AM, David Bruant wrote:

I wouldn't argue if we had: Object.defineDataProperty(o, name, enumerable, configurable, writable, value) and Object.defineAccessorProperty(...). Or one Object.defineProperty with ten arguments. But this is not the case. I think that part of the reason why the commitee has agreed on an object rather than another solution is to have the liberty of providing different flavors (there are currently two with data and accessor, maybe a third will show up eventually) and different attributes one day or another. (Can someone from TC39 can provide insights on the property attribute API design?)

This has already been discussed, and indeed written up as a rationale document:

www.mail-archive.com/[email protected]/msg02308.html, es3.1:rationale_for_es3_1_static_object_methods.pdf, es3.1:rationale_for_es3_1_static_object_methods.pdf

Thank you, that's a very interesting document and thread. After reading, I feel that the potential of adding other attributes wasn't part of the decision design. I was wrong. Still, the potential is here.

Based on this rationale, I think that since proxies have the potential entirely redefine their internal consistency, it could make sense to completely forget about usual semantics of attributes. This actually could be an interesting way to drive innovation and maybe that relevant property attribute semantics patterns made by library authors could eventually be added to the spec.

Maybe, but should we be strict and explicit, or loose and implicit, in our interpretation and handling of the currently-spec'ed attributes?

In the spec, for regular (non-proxy) objects, the interpretation and handling of currently-spec'ed attribute is and should be strict and explicit, I agree. For proxy-based objects, our ("we" means "spec writers") interpretation and handling of currently-spec'ed attribute is non-existent, because as soon as we leave the power of a function to a proxy author (for 'defineProperty' and 'getOwnPropertyDescriptor' traps), we allow them to choose if they want to consider using "configurable" or "enumerable" or not. For instance, we have no way to enforce consistency between calls to Object.defineProperty ("enumerable" attribute value) and Object.keys. I mean, we could enforce consistency here, but if we do, then Object.keys isn't a trap anymore, or at least its return value wouldn't mean anything. Object.keys would just be the function we know. In a proxy-based object, properties created as non-configurable could be deleted. Once again, you have no way to enforce that the property isn't deletable. My point is, you may provide any sort of changes to property descriptors, but it doesn't impose anything on how the trap code will interpret it. It's not because the spec defines the semantics of some attributes that proxy authors will follow the spec.

There is also a potential for future backward incompatibility. If I define my own attribute for my own proxy-based objects, let's say "deletable", that I use to separate the concerns of deletion and configuration. If one day, the ECMAScript standard standardizes the attribute and provide a different default value, then all the people who relied on my default value will see there code broken, since from the trap, I won't be able to see the difference between an attribute defined or omitted by a user or an attribute added by the engine before my trap code is called. So, either you take the risk to break a part of the web each time you define a new attribute or you don't standardize it. Providing full freedom on the attributes will allow both library authors and ECMAScript specifiers to not fight each other when a new attribute keyword gets defined in the spec.

I was about to say that since property descriptors are objects, proxies could be used instead to fool the engine and pretend absorbing the property defining ("enumerable", "configurable") calls, but denying them within the trap. It turns out, the descriptor passed as an argument and the object within the trap call aren't the same object (that's consistent with ES5 and I understand this decision), so my idea falls apart.

I think we will do better with strict and explicit, even as we allow libraries (such as Mark and Tom's traitsjs.org library) to extend property descriptors.

Won't redefining their semantics be needlessly confusing? I cannot answer that question. It will depend on people. For the comparison, I would ask: How confusing would it be if a host objects redefined property attribute semantics? I'm going to provide a response below.

In the last 15 years, host objects have been strange enough that the phrase "host object" is a malediction in 43 languages ;-). Let's not add loose/implicit opportunities for strangeness.

Sure, let's hope library authors add new and different extensions, and may the least-strange or most-beautiful win, and indeed feed into future standard editions.

About the surprising aspect, I am going to take the example of an host object: NodeLists. NodeLists (returned by document.getElementsByTagName() ) are live objects. It means that once you have created it, it stays in sync with the DOM tree without the programmer having to refresh the object in any way. This behavior cannot be provided by any regular object. So programmers could something equivalent to:

var divs = document.getElementsByTagName('div'); for(var i; i< divs.length ; i++){ document.appendChild( divs[i].cloneNode ); }

That's right, NodeLists are live and therefore must be implemented by Proxies in a self-hosted DOM. Coming soon at

andreasgal/dom.js

Awesome! Will be following that closely!

If you haven't loaded the module on purpose, well, I think that being surprised by object behaviors may be the least of your concerns :-)

Ok, but that's your opinion. If others disagree, they won't use the library. Over time a winnowing process will sort wheat from chaff.

But this is not an argument for our core Harmony Proxies spec embracing strangeness by being loose or having implicit conversions, IMHO.

I'm sorry, I don't understand. What "implicit conversions" are you talking about? Just to make sure we are on the same page, the discussion we were having was about turning property descriptor objects passed as Object.defineProperty argument into a "classical" property descriptor (data or accessor depending on what is already there) while inside the defineProperty trap. This is an implicit conversion and this is what I am against. So, what are the "implicit conversions" you are against?

I think I do not really understand what is inherently wrong with "surprising" the programmer who has chosen to run your code. The programmer who has chosen your library did it with

(Your sentence ended prematurely there.) ... and I don't remember what I was going for :-s Sorry.

My opinion is that the Proxy API should encourage rich internal consistencies as much as possible. And trying to keep consistency with what we know of current regular objects interfers with that vision. Once again, that's just my opinion, I don't mean to impose it in any way. (I do not have any power to impose it anyway :-p ).

I think this is a bad argument. Proxies (and of course host objects implemented in C++ or C) can do all sorts of strange things. This does not argue either that (a) proxies should be used to implement strange objects; (b) the semantics of proxies should be stranger than necessary by including implicit conversions.

Once again, I do not understand what "implicit conversions" you are talking about.

Overall, one of the point I am rising is the definition of what is an object. The ES5 definition says "An object is a collection of properties and has a single prototype object. (...)" and each property is an "association between a name and a value that is a part of an object.". In my "PropStackObject" experiment (davidbruant.github.com/PropStackObjects (view source)), I have created a new object paradigm where a property is an association between a name and a stack of values (in a nutshell, defineProperty and delete allow to respectively stack up and pop up, while get and set only manipulate the value on top of the stack). When, in the defineProperty trap, I see a "configurable" attribute with "false" as value, since I have changed the object paradigm, I have to derive from spec semantics. That's where I said earlier that for proxy-based objects, the interpretation and handling of currently spec-ed attributes is inexistent. As I have a new paradigm, I have to decide what "configurable" mean. Is the whole stack becoming non configurable or just the last value? No matter how strict and explicit the spec is, no one else but me has to make the decision. My preference would be to not consider "configurable" at all for my "object paradigm".