Proposal to fix super and new inconsistency, future-proofing broader uses of new operator
An interesting proposal, but I'm not yet sold. Here are some of the issue I see:
- By "newing" a function an ES programmer is expressing a clear intent to use the body the body of the function as part of the instantiation process. It would be counter intuitive to not execute the body of a constructor function when performing a new:
function C() {console.log("C constructed")};
...
//code somewhere else
C.prototype.constructor = function () {}
...
//back in the original script
new C; //whoa why wasn't the constructor called
A class declaration is similar. It defines (either explicitly or implicitly) the body of the class constructor. It would be equally counter intuitive to not use that body when newing a class.
-
Some "classes" may not wish to expose an instantiation capability via their instances. ES6 GeneratorFunctions are a good example this. See lower right part of the diagram at people.mozilla.org/~jorendorff/es6-draft.html#sec-15.19.3 . Each GeneratorFunction has an associated prototype that is used by all of that GeneratorFunction's instances. However, the prototype does not have a 'constructor' property. This means that passing someone a generator instances doesn't give them the capability to instantiate additional instances of the same GeneratorFunction. Whether class instances should expose the capability to create additional instances of the same class is a design decision that situationally might go either way. Exposing that capability via as the instance 'constructor' property is a fine default but isn't always the desirable alternative. Removing the prototype's 'constructor' property seems like a reasonable way to circumvent the default. (BTW, a class whose prototype object does not have a 'constructor' property is similar, in concept, to a private constructor in Java).
-
The second case is much more common: one redefines .prototype of a function, but does not define .constructor there (there was no real need). I would propose guard against this case - whenever the .prototype of a function is changed, the new would use old, legacy semantics. Constructor functions with non-changed .prototypes, as well as
class
es (which have .prototype non-writable) would work fine with the new, cleaner semantics.
I think the "second case" compatibility issue is very significant. Your fix basically requires that normal functions have a distinct [[Construct]] internal method that ignores the 'constructor' property (just like the current spec.) while class objects would have to have a different [[Construct]] that dispatches via the 'construct' property. So far we haven't had to make class objects (ie, functions) a different kind of exotic object from regular functions. That means that class declarations are essentially just sugar and there is no real difference between an abstraction created using a class declaration and one composed out of more primitive function declarations and property accesses. I'd prefer not to lose that equivalence.
- I'm reluctant to put additional property probes/access in the fast path of the 'new' operator. Object instantiation should be fast and every extra property access or invocation costs something. Maybe they can be optimized away, but maybe not...
Allen Wirfs-Brock wrote:
An interesting proposal, but I'm not yet sold. Here are some of the issue I see:
- By "newing" a function an ES programmer is expressing a clear intent to use the body the body of the function as part of the instantiation process. It would be counter intuitive to not execute the body of a constructor function when performing a new:
function C() {console.log("C constructed")}; ... //code somewhere else C.prototype.constructor = function () {}
BTW this is the "first case" from my original post. How often this example happens in real world (without changing C.prototype before this assigment)? Because if C.prototype is changed, legacy semantics is used.
... //back in the original script new C; //whoa why wasn't the constructor called
Well, new (class extends C); // whoa why wasn't C called
This happens in actual state as well, though it is a little more hidden.
I am convinced that always, except very special cases new C and new (class extend C) should produce nearly* same instances, behaviourally. It is not the case, though.
- they should only differ in meta issues like if their __proto__s are equal or which inherits from which one, what they return to 'foo instanceof Bar' for some edge cases, etc.
A class declaration is similar. It defines (either explicitly or implicitly) the body of the class constructor. It would be equally counter intuitive to not use that body when newing a class.
It is different here (well, that is matter of PoV, I have different one). In class, you do not define body of any constructor-function, you define body of 'constructor' method. It is "just a convenience" that this method is also reachable by using the class itself (That its .prototype is set up accordingly can be seen as just a result of this "convenience" - if class would be represented by different object, that object would have its .prototype set).
(well, I understand it is not "just a convenience", the constructor should represent the class because of "SuperClass.apply(this, arguments)" legacy subclassing pattern, thus I see it as a workaround, not as a defining feature; other than this, class could be fine with any other object, provided it has .prototype, @@create and new(...args) would run Foo@@create.constructor(...args))
There is no constructor-function (if you look at syntax). There is class declaration, where one of the methods can be called 'constructor' and is used to initialize an instance of the class. If you do not define it, the one up the proto chain will be used. So this really behaves as if it was a normal method (in the default case, when you do not rewrite TheClass.prototype.constructor).
With the presented proposal, where new(...args) does (sans errors) Foo@@create.constructor(...args) there is even little point of actually defining the default constructor that just calls super (the only reason why it is really needed is that something must be reachable by using class itself, and using constructor method in all cases is consistent behaviour plus the legacy workaround).
- Some "classes" may not wish to expose an instantiation capability via their instances. ES6 GeneratorFunctions are a good example this. See lower right part of the diagram at people.mozilla.org/~jorendorff/es6-draft.html#sec-15.19.3 . Each GeneratorFunction has an associated prototype that is used by all of that GeneratorFunction's instances. However, the prototype does not have a 'constructor' property. This means that passing someone a generator instances doesn't give them the capability to instantiate additional instances of the same GeneratorFunction. Whether class instances should
Well, we talk two things here.
First, that GeneratorFunctions' prototypes (thus their instances) do not have .constructor. The diagram shows these instances have Object.prototype in their proto chain. Thus, the first thing, whether they can be safely instatiated with the presented proposal (which is the question that must be answered) is true: GF@@create.constructor(...args) @@creates appropriate object and calls empty .constructor(...args) inherited from Object.prototype.
The second thing is 'some "classes" may not wish to expose an instantiation capability via their instances' and using GeneratorFunctions as an example of this.
This thing is, as I pointed in Issues down in my original post, with the
present proposal no class in general exposes an instantiation
capability; because in general, 'constructor' is plain method. Yes, in
cases where constructor method is also used as "newable" (legacy
constructor functions and class
), default value of .constructor
exposes instantiation capability, though whenever you change it, it
stops to expose it.
So, the fact that GeneratorFunctions do not define .constructor in their .prototype is fine.
Or, IOW, I never proposed that .constructor is mandatory in each .prototype. It just has to be on proto chain, somewhere (and the dummy one from Object.prototype is just enough).
expose the capability to create additional instances of the same class is a design decision that situationally might go either way. Exposing that capability via as the instance 'constructor' property is a fine default but isn't always the desirable alternative. Removing the
Or did I understand this bad and you want to have the possibility to expose it and use .constructor as a channel for it?
Even the present situation is strange with this, by default constructor functions and classes do expose instantiation capability via 'new this.constructor'; once you redefine / delete it, it's gone.
It is good to realize that once @@create entered the scene, 'new this.constructor' is even more brittle channel for instantiation capability. Instantiation must not only have correct initialization (constructor) part, it also needs correct @@creation part (not that it was good before, where TheClass.prototype = ...; was common).
prototype's 'constructor' property seems like a reasonable way to circumvent the default. (BTW, a class whose prototype object does not have a 'constructor' property is similar, in concept, to a private constructor in Java).
The second case is much more common: one redefines .prototype of a function, but does not define .constructor there (there was no real need). I would propose guard against this case - whenever the .prototype of a function is changed, the new would use old, legacy semantics. Constructor functions with non-changed .prototypes, as well as
class
es (which have .prototype non-writable) would work fine with the new, cleaner semantics.
- I think the "second case" compatibility issue is very significant.
Same here.
Your fix basically requires that normal functions have a distinct
... after their 'prototype' property was compromised ...
[[Construct]] internal method that ignores the 'constructor' property (just like the current spec.) while class objects would have to have a
All other objects. Looking into future, the presented semantics aims to be the default for any "newable", the old one fixing the backward compatibility issue for constructor functions with compromised .prototype (silently leaving ones with default unchanged .prototype and with changed .prototype.constructor to fail with assumption no one hampers with .constructor other than redefining it after changing .prototype before).
different [[Construct]] that dispatches via the 'construct' property. So far we haven't had to make class objects (ie, functions) a different kind of exotic object from regular functions. That means that class declarations are essentially just sugar and there is no real difference between an abstraction created using a class declaration and one composed out of more primitive function declarations and property accesses. I'd prefer not to lose that equivalence.
You don't lost the equivalence. The class is still only a sugar. The proposal says that "whenever the .prototype of a function is changed, the new would use old, legacy semantics". I probably did formulate it so it could be misunderstood, I meant "whenever the 'prototype' property of a function is changed, the new would use old, legacy semantics". Up to that point it uses the newly proposed one.
A class has its .prototype defined once and set nonwritable and nonconfigurable. So it cannot be changed, thus in no way legacy [[Construct]] can be triggered and the newly proposed one is in charge all the time.
- I'm reluctant to put additional property probes/access in the fast path of the 'new' operator. Object instantiation should be fast and every extra property access or invocation costs something. Maybe they can be optimized away, but maybe not...
I believe there are ways to make it faster (guard every .constructor change in all object that were used as .prototype in at last one 'new', for example). I think method invocation is happening more often than 'new', so if method invocation is fast, new from this proposal is fast as well - it is @@creating a bare instance and then just invoking 'constructor' method on it.
[ reposting, since there was no reply to original posting ] [ thread starting here: esdiscuss/2013-August/033089 ]
Allen Wirfs-Brock wrote:
An interesting proposal, but I'm not yet sold. Here are some of the issue I see:
- By "newing" a function an ES programmer is expressing a clear intent to use the body the body of the function as part of the instantiation process. It would be counter intuitive to not execute the body of a constructor function when performing a new:
function C() {console.log("C constructed")}; ... //code somewhere else C.prototype.constructor = function () {}
BTW this is the "first case" from my original post. How often this example happens in real world (without changing C.prototype before this assigment)? Because if C.prototype is changed, legacy semantics is used.
... //back in the original script new C; //whoa why wasn't the constructor called
Well, new (class extends C); // whoa why wasn't C called
This happens in actual state as well, though it is a little more hidden.
I am convinced that always, except very special cases new C and new (class extend C) should produce nearly* same instances, behaviourally. It is not the case, though.
- they should only differ in meta issues like if their __proto__s are equal or which inherits from which one, what they return to 'foo instanceof Bar' for some edge cases, etc.
A class declaration is similar. It defines (either explicitly or implicitly) the body of the class constructor. It would be equally counter intuitive to not use that body when newing a class.
It is different here (well, that is matter of PoV, I have different one). In class, you do not define body of any constructor-function, you define body of 'constructor' method. It is "just a convenience" that this method is also reachable by using the class itself (That its .prototype is set up accordingly can be seen as just a result of this "convenience" - if class would be represented by different object, that object would have its .prototype set).
(well, I understand it is not "just a convenience", the constructor should represent the class because of "SuperClass.apply(this, arguments)" legacy subclassing pattern, thus I see it as a workaround, not as a defining feature; other than this, class could be fine with any other object, provided it has .prototype, @@create and new(...args) would run Foo@@create.constructor(...args))
There is no constructor-function (if you look at syntax). There is class declaration, where one of the methods can be called 'constructor' and is used to initialize an instance of the class. If you do not define it, the one up the proto chain will be used. So this really behaves as if it was a normal method (in the default case, when you do not rewrite TheClass.prototype.constructor).
With the presented proposal, where new(...args) does (sans errors) Foo@@create.constructor(...args) there is even little point of actually defining the default constructor that just calls super (the only reason why it is really needed is that something must be reachable by using class itself, and using constructor method in all cases is consistent behaviour plus the legacy workaround).
- Some "classes" may not wish to expose an instantiation capability via their instances. ES6 GeneratorFunctions are a good example this. See lower right part of the diagram at people.mozilla.org/~jorendorff/es6-draft.html#sec-15.19.3 . Each GeneratorFunction has an associated prototype that is used by all of that GeneratorFunction's instances. However, the prototype does not have a 'constructor' property. This means that passing someone a generator instances doesn't give them the capability to instantiate additional instances of the same GeneratorFunction. Whether class instances should
Well, we talk two things here.
First, that GeneratorFunctions' prototypes (thus their instances) do not have .constructor. The diagram shows these instances have Object.prototype in their proto chain. Thus, the first thing, whether they can be safely instatiated with the presented proposal (which is the question that must be answered) is true: GF@@create.constructor(...args) @@creates appropriate object and calls empty .constructor(...args) inherited from Object.prototype.
The second thing is 'some "classes" may not wish to expose an instantiation capability via their instances' and using GeneratorFunctions as an example of this.
This thing is, as I pointed in Issues down in my original post, with the
present proposal no class in general exposes an instantiation
capability; because in general, 'constructor' is plain method. Yes, in
cases where constructor method is also used as "newable" (legacy
constructor functions and class
), default value of .constructor
exposes instantiation capability, though whenever you change it, it
stops to expose it.
So, the fact that GeneratorFunctions do not define .constructor in their .prototype is fine.
Or, IOW, I never proposed that .constructor is mandatory in each .prototype. It just has to be on proto chain, somewhere (and the dummy one from Object.prototype is just enough).
expose the capability to create additional instances of the same class is a design decision that situationally might go either way. Exposing that capability via as the instance 'constructor' property is a fine default but isn't always the desirable alternative. Removing the
Or did I understand this bad and you want to have the possibility to expose it and use .constructor as a channel for it?
Even the present situation is strange with this, by default constructor functions and classes do expose instantiation capability via 'new this.constructor'; once you redefine / delete it, it's gone.
It is good to realize that once @@create entered the scene, 'new this.constructor' is even more brittle channel for instantiation capability. Instantiation must not only have correct initialization (constructor) part, it also needs correct @@creation part (not that it was good before, where TheClass.prototype = ...; was common).
prototype's 'constructor' property seems like a reasonable way to circumvent the default. (BTW, a class whose prototype object does not have a 'constructor' property is similar, in concept, to a private constructor in Java).
The second case is much more common: one redefines .prototype of a function, but does not define .constructor there (there was no real need). I would propose guard against this case - whenever the .prototype of a function is changed, the new would use old, legacy semantics. Constructor functions with non-changed .prototypes, as well as
class
es (which have .prototype non-writable) would work fine with the new, cleaner semantics.
- I think the "second case" compatibility issue is very significant.
Same here.
Your fix basically requires that normal functions have a distinct
... after their 'prototype' property was compromised ...
[[Construct]] internal method that ignores the 'constructor' property (just like the current spec.) while class objects would have to have a
All other objects. Looking into future, the presented semantics aims to be the default for any "newable", the old one fixing the backward compatibility issue for constructor functions with compromised .prototype (silently leaving ones with default unchanged .prototype and with changed .prototype.constructor to fail with assumption no one hampers with .constructor other than redefining it after changing .prototype before).
different [[Construct]] that dispatches via the 'construct' property. So far we haven't had to make class objects (ie, functions) a different kind of exotic object from regular functions. That means that class declarations are essentially just sugar and there is no real difference between an abstraction created using a class declaration and one composed out of more primitive function declarations and property accesses. I'd prefer not to lose that equivalence.
You don't lost the equivalence. The class is still only a sugar. The proposal says that "whenever the .prototype of a function is changed, the new would use old, legacy semantics". I probably did formulate it so it could be misunderstood, I meant "whenever the 'prototype' property of a function is changed, the new would use old, legacy semantics". Up to that point it uses the newly proposed one.
A class has its .prototype defined once and set nonwritable and nonconfigurable. So it cannot be changed, thus in no way legacy [[Construct]] can be triggered and the newly proposed one is in charge all the time.
- I'm reluctant to put additional property probes/access in the fast path of the 'new' operator. Object instantiation should be fast and every extra property access or invocation costs something. Maybe they can be optimized away, but maybe not...
I believe there are ways to make it faster (guard every .constructor change in all object that were used as .prototype in at last one 'new', for example). I think method invocation is happening more often than 'new', so if method invocation is fast, new from this proposal is fast as well - it is @@creating a bare instance and then just invoking 'constructor' method on it.
PROBLEM
In the present state of the spec, there is little inconsistency between behaviour of
new
andsuper
.What these operation roughly do is:
new Foo(...args)
isFoo.call(Foo[@@create](), ...args)
super(...args)
inside constructor is__superclassproto__.constructor.call(this, ...args)
This is elegant and consistent solution -
super
behaves as in any other method - calling superclass's version of itself.Since
Foo.prototype.constructor
is set toFoo
by default, no inconsistency is observed in default case -super(...args)
calls the same function from subclass ofFoo
and innew Foo
.But if constructor is changed (or deleted / not defined), inconsistency appears -
new
still callsFoo
, butsuper
calls different function (or fails if there is no.constuctor
in proto chain).My gut feeling is that
new Class
andsuper
inSubClass
should do the same thing. Also, if this IMO bug begins to be exploited, to "have different initialization of own instance versus subclass one", there is no way back.SOLUTION
There is elegant solution for this by redefining new to do roughly:
new Foo(...args)
isFoo[@@create]().constructor(...args)
Compared to previous semantics, this is much cleaner and understandable, and in par with super philosophy of "treat
constructor
as just another method".For default cases, this works identically with the formed definition.
For
class
keyword, if you change constructor method of an existing class, this semantics nicely implements your intent - to change the way how classFoo
is initialized (in bothnew
andsuper
).Remaining scenario is changed
.constructor
of constructor function. Here, it can be changed directly (you change.constructor
of existing.prototype
) or indirectly (you change.prototype
of the constructor function). Both would break existing web.The silent assumption of this proposal is, that the former case (changing
.constructor
of default.prototype
) is rare if it ever appears, though I did not search for this.The second case is much more common: one redefines
.prototype
of a function, but does not define.constructor
there (there was no real need). I would propose guard against this case - whenever the.prototype
of a function is changed, the new would use old, legacy semantics. Constructor functions with non-changed.prototype
s, as well asclass
es (which have.prototype
non-writable) would work fine with the new, cleaner semantics.FUTURE PROOFED BROADER NEW
This change decoupled the need of the Foo in
new Foo
being callable - so the new semantics of new allows any object having @@create defined to be usable insidenew
- the initialization of the instance is nothing more than just callingconstructor
method with appropriate args, so the new instance is responsible for initializing itself, no matter who was its creator/allocator.ISSUES
.prototype.constructor
without changing.prototype
itself on legacy constructor functions is really rare or it has its legitimate use and is spread.new this.constructor(...args)
).