custom proto arg for Proxy.createFunction?

# David Herman (15 years ago)

I've been working on a prototype implementation of the binary data spec in pure JS (implemented via typed arrays) and I've been bitten by the lack of a standard mechanism for subclassing Function.

I'm using proxies for the implementation, and Proxy.createFunction doesn't let me specify a custom prototype. Now, I can understand that this preserves the existing property of the language that the only callable things are either regexps or descendants of Function. But we could extend the proxy API to allow custom functions with user-specified prototypes and still preserve this property:

Proxy.createFunction(handler, callTrap[, constructTrap[, proto]])

The proto argument would default to the original value of Function.prototype [1]. But if the user provides a prototype, the library could enforce that proto instanceof Function [2]. This way we would preserve the invariant that for any callable value v, either v is a regexp or (typeof v === "function" && v instanceof Function).

Maciej has also suggested a Function.create(...) API for more lightweight creation of function subtypes. This would strengthen the argument for allowing Proxy.createFunction to specify a prototype, since Proxy.createFunction() ought to be able to do anything Function.create() can do.

I'm curious to know if there are reasons I've missed why Proxy.createFunction() doesn't support a custom prototype. It seems to me like a nice additional expressiveness without any great loss of language invariants. But I may have missed something.

Thanks, Dave

[1] In the face of multiple globals, this would be the Function associated with the same global as Proxy. (BTW, I intend to work with Andreas on speccing out multiple globals.)

[2] Again, the Function associated with the same global as Proxy.

# David Herman (15 years ago)

PS Correction: it's actually a non-standard extension that regexps are callable in SpiderMonkey. So the invariant is that the only callable non-host objects are descendants of Function, or possibly host objects. This doesn't change my overall point, though.

# David Bruant (15 years ago)

Le 23/02/2011 23:26, David Herman a écrit :

I've been working on a prototype implementation of the binary data spec in pure JS (implemented via typed arrays) and I've been bitten by the lack of a standard mechanism for subclassing Function.

I'm using proxies for the implementation, and Proxy.createFunction doesn't let me specify a custom prototype. Now, I can understand that this preserves the existing property of the language that the only callable things are either regexps or descendants of Function. But we could extend the proxy API to allow custom functions with user-specified prototypes and still preserve this property:

Proxy.createFunction(handler, callTrap[, constructTrap[, proto]])

The proto argument would default to the original value of Function.prototype [1]. But if the user provides a prototype, the library could enforce that proto instanceof Function [2]. This way we would preserve the invariant that for any callable value v, either v is a regexp or (typeof v === "function" && v instanceof Function).

With your optional argument, I see a second solution that could be consistent. The prototype chain could contain the provided prototype then Function.prototype ("obj --> proto --> Function.prototype -->

null" as opposed to your proposition which is: "obj --> proto --> null" ). Hence, there would be no need to enforce anything: instanceof would naturally find Function.prototype in the chain and function proxies would still be functions.

Maciej has also suggested a Function.create(...) API for more lightweight creation of function subtypes. This would strengthen the argument for allowing Proxy.createFunction to specify a prototype, since Proxy.createFunction() ought to be able to do anything Function.create() can do.

I'm actually starting to think that this would be a good idea and could be applied to other things. Here (perfectionkills.com/how-ecmascript-5-still-does-not-allow-to-subclass-an-array) is a complain of Arrays which cannot be subclassed. A decent solution for that could be to have a Array.create(proto) method which would create an array with the prototype chain "myArray --> proto -->

Array.prototype --> null". It could solve all the problems at a time.

I'm curious to know if there are reasons I've missed why Proxy.createFunction() doesn't support a custom prototype. It seems to me like a nice additional expressiveness without any great loss of language invariants. But I may have missed something.

With your solution, by removing Function.prototype from the chain, proxy functions couldn't be .call()-ed or .apply()-ed

Thanks, Dave

[1] In the face of multiple globals, this would be the Function associated with the same global as Proxy. (BTW, I intend to work with Andreas on speccing out multiple globals.)

[2] Again, the Function associated with the same global as Proxy.

,

# David Herman (15 years ago)

With your optional argument, I see a second solution that could be consistent. The prototype chain could contain the provided prototype then Function.prototype ("obj --> proto --> Function.prototype --> null" as opposed to your proposition which is: "obj --> proto --> null" ). Hence, there would be no need to enforce anything: instanceof would naturally find Function.prototype in the chain and function proxies would still be functions.

I don't quite see how that works; prototype inheritance is inherently singly-linked. If the user specifies a prototype object, it already has its own single prototype -- and we certainly aren't going to spec something that mutates its existing prototype.

I suppose you could have some sort of multiple inheritance chain semantics that says "follow this chain to the end, then when you're done, here's another chain." But that seems kind of crazy.

You could spec something a bit more ad hoc, just for the purposes of inheritance: that no matter what object the user provides, `instanceof' always says true for Function.prototype as well. I don't have a concrete argument against it, but it still looks funny to me.

The simple instanceof check seems simpler and reasonably intuitive.

With your solution, by removing Function.prototype from the chain, proxy functions couldn't be .call()-ed or .apply()-ed

That's already true, even without my extension:

js> var proxy = Proxy.createFunction({}, function(){}, function(){})
js> proxy.call(this)   
typein:11: TypeError: getPropertyDescriptor is not a function

Now, you can still do:

js> Function.prototype.call.call(proxy, this)

But that would continue to be true with my extension as well.

# Brendan Eich (15 years ago)

On Feb 23, 2011, at 2:26 PM, David Herman wrote:

Maciej has also suggested a Function.create(...) API for more lightweight creation of function subtypes.

Maciej proposal, first raised here:

esdiscuss/2009-March/008954

and recorded as part of a strawman here:

strawman:name_property_of_functions

and only for the ability (beyond the reach of the Function constructor) to specify the intrinsic name of the created function object.

Adding an optional leading (not trailing) proto parameter is plausible if Function.create throws for an actual proto parameter that does not delegate to (the same Function) Function.prototype.

Oops, my name is by that strawman. I'd better make time for it soon.

# David Bruant (15 years ago)

Le 23/02/2011 23:54, David Herman a écrit :

With your optional argument, I see a second solution that could be consistent. The prototype chain could contain the provided prototype then Function.prototype ("obj --> proto --> Function.prototype --> null" as opposed to your proposition which is: "obj --> proto --> null" ). Hence, there would be no need to enforce anything: instanceof would naturally find Function.prototype in the chain and function proxies would still be functions. I don't quite see how that works; prototype inheritance is inherently singly-linked. If the user specifies a prototype object, it already has its own single prototype -- and we certainly aren't going to spec something that mutates its existing prototype.

I suppose you could have some sort of multiple inheritance chain semantics that says "follow this chain to the end, then when you're done, here's another chain." But that seems kind of crazy.

You could spec something a bit more ad hoc, just for the purposes of inheritance: that no matter what object the user provides, `instanceof' always says true for Function.prototype as well. I don't have a concrete argument against it, but it still looks funny to me.

I'm sorry, I've been a little bit crazy. I was wrong. I thought for a minute, that prototype chain appending was possible and I was wrong.

The simple instanceof check seems simpler and reasonably intuitive.

By "hardcoding" p instanceof Function === true, you're breaking your user expectation of finding the Function.prototype methods ("call", "apply" or any further addition) in the object. In current function proxies, they are "available", I have "fixed" your example below (and explained why I quote both "available" and "fixed").

With your solution, by removing Function.prototype from the chain, proxy functions couldn't be .call()-ed or .apply()-ed That's already true, even without my extension:

js> var proxy = Proxy.createFunction({}, function(){}, function(){})
js> proxy.call(this)   
typein:11: TypeError: getPropertyDescriptor is not a function

The bug is in your code: When you do "proxy.call(this)", the "proxy.call" getter is trapped as the "get" trap. Since you do not define it (your handler objectis empty), the default get trap is called (see here : harmony:proxies#trap_defaults). In this default get trap, the first thing that is done is a call to this.getPropertyDescriptor ("this" refers to the handler object which is empty in your example). And here, the error is thrown by the engine since handler.getPropertyDescriptor isn't a function because it's undefined.

You need to provide (at least) all fundamental traps in your proxies to make them work in all situations. (developer.mozilla.org/en/JavaScript/Reference/Global_Objects/Proxy#Common_mistakes_and_misunderstanding)

The following code is ugly, but works: var proxy = Proxy.createFunction({getOwnPropertyDescriptor: function(name) {}, getPropertyDescriptor: function(name) {}, getOwnPropertyNames: function() {}, getPropertyNames: function() {}, defineProperty: function(name, desc) {}, delete: function(name) {},
fix: function() {}, get: function(rec, name){ if(name=='call') return Object.getPrototypeOf(proxy).call;} }, function(){alert('called');}, function(){}); proxy.call(); // alerts 'called'

In this example, since the other traps (fundamental or derived) aren't used (that's why I've used stub methods), they don't have to be implemented, but if your proxy is planned on being used in the wild, it's safer.

This code is ugly, because my get handler methods assumes that the proxy object is reachable from the lexical scope which could not be the case (I could have used the "rec" argument, but that's not very clean). There is an ongoing discussion to make the proxy object available as an argument in each trap. This way, the get trap could be rewritten as: get: function (rec, name, proxy){ return Object.getPrototypeOf(proxy)[name]; // Object.getPrototypeOf(proxy) is Function.prototype by definition here } This is actually one more argument in the direction of making this happen.

,

# David Herman (15 years ago)

The simple instanceof check seems simpler and reasonably intuitive. By "hardcoding" p instanceof Function === true

I'm not proposing this -- it's already true. Try it out in SpiderMonkey:

js> (Proxy.createFunction({}, function(){}, function(){})) instanceof Function
true

you're breaking your user expectation of finding the Function.prototype methods ("call", "apply" or any further addition) in the object.

I think I haven't made myself clear. Certainly, users generally expect invariants like:

if (x instanceof y) and (key in y.prototype) then (key in x)
if (y === Object.getPrototypeOf(x)) and (key in y) then (key in x)

But proxies already break these invariants. It's already possible to write a proxy that doesn't have a property that its prototype has. You can do this both with Proxy.create(), which lets you provide a prototype but act as though you don't have properties of the prototype, and with Proxy.createFunction(), which insists that your proxy is an instanceof Function but again lets you act as though you don't have the properties of Function.prototype.

None of this has anything to do with my suggested extension, which is allowing a user-specified prototype to Proxy.createFunction.

With your solution, by removing Function.prototype from the chain, proxy functions couldn't be .call()-ed or .apply()-ed That's already true, even without my extension:

js> var proxy = Proxy.createFunction({}, function(){}, function(){}) js> proxy.call(this)
typein:11: TypeError: getPropertyDescriptor is not a function The bug is in your code:

No, there's no bug. The point of my code was to demonstrate that it's already possible to break the above invariant.

If you want to promote as best practices that people should strive to uphold the invariant, even though it isn't enforced by the language, that's fine. But allowing user-specified prototypes for Proxy.createFunction doesn't change this. It's still up to the proxy writer to ensure that they delegate to the prototype, regardless of whether the prototype is Function.prototype or a descendant of Function.prototype.

When you do "proxy.call(this)", the "proxy.call" getter is trapped as the "get" trap. Since you do not define it (your handler objectis empty), the default get trap is called (see here : harmony:proxies#trap_defaults). In this default get trap, the first thing that is done is a call to this.getPropertyDescriptor ("this" refers to the handler object which is empty in your example). And here, the error is thrown by the engine since handler.getPropertyDescriptor isn't a function because it's undefined.

Well, of course! That's my point. Proxies do not force you to respect prototype delegation.

This code is ugly, because my get handler methods assumes that the proxy object is reachable from the lexical scope which could not be the case (I could have used the "rec" argument, but that's not very clean). There is an ongoing discussion to make the proxy object available as an argument in each trap. This way, the get trap could be rewritten as: get: function (rec, name, proxy){ return Object.getPrototypeOf(proxy)[name]; // Object.getPrototypeOf(proxy) is Function.prototype by definition here } This is actually one more argument in the direction of making this happen.

That seems like a non sequitur; in what you've written, I don't see any argument against allowing Proxy.createFunction() to accept a user-specified prototype.

Again: the whole design of the proxy system already requires you to implement the delegation manually if you want to respect the prototype chain. But you don't have to respect the prototype delegation chain. That's true for both Object.create() and Object.createFunction().

All I'm proposing is allowing you to specify a descendant of Function.prototype as your prototype link, rather than Function.prototype itself. Either way, it's up to the programmer to decide whether they want to respect prototype delegation in their traps.

# David Bruant (15 years ago)

Le 24/02/2011 02:32, David Herman a écrit :

The simple instanceof check seems simpler and reasonably intuitive. By "hardcoding" p instanceof Function === true I'm not proposing this -- it's already true. Try it out in SpiderMonkey:

js> (Proxy.createFunction({}, function(){}, function(){})) instanceof Function
true

you're breaking your user expectation of finding the Function.prototype methods ("call", "apply" or any further addition) in the object. I think I haven't made myself clear. Certainly, users generally expect invariants like:

if (x instanceof y) and (key in y.prototype) then (key in x)
if (y === Object.getPrototypeOf(x)) and (key in y) then (key in x)

But proxies already break these invariants. It's already possible to write a proxy that doesn't have a property that its prototype has. You can do this both with Proxy.create(), which lets you provide a prototype but act as though you don't have properties of the prototype, and with Proxy.createFunction(), which insists that your proxy is an instanceof Function but again lets you act as though you don't have the properties of Function.prototype.

None of this has anything to do with my suggested extension, which is allowing a user-specified prototype to Proxy.createFunction.

With your solution, by removing Function.prototype from the chain, proxy functions couldn't be .call()-ed or .apply()-ed That's already true, even without my extension:

js> var proxy = Proxy.createFunction({}, function(){}, function(){}) js> proxy.call(this)
typein:11: TypeError: getPropertyDescriptor is not a function The bug is in your code: No, there's no bug. The point of my code was to demonstrate that it's already possible to break the above invariant.

If you want to promote as best practices that people should strive to uphold the invariant, even though it isn't enforced by the language, that's fine. But allowing user-specified prototypes for Proxy.createFunction doesn't change this. It's still up to the proxy writer to ensure that they delegate to the prototype, regardless of whether the prototype is Function.prototype or a descendant of Function.prototype.

This is currently true that it's up to the writer, but is likely to become half-false: Previously (starting here: esdiscuss/2011-January/012603) we've been discussing the idea that getPropertyNames and getPropertyDescriptor become derived traps and that their default behavior would be to climb the prototype chain. This way, by default, the prototype delegation occurs, so it wouldn't be up to the proxy writer to ensure prototype delegation. But the proxy writer can decide to override getPropertyNames and getPropertyDescriptor (or any trap that would) anyway and then mess with prototype delegation, I agree.

When you do "proxy.call(this)", the "proxy.call" getter is trapped as the "get" trap. Since you do not define it (your handler objectis empty), the default get trap is called (see here : harmony:proxies#trap_defaults). In this default get trap, the first thing that is done is a call to this.getPropertyDescriptor ("this" refers to the handler object which is empty in your example). And here, the error is thrown by the engine since handler.getPropertyDescriptor isn't a function because it's undefined. Well, of course! That's my point. Proxies do not force you to respect prototype delegation.

Ok, I thought you made a mistake.

This code is ugly, because my get handler methods assumes that the proxy object is reachable from the lexical scope which could not be the case (I could have used the "rec" argument, but that's not very clean). There is an ongoing discussion to make the proxy object available as an argument in each trap. This way, the get trap could be rewritten as: get: function (rec, name, proxy){ return Object.getPrototypeOf(proxy)[name]; // Object.getPrototypeOf(proxy) is Function.prototype by definition here } This is actually one more argument in the direction of making this happen. That seems like a non sequitur; in what you've written, I don't see any argument against allowing Proxy.createFunction() to accept a user-specified prototype.

Again: the whole design of the proxy system already requires you to implement the delegation manually if you want to respect the prototype chain. But you don't have to respect the prototype delegation chain. That's true for both Object.create() and Object.createFunction().

All I'm proposing is allowing you to specify a descendant of Function.prototype as your prototype link, rather than Function.prototype itself.

What do you call a "descendant of Function.prototype". Does it mean that your proto argument MUST have Function.prototype in its chain too (and Proxy.createFunction throws an error if it isn't the case for instance)?

I've been puzzled in your first message when you said "the library could enforce that proto instanceof Function". I see four ways to do that:

  1. Hardcode it This may break some promise. I agree the delegation promise can be broken by proxies. However, the following stands even with current proxies: p instanceof C <=> There exist n such that (Object.getPrototypeOf)^n(p) === C.prototype (please forgive the "to the power of n" notation). And this would be broken if the instanceof result was hardcoded. So this isn't a solution (I know you said that's not your proposal. I'm just enumerating)

  2. Change the proto prototype chain: We have a proto object with a prototype chain that looks like: proto --> p1 --> p2 --> ... --> pn --> null

I understand your suggestion as, when: var p = Proxy.createFunction(h, c, cstr, proto); then the prototype chain of p would be: p --> proto --> p1 --> p2 --> ... --> pn --> Function.prototype -->

Object.prototype --> null (the Object.prototype is specified in ES5 15.3.4)

However, this would break some invariants.

var o = Object.create(proto); // o instanceof Function === false var p = Proxy.createFunction(h, c, cstr, proto); // p instanceof Function === true // which is what we want // But we now also have: // o instanceof Function === true // which is an unwanted side effect of changing proto's prototype chain.

There is no mecanism in ES5 to change the prototype chain of an object, certainly because it would artificially break inheritance invariants.

  1. Cloning the prototype chain of proto, append Function.prototype to the fresh clone prototype chain. But it causes an object identity problem (certainly among many others): function C(){}; C.prototype = proto; var p = Proxy.create(a,b,c, proto); // p instanceof C === false // because we're dealing with a copy of proto

  2. Make sure that proto contains Function.prototype in its chain and throw an error if it isn't the case

Apologies, I've jumped a bit too quickly on the first solution when you talked about "enforcing" the instanceof result. As shown, none of the three first solutions are really a solution since they break a language invariant (even respected by proxies in their current form). So, is the 4th solution what you suggest? Or is it something else?

Either way, it's up to the programmer to decide whether they want to respect prototype delegation in their traps.

Indeed. Actually, since it's currently up to the programmer to re-implement delegation, what are you expecting from choosing your prototype object? Can't you just use it in your handler object? I'm just talking about currently (current spec state and current FF4 implementation). For the longerer term, if, as discussed, getPropertyNames and getPropertyDescriptor become derived traps with the suggestion default implementation, it completly makes sense to add an additional argument, with the still pending issue of how to enforce the result of instanceof (I'm actually a fan of the 4th solution).

# David Bruant (15 years ago)

Ok... hmm... well... I misread your initial post. You wanted to enforce the condition on the argument proto object, not the object returned by Proxy.createFunction. My apologies, here's my fixed response:

Le 24/02/2011 11:42, David Bruant a écrit :

Le 24/02/2011 02:32, David Herman a écrit :

The simple instanceof check seems simpler and reasonably intuitive. By "hardcoding" p instanceof Function === true I'm not proposing this -- it's already true. Try it out in SpiderMonkey:

js> (Proxy.createFunction({}, function(){}, function(){})) instanceof Function
true

you're breaking your user expectation of finding the Function.prototype methods ("call", "apply" or any further addition) in the object. I think I haven't made myself clear. Certainly, users generally expect invariants like:

if (x instanceof y) and (key in y.prototype) then (key in x)
if (y === Object.getPrototypeOf(x)) and (key in y) then (key in x)

But proxies already break these invariants. It's already possible to write a proxy that doesn't have a property that its prototype has. You can do this both with Proxy.create(), which lets you provide a prototype but act as though you don't have properties of the prototype, and with Proxy.createFunction(), which insists that your proxy is an instanceof Function but again lets you act as though you don't have the properties of Function.prototype.

None of this has anything to do with my suggested extension, which is allowing a user-specified prototype to Proxy.createFunction.

With your solution, by removing Function.prototype from the chain, proxy functions couldn't be .call()-ed or .apply()-ed That's already true, even without my extension:

js> var proxy = Proxy.createFunction({}, function(){}, function(){}) js> proxy.call(this)
typein:11: TypeError: getPropertyDescriptor is not a function The bug is in your code: No, there's no bug. The point of my code was to demonstrate that it's already possible to break the above invariant.

If you want to promote as best practices that people should strive to uphold the invariant, even though it isn't enforced by the language, that's fine. But allowing user-specified prototypes for Proxy.createFunction doesn't change this. It's still up to the proxy writer to ensure that they delegate to the prototype, regardless of whether the prototype is Function.prototype or a descendant of Function.prototype. This is currently true that it's up to the writer, but is likely to become half-false: Previously (starting here: esdiscuss/2011-January/012603) we've been discussing the idea that getPropertyNames and getPropertyDescriptor become derived traps and that their default behavior would be to climb the prototype chain. This way, by default, the prototype delegation occurs, so it wouldn't be up to the proxy writer to ensure prototype delegation. But the proxy writer can decide to override getPropertyNames and getPropertyDescriptor (or any trap that would) anyway and then mess with prototype delegation, I agree.

When you do "proxy.call(this)", the "proxy.call" getter is trapped as the "get" trap. Since you do not define it (your handler objectis empty), the default get trap is called (see here : harmony:proxies#trap_defaults). In this default get trap, the first thing that is done is a call to this.getPropertyDescriptor ("this" refers to the handler object which is empty in your example). And here, the error is thrown by the engine since handler.getPropertyDescriptor isn't a function because it's undefined. Well, of course! That's my point. Proxies do not force you to respect prototype delegation. Ok, I thought you made a mistake.

This code is ugly, because my get handler methods assumes that the proxy object is reachable from the lexical scope which could not be the case (I could have used the "rec" argument, but that's not very clean). There is an ongoing discussion to make the proxy object available as an argument in each trap. This way, the get trap could be rewritten as: get: function (rec, name, proxy){ return Object.getPrototypeOf(proxy)[name]; // Object.getPrototypeOf(proxy) is Function.prototype by definition here } This is actually one more argument in the direction of making this happen. That seems like a non sequitur; in what you've written, I don't see any argument against allowing Proxy.createFunction() to accept a user-specified prototype.

Again: the whole design of the proxy system already requires you to implement the delegation manually if you want to respect the prototype chain. But you don't have to respect the prototype delegation chain. That's true for both Object.create() and Object.createFunction().

All I'm proposing is allowing you to specify a descendant of Function.prototype as your prototype link, rather than Function.prototype itself. (...)

  1. Make sure that proto contains Function.prototype in its chain and throw an error if it isn't the case

That's what you're suggesting from the beginning if I understand well.

# Dave Herman (15 years ago)

David Bruant <bruant at enseirb-matmeca.fr> wrote:

Ok... hmm... well... I misread your initial post. You wanted to enforce the condition on the argument proto object, not the object returned by Proxy.createFunction.

That's right.

That's what you're suggesting from the beginning if I understand well.

Exactly. Sorry that was unclear.

(I'm actually a fan of the 4th solution).

Cool!

I hope Tom & Mark can also weigh in on this when they get a chance.

# Lasse Reichstein (15 years ago)

On Wed, 23 Feb 2011 23:41:00 +0100, David Bruant
<bruant at enseirb-matmeca.fr> wrote:

Le 23/02/2011 23:26, David Herman a écrit :

I've been working on a prototype implementation of the binary data spec
in pure JS (implemented via typed arrays) and I've been bitten by the
lack of a standard mechanism for subclassing Function.

I'm using proxies for the implementation, and Proxy.createFunction
doesn't let me specify a custom prototype. Now, I can understand that
this preserves the existing property of the language that the only
callable things are either regexps or descendants of Function. But we
could extend the proxy API to allow custom functions with
user-specified prototypes and still preserve this property:

Proxy.createFunction(handler, callTrap[, constructTrap[, proto]])

The proto argument would default to the original value of
Function.prototype [1]. But if the user provides a prototype, the
library could enforce that proto instanceof Function [2]. This way we
would preserve the invariant that for any callable value v, either v is
a regexp or (typeof v === "function" && v instanceof Function).

As long as proto is writable, that can be changed later anyway. If we get rid of writable proto, then it would be an invariant.

I'm not sure why it's important that all Callable objects need to have Function.prototype in their prototype chain, though. If it's in order to detect that an object can be called, we could have Object.isCallable(o) instead.

With your optional argument, I see a second solution that could be consistent. The prototype chain could contain the provided prototype then Function.prototype ("obj --> proto --> Function.prototype --> null" as opposed to your proposition which is: "obj --> proto --> null" ).

As I read it, that's not what's proposed. Remember that any object has a prototype chain, so the supplied "proto" could already have the prototype chain proto->Function.prototype->Object.prototype->null.

At least, Function.prototype will have Object.prototype as prototype, so it shouldn't be followed by null.

Hence, there would be no need to enforce anything: instanceof would naturally find Function.prototype in the chain and function proxies would still be functions.

There is no good way to add Function.prototype to the prototype chain of "proto" (that would need to change proto's prototype chain, which could cause other uses of it to break), and also no good way to add "proto" to
the prototype chain of Function.prototype.

I think the best you can do is to require Function.prototoype to be in the prototype chain of "proto", if you want to ensure that all callables are
Functions.

Maciej has also suggested a Function.create(...) API for more
lightweight creation of function subtypes. This would strengthen the
argument for allowing Proxy.createFunction to specify a prototype,
since Proxy.createFunction() ought to be able to do anything
Function.create() can do. I'm actually starting to think that this would be a good idea and could be applied to other things. Here (perfectionkills.com/how-ecmascript-5-still-does-not-allow-to-subclass-an-array) is a complain of Arrays which cannot be subclassed. A decent solution for that could be to have a Array.create(proto) method which would create an array with the prototype chain "myArray --> proto --> Array.prototype --> null". It could solve all the problems at a time.

Agree. For consistency, there should also be Date.create (I've seen that being wanted, the solution ended up being proto assignment),
String.create, Number.create, Boolean.create, RegExp.create and ... whatever [[Class]] values I'm forgetting.

Prototype inheritance only inherits properties, so the non-property
behavior of an object must live on the top object. Inheriting from an Array won't
make you an array, so you need a way to make an Array inherit from something
else to make your own extended Arrays, or Dates.

I'm curious to know if there are reasons I've missed why
Proxy.createFunction() doesn't support a custom prototype. It seems to
me like a nice additional expressiveness without any great loss of
language invariants. But I may have missed something. With your solution, by removing Function.prototype from the chain, proxy functions couldn't be .call()-ed or .apply()-ed

Well, you could do: Function.prototype.call.call(funcProxy, null, arg1,
arg2) if you want to be absolutely certain.

# Brendan Eich (15 years ago)

On Feb 25, 2011, at 5:41 AM, Lasse Reichstein wrote:

As long as proto is writable, that can be changed later anyway. If we get rid of writable proto, then it would be an invariant.

We will work to get rid of writable proto and then proto altogether, but that will take time and require browser vendor cooperation.

Anyway, proto is non-standard so let's turn a blind eye to it.

I'm not sure why it's important that all Callable objects need to have Function.prototype in their prototype chain, though.

The core language invariants, per ES5, include:

typeof x == "function => x() <=> x.call(undefined)

provided x is a native object. If x refers to a host object, .call may not be Function.prototype.call or a workalike.

It turns out in "JS in reality", with multiple global objects and also (independently) due to host objects, this invariant varies.

So perhaps it is not "important", but spec invariants often are. Consider

typeof x == typeof y && x == y <=> x === y

(That one, I believe all engines uphold.)

We hope to extend the spec to cover "reality" over time, while reforming "reality" to uphold spec invariants if we can. This is one reason we do not spec all of "reality" -- we want to turn a blind spec-eye while working with implementors to deprecate and reform.

# Tom Van Cutsem (15 years ago)

I like your proposed extension.

IIRC, the main invariant that we wanted to uphold is that callable objects are always instanceof Function, which continues to hold under your proposal. Again, repeating what you said: even without this extension, function proxies are already powerful enough to act as if they inherited from a custom prototype P, except that they can never claim to be "instanceof P". I don't see any reason for disallowing this if P inherits from Function.prototype. Mark may remember more details though.

2011/2/24 Dave Herman <dherman at mozilla.com>

# Boris Zbarsky (15 years ago)

On 2/25/11 9:05 AM, Brendan Eich wrote:

typeof x == typeof y&& x == y<=> x === y

(That one, I believe all engines uphold.)

Though Spidermonkey used to not have this for x and y objects (equality hooks, etc). Has that changed?

# Brendan Eich (15 years ago)

On Feb 25, 2011, at 9:00 AM, Boris Zbarsky wrote:

On 2/25/11 9:05 AM, Brendan Eich wrote:

typeof x == typeof y&& x == y<=> x === y

(That one, I believe all engines uphold.)

Though Spidermonkey used to not have this for x and y objects (equality hooks, etc). Has that changed?

Those were added for E4X:

js> x = <x>hi</x> <x>hi</x>

js> y = <x>hi</x> <x>hi</x>

js> x == y

true js> x === y

false js> typeof x "xml" js> typeof y "xml"

Again, turn a blind eye (the other one; uh oh).

These may be used also have been used for proxies or wrappers, and IIRC IE had some COM interface that allowed equality among proxies, but I'm not sure of the details. Firefox 4 security membranes do not cheat.

As long as == and === exist and engines have rich-enough embedding APIs, shenanigans will happen. Invariants need socialization out to the remote wastelands of browser DOMs, COM-like bridge layers, etc. Fight the good fight.