Fwd: undefined being treated as a missing optional argument
Just to start, it means a method like: Array.prototype.fill = function(value=null) { this.forEach((v,i)=>this[i]=value)} wouldn't do the obvious thing when called as: someArray.fill(undefined)
I think it is debatable what is obvious in this case. The wiki says that this should fill the array with 'undefined' (I believe the spec draft aims to say the same, but I can't tell where this is established). But I think many JavaScript developers would expect this to fill the array with 'null'.
Some arguments in favor of for treating undefined the same as not-present:
-
Existing uses of default value patterns in JavaSscript use undefined as the sentinel for not present, so a semantics for default values in ES6 that behaves differently will not be usable as a shorthand for any of these existing APIs without changing semantics.
-
The fact that JavaScript (at least for user objects) currently doesn't differentiate between missing arguments and undefined arguments is a nice simplifying rule in the language that is easy to understand..
-
The example above, of wanting to have an API that allows explicitly passing in undefined to override a default value, seems outside of the common case (out of curiosity - are there any realistic example of this?). If truly desired, it is easy to not use default values. But the common case seems more likely to be to emulate what is done today - and avoid having any undefined values flow into the API call.
Luke
On Apr 11, 2012, at 7:01 PM, Luke Hoban wrote:
Just to start, it means a method like: Array.prototype.fill = function(value=null) { this.forEach((v,i)=>this[i]=value)} wouldn't do the obvious thing when called as: someArray.fill(undefined)
I think it is debatable what is obvious in this case. The wiki says that this should fill the array with 'undefined' (I believe the spec draft aims to say the same, but I can't tell where this is established).
The same runtime semantics routines are used for both default parameter initialization and for initializing array destructurings in let/const declarations (and hence same semantics for places). For formal parameters it starts in the Runtime Semantics of 13.1 and (in theory) for individual formal parameters of the form (id=expr) will eventually reach Keyed Binding Initialisation for the production singeNameBind : BindingIdentifier Initialiser in 12.2.4. It's "in theory" because it looks like the current draft is missing a couple intermediate steps in 13.1.
But I think many JavaScript developers would expect this to fill the array with 'null'.
Some arguments in favor of for treating undefined the same as not-present:
- Existing uses of default value patterns in JavaSscript use undefined as the sentinel for not present, so a semantics for default values in ES6 that behaves differently will not be usable as a shorthand for any of these existing APIs without changing semantics.
Undefined is not always such a sentential. Some examples of built-ins that treat undefined different form a missing parameter:
function(a){return a}.bind(null) vs function(a){return a}.bind(null)
new Array() vs new Array(undefined)
[].reduce(function(){}) vs [].reduce(function(){},undefined)
new String() vs new String(undefined)
Number() vs Number(undefined)
new Date(2012,3) vs new Date(2012,3,undefined)
etc.
I suggest that the appropriate way to think of about the current behavior, in the context of ES6, is that function f(a) {...} is equivalent to function f(a=undefined) {...}
In other words, there is a default initialization expression, if one is not specified. So, f() and f(undefined) appear to produce the same result. But I see why somebody calling a function defined as function(a={ }){...} explicitly as f(undefined) would expect to trigger the default value initializer.
- The fact that JavaScript (at least for user objects) currently doesn't differentiate between missing arguments and undefined arguments is a nice simplifying rule in the language that is easy to understand..
It does differentiate, at least in regard to the arguments object. Consider:
function f(a,b,c) { console.log(arguments.length); a="modified"; console.log(arguments[0]) } f(); f(undefined);
Which for Es5 produces: 0 undefined 1 modified
- The example above, of wanting to have an API that allows explicitly passing in undefined to override a default value, seems outside of the common case (out of curiosity - are there any realistic example of this?). If truly desired, it is easy to not use default values. But the common case seems more likely to be to emulate what is done today - and avoid having any undefined values flow into the API call.
Why is the example, outside of common sense. It is a straightforward function to fill every element of an array with a common value. Undefined is certain something that can be stored in arrays so why wouldn't there be situations where where you would want to pass undefined. Particularly if the fill function was written by an unreformed Java programmer who used a peculiar default fill value.
I agree that there is some confusion among some JS programmer about the current missing argument default value rules. However, I don't think what you are suggesting is going to reduce that confusion. I think it will increase it.
Allen
Luke
function(a){return a}.bind(null)
This is also a shortcut to making a dense array of a given size.
Array.apply(null, Array(5)) // [undefined, undefined, undefined, undefined, undefined]
Array(5).map(...) // won't do anything with a sparse array Array.apply(null, Array(5)).map(Function.prototype.call.bind(Number)) // [0,1,2,3,4,5]
The wiki says that this should fill the array with 'undefined' (I believe the spec draft aims to say the same, but I can't tell where this is established).
The same runtime semantics routines are used for both default parameter initialization and for initializing array destructurings in let/const declarations (and hence same semantics for places). For formal parameters it starts in the Runtime Semantics of 13.1 and (in theory) for individual formal parameters of the form (id=expr) will eventually reach Keyed Binding Initialisation for the production singeNameBind : BindingIdentifier Initialiser in 12.2.4. It's "in theory" because it looks like the current draft is missing a couple intermediate steps in 13.1.
Got it. That's where I was looking, but didn't see the place where (id=expr) was explicitly handled and passed into Keyed Binding Initialisation. Sounds like that is the step that is missing in current drafts.
- Existing uses of default value patterns in JavaSscript use undefined as the sentinel for not present, so a semantics for default values in ES6 that behaves differently will not be usable as a shorthand for any of these existing APIs without changing semantics.
Undefined is not always such a sentential. Some examples of built-ins that treat undefined different form a missing parameter: ...
The examples cited are arguably cases where the built-in behaviour is unintuitive to many JavaScript developers, because it doesn't match their expectation with user code functions and most other built-ins. I don't view any of the cases listed as validation that we would want that behaviour by default, just that there are a non-zero number of cases where it exists today.
I suggest that the appropriate way to think of about the current behavior, in the context of ES6, is that function f(a) {...} is equivalent to function f(a=undefined) {...} In other words, there is a default initialization expression, if one is not specified. So, f() and f(undefined) appear to produce the same result.
This is a good way of explaining the proposed semantics, but...
But I see why somebody calling a function defined as function(a={ }){...} explicitly as f(undefined) would expect to trigger the default value initializer.
Right. This is exactly the sort of thing I'm worried about, and seems like the practical common case for default values.
- The fact that JavaScript (at least for user objects) currently doesn't differentiate between missing arguments and undefined arguments is a nice simplifying rule in the language that is easy to understand..
It does differentiate, at least in regard to the arguments object.
True. But this is uncommon enough as to not be something most developers deal with. Default values aim to be a much more commonly used tool.
- The example above, of wanting to have an API that allows explicitly passing in undefined to override a default value, seems outside of the common case (out of curiosity - are there any realistic example of this?). If truly desired, it is easy to not use default values. But the common case seems more likely to be to emulate what is done today - and avoid having any undefined values flow into the API call.
Why is the example, outside of common sense. It is a straightforward function to fill every element of an array with a common value. Undefined is certain something that can be stored in arrays so why wouldn't there be situations where where you would want to pass undefined. Particularly if the fill function was written by an unreformed Java programmer who used a peculiar default fill value.
The last point was why I considered it outside of the common case. It seems unusual to intentionally want to fill with null by default, but still allow overriding with an undefined fill. Not impossible, but I would expect this to be rare enough that I don't mind making it the one case where default values can't be used.
I agree that there is some confusion among some JS programmer about the current missing argument default value rules. However, I don't think what you are suggesting is going to reduce that confusion. I think it will increase it.
At the end of the day - I see value in enabling the patterns developers are using today to be refactorable into default values. I worry that the current proposed semantics are too far away from what is used today to make that practical.
Of course, there is enough inconsistency in what is used currently anyway - so this may be a difficult goal to achieve fully. But I suspect that treating undefined the same as not present at least keeps things close enough the common forms below could reasonably consider migrating to default values.
// All fairly common.. if(!param) { param = 3; } if(param == null) { param = 3; } if(typeof param == 'undefined') { param = 3; } param = param || 3; var param = arguments[1] || 3;
// Not sure I've ever seen this... which seems to be the proposed default value semantics if(arguments.length < f.length) { param = 3; }
Luke
The examples cited are arguably cases where the built-in behaviour is unintuitive to many JavaScript developers, because it doesn't match their expectation with user code functions and most other built-ins. I don't view any of the cases listed as validation that we would want that behaviour by default, just that there are a non-zero number of cases where it exists today.
I suggest that the appropriate way to think of about the current behavior, in the context of ES6, is that function f(a) {...} is equivalent to function f(a=undefined) {...} In other words, there is a default initialization expression, if one is not specified. So, f() and f(undefined) appear to produce the same result.
This is a good way of explaining the proposed semantics, but...
But I see why somebody calling a function defined as function(a={ }){...} explicitly as f(undefined) would expect to trigger the default value initializer.
Right. This is exactly the sort of thing I'm worried about, and seems like the practical common case for default values.
- The fact that JavaScript (at least for user objects) currently doesn't differentiate between missing arguments and undefined arguments is a nice simplifying rule in the language that is easy to understand..
It does differentiate, at least in regard to the arguments object.
True. But this is uncommon enough as to not be something most developers deal with. Default values aim to be a much more commonly used tool.
- The example above, of wanting to have an API that allows explicitly passing in undefined to override a default value, seems outside of the common case (out of curiosity - are there any realistic example of this?). If truly desired, it is easy to not use default values. But the common case seems more likely to be to emulate what is done today - and avoid having any undefined values flow into the API call.
Why is the example, outside of common sense. It is a straightforward function to fill every element of an array with a common value. Undefined is certain something that can be stored in arrays so why wouldn't there be situations where where you would want to pass undefined. Particularly if the fill function was written by an unreformed Java programmer who used a peculiar default fill value.
The last point was why I considered it outside of the common case. It seems unusual to intentionally want to fill with null by default, but still allow overriding with an undefined fill. Not impossible, but I would expect this to be rare enough that I don't mind making it the one case where default values can't be used.
I agree that there is some confusion among some JS programmer about the current missing argument default value rules. However, I don't think what you are suggesting is going to reduce that confusion. I think it will increase it.
At the end of the day - I see value in enabling the patterns developers are using today to be refactorable into default values. I worry that the current proposed semantics are too far away from what is used today to make that practical.
Of course, there is enough inconsistency in what is used currently anyway
- so this may be a difficult goal to achieve fully. But I suspect that treating undefined the same as not present at least keeps things close enough the common forms below could reasonably consider migrating to default values.
// All fairly common.. if(!param) { param = 3; } if(param == null) { param = 3; } if(typeof param == 'undefined') { param = 3; } param = param || 3; var param = arguments[1] || 3;
// Not sure I've ever seen this... which seems to be the proposed default value semantics if(arguments.length < f.length) { param = 3; }
At first the answer to this didn't really matter to me, because how often does someone pass undefined to a function like foo(undefined). I know I don't, though I'm sure it happens occasionally. Then I thought about it and realized that it happens in my code all the time, just not like that. A much more common case is a pass through of an argument to another function.
function fadeIn(duration=200){...}
function fadeOut(duration=200){...}
function fadeToggle(duration){
if(visible){
fadeOut(duration);
}else{
fadeIn(duration);
}
}
Here, the argument duration is always passed through fadeToggle to fadeIn or fadeOut. Someone writing fadeIn would always expect to have a default of 200. fadeToggle does not care about the duration so much as it wants to pass it on and use the defaults of the functions it calls. If passing undefined does not trigger the default, it would have to be rewritten like:
function fadeToggle(duration){
var hasDuration = typeof duration != "undefined";
if(visible){
if(hasDuration){
fadeOut(duration);
}else{
fadeOut();
}
}else{
if(hasDuration){
fadeIn(duration);
}else{
fadeIn();
}
}
}
Given this, I would probably just stick to putting
duration = duration || 200;
at the top of the function.
What about functions that take two or more independent optional arguments, where sometimes you want to omit the first argument while providing the second?
For example, in the ECMAScript Internationalization API:
Intl.Collator = function(localeList, options) { ... }
currently fills in default values for both localeList and options if they're undefined or not provided. The localeList parameter comes first because most applications will want to specify it, while the options are more optional. Some rare applications however may be OK with the browser's default locale and still need to specify options. In that case, the current spec of the Internationalization API lets them use
new Intl.Collator(undefined, {usage: "search"});
It seems the current proposal for optional arguments would not let us use
Intl.Collator = function(localeList = defaultLocaleList, options = {}) { ... }
because undefined as the argument value would prevent the default value from being used, and there's no good way for the caller to say "please use the default value for this non-trailing argument".
Norbert
This is covered on the wiki too.
Fwiw, arguments.length is currently the only way of properly detecting the correct number of explicit variables of any type. I would hate for that behavior to change in the case of explicitly passing on undefined.
Default values of course do need to be set in the arguments array so it's length will depend on that. Can we maybe set an extra property on the arguments array that tells us how many arguments were explicitly passed on, counting any type? I don't see how we could otherwise figure that out, especially not after default values clobber this count.
If you really need to know how many arguments were passed just use rest parameters.
We should definitely not add more API to the arguments object.
On Fri, Apr 13, 2012 at 5:19 PM, Erik Arvidsson <erik.arvidsson at gmail.com> wrote:
If you really need to know how many arguments were passed just use rest parameters.
We should definitely not add more API to the arguments object.
I retract my suggestion. Russell was right.
On Apr 12, 2012, at 7:31 PM, Luke Hoban wrote:
...
This is a good way of explaining the proposed semantics, but...
But I see why somebody calling a function defined as function(a={ }){...} explicitly as f(undefined) would expect to trigger the default value initializer.
Right. This is exactly the sort of thing I'm worried about, and seems like the practical common case for default values.
Oops, I meant "I don't see why...". Some of my negations don't seem to be getting from my head to my finger as I type...
Restating, when I type f(undefined) I'm thinking something quite different from f()
On Fri, Apr 13, 2012 at 12:26 PM, Allen Wirfs-Brock <allen at wirfs-brock.com>wrote:
On Apr 12, 2012, at 7:31 PM, Luke Hoban wrote:
...
This is a good way of explaining the proposed semantics, but...
But I see why somebody calling a function defined as function(a={ }){...} explicitly as f(undefined) would expect to trigger the default value initializer.
Right. This is exactly the sort of thing I'm worried about, and seems like the practical common case for default values.
Oops, I meant "I don't see why...". Some of my negations don't seem to be getting from my head to my finger as I type...
Restating, when I type f(undefined) I'm thinking something quite different from f()
Yes, but as I said, and Erik pointed out is in the wiki, it is a lot more likely that someone would pass f(foo) or f(obj.foo) where foo might be undefined. Expecting undefined as a possible valid argument (as opposed to a missing argument) seems like a very rare case, and probably a code smell. I think it would be very unintuitive to the majority of JavaScript developers, and greatly undermines the usefulness of default parameters for the sake of a minority use case. In those cases where undefined is an acceptable parameter, just don't use default parameters.
On Apr 12, 2012, at 8:38 PM, Russell Leggett wrote:
At first the answer to this didn't really matter to me, because how often does someone pass undefined to a function like foo(undefined). I know I don't, though I'm sure it happens occasionally. Then I thought about it and realized that it happens in my code all the time, just not like that. A much more common case is a pass through of an argument to another function.
function fadeIn(duration=200){...} function fadeOut(duration=200){...} function fadeToggle(duration){ if(visible){ fadeOut(duration); }else{ fadeIn(duration); } }
Here, the argument duration is always passed through fadeToggle to fadeIn or fadeOut. Someone writing fadeIn would always expect to have a default of 200. fadeToggle does not care about the duration so much as it wants to pass it on and use the defaults of the functions it calls. If passing undefined does not trigger the default, it would have to be rewritten like:
function fadeToggle(duration){ var hasDuration = typeof duration != "undefined"; if(visible){ if(hasDuration){ fadeOut(duration); }else{ fadeOut(); } }else{ if(hasDuration){ fadeIn(duration); }else{ fadeIn(); } } }
I'd write it:
function fadeToggle(...args){
if(visible){
fadeOut(...args);
}else{
fadeIn(...args);
}
}
If you don't care about the the actual argument values are just passing them on that's how you should do it.
I'd write it:
function fadeToggle(...args){ if(visible){ fadeOut(...args); }else{ fadeIn(...args); } }
If you don't care about the the actual argument values are just passing them on that's how you should do it.
Ok, sure, but in an equally likely case, only one of the arguments is a pass through, should you still use ...args for one thing just to distinguish? What about other case like objects with optional properties fadeIn(config.duration)?
I just don't understand why someone who wanted to be able to distinguish between missing and undefined would really need the extra convenience of default parameters. What are they going to default to - undefined? Meanwhile, the rest of the people that probably could use it all the time to replace their foo = foo || "default" code have to keep it around. I understand what you're saying in principle. There is a certain amount of purity and correctness to it, I just don't think its practical.
On Apr 12, 2012, at 11:25 PM, Norbert Lindenberg wrote:
What about functions that take two or more independent optional arguments, where sometimes you want to omit the first argument while providing the second?
arguably, this is where you should be using an options argument rather than multiple positional parameters with default values. Certainly for the "or more" case.
For example, in the ECMAScript Internationalization API:
Intl.Collator = function(localeList, options) { ... }
currently fills in default values for both localeList and options if they're undefined or not provided. The localeList parameter comes first because most applications will want to specify it, while the options are more optional. Some rare applications however may be OK with the browser's default locale and still need to specify options. In that case, the current spec of the Internationalization API lets them use
new Intl.Collator(undefined, {usage: "search"});
I'suggest that this is better written as:
new Intl.Collator(Intl.LocalList(), {usage:"search"});
It would probably even better is Intl provided more explicit access to the default locale:
new Intl.Collator(Intl.defaultLocale, {usage:"search"});
If it is indeed a rare usage then it is better to be explicit in the code rather than depending upon the code reader to ponder the intended meaning of a seldom used explicit undefined.
It seems the current proposal for optional arguments would not let us use
Intl.Collator = function(localeList = defaultLocaleList, options = {}) { ... }
because undefined as the argument value would prevent the default value from being used, and there's no good way for the caller to say "please use the default value for this non-trailing argument".
You would still declare the formals as above. But in addition, the body should contain a: localeList = localeList || defaultLocaleList;
or perhaps:
localeList = localeList === undefined ? defaultLocaleList : localeList;
If you don't want other falsy arguments such as null to also trigger use of the default.
Allen Wirfs-Brock wrote:
On Apr 12, 2012, at 11:25 PM, Norbert Lindenberg wrote:
What about functions that take two or more independent optional arguments, where sometimes you want to omit the first argument while providing the second?
arguably, this is where you should be using an options argument rather than multiple positional parameters with default values. Certainly for the "or more" case.
Ok, so use an options argument and then add delegation:
function add({x = 0, y = 0}) { return x + y; } function inc({opt_y}) { return add({x:1, y:opt_y}); } assertEq(1, inc({}));
The author surely wants y to default to 0 and for the assertEq to succeed.
In general, delegation (depth D) plus optionality (degree N paramters) makes an (2N)^D worst-case combinatorial explosion.
This is IMHO a strong argument for a sentinel in-language to mean "missing actual".
It’s an interesting case of “traditional lenient style” versus “new, more strict style”. Both approaches have merit (pro leniency: familiarity, compatibility). With arity checking, a first step towards the latter style has been made, I think it makes sense to continue in that direction.
On Apr 13, 2012, at 9:50 AM, Russell Leggett wrote:
I'd write it:
function fadeToggle(...args){ if(visible){ fadeOut(...args); }else{ fadeIn(...args); } }
If you don't care about the the actual argument values are just passing them on that's how you should do it.
Ok, sure, but in an equally likely case, only one of the arguments is a pass through, should you still use ...args for one thing just to distinguish? What about other case like objects with optional properties fadeIn(config.duration)?
One of the nice things about ES6 is that we have destructuring and default/rest arguments. The defaulting semantics of destructuring binding is currently exactly the same (they use the same semantic rules) as formal parameter binding. that means you can almost[*] always replace: function f(<formals>) { ... }
with
function f(...args) { let [<formals>] = args; ... }
If you want to treat some argument one way in certain circumstances and another way in other circumstances all you have to do is do one or more manual destructurings such as above.
[*] It's "almost" because slightly different shadowing rules are used when processing default value expression in the formals parameters position. Such differences can be avoid if you are careful in your name selection.
I just don't understand why someone who wanted to be able to distinguish between missing and undefined would really need the extra convenience of default parameters. What are they going to default to - undefined? Meanwhile, the rest of the people that probably could use it all the time to replace their foo = foo || "default" code have to keep it around. I understand what you're saying in principle. There is a certain amount of purity and correctness to it, I just don't think its practical.
function f(foo="default") {}
if not semantically the same as
function f(foo) {foo == foo || "default"}
consider
f(null), f(false), f(0), f(NaN), f("")
On Apr 13, 2012, at 10:10 AM, Brendan Eich wrote:
Allen Wirfs-Brock wrote:
On Apr 12, 2012, at 11:25 PM, Norbert Lindenberg wrote:
What about functions that take two or more independent optional arguments, where sometimes you want to omit the first argument while providing the second?
arguably, this is where you should be using an options argument rather than multiple positional parameters with default values. Certainly for the "or more" case.
Ok, so use an options argument and then add delegation:
function add({x = 0, y = 0}) { return x + y; } function inc({opt_y}) { return add({x:1, y:opt_y}); } assertEq(1, inc({}));
The author surely wants y to default to 0 and for the assertEq to succeed.
In general, delegation (depth D) plus optionality (degree N paramters) makes an (2N)^D worst-case combinatorial explosion.
This is IMHO a strong argument for a sentinel in-language to mean "missing actual".
That sentinel could simply be a empty argument position:
new Intl.Collator( , {usage: "search"});
An implementation level sentinel value would probably still be needed but it would never be directly exposed to user level code.
I was aghast the first time somebody suggested this a while ago. It reminded me to much of OS/360 JCL:
//U99999A JOB (*),,CLASS=A
However if there is a real need for non-trailing defaulting positional parameters then that may indeed be the best way to do it. If so, I would define it so that something like: function f(...args) {} f(,,3);
produced a sparse args array object. This would actually straighten the common semantics between formal parameter processing and array destructuring.
It also have to apply and argument list result spread translate array holes into the internal missing value sentinel.
On Apr 13, 2012, at 10:53 AM, Allen Wirfs-Brock wrote:
On Apr 13, 2012, at 10:10 AM, Brendan Eich wrote:
In general, delegation (depth D) plus optionality (degree N paramters) makes an (2N)^D worst-case combinatorial explosion.
This is IMHO a strong argument for a sentinel in-language to mean "missing actual".
That sentinel could simply be a empty argument position:
new Intl.Collator( , {usage: "search"});
That's not enough. It doesn't allow you to make a dynamic decision as to whether or not to pass that argument, which still leaves you in general with the combinatorial explosion.
I'm very much in favor of defaults treating undefined exactly the same as no argument.
On Apr 13, 2012, at 9:38 AM, Russell Leggett wrote:
Yes, but as I said, and Erik pointed out is in the wiki, it is a lot more likely that someone would pass f(foo) or f(obj.foo) where foo might be undefined.
Bingo.
Expecting undefined as a possible valid argument (as opposed to a missing argument) seems like a very rare case, and probably a code smell.
Amen.
I think it would be very unintuitive to the majority of JavaScript developers, and greatly undermines the usefulness of default parameters for the sake of a minority use case.
Preach it.
In those cases where undefined is an acceptable parameter, just don't use default parameters.
+9001, as rwaldron likes to say (big fan of ISO?)
What happens if i have:
function foo(a=1, b=2) { log(a, b, arguments.length); }
foo(); foo(undefined); foo(3); foo(undefined, undefined); foo(undefined, 3); foo(3, undefined);
Default values are for when arguments are not passed, it does not make logical sense to say that they're the value given just because someone has passed undefined. It also makes the behavioural definition more complex.
Making default values overridden by undefined makes them useless in a large portion of cases, but not doing so makes arguments.length a liar and invalidates other cases.
On Apr 13, 2012, at 11:22 AM, David Herman wrote:
On Apr 13, 2012, at 10:53 AM, Allen Wirfs-Brock wrote:
On Apr 13, 2012, at 10:10 AM, Brendan Eich wrote:
In general, delegation (depth D) plus optionality (degree N paramters) makes an (2N)^D worst-case combinatorial explosion.
This is IMHO a strong argument for a sentinel in-language to mean "missing actual".
That sentinel could simply be a empty argument position:
new Intl.Collator( , {usage: "search"});
That's not enough. It doesn't allow you to make a dynamic decision as to whether or not to pass that argument, which still leaves you in general with the combinatorial explosion.
sure you can
Intl.Collator = function(...args) { let [locale=defaultLocale, options={}] = args; let localeDefaulted = args.hasOwnProperty("0") ... }
On Apr 13, 2012, at 11:35 AM, Oliver Hunt wrote:
What happens if i have:
function foo(a=1, b=2) { log(a, b, arguments.length); }
foo();
1, 2, 2
foo(undefined);
1, 2, 1
foo(3);
3, 2, 1
foo(undefined, undefined);
1, 2, 2
foo(undefined, 3);
1, 3, 2
foo(3, undefined);
3, 2, 2
Default values are for when arguments are not passed,
Stated without evidence!
it does not make logical sense to say that they're the value given just because someone has passed undefined.
It makes perfect sense. But hey, I had the same reaction as you at first, so I sympathize. Default values conceptually represent "no value provided," and the natural way to think of this from a language designer or implementer's perspective is to think about whether the syntax of the call included an argument at the particular position. But this is not the programmer's perspective. From the programmer's perspective, it's "do I have a value to provide for this argument?" And in reality, the answer to that question is often a dynamic one, not a static one. You often have to make the decision at runtime whether to provide a value for a given argument. When you do, the semantics based on arguments.length has terrible consequences: you either have to write a combinatorial explosion of separate calls (code bloat -- unacceptable), or you just end up reimplementing the default logic by conditionally producing the default value (abstraction violation -- unacceptable).
So after thinking more about it, I came to the conclusion that:
- default values are for when no value of the expected input type was provided
- undefined is almost universally used in JS as a sentinel value for "I have no value of the expected input type," because it's not statically typed so it's not possible to express option types
- whenever you need to make conditional decisions about whether you have a value for various arguments, the arguments.length semantics has unacceptable engineering consequences
- in the rare cases where you want to treat undefined as an acceptable value of the expected input type, just don't use default values
It also makes the behavioural definition more complex.
The behavioral definition is trivial. If the value of an argument with a given default is undefined, you use the default.
On Apr 13, 2012, at 11:26 AM, David Herman wrote:
On Apr 13, 2012, at 9:38 AM, Russell Leggett wrote:
Yes, but as I said, and Erik pointed out is in the wiki, it is a lot more likely that someone would pass f(foo) or f(obj.foo) where foo might be undefined.
Bingo.
how does that equate to expecting the parameter to default? If foo or object.foo being uninitialized is unintentional it is better propagate the undefined rather than treating it as f(). The later will tend to hide latent logic errors.
Expecting undefined as a possible valid argument (as opposed to a missing argument) seems like a very rare case, and probably a code smell.
Amen.
both sides of this debate seem a little smelly. But, Pretending that undefined isn't a real value seems more smelly to me.
On Apr 13, 2012, at 11:46 AM, Allen Wirfs-Brock wrote:
On Apr 13, 2012, at 11:22 AM, David Herman wrote:
On Apr 13, 2012, at 10:53 AM, Allen Wirfs-Brock wrote:
That sentinel could simply be a empty argument position:
new Intl.Collator( , {usage: "search"});
That's not enough. It doesn't allow you to make a dynamic decision as to whether or not to pass that argument, which still leaves you in general with the combinatorial explosion.
sure you can
Intl.Collator = function(...args) { let [locale=defaultLocale, options={}] = args; let localeDefaulted = args.hasOwnProperty("0") ... }
I'm talking about the caller code. If the caller needs to make a dynamic decision about whether to provide argument 0, it has to write:
let locale = foo.getLocaleIfTheresOneAvailable();
locale === undefined ? new Intl.Collator(locale, {usage: "search"})
: new Intl.Collator(, {usage: "search"})
That's bad enough with only one. Now when you scale that up to multiple optional arguments you get combinatorial explosion. The semantics that always defaults undefined just allows you to write
new Intl.Collator(foo.getLocaleIfTheresOneAvailable(), {usage: "search"})
On Apr 13, 2012, at 11:51 AM, Allen Wirfs-Brock wrote:
both sides of this debate seem a little smelly. But, Pretending that undefined isn't a real value seems more smelly to me.
There's no need to think of it as pretending it's not a real value. Think of it as saying that the undefined value is the idiomatic way in JS to represent "no value for the expected type." The smelly thing then is to create API's that both use defaults and accept undefined as a valid input.
Put differently: defaulting based on whether the argument happened to be provided is reporting on what the syntax of the call looked like. But an API doesn't and shouldn't care about the syntax of the call. It just wants a protocol whereby the caller can indicate whether or not they have a value for a given argument. The undefined value is an idiomatic way to do that in JS, and it composes well with other expression forms that already produce undefined when they don't have a value. That puts some burden on programmers not to treat undefined as a normal value, but that's already a cost programmers live with in JS. And that cost is outweighed by the benefit of a more expressive call syntax.
On Apr 13, 2012, at 11:48 AM, David Herman wrote:
On Apr 13, 2012, at 11:35 AM, Oliver Hunt wrote:
What happens if i have:
function foo(a=1, b=2) { log(a, b, arguments.length); }
foo();
1, 2, 2
Oops: 1, 2, 0
On Apr 13, 2012, at 11:48 AM, David Herman <dherman at mozilla.com> wrote:
On Apr 13, 2012, at 11:35 AM, Oliver Hunt wrote:
What happens if i have:
function foo(a=1, b=2) { log(a, b, arguments.length); }
foo();
1, 2, 2
foo(undefined);
1, 2, 1
Uh what? I pass no arguments and arguments.length is 2, and i pass one argument and arguments.length is 1?
Sorry, that behaviour makes no sense at all.
foo(3);
3, 2, 1
foo(undefined, undefined);
1, 2, 2
So i've passed 2 arguments, but they both get replaced?
foo(undefined, 3);
1, 3, 2
So if you have default values we have to emit code to check for undefined in each case?
is foo(a=5, b) valid? The above examples imply that it should be.
foo(3, undefined);
3, 2, 2
Default values are for when arguments are not passed,
Stated without evidence!
C++, python, ...
it does not make logical sense to say that they're the value given just because someone has passed undefined.
It makes perfect sense. But hey, I had the same reaction as you at first, so I sympathize. Default values conceptually represent "no value provided," and the natural way to think of this from a language designer or implementer's perspective is to think about whether the syntax of the call included an argument at the particular position. But this is not the programmer's perspective. From the programmer's perspective, it's "do I have a value to provide for this argument?" And in reality, the answer to that question is often a dynamic one, not a static one. You often have to make the decision at runtime whether to provide a value for a given argument. When you do, the semantics based on arguments.length has terrible consequences: you either have to write a combinatorial explosion of separate calls (code bloat -- unacceptable), or you just end up reimplementing the default logic by conditionally producing the default value (abstraction violation -- unacceptable).
So after thinking more about it, I came to the conclusion that:
- default values are for when no value of the expected input type was provided
- undefined is almost universally used in JS as a sentinel value for "I have no value of the expected input type," because it's not statically typed so it's not possible to express option types
- whenever you need to make conditional decisions about whether you have a value for various arguments, the arguments.length semantics has unacceptable engineering consequences
- in the rare cases where you want to treat undefined as an acceptable value of the expected input type, just don't use default values
This makes it impossible for me to distinguish between passing undefined as an argument and not passing an argument. If i've written:
foo(undefined)
There's a very clear expectation that i have selected undefined as the value i want to be passed. Yet you're saying this should be interpreted as foo() (only with a shorter arguments.length).
I'm sympathetic toward undefined
as a sentinel for "no value of the expected type," whereas null
means "we have a value of the expected type, but that value represents 'nothing.'" Not sure if anyone else sees it that way, though, and admittedly it's based on vague hand-wavey arguments.
David Herman wrote:
Think of it as saying that the undefined value is the idiomatic way in JS to represent "no value for the expected type." The smelly thing then is to create API's that both use defaults and accept undefined as a valid input.
FTR, I'm on board with Arv, Dave, Russell, and probably others on this. I agree that we must not impose the combinatorial explosion. Alternatively, we must not require copy/paste of parameter default values across delegation graphs.
That leaves using undefined as the in-language defaulting sentinel as the "least worst" option. Teaching programmers to avoid giving a maybe-undefined-on-purpose parameter a default value (other than undefined!) is doable and I would be surprised if people wrote
function foo(a = default_a) {...}
and wanted
foo(undefined)
to bind default_a to a.
(Again, if default_a is the value |undefined| then everything "just works", but writing
function foo(a = undefined) {...}
is a silly way to write
function foo(a) {...}
IMHO.)
On Fri, Apr 13, 2012 at 3:05 PM, Oliver Hunt <oliver at apple.com> wrote:
On Apr 13, 2012, at 11:48 AM, David Herman <dherman at mozilla.com> wrote:
On Apr 13, 2012, at 11:35 AM, Oliver Hunt wrote:
What happens if i have:
function foo(a=1, b=2) { log(a, b, arguments.length); }
foo();
1, 2, 2
foo(undefined);
1, 2, 1
Uh what? I pass no arguments and arguments.length is 2, and i pass one argument and arguments.length is 1?
Sorry, that behaviour makes no sense at all.
foo(3);
3, 2, 1
foo(undefined, undefined);
1, 2, 2
So i've passed 2 arguments, but they both get replaced?
foo(undefined, 3);
1, 3, 2
So if you have default values we have to emit code to check for undefined in each case?
is foo(a=5, b) valid? The above examples imply that it should be.
foo(3, undefined);
3, 2, 2
Default values are for when arguments are not passed,
Stated without evidence!
C++, python, ...
it does not make logical sense to say that they're the value given just because someone has passed undefined.
It makes perfect sense. But hey, I had the same reaction as you at first, so I sympathize. Default values conceptually represent "no value provided," and the natural way to think of this from a language designer or implementer's perspective is to think about whether the syntax of the call included an argument at the particular position. But this is not the programmer's perspective. From the programmer's perspective, it's "do I have a value to provide for this argument?" And in reality, the answer to that question is often a dynamic one, not a static one. You often have to make the decision at runtime whether to provide a value for a given argument. When you do, the semantics based on arguments.length has terrible consequences: you either have to write a combinatorial explosion of separate calls (code bloat -- unacceptable), or you just end up reimplementing the default logic by conditionally producing the default value (abstraction violation -- unacceptable).
So after thinking more about it, I came to the conclusion that:
- default values are for when no value of the expected input type was provided
- undefined is almost universally used in JS as a sentinel value for "I have no value of the expected input type," because it's not statically typed so it's not possible to express option types
- whenever you need to make conditional decisions about whether you have a value for various arguments, the arguments.length semantics has unacceptable engineering consequences
- in the rare cases where you want to treat undefined as an acceptable value of the expected input type, just don't use default values
This makes it impossible for me to distinguish between passing undefined as an argument and not passing an argument. If i've written:
foo(undefined)
There's a very clear expectation that i have selected undefined as the value i want to be passed. Yet you're saying this should be interpreted as foo() (only with a shorter arguments.length).
Can I just say that it bothers me when we start thinking about undefined as an actual thing? We have something for that. It's null. Undefined is the lack of definition. If I have an object foo and I start asking for things it doesn't have, foo.nope, foo.sorry, foo.i_dont_think_so - those are all undefined. foo is not an object with infinite slots all filled with the value undefined. Those things are just not defined. So yes, quite frankly, I think logically, foo(undefined) really should be the same as foo(), and trying to twist your function to be something else is an abuse of undefined. I recognize that there is a challenge reconciling this with arguments.length and rest parameters, but lets not make it worse than it has to be.
On Fri, Apr 13, 2012 at 12:00 PM, David Herman <dherman at mozilla.com> wrote:
On Apr 13, 2012, at 11:51 AM, Allen Wirfs-Brock wrote:
both sides of this debate seem a little smelly. But, Pretending that undefined isn't a real value seems more smelly to me.
There's no need to think of it as pretending it's not a real value. Think of it as saying that the undefined value is the idiomatic way in JS to represent "no value for the expected type." The smelly thing then is to create API's that both use defaults and accept undefined as a valid input.
Put differently: defaulting based on whether the argument happened to be provided is reporting on what the syntax of the call looked like. But an API doesn't and shouldn't care about the syntax of the call. It just wants a protocol whereby the caller can indicate whether or not they have a value for a given argument. The undefined value is an idiomatic way to do that in JS, and it composes well with other expression forms that already produce undefined when they don't have a value. That puts some burden on programmers not to treat undefined as a normal value, but that's already a cost programmers live with in JS. And that cost is outweighed by the benefit of a more expressive call syntax.
I'm with Dave.
Even in Lisp, where you can ask for explicit indicator of whether a particular optional or keyword argument was passed or not, delegation is very annoying because of this problem. You end up having to add conditional code to call the same function with slightly different signatures, just like several people have provided in this thread already.
You really do need an explicit "I'm not really passing this value" value, even if it seems smelly or overly meta. Using undefined for this is the right thing to do, given existing JS semantics, and the fact that JS also has null which is approximately the same thing for most semantic purposes.
The story for JS is
undefined is "no value" null is "no object"
It's definitely a just-so story, and some listeners want their money back, but we're stuck with two. For defaulting, undefined is the only choice of an in-language sentinel you can express. No one wants to add yet another bottom type and singleton value.
Argh, I've caught Allen's dropped-negative disease:
Brendan Eich wrote:
I would be surprised if people wrote
function foo(a = default_a) {...}
and wanted
foo(undefined)
to bind default_a to a.
s/wanted/not want/
or else s/bind default_a to a/bind undefined to a/.
Will proofread thrice before sending henceforth.
On Apr 13, 2012, at 12:05 PM, Oliver Hunt wrote:
On Apr 13, 2012, at 11:48 AM, David Herman <dherman at mozilla.com> wrote:
On Apr 13, 2012, at 11:35 AM, Oliver Hunt wrote:
foo(undefined);
1, 2, 1
Uh what? I pass no arguments and arguments.length is 2, and i pass one argument and arguments.length is 1?
See my follow-up, answer to the first one was wrong; arguments.length is of course 0 in the first case.
foo(undefined, undefined);
1, 2, 2
So i've passed 2 arguments, but they both get replaced?
Yes.
foo(undefined, 3);
1, 3, 2
So if you have default values we have to emit code to check for undefined in each case?
is foo(a=5, b) valid? The above examples imply that it should be.
I'm somewhat ambivalent about it. Sure, it's true that "undefined always gets replaced by default" means it's possible to get a defaulted argument followed by a non-defaulted argument. But we also don't have to encourage writing positional arguments like that; they can use options objects for that.
foo(3, undefined);
3, 2, 2
Default values are for when arguments are not passed,
Stated without evidence!
C++, python, ...
That's precedent, but not an actual argument about why it should be done one way or another. (Also, the decision is utterly different in a statically typed language, so C++ isn't relevant.)
This makes it impossible for me to distinguish between passing undefined as an argument and not passing an argument.
That's right. In that case, you should not use the x=expr default syntax. But this is an uncommon case. The semantics you advocate is optimizing for the wrong cases.
If i've written:
foo(undefined)
There's a very clear expectation that i have selected undefined as the value i want to be passed.
You can just as easily say that if you've written:
foo(getOptionalValue())
there's a very clear expectation that if getOptionalValue() returns undefined, you don't want the value to be passed.
Yet you're saying this should be interpreted as foo() (only with a shorter arguments.length).
Longer argument.length, but yes.
On Apr 13, 2012, at 12:16 PM, Brendan Eich wrote:
The story for JS is
undefined is "no value" null is "no object"
Yet there are many places in the language where undefined is not the same as no value. For example:
let mapped1 = [,,"foo",,].map(v=>v+1);
let mapped2 =[undefined,undefined,"foo",undefined,undefined].map(v>v+1);
produce quite different values for mapped1 and mapped2.
obj.foo = undefined;
does not mean delete obj.foo;
new Array(undefined, undefined, undefined, undefined)
does not mean new Array()
Additionally, function (x = "default") {return x} does not mean the same thing as function (x) { x= x || "default"; return x} under either of the proposed default parameter value semantics.
Making f() and f(undefined) mean the something (but only sometimes, you have to look at the actual implementation of f to know for sure) seems to be add just another internal inconsistency that makes it harder to form a general conceptual model of the language.
On Apr 13, 2012, at 12:16 PM, Brendan Eich wrote:
...
No one wants to add yet another bottom type and singleton value.
Permitting "holes" in argument lists, eg Intl.Collator( , {usage:"search"}) doesn't require either at the user language level or the specification level. Whether a sentinel value would be needed to represent holes in argument lists at the implementation level is entirely an implement issue just like it is for sparse arrays.
Allen Wirfs-Brock wrote:
On Apr 13, 2012, at 12:16 PM, Brendan Eich wrote:
No one wants to add yet another bottom type and singleton value.
Permitting "holes" in argument lists, eg Intl.Collator( , {usage:"search"}) doesn't require either at the user language level or the specification level. Whether a sentinel value would be needed to represent holes in argument lists at the implementation level is entirely an implement issue just like it is for sparse arrays.
I don't think it's that easy. Your example showing hasOwnProperty testing points to one problem. Without a very convenient is-this-value-a-hole predicate, people will push back against the need to test for holes in rest parameters or (shudder) arguments objects.
The big-picture objection is that holes in arrays do not justify holes in argument lists. Arguably holes in arrays are a botch, or at least over-exposed (e.g. by array extras and generic built-ins). Why propagate one botch into a new domain?
Allen Wirfs-Brock wrote:
Making f() and f(undefined) mean the something (but only sometimes, you have to look at the actual implementation of f to know for sure) seems to be add just another internal inconsistency that makes it harder to form a general conceptual model of the language.
This may be, but parameter default values where delegation makes a case explosion are bad too. I'd take the "internal inconsistency" hit. Tab Atkins cited Lisp (Common Lisp, I take it) experience. Sam has similar testimony from Racket (neé PLT-Scheme).
On Fri, Apr 13, 2012 at 5:42 PM, Brendan Eich <brendan at mozilla.org> wrote:
Sam has similar testimony from Racket (neé PLT-Scheme).
Very much so -- I've encountered places where Racket's semantics for keyword arguments (which is similar to what Allen proposes for ES6) ends up causing the entire set of keyword arguments to be propagated back up the chain and provided in full at every call site.
On Apr 13, 2012, at 3:42 PM, Sam Tobin-Hochstadt wrote:
On Fri, Apr 13, 2012 at 5:42 PM, Brendan Eich <brendan at mozilla.org> wrote:
Sam has similar testimony from Racket (neé PLT-Scheme).
Very much so -- I've encountered places where Racket's semantics for keyword arguments (which is similar to what Allen proposes for ES6) ends up causing the entire set of keyword arguments to be propagated back up the chain and provided in full at every call site.
I'm not so sure that a comparison to keyword arguments is valid. The ES equivalent to keywords is an options object. I think that default argument expressions are are probably only going to be useful when there is one or two of them in a trailing position. For that usage, any combinatory case explosion is going to be at most a small pop.
There are two cases, both seen in the wild in JS:
- Delegation of non-keyword parameters:
function add(x = 0, y = 0) { return x + y; } function inc(opt_y) { return add(1, opt_y); } assertEq(1, inc());
- Delegation of keyword parameters:
function add({x = 0, y = 0}) { return x + y; } function inc({opt_y}) { return add({x:1, y:opt_y}); } assertEq(1, inc({}));
In either case, combinatorial explosing is a problem without a sentinel that is passed for missing-actual/not-found-property and that triggers defaulting.
Tab and Sam testify about 2 but it's a transformation of 1.
The not-found-property aspect makes the case for undefined as the sentinel, IMHO. Not a new masquerades-as-undefined magic value.
See below:
I'm not sure who told Cameron that this is likely to change from the current spec. but regardless I don't think it is a good idea.
Just to start, it means a method like:
Array.prototype.fill = function(value=null) { this.forEach((v,i)=>this[i]=value)}
wouldn't do the obvious thing when called as:
someArray.fill(undefined)
Allen
Begin forwarded message: