Wouldn't being bolder on new constructs save the "breaking the web" costs for future"?
You are describing path dependency, which afflicts us all, in various evolving systems including JS.
We cannot keep resetting ES6 to include and think through and make consistent things such as nil. Sorry, that trades off badly against developers' time -- they need the good stuff coming already in top engines, which Allen is drafting.
Of course we have strawman: space on the wiki, and I'm going to work on existential operators and possibly nil there, for inclusion in Harmony (whatever edition that might be).
And since we're only human, along the dependent paths we walk, missteps happen. If we can back up and take a better path in time for ES6, we will.
Another thing we try to do: make incremental progress edition by edition, based on experience ("pave the cowpaths", lessons from nearby languages, championed designs that are prototyped early, e.g. proxies), but not close important doors. This is called future-proofing.
Your desire to make class object identity !== class constructor function identity could be viewed as future-proofing, but we have to pick some identity for ES6. And again, apart from the need to separate constructor invocation via () from invocation view new, we don't have strong motivation to future-proof.
It's easy to wrap up your own designs along neater lines that themselves have lots of dependencies (nil in language is controversial, from my twitter survey -- it arguably hides errors). Mature language design, really all successful language design however radical, shies away from running ahead along one path too far, at least in the "main line". That would be ECMA-262, so ES6 ;-).
Brendan Eich wrote:
You are describing path dependency, which afflicts us all, in various evolving systems including JS.
We cannot keep resetting ES6 to include and think through and make consistent things such as nil. Sorry, that trades off badly against developers' time -- they need the good stuff coming already in top engines, which Allen is drafting.
If you would explain this better, I'd be glad - it is too dense for me to understand (especially "include and think thorugh" and "they need the good stuff ..."). Thanks.
And since we're only human, along the dependent paths we walk, missteps happen. If we can back up and take a better path in time for ES6, we will.
I understand there is a legacy, but I am deeply convinced there is hardly better path through which fix(es) can be made than through new construct. Because old ones must work as expected.
Another thing we try to do: make incremental progress edition by edition, based on experience ("pave the cowpaths", lessons from nearby languages, championed designs that are prototyped early, e.g. proxies), but not close important doors. This is called future-proofing.
What are those "champions" and "championed" you use in meeting notes often? Thanks again. :-|
Your desire to make class object identity !== class constructor function identity could be viewed as future-proofing, but we have to pick some
Exactly (and with nil as well, though there it is a bit weaker; the whole OP was about "fix&future-proof" combo).
identity for ES6. And again, apart from the need to separate constructor
"some [non-tied] identity" I'd say, when the proposal is about making those identities separated. That's why I proposed (as a general, first step) the most general thing - a plain Object. As a most general one, it is most future-proofing.
invocation via () from invocation view new, we don't have strong motivation to future-proof.
Hm.
What exactly is "future-proof"? How does it relate to "enough space for new cowpaths" and "progress"? These things are really mixing in my head, probably because they really are somewhat intertwined.
If future-proof is really only about "Just don't make this impossible, eventually", that is, if it is about preventing unnecessary constraining (eg. when existing things are fixed / specced better) a specific known use case is probably good enough as a criterion whether to future-proof.
But I think this is not a case for future-proof which is combined with additions (not updates). Additions invlove those "new spaces for cowpaths". There, you cannot (I think) use some specific use case as a criterion, because you don't know where and how the cows will walk.
So I wonder if it is more a future-proofing or opening the gates? I am saying that class !== constructor proposal is not just a future-proofing, it's 'open the gates to use plain objects as classes" combined with future-proofing "any (normal) object can be, eventually, used within new / extends, the user chooses".
I'd like to ask my question on the original post again, sorry. If there is "future-proof" combined with "new space for cowpath" possibility and it can be brought in with the new feature, is it not worth considering, given that it cannot be done later when new feature will be stuck with already given, probably inherited from legacy, semantics?
However I see at it, it more than future-proofing. It is future-proofing--model-evolution--only-t6-be-efficiently-done-now.
I was asking on such combos. Or I see it wrongly? I am really deeply convinced that once class will produce legacy class===constructor pattern, no [[Constructor]] freeing can ever happen because there will already be code that uses class and exploits that fact; the same is for existential operator not returning nil - later it cannot be switched.
It's easy to wrap up your own designs along neater lines that themselves
Yes.
have lots of dependencies (nil in language is controversial, from my twitter survey -- it arguably hides errors). Mature language design,
Yes, it hides. :-/
really all successful language design however radical, shies away from running ahead along one path too far, at least in the "main line". That would be ECMA-262, so ES6 ;-).
Would changing [[Construct]] semantics for class
(at least half a
step) to clearly follow .prototype.constructor being a "too far ahead
one path"? It gives free new/super consistency and free default
constructors (and insight into class/constructor separated worldview)?
I don't consider even the real class/constructor separation too far ahead, it is little well-placed design change (in a new construct).
Herby Vojčík wrote:
Brendan Eich wrote:
You are describing path dependency, which afflicts us all, in various evolving systems including JS.
We cannot keep resetting ES6 to include and think through and make consistent things such as nil. Sorry, that trades off badly against developers' time -- they need the good stuff coming already in top engines, which Allen is drafting.
If you would explain this better, I'd be glad - it is too dense for me to understand (especially "include and think thorugh" and "they need the good stuff ..."). Thanks.
Adding some new feature to JS requires not just evaluating the design of the thing itself, but how it interacts with the rest of the language.
This is hard! It's not just a question of direct combinatorial complexity, but subtle indirect interactions, human factors pitfalls, things that often require implementation and user testing to find.
And since we're only human, along the dependent paths we walk, missteps happen. If we can back up and take a better path in time for ES6, we will.
I understand there is a legacy, but I am deeply convinced there is hardly better path through which fix(es) can be made than through new construct. Because old ones must work as expected.
Yes, can't break the web.
Making the new too different from the old, while keeping the old working, is certainly possible. But we do not want to make a big fork in the road. As pundit-of-malapromisms Yogi Berra said, "/When you/ come to a /fork in the road/, /take it/".
If we fork JS too much with new runtime semantics, and developers "take the fork", then transpilers become full compilers, the cognitive load on all developers grows quite a bit.
JS does not need to fork in order to grow to what it should become. This is a vague statement, I grant you. But Harmony's means to its ends is
- Minimize the additional semantic state needed beyond ES5.
We agreed to this in 2008 July at Oslo, I still think it holds.One reason for this agreement, above and beyond trying to avoid adding too much combinatorial and deep human-user-testing-required complexity: not trying to make JS into "a different language".
Again, that's vague. Languages evolve, change is a constant over deep time. But consider ES4's attempts to add optional (gradual/hybrid) type annotations that could be statically checked.That was one of the things the Harmony ends-and-means agreements precluded, intentionally.
So we are intentionally limiting how much JS might grow in its kernel semantics. I think that's a good thing.
Another thing we try to do: make incremental progress edition by edition, based on experience ("pave the cowpaths", lessons from nearby languages, championed designs that are prototyped early, e.g. proxies), but not close important doors. This is called future-proofing.
What are those "champions" and "championed" you use in meeting notes often? Thanks again. :-|
Proxies: tomvc and markm Modules: samth and dherman Max-Min Classes: awb, after dherman, after markm & others before
We avoid design-by-committee via the champions model, where one or two people design a given extension. TC39 curates championed extensions based on use-case-based demand, PLT theory and best practices, and the quality of the work itself.
Would changing [[Construct]] semantics for
class
(at least half a step) to clearly follow .prototype.constructor being a "too far ahead one path"? It gives free new/super consistency and free default constructors (and insight into class/constructor separated worldview)?
Allen seems to have addressed the present-tense use-cases here, without making class identity other than constructor identity.
[repost; first version a few days ago disappeared somewhere]
Hello,
recently I came over two issues of the very similar pattern, involving semantics of new language constructs. In both cases it seems to be that being bolder about changes that new_constructs bring (that is, not making too little steps) may save the cost of "can't make progress here, because it would be breaking change for the web".
(just 7 more paragraphs, pls read on :-) ):
First one was in the thread "Good-bye constructor functions?" where the discussed topic were new construct "class" and "super". As it stands now, they are buggy (internally inconsistent), and thinking about possible solutions I discovered that with
class
there is little need to couple class to its constructor, so I proposed leaving this box and open the space of "what can be as Foo innew Foo
" to any [[Construct]] bearing entity (that is, so thatclass
produces plain object with [[Construct]] instead of legacy coupling its identity with the identity of the constructor).(let's pretends that we overcome technical details and the proposal may actually work; I showed in the post that it may very well be so; I want to discuss the higher pattern of defensiveness vs. later compatibility problems here)
One reason I brought up why it would be fine to consider now (not later) is: if
class
, as a new language construct, behaves same as legacy constructor functions (tightly coupling the identity of class and constructor), people using ES6 accept that "this is howclass
works". If later we would want to "liberate" space of object usable as "constructor" (innew
), by makingclass
return non-coupled class objects, it would not be possible "because it would break existing code".OTOH, if
class
would return class object decoupled from constructor, it may impose some tax to refactoring existing class-like constructs intoclass
keyword, but they adopt that "space of constructors have widened" and no backward compatibility problem would appear as with splitting it into two steps.The second one was the reified-nil discussion involving pattern-destructuring and existential operator (again let's pretend technicalities can be solved). There, the issue was the semantics of existential operator (and consequently, refutable destructring, since they are coupled) would be simplified by involving nil object into the equation; by returning it from existent operator or inside refutable destructuring. One possibilty was to include nil object head-on into the language (as part of {undefined, null} ==-equivalence group), thus making it first-class and make things like
foo = (bar = p?.q).r
work fine.Another, defensive, possibility is to use nil behind the scenes, but to change it to undefined when it becomes visible, with the proposition to include first-class nil in ES7. This brings little cost now, but in ES7 I am afraid of the same effect as in previous case; notably, that there already will be code that uses new constructs (refutable destructuring or existential operator if included in ES6), but that would be broken if ES7 change the semantics to first-class nil. Again, this would not be problem if the new constructs bring the new semantics right on.
Am I missing something or is there something about this pattern? That new constructs, when being more bolder with its semantics, it can a) save backward compatibility cost compared to more granular progress; b) can be used to piggyback new semantics fairly cheaply (since it is a new construct in this version so it is more tolareated to bring its new semantics with it)?