Module Interop
On Thu, Mar 21, 2013 at 12:25 PM, Kevin Smith <khs4473 at gmail.com> wrote:
The problem is that the
options.metadata === "node"
test is hand-waiving. In a mixed environment where a module may be an ES6 module or a legacy Node module, how is the loader supposed to know how to link it? Ideally, everything will "just work", so that legacy modules can be used transparently alongside ES6 modules.
Some alternatives to parsing the whole file:
- look for a special comment at the beginning
- look at the module name import "npm/moment" as moment;
- look at package metadata etc. that is available to your custom module loader
I think that's probably enough and we can get by without another hack, but what do you think?
On Mar 21, 2013, at 12:25 PM, Kevin Smith <khs4473 at gmail.com> wrote:
Ideally, everything will "just work", so that legacy modules can be used transparently alongside ES6 modules.
I disagree with this premise. It shouldn't be ES6's responsibility to auto-detect historical non-ES6 systems. If you want to build such auto-detection into a custom loader, by all means do so. But there are plenty of lightweight ways for programmers to "annotate" their code -- filename conventions, directory structure, etc -- that don't require particularly sophisticated logic to be implemented in a custom loader, and I don't see this as worth the complexity.
- look for a special comment at the beginning
For backward compat, you'd have to put the comment in new ES6 modules. That's a spank-belt worse than "use strict" ; )
- look at the module name
import "npm/moment" as moment;
This is a good option, but it's not transparent. If you upgrade "moment" to ES6 modules, you have to change the line above. Not bad, though.
- look at package metadata etc. that is available to your custom module
loader
Again, for backward compat the flag would have to go in new ES6 modules. We want the tax to be paid by legacy modules, not new modules.
All good ideas. #2 has the most promise, I think.
Ideally, everything will "just work", so that legacy modules can be used transparently alongside ES6 modules.
I disagree with this premise. It shouldn't be ES6's responsibility to auto-detect historical non-ES6 systems.
Of course, I tend to agree with your disagreement : ) But in any case I think it's useful to look at what it would take to make things "just work", even if only to reject the goal.
Referencing the presentation at meetings:meeting_mar_12_2013.
The module loader proposal outlined in the presentation is looking pretty solid. However, there is a weakness which I would like to point out.
The primary change in this proposal is a "link" hook, which takes source code and can return one of three different things:
In the presentation, there is an example of loading require-based Node modules using the link hook. If the module is a Node module, then the source code is eval'ed and a Module object is created based on the result of that execution:
The problem is that the
options.metadata === "node"
test is hand-waiving. In a mixed environment where a module may be an ES6 module or a legacy Node module, how is the loader supposed to know how to link it? Ideally, everything will "just work", so that legacy modules can be used transparently alongside ES6 modules.A reasonable heuristic is to only use the legacy module linking algorithm if there are no import or export declarations in the source code. However, obtaining that information requires parsing the source. Even using a C++ parser, it seems wasteful to parse every module's source code twice.
There are different ways to address this, but we can note two things:
There will never be a need to override the "link" behavior for an actual ES6 module. That is, a module whose source code contains import and export declarations. In that case, all of the relevant linking information is already contained directly within the source code.
Regardless of whether linking is overridden or not for a given source, that source must be valid ES6 code.
Taking these two points together, it is sufficient for our interoperability needs to redefine the "link" hook as follows:
A diagram: docs.google.com/drawings/d/1WfMhFq_kcyA1kS47V-M8odteA63IjLHAbNkqRynfIDg/edit?usp=sharing
Yes, this approach is a little hacky, but so is the requirement: that we must support transparent interoperability with a legacy module system. Note that with this approach, we will still end up parsing code for legacy modules twice, but (a) it will be parsed by the C++ engine and (b) the cost will be only be paid for legacy code.
Thoughts? Holes?
Thanks for your time,