Two interoperable implementations rule
On 7/11/08 3:01 PM, Maciej Stachowiak wrote:
Since there is precedent within ECMA, then I definitely think we should take a formal vote on adopting this rule for TC39, in particular that we must have two interoperable implementations for any of our specs before it progresses outside our committee.
This proposal is in the spirit of what we have intended for ES4 all along. Formalizing it seems to me to be a good idea. Will keep us honest anyway ;-)
There are also some details to be worked out:
- Is "two interoperable implementations" at feature granularity, or whole spec granularity? In particular, is it ok to cite two implementations for one feature, but two other implementations for another?
Very good point. I've always thought that "whole feature" was good enough, but if our goal is defining a language for interoperability on the web then "whole spec" seems to be the right choice.
- How is interoperability to be demonstrated? Do we accept good-faith claims of support, or do we need a test suite?
I'd say that good faith is good enough. It's easy enough for us to check each other's work. And the blogosphere will not be kind to liars.
One more detail:
- What constitutes a qualifying implementation? Does Rhino, EJScript (mbedthis.com), or Flash Player qualify? Or must it be one of the four leading browsers?
Given the nature of programming languages and the high stakes of Web standards, I would personally prefer whole-spec granularity (different implementations having different mixes of features does not prove real interoperability), and a test suite rather than just bare claims of support.
Again, it will be hard to get away with cheating. But, on the other hand an unofficial test suite (such as Spidermonkey's) would make it easier for implementors to be honest.
To be clear, I propose this rule not to block ES3.1, but to make it successful. The WebKit project will accept patches for any feature of 3.1 that has been reconciled with 4, and we will likely devote Apple resources to implementing such features as well, so SquirrelFish will likely be a candidate for one of the interoperable implementations. Mozilla also has an extensive test suite for ECMAScript 3rd edition, which could be a good starting point for an ES3.1 test suite.
I also note that the strong version of the interoperable implementations rule will be an even higher hurdle for ES4.
Yes, and one we've felt we should jump over all along. It would be good for us to agree that standards produced by TC39 have all passed such a test. As a consequence I suspect that our specs will have high rate of success.
On Sat, 12 Jul 2008 00:01:26 +0200, Maciej Stachowiak <mjs at apple.com>
wrote:
To be clear, I propose this rule not to block ES3.1, but to make it successful. The WebKit project will accept patches for any feature of 3.1 that has been reconciled with 4, and we will likely devote Apple resources to implementing such features as well, so SquirrelFish will likely be a candidate for one of the interoperable implementations. Mozilla also has an extensive test suite for ECMAScript 3rd edition, which could be a good starting point for an ES3.1 test suite.
I also note that the strong version of the interoperable implementations rule will be an even higher hurdle for ES4.
Any comments?
From experience with other specifications, requiring two interoperable
implementations does sound like a good idea. Especially if they are Web
browsers so compatibility can be tested. It also means that once the
specification ships authors can actually use it.
On Jul 11, 2008, at 3:01 PM, Maciej Stachowiak wrote:
On Jul 10, 2008, at 6:29 AM, OpenStrat at aol.com wrote:
In a message dated 7/10/2008 3:03:12 A.M. Eastern Daylight Time,
mjs at apple.com writes: I do not believe that ECMA has the "two interoperable
implementations" rule that the IETF and W3C have, but since ECMAScript is a
standard of equal important to the Web, I think we should adopt this rule for any future edition of ECMAScript. Such a rule is needed precisely to
avoid such casual breakage relative to Web reality. Can we make that a binding TC39 resolution? While it is true that no such rule exists in Ecma, it has been
used in work I am familiar with (optical storage) within TC 31.
Early work on MO storage resulted in TC 31 agreeing that at least
two implementations must demonstrate interoperability before
approval of the standard. This meant that both disk manufacturers
and drive manufacturers had to work together to demonstrate that
the product resulting from the standard would work together. The
committee always followed this rule without question, and the CC
and GA of Ecma did not interfere with its implementation.We can add this subject to discussion at Oslo, but this is a
question that I would put to an internal vote of TC 31 since it
has wider impact than may be represented in Oslo.Since there is precedent within ECMA, then I definitely think we
should take a formal vote on adopting this rule for TC39, in
particular that we must have two interoperable implementations for
any of our specs before it progresses outside our committee.There are also some details to be worked out:
Is "two interoperable implementations" at feature granularity,
or whole spec granularity? In particular, is it ok to cite two
implementations for one feature, but two other implementations for
another?How is interoperability to be demonstrated? Do we accept good- faith claims of support, or do we need a test suite?
Given the nature of programming languages and the high stakes of
Web standards, I would personally prefer whole-spec granularity
(different implementations having different mixes of features does
not prove real interoperability), and a test suite rather than just
bare claims of support.To be clear, I propose this rule not to block ES3.1, but to make it
successful. The WebKit project will accept patches for any feature
of 3.1 that has been reconciled with 4, and we will likely devote
Apple resources to implementing such features as well, so
SquirrelFish will likely be a candidate for one of the
interoperable implementations. Mozilla also has an extensive test
suite for ECMAScript 3rd edition, which could be a good starting
point for an ES3.1 test suite.I also note that the strong version of the interoperable
implementations rule will be an even higher hurdle for ES4.Any comments?
You don't need another huzzah from me.
The hurdle is certainly higher for ES4, although it may be less high
given its reference implementation, which could pass the tests.
Should a reference implementation, even if slow, count?
Of course tests are never complete, but we need not pretend they are
to have confidence in interoperation. I am interested in real
programmers banging on "draft" implementations, which will produce
bug reports beyond what tests find, and lead to more tests being
developed.
Should a reference implementation, even if slow, count?
My own opinion on this is "no."
Since, for the most part, a reference implementation doesn't face the
performance and maintainability challenges that shipping software
faces, I don't think it fleshes out the same issues that a real-world
implementation would.
Geoff
On Sat, 12 Jul 2008 00:54:05 +0200, Brendan Eich <brendan at mozilla.org>
wrote:
The hurdle is certainly higher for ES4, although it may be less high given its reference implementation, which could pass the tests. Should a reference implementation, even if slow, count?
FWIW, a reference implementation misses a real important point of having
implementations before finalizing, testing Web compatibility.
(Removed ES3 as I'm not subscribed.)
On Jul 11, 2008, at 3:49 PM, Jeff Dyer wrote:
On 7/11/08 3:01 PM, Maciej Stachowiak wrote:
- How is interoperability to be demonstrated? Do we accept good- faith claims of support, or do we need a test suite?
I'd say that good faith is good enough. It's easy enough for us to
check each other's work. And the blogosphere will not be kind to liars.
I'm less concerned about cheating than about honest mistakes, which
may nonetheless affect interoperability, Web compatibility, or
practical implementability of the spec.
For the WebKit project, we always make our best effort to correctly
implement Web standards, and even make our own test cases as we go.
However, once an independently developed test suite appears it always
finds mistakes in our implementation. I think we are not unusual in
this regard.
One more detail:
- What constitutes a qualifying implementation? Does Rhino, EJScript (mbedthis.com), or Flash Player qualify? Or must it be one of the four leading browsers?
That is a good point to raise. I think limiting to browser-hosted
implementations might be too extreme. On the other hand, if a spec
qualifies based solely on non-browser-hosted implementations, then we
have not done much to verify that the standard is compatible with the
real-world Web. I think a reasonable middle ground would be to require
at least one of the implementations to be browser-hosted. For these
purposes, I would count an implementation that works as a browser
extension replacing the scripting engine so long as it actually gets
wide testing, so for example ScreamingMonkey could qualify.
It should also be required that the two implementations are
independent (so a single vendor presenting two implementations would
not qualify). This may be tricky to define, since many possible
candidate implementations are open source and developed
collaboratively by community contributors and overlapping sets of
vendors. For example, would Rhino and SpiderMonkey count as
sufficiently independent implementations?
Given the nature of programming languages and the high stakes of Web standards, I would personally prefer whole-spec granularity
(different implementations having different mixes of features does not prove
real interoperability), and a test suite rather than just bare claims of support.Again, it will be hard to get away with cheating. But, on the other
hand an unofficial test suite (such as Spidermonkey's) would make it easier
for implementors to be honest.
Again, I am less worried about cheating than mistakes. If we had a
semi-official test suite, it would not be normative, only the spec is
normative. It would only be a tool for verifying interoperability has
been achieved to a reasonable degree. The concern is less about
deliberate deception than about having at least the minimal evidence
needed to make a fact-based claim.
, Maciej
On Jul 11, 2008, at 4:06 PM, Geoffrey Garen wrote:
Should a reference implementation, even if slow, count?
My own opinion on this is "no."
Since, for the most part, a reference implementation doesn't face the performance and maintainability challenges that shipping software faces, I don't think it fleshes out the same issues that a real-world implementation would.
I happen to agree, but this means there's more than a shared test
suite in answer to Maciej's second question:
- How is interoperability to be demonstrated? Do we accept good- faith claims of support, or do we need a test suite?
If only a test suite were enough, then the RI would have to count.
The chicken-and-egg problems with prototype implementations and draft
specs suggest that we need all of tests, users banging on prototypes
and causing new (reduced) tests to be written, and of course specs
(ideally testable, which is the primary reason for the RI).
It will take nice judgment along with hard work to reach the point
where we believe the specs should be standardized. It's clear some
vendors won't want to risk implementing and shipping something that
has not yet been standardized. I don't want to over-formalize at this
point, but I'm happy to exclude the RI in the "Two interoperable
implementations" rule.
A few thoughts on the general topic and various points that are been raised:
Overall, I think this is a good idea. My personal opinion is that standardization should follow proven utility, not the other way around. However, it's difficult to get meaningful use of proposed web standards until there is ubiquitous implementation. If we, as a community, can find a way to meaningfully work together to advance web technology it will be a very good thing.
Realistically, I think it has to be real browser-based implementations. However, Maciej's at least one browser implementation suggestion may be good enough. My perception is that we have far more unresolved "will it break the web" arguments then we do about the actual utility of features. Let's just demonstrate it on the web, one way or another. BTW, I think this puts us (Microsoft) at a disadvantage because we have self imposed restrictions that currently make it much harder for us to publicly demonstrate (or even discuss) browser changes or enhances than it would be for any sort of standalone implementation we did. We'll have to learn how to deal with it.
Conversely, I don't think a "reference implementation" really makes the cut, even if it is hosted in a browser. However, there isn't necessarily a sharp line between a reference implementation and a simplistic or naïve "production" implementation so maybe the browser hosted requirement is as close as we can come to pinning that down.
I'm ambivalent on the single feature or entire specification question. For a spec. on the order what is being proposed as ES3.1 I don't think an entire spec. requirement would be unreasonably burdensome. For more complex feature sets I'm less sure. A related question is what does it take for a feature to even get into a proposed specification? You can't require complete implementation of a spec. that is not yet complete. Is a public, browser based implementation a pre-requisite for even getting to the state of a feature proposal? I could argue both sides of that one.
Feature interactions are often a source of unanticipated problems. That argues for testing an entire specification.
Is there a time limit in which implementations must occur? How do we actually reach the point of "shipping" a revised specification.
An implementation, without some way to validate it seems of limited use. Should we also expect those who propose features to also provide a validation suite?
How fine a granularity do we push this to. Some of the sorts of clarify changes to existing specification language or changes to specification models that are needed to accommodate new features may not even be directly testable if their intent is no net change to the existing feature set.
Don't take this a "vote" yet, as we certainly need to have some internal discussion on the topic. However, I don't see how such a requirement would be in any way inconsistent with how Microsoft currently thinks about the process of web evolution.
On Jul 11, 2008, at 5:03 PM, Allen Wirfs-Brock wrote:
A few thoughts on the general topic and various points that are been
raised:Overall, I think this is a good idea. My personal opinion is that
standardization should follow proven utility, not the other way
around. However, it's difficult to get meaningful use of proposed
web standards until there is ubiquitous implementation. If we, as a
community, can find a way to meaningfully work together to advance
web technology it will be a very good thing.Realistically, I think it has to be real browser-based
implementations. However, Maciej's at least one browser
implementation suggestion may be good enough. My perception is that
we have far more unresolved "will it break the web" arguments then
we do about the actual utility of features. Let's just demonstrate
it on the web, one way or another. BTW, I think this puts us
(Microsoft) at a disadvantage because we have self imposed
restrictions that currently make it much harder for us to publicly
demonstrate (or even discuss) browser changes or enhances than it
would be for any sort of standalone implementation we did. We'll
have to learn how to deal with it.
There are a few purposes for requiring interoperable implementations
to advance the specification:
- Demonstrate practical implementability - that the spec can be
implemented correctly in a production-quality implementation without
truly excessive implementation effort. - Show that the spec can be implemented without compromising Web
compatibility, through implementor review and widespread testing. - Help flush out performance, security and usability issues through
implementation and use of the resulting implementations. - Show that the spec language is unambiguous enough that multiple
independent implementations are compatible.
A reference implementation, even if browser-hosted, would have limited
utility for purposes 1 and 3 (since by definition it does not aim to
be production-quality or high-performance). It may still help with
goal #2 a lot if browser-hosted.
Conversely, I don't think a "reference implementation" really makes
the cut, even if it is hosted in a browser. However, there isn't
necessarily a sharp line between a reference implementation and a
simplistic or naïve "production" implementation so maybe the browser
hosted requirement is as close as we can come to pinning that down.I'm ambivalent on the single feature or entire specification
question. For a spec. on the order what is being proposed as ES3.1 I
don't think an entire spec. requirement would be unreasonably
burdensome. For more complex feature sets I'm less sure. A related
question is what does it take for a feature to even get into a
proposed specification? You can't require complete implementation of
a spec. that is not yet complete. Is a public, browser based
implementation a pre-requisite for even getting to the state of a
feature proposal? I could argue both sides of that one.
I think it would be overkill to require an implementation (or even a
test suite) to even make a proposal. We should expect that at some
point in the standards development process, there will be a cycle of
feedback among implementations and the spec, where implementors
identify problems and propose changes, and the spec is adapted as a
result.
The IETF and W3C, the two other major standards organizations that
operate in the area of Web and Internet standards, have a two
implementation rule combined with a sequence of maturity levels. Here
is a summary of their levels:
IETF www.ietf.org/rfc/rfc2026.txt:
-
Internet-Draft -- Initial published version, for comment and review
(typically there are multiple iterations as an Internet-Draft, the
minimum comment period is 2 weeks) -
Proposed Standard -- Has resolved known design choices, is believed
to be well-understood, has received significant community review. But
further experience might result in a change or even retraction of the
specification before it advances. This is the stage at which wide
implementation is encouraged. Minimum 6 month duration. -
Draft Standard -- Two independent and interoperable implementations
from different code bases have been developed. Well-understood and
known to be quite stable. Minimum 4 month duration. -
Internet Standard -- Characterized by a high degree of technical
maturity and by a generally held belief that the specified protocol or
service provides significant benefit to the Internet community.
(Not all implementations reach this stage.)
(There is also a minimum 2 week Last Call period between any state
transition.)
W3C <www.w3.org/2005/10/Process-20051014/tr.html#maturity- levels>:
-
Working Draft -- A signal to the community to begin reviewing the
document. (Typically multiple iterations of this step.) -
Last Call Working Draft -- Final Working Draft that, if no serious
issues are found, proceeds to Candidate Recommendation. All technical
requirements are believed to be satisfied. Minimum 3 week duration. -
Candidate Recommendation -- The Director calls for implementations.
Features may be identified as "at risk" in which case they are dropped
if interoperable implementations are not produced, instead of sending
the spec back to Working Draft. Implementation and test suite work
occurs. A minimum duration must be stated before entering CR. -
Proposed Recommendation -- Two interoperable implementations are
required, and the spec should be considered mature and widely
reviewed, with significant experience. After this final review period,
the Director and the W3C Membership vote whether to advance to a full
Recommendation. Minimum 4 week duration. -
Recommendation -- Final stage. The spec is believed to have
significant support and the ideas in it are appropriate for the web.
I would say the drafts of ES3.1 and ES4 published so far are
equivalent in maturity to an IETF Internet-Draft or a W3C Working
Draft. I would suggest that before seeking final ECMA approval, we
should reach a maturity level equivalent to IETF Draft Standard or W3C
Proposed Recommendation. This would also imply that we need a stage
equivalent to Proposed Standard / Candidate Recommendation - where the
spec is considered largely complete and stable, but we are focusing on
finishing implementations and developing tests. (Of course, this does
not mean that implementation work cannot start early, as is being done
for ES4.)
Feature interactions are often a source of unanticipated problems.
That argues for testing an entire specification.
I agree.
Is there a time limit in which implementations must occur? How do
we actually reach the point of "shipping" a revised specification.
If we do something analogous to W3C or IETF process, there is no time
limit but the spec can't "ship" in final form until the
implementations are there. However, if it sits in this state for a
long time, features may be dropped or we go back to the drawing board
more generally.
An implementation, without some way to validate it seems of limited
use. Should we also expect those who propose features to also
provide a validation suite?
That would be a pretty high bar to pass to even make a suggestion, but
I do think it should be strongly encouraged.
How fine a granularity do we push this to. Some of the sorts of
clarify changes to existing specification language or changes to
specification models that are needed to accommodate new features may
not even be directly testable if their intent is no net change to
the existing feature set.
Anything in the spec that has no observable effect on behavior does
not need to be tested. Only normative conformance criteria need to be
tested.
Don't take this a "vote" yet, as we certainly need to have some
internal discussion on the topic. However, I don't see how such a
requirement would be in any way inconsistent with how Microsoft
currently thinks about the process of web evolution.
Cool, I look forward to hearing more from you on this.
, Maciej
Maciej Stachowiak wrote:
The WebKit project will accept patches for any feature of 3.1 that has been reconciled with 4, and we will likely devote Apple resources to implementing such features as well, so SquirrelFish will likely be a candidate for one of the interoperable implementations. Mozilla also has an extensive test suite for ECMAScript 3rd edition, which could be a good starting point for an ES3.1 test suite.
Not being familiar with the webkit code base or process for accepting patches, can you point me to where I can find out more?
- Sam Ruby
On Jul 14, 2008, at 8:12 AM, Sam Ruby wrote:
Maciej Stachowiak wrote:
The WebKit project will accept patches for any feature of 3.1 that
has been reconciled with 4, and we will likely devote Apple
resources to implementing such features as well, so SquirrelFish
will likely be a candidate for one of the interoperable
implementations. Mozilla also has an extensive test suite for
ECMAScript 3rd edition, which could be a good starting point for an
ES3.1 test suite.Not being familiar with the webkit code base or process for
accepting patches, can you point me to where I can find out more?
Sure!
Here's basic instructions on how to check out, build and debug the
code (applicable to Windows and Mac, the Gtk and Qt ports have their
build processes documented elsewhere):
webkit.org/building/checkout.html, webkit.org/building/build.html, webkit.org/building/run.html
Here's an overview of the process for contributing: webkit.org/coding/contributing.html
And here is contact info: webkit.org/contact.html
These links and a lot more info are all on the front page of webkit.org
, Maciej
On Fri, Jul 11, 2008 at 7:10 PM, Maciej Stachowiak <mjs at apple.com> wrote:
This may be tricky to define, since many possible candidate implementations are open source and developed collaboratively by community contributors and overlapping sets of vendors. For example, would Rhino and SpiderMonkey count as sufficiently independent implementations?
Similarly, if we end up with, f.e., both WebKit and Spidermonkey using decNumber as our internal implementation of Decimal, does that count as two interoperable implementations? It seems like we'd be at risk of mostly testing that code against itself, so I would hope that we look for such reuse cases when we're making sure that we actually have usefully-distinct implementations of features to validate the spec.
Mike
On Fri, Jul 11, 2008 at 7:10 PM, Maciej Stachowiak <mjs at apple.com>
wrote:
This may be tricky to define, since many possible candidate implementations are open source and developed collaboratively by community contributors and overlapping sets of vendors. For example, would Rhino and SpiderMonkey count as sufficiently independent implementations?
Similarly, if we end up with, f.e., both WebKit and Spidermonkey using decNumber as our internal implementation of Decimal, does that count as two interoperable implementations? It seems like we'd be at risk of mostly testing that code against itself, so I would hope that we look for such reuse cases when we're making sure that we actually have usefully-distinct implementations of features to validate the spec.
<chuckle> Isn't that the idea of open-source software -- that by using
the same software you get the same results (fdlibm, for example)? But a good point .. the 'new' issue here is that the same code in the different environments ends up with the same calls to the underlying library and yields the same results.
(The decNumber code is quite stable, for example -- averaging fewer than one detected bug/year since its first release in 2001, is used in numerous IBM, SAP, and other vendors' products, and is part of the verification suite for power.org, PowerPC, and IBM mainframe hardware.)
Mike
Unless stated otherwise above: IBM United Kingdom Limited - Registered in England and Wales with number 741598. Registered office: PO Box 41, North Harbour, Portsmouth, Hampshire PO6 3AU
On Mon, Jul 14, 2008 at 4:37 PM, Mike Cowlishaw <MFC at uk.ibm.com> wrote:
(The decNumber code is quite stable, for example -- averaging fewer than one detected bug/year since its first release in 2001, is used in numerous IBM, SAP, and other vendors' products, and is part of the verification suite for power.org, PowerPC, and IBM mainframe hardware.)
I have no doubt; it's more whether the spec is sufficiently detailed and clear that someone can work from it and produce an interoperable implementation without using the same software impl. Otherwise the spec can just include the decNumber source in an appendix, I guess. :)
Mike
On Jul 14, 2008, at 1:46 PM, Mike Shaver wrote:
On Mon, Jul 14, 2008 at 4:37 PM, Mike Cowlishaw <MFC at uk.ibm.com>
wrote:(The decNumber code is quite stable, for example -- averaging fewer
than one detected bug/year since its first release in 2001, is used in
numerous IBM, SAP, and other vendors' products, and is part of the verification
suite for power.org, PowerPC, and IBM mainframe hardware.)I have no doubt; it's more whether the spec is sufficiently detailed and clear that someone can work from it and produce an interoperable implementation without using the same software impl. Otherwise the spec can just include the decNumber source in an appendix, I guess. :)
I'd agree with the point of concern here. The risk is not bugs in
decNumber but that the spec might not match what it does, or may not
be sufficiently detailed to allow an independent interoperable
implementation. However, if decNumber implements something specified
in an independent standard (there's an IEEE standard for decimal
floating point, isn't there?), then I don't think this should count
against two implementations both using decNumber. For example, both
Gecko and WebKit use ICU but I would still count them as independent
implementations of HTML and CSS, since the shared component is only
used to implement the underlying Unicode standard, not the HTML and
CSS standards themselves.
, Maciej
Maciej Stachowiak <mjs at apple.com> wrote on 14/07/2008 22:32:02:
On Jul 14, 2008, at 1:46 PM, Mike Shaver wrote:
On Mon, Jul 14, 2008 at 4:37 PM, Mike Cowlishaw <MFC at uk.ibm.com> wrote:
(The decNumber code is quite stable, for example -- averaging fewer than one detected bug/year since its first release in 2001, is used in numerous IBM, SAP, and other vendors' products, and is part of the verification suite for power.org, PowerPC, and IBM mainframe hardware.)
I have no doubt; it's more whether the spec is sufficiently detailed and clear that someone can work from it and produce an interoperable implementation without using the same software impl. Otherwise the spec can just include the decNumber source in an appendix, I guess. :)
I'd agree with the point of concern here. The risk is not bugs in decNumber but that the spec might not match what it does, or may not be sufficiently detailed to allow an independent interoperable implementation. However, if decNumber implements something specified in an independent standard (there's an IEEE standard for decimal floating point, isn't there?), then I don't think this should count against two implementations both using decNumber. For example, both Gecko and WebKit use ICU but I would still count them as independent implementations of HTML and CSS, since the shared component is only used to implement the underlying Unicode standard, not the HTML and CSS standards themselves.
Yes that's a good example (and decNumber is in fact a component of ICU). You can really think of it as providing an alternative to the hardware instructions on machine which don't have decimal floating-point hardware yet. And yes, the relevant standard is the new IEEE 754. decNumber is very platform-independent; I test it on both big-endian and little-endian machines, and Nelson Beebe has it running on more than 20 flavors of Unix. It is also in GCC, where it is used for implementing the new C decimal floating-point datatypes.
For lots more specifications and details of implementation, hardware architecture documents, etc., see: www2.hursley.ibm.com/decimal/#links
Mike
Unless stated otherwise above: IBM United Kingdom Limited - Registered in England and Wales with number 741598. Registered office: PO Box 41, North Harbour, Portsmouth, Hampshire PO6 3AU
On Mon, Jul 14, 2008 at 8:45 AM, Mike Shaver <mike.shaver at gmail.com> wrote:
On Fri, Jul 11, 2008 at 7:10 PM, Maciej Stachowiak <mjs at apple.com> wrote:
[...] For example, would Rhino and SpiderMonkey count as sufficiently independent implementations?
Similarly, if we end up with, f.e., both WebKit and Spidermonkey using decNumber as our internal implementation of Decimal, does that count as two interoperable implementations? It seems like we'd be at risk of mostly testing that code against itself, so I would hope that we look for such reuse cases when we're making sure that we actually have usefully-distinct implementations of features to validate the spec.
Adding decimal to Rhino would presumably build on the BigDecimal class already present in Java. Is Java's BigDecimal class sufficiently conformant to the relevant IEEE spec to support a conformant implementation of the decimal proposed for EcmaScript? And is the implementation sufficiently independent of the implementation the IBM guys might add to WebKit or Spidermonkey to count as a cross check on the spec? IBM guys, would you be interested in contributing such a decimal implementation to Rhino?
Mark S. Miller wrote:
On Mon, Jul 14, 2008 at 8:45 AM, Mike Shaver <mike.shaver at gmail.com <mailto:mike.shaver at gmail.com>> wrote:
On Fri, Jul 11, 2008 at 7:10 PM, Maciej Stachowiak <mjs at apple.com <mailto:mjs at apple.com>> wrote: > [...] For example, would Rhino and SpiderMonkey count as > sufficiently independent implementations? Similarly, if we end up with, f.e., both WebKit and Spidermonkey using decNumber as our internal implementation of Decimal, does that count as two interoperable implementations? It seems like we'd be at risk of mostly testing that code against itself, so I would hope that we look for such reuse cases when we're making sure that we actually have usefully-distinct implementations of features to validate the spec.
Adding decimal to Rhino would presumably build on the BigDecimal class already present in Java. Is Java's BigDecimal class sufficiently conformant to the relevant IEEE spec to support a conformant implementation of the decimal proposed for EcmaScript? And is the implementation sufficiently independent of the implementation the IBM guys might add to WebKit or Spidermonkey to count as a cross check on the spec? IBM guys, would you be interested in contributing such a decimal implementation to Rhino?
Let me answer that in three parts.
-
Would I be willing to participate in developing a compatible Decimal implementation for Rhino? Absolutely! (Though probably not until next month at the earliest).
-
Is the Rhino implementation sufficiently different than the one I just started working on based on SpiderMonkey. I would think so. Note that I changed the word from independent to different in this question.
-
Would having the same people who are involved in writing the spec provide two different implementations provide the needed assurance that the specification is somehow ready for standardization? That I am not so clear on.
Note: we are focusing on Decimal here, though I sensed that the original question was meant to be much more general. Decimal may not be the best example as we are talking about implementing a thin interface over a mature and time tested specification. Even the interface itself is based on a lot of prior art and experience.
- Sam Ruby
Mark wrote:
Adding decimal to Rhino would presumably build on the BigDecimal class already present in Java. Is Java's BigDecimal class sufficiently conformant to the relevant IEEE spec to support a conformant implementation of the decimal proposed for EcmaScript?
It follows the same arithmetic rules, but is not a complete implementation of IEEE 754 (in particular, it does not have Infinity and NaNs, or malleable exponent range limits, and hence its overflow/exception cases are different. Sun were talking of either extending it or providing a wrapper class which used it yto provide exact IEEE 754 conformance, but I do not know the status of that work item.
And is the implementation sufficiently independent of the implementation the IBM guys might add to WebKit or Spidermonkey to count as a cross check on the spec?
Well, I wrote most of the Java 5 extensions to BigDecimal (including the MathContext support, etc.), so one might argue that it's not particularly independent. However, it is a completely different implementation, historically.
IBM guys, would you be interested in contributing such a decimal
implementation to Rhino?
BigDecimal is 'owned' by Sun, so not really something we could contribute.
Mike
Unless stated otherwise above: IBM United Kingdom Limited - Registered in England and Wales with number 741598. Registered office: PO Box 41, North Harbour, Portsmouth, Hampshire PO6 3AU
On Jul 10, 2008, at 6:29 AM, OpenStrat at aol.com wrote:
Since there is precedent within ECMA, then I definitely think we
should take a formal vote on adopting this rule for TC39, in
particular that we must have two interoperable implementations for any
of our specs before it progresses outside our committee.
There are also some details to be worked out:
Is "two interoperable implementations" at feature granularity, or
whole spec granularity? In particular, is it ok to cite two
implementations for one feature, but two other implementations for
another?
How is interoperability to be demonstrated? Do we accept good-faith
claims of support, or do we need a test suite?
Given the nature of programming languages and the high stakes of Web
standards, I would personally prefer whole-spec granularity (different
implementations having different mixes of features does not prove real
interoperability), and a test suite rather than just bare claims of
support.
To be clear, I propose this rule not to block ES3.1, but to make it
successful. The WebKit project will accept patches for any feature of
3.1 that has been reconciled with 4, and we will likely devote Apple
resources to implementing such features as well, so SquirrelFish will
likely be a candidate for one of the interoperable implementations.
Mozilla also has an extensive test suite for ECMAScript 3rd edition,
which could be a good starting point for an ES3.1 test suite.
I also note that the strong version of the interoperable
implementations rule will be an even higher hurdle for ES4.
Any comments?
, Maciej