JSON Duplicate Keys

# Douglas Crockford (12 years ago)

The JSON RFC says

 The names within an object SHOULD be unique.

Sadly, there are people in the JavaScript community who interpreted SHOULD to mean DON'T HAVE TO. In a perfect world, we would change the SHOULD to MUST, but we can't. Interoperability and security concerns demand that we specify what happens when keys are duplicated. So we may end up with something like this:

 The names within an object SHOULD be unique. If a key is duplicated,
 a parser SHOULD reject. If it does not reject, it MUST take only the
 last of the duplicated key pairs.

Does anyone see a problem with this?

# Anne van Kesteren (12 years ago)

On Thu, Jun 6, 2013 at 12:29 PM, Douglas Crockford <douglas at crockford.com> wrote:

The JSON RFC says

The names within an object SHOULD be unique.

Sadly, there are people in the JavaScript community who interpreted SHOULD to mean DON'T HAVE TO. In a perfect world, we would change the SHOULD to MUST, but we can't. Interoperability and security concerns demand that we specify what happens when keys are duplicated. So we may end up with something like this:

The names within an object SHOULD be unique. If a key is duplicated,
a parser SHOULD reject. If it does not reject, it MUST take only the
last of the duplicated key pairs.

Does anyone see a problem with this?

SHOULD reject seems too strong given that we know perfectly well we cannot do that. MAY seems much more reasonable or maybe MUST either reject or take the last of the duplicated key pairs (decision up to the parser API specification).

-- annevankesteren.nl

# Douglas Crockford (12 years ago)

On 6/6/2013 4:34 AM, Anne van Kesteren wrote:

On Thu, Jun 6, 2013 at 12:29 PM, Douglas Crockford <douglas at crockford.com> wrote:

The JSON RFC says

 The names within an object SHOULD be unique.

Sadly, there are people in the JavaScript community who interpreted SHOULD to mean DON'T HAVE TO. In a perfect world, we would change the SHOULD to MUST, but we can't. Interoperability and security concerns demand that we specify what happens when keys are duplicated. So we may end up with something like this:

 The names within an object SHOULD be unique. If a key is duplicated,
 a parser SHOULD reject. If it does not reject, it MUST take only the
 last of the duplicated key pairs.

Does anyone see a problem with this? SHOULD reject seems too strong given that we know perfectly well we cannot do that. MAY seems much more reasonable or maybe MUST either reject or take the last of the duplicated key pairs (decision up to the parser API specification).

The current RFC already says SHOULD. We don't have to weaken that. But we should be explicit about what a parser should do if it decides to accept away. The sentence

 If a key is duplicated, a parser SHOULD reject.

is not a change. It is implied by the first statement. The thing that is a change is the third statement

 If it does not reject, it MUST take only the last of the duplicated 

key pairs.

# François REMY (12 years ago)

The sentence

 If a key is duplicated, a parser SHOULD reject.

is not a change. It is implied by the first statement.

I do not agree. A 'should' authoring requirement is not meant to trigger an abortion from the execution engine.

For example, the CSS flexbox spec says that an author should not use the CSS 'order' property to convey any semantic meaning, but that doesn't mean the browser will reject the 'order' declaration if it detects the way it's used is not semantically valid. A 'should' is nothing more than an advice to the author, not a requirement per se.

# gaz Heyes (12 years ago)

On 6 June 2013 12:29, Douglas Crockford <douglas at crockford.com> wrote:

The JSON RFC says

The names within an object SHOULD be unique.

Sadly, there are people in the JavaScript community who interpreted SHOULD to mean DON'T HAVE TO. In a perfect world, we would change the SHOULD to MUST, but we can't. Interoperability and security concerns demand that we specify what happens when keys are duplicated. So we may end up with something like this:

The names within an object SHOULD be unique. If a key is duplicated,
a parser SHOULD reject. If it does not reject, it MUST take only the
last of the duplicated key pairs.

Does anyone see a problem with this?

That doesn't make sense "should" should be "must" because if it's taking the last key then you can overwrite someone else's data.

# Douglas Crockford (12 years ago)

On 6/6/2013 5:00 AM, gaz Heyes wrote:

On 6 June 2013 12:29, Douglas Crockford <douglas at crockford.com <mailto:douglas at crockford.com>> wrote:

The JSON RFC says

    The names within an object SHOULD be unique.

Sadly, there are people in the JavaScript community who
interpreted SHOULD to mean DON'T HAVE TO. In a perfect world, we
would change the SHOULD to MUST, but we can't. Interoperability
and security concerns demand that we specify what happens when
keys are duplicated. So we may end up with something like this:

    The names within an object SHOULD be unique. If a key is
duplicated,
    a parser SHOULD reject. If it does not reject, it MUST take
only the
    last of the duplicated key pairs.

Does anyone see a problem with this?

That doesn't make sense "should" should be "must" because if it's taking the last key then you can overwrite someone else's data.

You are exactly right. But it is too late to change to MUST unless TC39 chooses to break the programs of the people who are doing what they shouldn't, which is something TC39 said it will not do.

# gaz Heyes (12 years ago)

On 6 June 2013 13:19, Douglas Crockford <douglas at crockford.com> wrote:

You are exactly right. But it is too late to change to MUST unless TC39 chooses to break the programs of the people who are doing what they shouldn't, which is something TC39 said it will not do.

Meh politics. I have no interest in such matters. Breaking broken programs shouldn't be a concern.

# Jeremy Darling (12 years ago)

Copying in from another post I sent. Instead of amending SHOULD to allow parsers to throw new errors why not have them emit collections when duplication is found?

Duplicate Keys should never have been allowed in the spec in the first place (most all key/value stores don't allow it) or if they were allowed when decomposed to JS objects they should have created arrays of values much like most frameworks do for duplicate keys in the HTTP headers.

Thus, and typing this makes my skin crawl and my head hurt:

{ "myKey": "Value 1", "myKey": 2 }

Becomes, upon parse:

{ "myKey": ["Value 1", 2] }

All values have now been accounted for. nothing has been created or lost, and code that didn't take into account multiple keys but did type checking on values will fail or adapt gracefully.

# Rick Waldron (12 years ago)

On Thu, Jun 6, 2013 at 9:31 AM, Jeremy Darling <jeremy.darling at gmail.com>wrote:

Copying in from another post I sent. Instead of amending SHOULD to allow parsers to throw new errors why not have them emit collections when duplication is found?

Duplicate Keys should never have been allowed in the spec in the first place (most all key/value stores don't allow it) or if they were allowed when decomposed to JS objects they should have created arrays of values much like most frameworks do for duplicate keys in the HTTP headers.

Thus, and typing this makes my skin crawl and my head hurt:

{ "myKey": "Value 1", "myKey": 2 }

Becomes, upon parse:

{ "myKey": ["Value 1", 2] }

All values have now been accounted for. nothing has been created or lost, and code that didn't take into account multiple keys but did type checking on values will fail or adapt gracefully.

This breaks extant data value invariants. Ignoring the original question and addressing only this example, let's say the expected value was a string (your example shows a string and a number)

var parsed = JSON.parse(data); parsed.myKey.toUpperCase(); // TypeError

This is web breaking.

# Rick Waldron (12 years ago)

On Thu, Jun 6, 2013 at 7:29 AM, Douglas Crockford <douglas at crockford.com>wrote:

The JSON RFC says

The names within an object SHOULD be unique.

Sadly, there are people in the JavaScript community who interpreted SHOULD to mean DON'T HAVE TO. In a perfect world, we would change the SHOULD to MUST, but we can't. Interoperability and security concerns demand that we specify what happens when keys are duplicated. So we may end up with something like this:

The names within an object SHOULD be unique. If a key is duplicated,
a parser SHOULD reject. If it does not reject, it MUST take only the
last of the duplicated key pairs.

+1

As far as I can tell, this is the de facto browser behavior.

# Allen Wirfs-Brock (12 years ago)

On Jun 6, 2013, at 4:34 AM, Anne van Kesteren wrote:

On Thu, Jun 6, 2013 at 12:29 PM, Douglas Crockford <douglas at crockford.com> wrote:

The JSON RFC says

The names within an object SHOULD be unique.

Sadly, there are people in the JavaScript community who interpreted SHOULD to mean DON'T HAVE TO. In a perfect world, we would change the SHOULD to MUST, but we can't. Interoperability and security concerns demand that we specify what happens when keys are duplicated. So we may end up with something like this:

The names within an object SHOULD be unique. If a key is duplicated, a parser SHOULD reject. If it does not reject, it MUST take only the last of the duplicated key pairs.

Does anyone see a problem with this?

SHOULD reject seems too strong given that we know perfectly well we cannot do that. MAY seems much more reasonable or maybe MUST either reject or take the last of the duplicated key pairs (decision up to the parser API specification).

Some that should be pointed out is that one reason JSON needs to accept duplicate field names is that some people currently use them to add comments to JSON datasets:

{ "//": "This is a comment about my data", "//": "that takes more than one line", "field1" : 1, "field2" : 2 }

The above is valid according to the existing RFC and is accepted as valid JSON by the existing ES JSON.parse spec. We don't want to invalid existing JSON datasets that use this technique so duplicate keys MUST be accepted.

What I would say, as a replacement for the current text is:

      The names within an object SHOULD be unique.  If a key is duplicated, a parser MUST take  <<use?? interpret??>> only the last of the duplicated key pairs.

I would be willing to loose the first sentence (containing the SHOULD) entirely as it doesn't add any real normative value.

# Allen Wirfs-Brock (12 years ago)

On Jun 6, 2013, at 4:51 AM, François REMY wrote:

The sentence

If a key is duplicated, a parser SHOULD reject.

is not a change. It is implied by the first statement.

I do not agree. A 'should' authoring requirement is not meant to trigger an abortion from the execution engine.

For example, the CSS flexbox spec says that an author should not use the CSS 'order' property to convey any semantic meaning, but that doesn't mean the browser will reject the 'order' declaration if it detects the way it's used is not semantically valid. A 'should' is nothing more than an advice to the author, not a requirement per se.

+1

# Allen Wirfs-Brock (12 years ago)

On Jun 6, 2013, at 6:02 AM, gaz Heyes wrote:

On 6 June 2013 13:19, Douglas Crockford <douglas at crockford.com> wrote: You are exactly right. But it is too late to change to MUST unless TC39 chooses to break the programs of the people who are doing what they shouldn't, which is something TC39 said it will not do.

Meh politics. I have no interest in such matters. Breaking broken programs shouldn't be a concern.

There is nothing broken about then. People have simply been doing what RFC 4627 allows them to do.

Invalidating existing JSON datasets (many of which may not be actively maintained) is a real concern. Who does it benefit to make them invalid.

# gaz Heyes (12 years ago)

On 6 June 2013 16:13, Allen Wirfs-Brock <allen at wirfs-brock.com> wrote:

Some that should be pointed out is that one reason JSON needs to accept duplicate field names is that some people currently use them to add comments to JSON datasets:

{ "//": "This is a comment about my data", "//": "that takes more than one line", "field1" : 1, "field2" : 2 }

That is totally stupid. IMO the JSON spec should support comments like // and /**/ and also support single quotes for string values. It's JavaScript for christ sake. It makes no sense to invent and support this crappy syntax.

# Allen Wirfs-Brock (12 years ago)

On Jun 6, 2013, at 6:46 AM, Rick Waldron wrote:

On Thu, Jun 6, 2013 at 7:29 AM, Douglas Crockford <douglas at crockford.com> wrote: The JSON RFC says

The names within an object SHOULD be unique.

Sadly, there are people in the JavaScript community who interpreted SHOULD to mean DON'T HAVE TO. In a perfect world, we would change the SHOULD to MUST, but we can't. Interoperability and security concerns demand that we specify what happens when keys are duplicated. So we may end up with something like this:

The names within an object SHOULD be unique. If a key is duplicated,
a parser SHOULD reject. If it does not reject, it MUST take only the
last of the duplicated key pairs.

+1

As far as I can tell, this is the de facto browser behavior.

I'm not sure what you mean by this. The ES5 spec. for JSON.parse requires (ie MUST accept) that duplicate keys are accepted and that the value associated with the last duplicated key wins. A valid implementation of JSON.parse MUST not reject such JSON strings.

# Allen Wirfs-Brock (12 years ago)

On Jun 6, 2013, at 8:21 AM, gaz Heyes wrote:

On 6 June 2013 16:13, Allen Wirfs-Brock <allen at wirfs-brock.com> wrote: Some that should be pointed out is that one reason JSON needs to accept duplicate field names is that some people currently use them to add comments to JSON datasets:

{ "//": "This is a comment about my data", "//": "that takes more than one line", "field1" : 1, "field2" : 2 }

That is totally stupid. IMO the JSON spec should support comments like // and /**/ and also support single quotes for string values. It's JavaScript for christ sake. It makes no sense to invent and support this crappy syntax.

No it's not JavaScript! It a file format for storing and interchanging data. Preserving existing data is a key requirement.

# Anne van Kesteren (12 years ago)

On Thu, Jun 6, 2013 at 4:29 PM, Allen Wirfs-Brock <allen at wirfs-brock.com> wrote:

No it's not JavaScript! It a file format for storing and interchanging data. Preserving existing data is a key requirement.

Nevertheless, enhancing it with sensible comment syntax would be great. But as you said, there's a different list for that.

-- annevankesteren.nl

# gaz Heyes (12 years ago)

On 6 June 2013 16:29, Allen Wirfs-Brock <allen at wirfs-brock.com> wrote:

No it's not JavaScript! It a file format for storing and interchanging data. Preserving existing data is a key requirement.

It's syntax is based on JavaScript but crippled for no reason other than to follow a outdated specification. It's so frustrating to be at this end of a keyboard to see awful syntax being promoted just to follow the RFC which is clearly flawed. It's so much more logical to change and fix the problems in the specification so that in future code will work correctly and JSON parsers would be much easier and consistent. "Don't break the web" shouldn't ever become "Keep the web broken" code will never evolve if the core is rotten.

# Mark S. Miller (12 years ago)

Everyone, please keep in mind that JSON does not and never has claimed to be the best or last data format. No doubt many better ones have been and will be created, some as variants of JSON, some not. If you'd like to create a "fixed JSON" or "enhanced JSON" or whatever, feel free -- whether it treats comments, duplicated keys, or \u2028 differently or whatever. But please don't confuse any of these variants with JSON itself. As a data format, JSON's greatest value is its stability, and the inability for anyone, including us, to version it.

# Benoit Marchant (12 years ago)

What's really problematic is that there is no way to specify a version in the format itself so that it could be evolved without breaking existing data set. This has been solved before for serializetion, in Cocoa for example, we need that for JSON.

Benoit

# Mark Miller (12 years ago)

On Thu, Jun 6, 2013 at 8:58 AM, Benoit Marchant <marchant at mac.com> wrote:

What's really problematic is that there is no way to specify a version in the format itself so that it could be evolved without breaking existing data set.

Exactly!

This has been solved before for serializetion, in Cocoa for example, we need that for JSON.

You can "solve" that by inventing such a data format and calling it something else.

# Allen Wirfs-Brock (12 years ago)

On Jun 6, 2013, at 8:45 AM, Anne van Kesteren wrote:

On Thu, Jun 6, 2013 at 4:29 PM, Allen Wirfs-Brock <allen at wirfs-brock.com> wrote:

No it's not JavaScript! It a file format for storing and interchanging data. Preserving existing data is a key requirement.

Nevertheless, enhancing it with sensible comment syntax would be great. But as you said, there's a different list for that.

From an ECMAScript perspective, the way to get a JSON comment syntax in place would be to first added it to JSON.parse in the ES spec. and get it deployed in browsers. Starting by adding it to the JSON RFC is just going to suggest to people that they can use use comments in their JSON datasets when in fact the most widely deployed browser-based parsers won't accept them. Of course, getting support in parsers hosted by other languages is a whole different matter.

# Alexandre Morgaut (12 years ago)

On 6 juin 2013, at 15:45, Rick Waldron wrote:

On Thu, Jun 6, 2013 at 9:31 AM, Jeremy Darling <jeremy.darling at gmail.com<mailto:jeremy.darling at gmail.com>> wrote:

Copying in from another post I sent. Instead of amending SHOULD to allow parsers to throw new errors why not have them emit collections when duplication is found?

Duplicate Keys should never have been allowed in the spec in the first place (most all key/value stores don't allow it) or if they were allowed when decomposed to JS objects they should have created arrays of values much like most frameworks do for duplicate keys in the HTTP headers.

Thus, and typing this makes my skin crawl and my head hurt:

{ "myKey": "Value 1", "myKey": 2 }

Becomes, upon parse:

{ "myKey": ["Value 1", 2] }

All values have now been accounted for. nothing has been created or lost, and code that didn't take into account multiple keys but did type checking on values will fail or adapt gracefully.

This breaks extant data value invariants. Ignoring the original question and addressing only this example, let's say the expected value was a string (your example shows a string and a number)

var parsed = JSON.parse(data); parsed.myKey.toUpperCase(); // TypeError

This is web breaking.

Rick

I agree with Rick and would add that if myKey can legitimately be an array, with any type of elements, the receiver as no way to know what was the original message

In some cases, as to set an email "To" field, a JSON message can provide a string for an unique email or an array of emails. This is a very simple example which may be solved with regex analyses updates but:

  • this may break existing implementations
  • some more complex examples may not be detectable

[cid:ab50c5.png at 00a2dc9f.4681140a] Alexandre Morgaut Wakanda Community Manager Email : Alexandre.Morgaut at 4d.com<mailto:Alexandre.Morgaut at 4d.com>

Web : www.4D.comwww.4D.com

4D SAS 60, rue d'Alsace 92110 Clichy - France Standard : +33 1 40 87 92 00

# Mathias Bynens (12 years ago)

On 6 Jun 2013, at 15:02, gaz Heyes <gazheyes at gmail.com> wrote:

Meh politics. I have no interest in such matters. Breaking broken programs shouldn't be a concern.

Welcome to the Web :)

# Alexandre Morgaut (12 years ago)

On 6 juin 2013, at 13:42, Douglas Crockford wrote:

The current RFC already says SHOULD. We don't have to weaken that. But we should be explicit about what a parser should do if it decides to accept away. The sentence

If a key is duplicated, a parser SHOULD reject.

is not a change. It is implied by the first statement. The thing that is a change is the third statement

If it does not reject, it MUST take only the last of the duplicated

key pairs.

+1

Still I would then, as throwing errors would also be Web breaking, add in the ECMAScript JSON API either:

  1. an Array status property which may be called "lastParseErrors"
  • it would be empty if there was no error
  • it would be filled with Error instances if some error occurred

ex: var obj = JSON.parse('{"myKey": "Value 1","myKey": 2}'); var errors = JSON.lastParsedErrors.length && JSON.lastParsedErrors; if (errors) { // handle the errors }

The Error message could in this case be: -> The "myKeys" key is duplicated in this object: "$"

where "$" represent the root object as defined in JSONPath (goessner.net/articles/JsonPath) (if there was a "$" property in the root object, its path would be "$.$") A JSONPath would in my opinion be much more usable than "line" + "column" or even more "index", as JSON is not initially meant to be multiline

ECMAScript might then define a Native "JSONError" or "ParseError" which instances would have the matching name property value

  1. Instead of a "lastParseErrors", a "lastReport" property could available

The advantage would be that it could also handle warnings from the stringify() to alert the developer when some values were automatically dropped (like for undefined or function values), or replaced by null (like for Infinity or NaN values). The NativeError type might then be "StringifyError" or "CastError" An alternative would be to consider those warnings as errors and expose either a generic "lastErrors" or an additional specific "lastStringifyErrors"

  1. Another approach could be to let the parse() and stringify() methods accept an additional errorCallback parameter

It could be nice but it would force the developer to set values for the other optional parameters first even if he doesn't need them

I would have expected such Errors to be thrown as exceptions in strict mode but it's too late...

Alexandre Morgaut Wakanda Community Manager

4D SAS 60, rue d'Alsace 92110 Clichy France

Standard : +33 1 40 87 92 00 Email : Alexandre.Morgaut at 4d.com Web : www.4D.com

# Alexandre Morgaut (12 years ago)

On 7 juin 2013, at 10:58, Alexandre Morgaut wrote:

  1. Another approach could be to let the parse() and stringify() methods accept an additional errorCallback parameter

It could be nice but it would force the developer to set values for the other optional parameters first even if he doesn't need them

Well, of course, the errorCallback might be set via something like JSON.onerror but a specific callback may easily overwrite a previously defined global one and integrating the whole W3C EventTarget interface in ECMAScript might not be easily acceptable

Alexandre Morgaut Wakanda Community Manager

4D SAS 60, rue d'Alsace 92110 Clichy France

Standard : +33 1 40 87 92 00 Email : Alexandre.Morgaut at 4d.com Web : www.4D.com

# Brendan Eich (12 years ago)

Allen Wirfs-Brock wrote:

The ES5 spec. for JSON.parse requires (ie MUST accept) that duplicate keys are accepted and that the value associated with the last duplicated key wins. A valid implementation of JSON.parse MUST not reject such JSON strings.

IETF has SHOULD as well as MUST, though. Normative specs can say what must happen, but also what should happen in a clean-slate or ideal-world setting. The Internet evolved with Postel's Law falling out of the process.

Previously you wrote:

I would be willing to [lose] the first sentence (containing the

SHOULD) entirely as it doesn't add any real normative value.

But normative RFCs/internet-drafts that use SHOULD not MUST still have value.

From tools.ietf.org/html/rfc2119

1 <http://tools.ietf.org/html/rfc2119#section-1>. MUST

This word, or the terms "REQUIRED" or "SHALL", mean that the
definition is an absolute requirement of the specification.



2 <http://tools.ietf.org/html/rfc2119#section-2>. MUST NOT

This phrase, or the phrase "SHALL NOT", mean that the
definition is an absolute prohibition of the specification.



3 <http://tools.ietf.org/html/rfc2119#section-3>. SHOULD

This word, or the adjective "RECOMMENDED", mean that there
may exist valid reasons in particular circumstances to ignore a
particular item, but the full implications must be understood and
carefully weighed before choosing a different course.



4 <http://tools.ietf.org/html/rfc2119#section-4>. SHOULD NOT

This phrase, or the phrase "NOT RECOMMENDED" mean that
there may exist valid reasons in particular circumstances when the
particular behavior is acceptable or even useful, but the full
implications should be understood and the case carefully weighed
before implementing any behavior described with this label.



5 <http://tools.ietf.org/html/rfc2119#section-5>. MAY

This word, or the adjective "OPTIONAL", mean that an item is
truly optional.  One vendor may choose to include the item because a
particular marketplace requires it or because the vendor feels that
it enhances the product while another vendor may omit the same item.
An implementation which does not include a particular option MUST be
prepared to interoperate with another implementation which does
include the option, though perhaps with reduced functionality. In the
same vein an implementation which does include a particular option
MUST be prepared to interoperate with another implementation which
does not include the option (except, of course, for the feature the
option provides.)



6 <http://tools.ietf.org/html/rfc2119#section-6>. Guidance in the
use of these Imperatives

Imperatives of the type defined in this memo must be used with care
and sparingly.  In particular, they MUST only be used where it is
actually required for interoperation or to limit behavior which has
potential for causing harm (e.g., limiting retransmissions)  For
example, they must not be used to try to impose a particular method
on implementors where the method is not required for
interoperability.
# Brendan Eich (12 years ago)

Mark S. Miller wrote:

Everyone, please keep in mind that JSON does not and never has claimed to be the best or last data format. No doubt many better ones have been and will be created, some as variants of JSON, some not. If you'd like to create a "fixed JSON" or "enhanced JSON" or whatever, feel free -- whether it treats comments, duplicated keys, or \u2028 differently or whatever. But please don't confuse any of these variants with JSON itself. As a data format, JSON's greatest value is its stability, and the inability for anyone, including us, to version it.

+∞

"Get off my lawn!" comment (I will tag in and tag Doug out of the grumpy old men smackdown ring): you kids stop fiddling with JSON. It needs "fixing" like it needs a hole in the head.

# Brian Kardell (12 years ago)

(Snipping out everything as this is a holistic response to the whole thread)

....

Re: MUST / last value Agree, codify the de facto standard. That is what we do almost universally

  • it doesn't even actually "break" anything that already works - it just means those parsers aren't strictly conforming - they already aren't in keeping with the norm.

.....

Re: Should allow comments / single quotes Perhaps if we went back in time and were discussing creating a new format, you might even convince Doug of some of this - what we wouldn't know is how that would affect adoption/development of compatible parsers, bugs, etc - and that is important given the next comment...

.....

Re: The Web can't evolve if you can't...AND create something else, but don't call it JSON.. AND inventing crappy syntax

JSON is maybe the single best proof that the Web can evolve without breaking anything... It started out competing with XML which had every advantage imaginable: it was a REC standard, had support from every big company, hundreds of libraries, had built in/native support in just about every language and was "technically superior". JSON had simplicity and dev contributions. It competed in the real world and won hearts and minds against all odds. Now we have a standard of agreement - you can't break it. The Web is an enterprise of unparalleled scale, JSON is a data interchange format, so even more so than something that exists only in browsers...Anything that breaks potentially affects people's (not developers) lives in real ways and has to be done with great care. You can, however, follow the same model and beat it. In this case, if you have minor changes, it is even easy to make something that can transform to existing JSON format... If you do, we can easily make it native/standard - and you can expect that eventually someone will have similar criticisms and try to compete - that's a good thing.

#extendthewebforward :)

# Kevin Smith (12 years ago)

+∞

"Get off my lawn!" comment (I will tag in and tag Doug out of the grumpy old men smackdown ring): you kids stop fiddling with JSON. It needs "fixing" like it needs a hole in the head.

Comment syntax sure would be nice though : P

# Jason Orendorff (12 years ago)

On Thu, Jun 6, 2013 at 10:21 AM, gaz Heyes <gazheyes at gmail.com> wrote:

That is totally stupid. IMO the JSON spec should support comments like // and /**/ and also support single quotes for string values. It's JavaScript for christ sake. It makes no sense to invent and support this crappy syntax.

Please use respectful language on this list.

# Jason Orendorff (12 years ago)

On Thu, Jun 6, 2013 at 10:54 AM, gaz Heyes <gazheyes at gmail.com> wrote:

It's syntax is based on JavaScript but crippled for no reason other than to follow a outdated specification. It's so frustrating to be at this end of a keyboard to see awful syntax being promoted just to follow the RFC which is clearly flawed. It's so much more logical to change and fix the problems in the specification so that in future code will work correctly and JSON parsers would be much easier and consistent.

Is there a canonical explanation of Don't Break the Web out there? It would be nice to have something to point each new person to when they express this frustration.

# gaz Heyes (12 years ago)

On 7 June 2013 15:19, Jason Orendorff <jason.orendorff at gmail.com> wrote:

On Thu, Jun 6, 2013 at 10:21 AM, gaz Heyes <gazheyes at gmail.com> wrote:

That is totally stupid. IMO the JSON spec should support comments like // and /**/ and also support single quotes for string values. It's JavaScript for christ sake. It makes no sense to invent and support this crappy syntax.

Please use respectful language on this list.

Good sir that is totally stupid. IMO the JSON spec should support comments like // and /**/ and also support single quotes for string values. It's JavaScript for heavens sake. It makes no sense to invent and support this rather than ideal syntax. Thank you sir.

# David Bruant (12 years ago)

Le 07/06/2013 07:31, Jason Orendorff a écrit :

On Thu, Jun 6, 2013 at 10:54 AM, gaz Heyes <gazheyes at gmail.com <mailto:gazheyes at gmail.com>> wrote:

It's syntax is based on JavaScript but crippled for no reason
other than to follow a outdated specification. It's so frustrating
to be at this end of a keyboard to see awful syntax being promoted
just to follow the RFC which is clearly flawed. It's so much more
logical to change and fix the problems in the specification so
that in future code will work correctly and JSON parsers would be
much easier and consistent.

Is there a canonical explanation of Don't Break the Web out there? It would be nice to have something to point each new person to when they express this frustration.

I usually point people to DavidBruant/ECMAScript-regrets#foreword I'll happily accept feedback (or pull request if anyone feels like it) to improve this paragraph.

I tried to explain in length (and in French) in the web magazine "Le train de 13h37" the relationship between webdevs, standards and web browsers and how technologies evolve. The article is CC BY-NC-SA licenced (it's identified clear on the page, but is on the downloadable PDF version), I'll be happy to help if someone feels like translating it to English and maybe add a specific part about "don't break the web"

# Mathias Bynens (12 years ago)

On 7 Jun 2013, at 18:43, David Bruant <bruant.d at gmail.com> wrote:

Le 07/06/2013 07:31, Jason Orendorff a écrit :

Is there a canonical explanation of Don't Break the Web out there? It would be nice to have something to point each new person to when they express this frustration. I usually point people to DavidBruant/ECMAScript-regrets#foreword I'll happily accept feedback (or pull request if anyone feels like it) to improve this paragraph.

There’s also the HTML design principles, which really apply to spec development for the Web in general: www.w3.org/TR/html-design-principles The “don’t break the web” section is titled “Support existing content”: www.w3.org/TR/html-design-principles/#support

# David Bruant (12 years ago)

Le 07/06/2013 06:41, Kevin Smith a écrit :

+?

"Get off my lawn!" comment (I will tag in and tag Doug out of the
grumpy old men smackdown ring): you kids stop fiddling with JSON.
It needs "fixing" like it needs a hole in the head.

Comment syntax sure would be nice though : P

As others suggested, create a different format that looks like JSON and has comments. And just add yet another build-step to your build process. Very much like what happens with SASS (comments are remove when compilation to CSS occurs)

# Paul Hoffman (12 years ago)

Just a note to emphasize what was said a few days ago: the revision of the JSON RFC is being discussed in the IETF right now. This topic is certainly one of the many that the JSON WG is discussing. If you want to participate in the conversation in a way that will affect the new RFC, you should probably be doing so in the JSON WG. Info at www.ietf.org/mailman/listinfo/json

# Yehuda Katz (12 years ago)

At two TC39 meetings, members of TC39 expressed deep concern about the prospect of incompatible changes to JSON by the IETF. It seems as though the IETF would like to consider one or more incompatible changes to JSON as part of this standardization process. There is extremely little support on TC39 for such changes.

We could keep an eye on the IETF list for the introduction of incompatible changes and keep popping up to express this sentiment, but I believe that it would be better if any proposed incompatible changes were raised here before there was serious consideration in the IETF.

I do not believe that "you should have been paying attention" will be sufficient to gain consensus on TC39 for incompatible changes.

Yehuda Katz (ph) 718.877.1325

# Paul Hoffman (12 years ago)

At two TC39 meetings, members of TC39 expressed deep concern about the

prospect of incompatible changes to JSON by the IETF.

Those concerns have not been expressed directly to the IETF's JSON Working Group. I say this as one of the two co-chairs of the JSON WG. If TC39 wants to express "deep concern", they certainly know where to do so. Them doing so sooner rather than later would be helpful all around.

I would note that some of the possibly-incompatible changes to RFC 4627 that are being discussed relate to places where the RFC is self-contradictory or blatantly unclear. In such cases, leaving the RFC alone might just as easily lead to incompatible implementations as clarifications would. That is going to have be determined by the IETF's consensus process.

No one can force anyone here to follow the official WG discussion for the successor to RFC 4627, of course. However, given that some people on this list are JSON experts, if you don't want to participate in the evolution of the RFC, I would be interested in hearing (possibly off-list) why that is. Part of the job of the chairs is to make sure experts feel welcome in the IETF process.

# Rick Waldron (12 years ago)

On Sun, Jun 9, 2013 at 7:19 PM, Paul Hoffman <paul.hoffman at gmail.com> wrote:

At two TC39 meetings, members of TC39 expressed deep concern about the prospect of incompatible changes to JSON by the IETF.

Those concerns have not been expressed directly to the IETF's JSON Working Group. I say this as one of the two co-chairs of the JSON WG. If TC39 wants to express "deep concern", they certainly know where to do so. Them doing so sooner rather than later would be helpful all around.

I would note that some of the possibly-incompatible changes to RFC 4627 that are being discussed relate to places where the RFC is self-contradictory or blatantly unclear. In such cases, leaving the RFC alone might just as easily lead to incompatible implementations as clarifications would. That is going to have be determined by the IETF's consensus process.

No one can force anyone here to follow the official WG discussion for the successor to RFC 4627, of course. However, given that some people on this list are JSON experts, if you don't want to participate in the evolution of the RFC, I would be interested in hearing (possibly off-list) why that is. Part of the job of the chairs is to make sure experts feel welcome in the IETF process.

Paul,

Here are the notes from the first TC39 meeting that this topic was discussed

rwldrn/tc39-notes/blob/master/es6/2013-03/mar-12.md#49

# Paul Hoffman (12 years ago)

Thanks, but that doesn't match what Yehuda said. If anything, it shows that there is as widespread disagreement within TC39 as to what is a "breaking" change with respect to duplicate names in objects as there so far is in the JSON WG.

If there are other public minutes that relate to the TC39-IETF relationship, I'd certainly appreciate them. The ECMA-IETF discussions all happened before Matt Miller and I were made co-chairs of the WG.

# Brendan Eich (12 years ago)

Paul Hoffman wrote:

At two TC39 meetings, members of TC39 expressed deep concern about the prospect of incompatible changes to JSON by the IETF.

Those concerns have not been expressed directly to the IETF's JSON Working Group.

That's surprising -- am I the only one on TC39 who thought Doug Crockford had spoken to you against making incompatible changes?

# Brendan Eich (12 years ago)

Paul Hoffman wrote:

Thanks, but that doesn't match what Yehuda said. If anything, it shows that there is as widespread disagreement within TC39 as to what is a "breaking" change with respect to duplicate names in objects as there so far is in the JSON WG.

The minutes Rick cited show no disagreement other than Doug (I missed this part of the March meeting). The "FTR: Majority opposition, no consensus" note, I believe, means that everyone but Doug agreed that an incompatible change was a bad idea, to be opposed.

Is anything else unclear? Am I missing some other notes section? I do not see "widespread disagreement within TC39 as to what is a "breaking" change with respect to duplicate names in objects" in any of the "JSON, IETF changes" meeting notes text (cited below).

/be

<https://github.com/rwldrn/tc39-notes/blob/master/es6/2013-03/mar-12.md#49-json-ietf-changes>4.9
JSON, IETF changes

(Presented by DC Crockford)

Currently, JSON is an RFC, informational, the IETF version will be an internet standard and there is a minor correction that affects ECMAScript.

The use of "should" in 15.12.2

AR: What is the motivation of the change?

DC: The change involves the mistake of using "should" w/r to multiple same-named keys error. Multiple same-name keys are invalid and /must/ throw an error (vs. "should" throw an error)

LH: This is a breaking change

DH: The worst being the use case of multiple, same-named keys as comments

DC: That's stupid

YK: That's based on on your recommendation to use a keyed entry as a comment, so people naturally used the same key, knowing they'd be ignored.

DC: I would certainly never recommend that practice

YK: It was a side-effect

AR: Which key is used now?

AWB: The last one wins.

AR: Is that the root of the security vector?

DC: Not in ES, but in other encodings

AR: Order matters, unescaped content that follows...

DC: The current spec says "[they should not]", but will say "[they must now]"

YK: Let's define an ordering and make it cryptographically secure.

DC: (recapping to Mark Miller, who just arrived)

MM: You can't do that. (laughs)

MM: You can't change "should" to "must"

YK: Agreed, you cannot change JSON, there are too many JSON documents in existence.

MM: Agreed.

AR: It's possible to ignore this change?

DC: Yes

DH: Then why are we creating a dead letter?

MM: ES has a grammatical specification for validating and parsing JSON. Anything that is not conformant JSON, would not parse. This change loses that property.

DC: Or we don't change the spec

MM: The way that you properly reject our favorite fixes, I think you should apply to your favorite fixes

DC: I'll consider that

AR: There is considerable opposition to this change

DC: Two choices...

  1. Make it an error
  2. Continue to take the last one

DC: Decoders have license to do what they want with non-conformant material. Encoders /must/ be conferment to new changes.

MM: Our current encoder conforms...

AWB: I don't think it does... reviver/replacer

MM: No, can only apply objects instead of the original objects.

AR: Did not realize the production/consumption distinction of this change.

WH: Supports this change. ECMAScript is already conformant because it never generates duplicate keys.

MM: With this change ECMAScript would have two unappealing choices: A. No longer be a validating parser (i.e. a parser that doesn't allow any optional syntax or extensions, even though extensions are permitted by the JSON spec). B. Do a breaking change by throwing errors when seeing duplicates when parsing.

    <https://github.com/rwldrn/tc39-notes/blob/master/es6/2013-03/mar-12.md#conclusionresolution>Conclusion/Resolution
  • Revisit this, after DC has made a final decision.
  • FTR: Majority opposition, no consensus.
# Sam Tobin-Hochstadt (12 years ago)

On Sun, Jun 9, 2013 at 10:25 PM, Brendan Eich <brendan at mozilla.com> wrote:

Paul Hoffman wrote:

Thanks, but that doesn't match what Yehuda said. If anything, it shows that there is as widespread disagreement within TC39 as to what is a "breaking" change with respect to duplicate names in objects as there so far is in the JSON WG.

The minutes Rick cited show no disagreement other than Doug (I missed this part of the March meeting). The "FTR: Majority opposition, no consensus" note, I believe, means that everyone but Doug agreed that an incompatible change was a bad idea, to be opposed.

Is anything else unclear? Am I missing some other notes section? I do not see "widespread disagreement within TC39 as to what is a "breaking" change with respect to duplicate names in objects" in any of the "JSON, IETF changes" meeting notes text (cited below).

A quick skim of the IETF list also indicates that people are discussing how characters in strings are treated, and I feel confident in predicting that any change for the 16-bit number interpretation that EcmaScript uses would be thought of as a breaking change by TC39.

# Paul Hoffman (12 years ago)

The term "incompatible change" is being used too loosely here. RFC 4627 and the ES spec are currently different. If we simply republished RFC 4627, it would be an incompatible change from ES. If we publish a new RFC that says exactly what ES says, it will be an incompatible change from the earlier RFC.

Developers have read both documents and dealt with the incompatibilities however they felt appropriate.

# Brendan Eich (12 years ago)

Paul Hoffman wrote:

The term "incompatible change" is being used too loosely here. RFC 4627 and the ES spec are currently different. If we simply republished RFC 4627, it would be an incompatible change from ES. If we publish a new RFC that says exactly what ES says, it will be an incompatible change from the earlier RFC.

Right, that's a good point -- something has to give.

In case it wasn't obvious, most TC39 folks generally believe the "something" is the de-jure spec when de-facto practice (built on another, later spec -- or built on no spec atll) is in conflict. Not all TC39ers, and not all the time -- if something is so broken by design as to be unused in practice outside of tests, we may try to break backward compatibility. I observe that such cases are few.

Developers have read both documents and dealt with the incompatibilities however they felt appropriate.

Developers also are constrained in the market to interoperate, most of the time (we hope; I lived through the IE nuclear winter of near-95% market share and fought back with Firefox; we're living through the "mobile Web == iOS WebKit" era, with an end in sight finally).

So what actually works, the "intersection semantics" among popular implementations, is what we ought to spec. That's something most TC39ers agree on, most of the time, too.

This says to me that changing the RFC is the best course. People reading will, I'm sure, let me know if I'm missing anything!