Proposal: `maxDepth` on objects
Are there other use cases for this? This reason doesn't really seem compelling.
(Not TC39, but I wouldn't expect this to be considered without a way stronger use case.)
What would it report on obj
with const a = {}; const obj = { a };
? What
about with const obj = { get a() { return Math.random() > 0.5 ? obj : {}; } };
?
This seems like something you could do yourself as a function.
Wouldn't calculating the depth open up objects for abuse? Assuming the max depth is calculated with a BFS, it seems like you end up with a potentially expensive BFS in the standard. Something safer may be 'hasDepth(int n)' or a 'depthLessThan(int n)'.
Calculating this on plain objects could be a O(1) operation:
Empty objects are initialized with maxDepth 0 and are set to 1 when a primitive property is added. If an object property is added the maxDepth is set to 1 + maxDepth(newObjectProperty)`instead of it being calculated Ïevery time .maxDepth is accessed (much like how Array#length works).
On Mon, Oct 22, 2018 at 2:42 AM Rob Ede <robjtede at icloud.com> wrote:
Calculating this on plain objects could be a O(1) operation:
Empty objects are initialized with maxDepth 0 and are set to 1 when a primitive property is added. If an object property is added the maxDepth is set to 1 + maxDepth(newObjectProperty)`instead of it being calculated Ïevery time .maxDepth is accessed (much like how Array#length works).
That's not enough. Any time you add/change/remove a property that would alter the depth of an object, you also need to update the depth of every parent object containing it.
This isn't "actually" O(1) - it's O(1) on access, but only because it's amortized the cost over every mutation instead. We don't generally consider that trade-off worthwhile, particularly for things like this that would have fairly specialized/limited use-cases in the first place.
(Plus, it still doesn't answer what the depth is of an object with a cyclic reference.)
Ahh yes, the updating aspect is tricky (call it an early-morning oversight), and not one that I think could have a practical solution if spreading out the calculation over the life of the object is not viable. Maybe my Array comparison was being a bit too optimistic about this usefulness of this feature.
You are correct. A parent object should not have to care about its children objects. I’m not aware that JS engines even keep any kind of reverse lookup table that would accommodate this, right now.
On reference cycles, would it not make sense to have the depth of an object with cyclic references be Infinity, this would be useful information and easy to compute given that garbage collectors already detect them.
(To be clear, I’m not a huge fan of this proposal but it is fun to think about how this could be implemented in userland and wether it could benefit from being a built-in.)
a practical solution to your nested-validation-problem is to use a recursive tree-walker that keeps track of depth. here's a real-world example that limits the depth (to 3) for auto-generating swagger-data from nested-schemas using technique [1].
local.dbFieldRandomCreate = function (options) {
/*
* this function will create a random dbField from options.schemaP
*/
var depth, ii, max, min, schemaP, value;
depth = Number.isFinite(options.depth)
? options.depth
: 3;
...
// 5.4. Validation keywords for objects
default:
if (depth <= 0) {
break;
}
// recurse dbRowRandomCreate
value = local.dbRowRandomCreate({
depth: depth - 1,
modeNotRandom: options.modeNotRandom,
prefix: ['schema<' + JSON.stringify(schemaP) + '>'],
schema: schemaP
});
break;
hi TJ, a practical solution to circular-recursion is to have an Array/Set that records all unique objects the tree-walker has traversed, and which it checks against recursion. here's a real-world example of swagger-validation guarding itself against circular schema-definitions using technique [2].
local.swaggerValidateDataSchema = function (options) {
/*
* this function will validate options.data against the swagger options.schema
* http://json-schema.org/draft-04/json-schema-validation.html#rfc.section.5
*/
var $ref,
circularList,
...
circularList = [];
while (true) {
...
// dereference schema.$ref
$ref = schema && schema.$ref;
if (!$ref) {
break;
}
test = circularList.indexOf($ref) < 0;
local.throwSwaggerError(!test && {
data: data,
errorType: 'schemaDereferenceCircular',
prefix: options.prefix,
schema: schema
});
circularList.push($ref);
...
[1] maxDepth guard in auto-generating swagger-data from nested-schemas kaizhu256/node-swgg/blob/2018.9.8/lib.swgg.js#L2377
[2] circular-schema guard in swagger-validation kaizhu256/node-swgg/blob/2018.9.8/lib.swgg.js#L3940
kai zhu kaizhu256 at gmail.com
I’d love to see some sort of
maxDepth
property on objects.For
{}
, it would return0
. In the case of{keyOne: true}
, it would return1
. For{keyOne: {anotherKey: false}, keyTwo: false}
, it would return2
.My particular use case is validating a JSON payload sent by a user to prevent abuse. I don’t want to force a particular structure or set of keys, but I do want to make sure the level of nesting doesn’t get silly.
The code to implement this is fairly short, but looks a bit messy. It’d be really nice to have a property instead.