njs: incorrect order of keys in Object.getOwnPropertyNames()

[[OwnPropertyKeys]]

njs:

>> Object.getOwnPropertyNames([1,2,3].reduce((a,x,i,s) => { a[s.length - i] = x; a['s' + (s.length - i)] = x; return a; }, {}))
[
 '3',
 's3',
 '2',
 's2',
 '1',
 's1'
]

node:

> Object.getOwnPropertyNames([1,2,3].reduce((a,x,i,s) => { a[s.length - i] = x; a['s' + (s.length - i)] = x; return a; }, {}))
[ '1', '2', '3', 's3', 's2', 's1' ]

Also, while Object.keys doesn’t guarantee the order shown, it widely used in test262 (https://github.com/tc39/test262/issues/2226). For example: /language/computed-property-names/object/method/number.js

About this issue

  • Original URL
  • State: closed
  • Created 5 years ago
  • Comments: 28 (16 by maintainers)

Commits related to this issue

Most upvoted comments

this is totally unexpected and it may be a serious problem for njs adoption in a wider community.

To me it’s totally unexpected, that one will rely on the order of hash map keys. It’s just wrong.

> var y = { a: 'a', b: 'b', c: 'c', d: 'd', e: 'e', 1: 'f', g: 'g', h: 'h', 2: 'i', j: 'j', k: 'k' }
undefined
> Object.keys(y)
[
  '1', '2', 'a', 'b',
  'c', 'd', 'e', 'g',
  'h', 'j', 'k'
]

Actually, I just don’t want to pay a computational price that a proper ES7+ implementation require. Consider JSON.parse’ing a 1M+ string, how many CPU cycles/memory will it waste… So, to me the current implementation is a perfect, while been incompatible with the latest standard.

That doesn’t make any sense - especially since ecmascript is a living standard, and the only spec that matters is https://tc39.es/ecma262/ - but also because most people aren’t writing code to target a standard nobody sticks to anymore from a decade ago.

@jirutka

I plan to rewrite the underlying structure for Object instances. With new structure the order will be preserved.

Objects in JS are simply not hashes or hash maps, even if you can implement them that way.

Yes, it is correct to assume that if the keys are numeric. If they are not, then insertion order applies, and the correct assumption would be that they match the order of appearance in the original JSON string. More importantly, if you then did let z = await x.json(), y and z must have the same key ordering - ie, it must be deterministic.

@ljharb

Note that Map and Set in JS are ordered as well.

yeah, but without integer footgun 😃

why is anyone JSON parsing a 1MB string, if you’re taking about what “feels” wrong? this project is named “njs” - not “n whatever drsm wishes js was”.

This project is a lightweight implementation of ECMAScript script subset, it runs server-side. Some of most common usage patterns are:

  • convert JSON request body to an object.
  • do requests to external subsystems using JSON as a transport encoding.

And in JSON - An object is an unordered set of name/value pairs.

So, a “proper” ordering of object keys looks useless to me.

Yes, but your expectations are irrelevant. JavaScript dictates that one can, thus, one can.

why is anyone JSON parsing a 1MB string, if you’re taking about what “feels” wrong?

this project is named “njs” - not “n whatever drsm wishes js was”.

njs implements ECMAScript, not Java language.

https://262.ecma-international.org/5.1/#sec-15.2.3.14 https://262.ecma-international.org/5.1/#sec-12.6.4

The mechanics and order of enumerating the properties (step 6.a in the first algorithm, step 7.a in the second) is not specified.

And, in my opinion, we should stay at es5.1 level there.

So, Web Reality is forcing the browsers to waste computational resources, instead of forcing developers to fix their code. It’s sad, but the reason is to make endusers happy, ok.

But why it should be extended to other ECMAScript implementations? I see no reason.