nodejs-datastore: decodeIntegerValue throws an error, but it's not caught.

Environment details

  • OS: Linux Distributor ID: Ubuntu Description: Pop!_OS 19.10 Release: 19.10 Codename: eoan
  • Node.js version: v10.15.2, v12.14.0
  • npm version: 6.4.1, 6.13.4, yarn 1.19.1
  • @google-cloud/datastore version: ^5.0.2 (5.0.3, 5.0.4)

Steps to reproduce

Add an integer value into the datastore that won’t fit within the bounds of Number.MAX_SAFE_INTEGER

Try to request the data back from the datastore:

const datastore = new Datastore({...})
const q = datastore.createQuery('sesionState').filter('event', '=', event || '');
datastore.runQuery(q)
The application will crash due to an uncaught exception:
Error: We attempted to return all of the numeric values, but value 8446744073709552000 is out of bounds of 'Number.MAX_SAFE_INTEGER'.
To prevent this error, please consider passing 'options.wrapNumbers=true' or
'options.wrapNumbers' as
{
  integerTypeCastFunction: provide <your_custom_function>
  properties: optionally specify property name(s) to be cutom casted}

Also a typo misspelling of ‘custom’ here ----^

at decodeIntegerValue (/home/jpate/Documents/escaperoom/leaderboard/node_modules/@google-cloud/datastore/build/src/entity.js:388:13)
    at Object.decodeValueProto (/home/jpate/Documents/escaperoom/leaderboard/node_modules/@google-cloud/datastore/build/src/entity.js:462:23)
    at Object.entityFromEntityProto (/home/jpate/Documents/escaperoom/leaderboard/node_modules/@google-cloud/datastore/build/src/entity.js:607:45)
    at Object.decodeValueProto (/home/jpate/Documents/escaperoom/leaderboard/node_modules/@google-cloud/datastore/build/src/entity.js:465:31)
    at value.values.map (/home/jpate/Documents/escaperoom/leaderboard/node_modules/@google-cloud/datastore/build/src/entity.js:446:57)
    at Array.map (<anonymous>)
    at Object.decodeValueProto (/home/jpate/Documents/escaperoom/leaderboard/node_modules/@google-cloud/datastore/build/src/entity.js:446:37)
    at Object.entityFromEntityProto (/home/jpate/Documents/escaperoom/leaderboard/node_modules/@google-cloud/datastore/build/src/entity.js:607:45)
    at Object.decodeValueProto (/home/jpate/Documents/escaperoom/leaderboard/node_modules/@google-cloud/datastore/build/src/entity.js:465:31)
    at Object.entityFromEntityProto (/home/jpate/Documents/escaperoom/leaderboard/node_modules/@google-cloud/datastore/build/src/entity.js:607:45)
    at results.map.result (/home/jpate/Documents/escaperoom/leaderboard/node_modules/@google-cloud/datastore/build/src/entity.js:797:32)
    at Array.map (<anonymous>)
    at Object.formatArray (/home/jpate/Documents/escaperoom/leaderboard/node_modules/@google-cloud/datastore/build/src/entity.js:796:24)
    at Immediate.request_ [as _onImmediate] (/home/jpate/Documents/escaperoom/leaderboard/node_modules/@google-cloud/datastore/build/src/request.js:230:45)
    at runCallback (timers.js:706:11)"

It looks like there’s nothing that actually catches errors from entity.formatArray(...)

About this issue

  • Original URL
  • State: closed
  • Created 4 years ago
  • Comments: 16 (4 by maintainers)

Most upvoted comments

+1 on @jamie-pate’s last comment, which bit me recently: it’s highly unexpected that an upsert will write without complaint but reading back the same data will throw. It seems like either both should fail (or require that a flag is passed) or neither.

Also, the originally stored number appears to have been inserted as a JavaScript float as a member of a plain JavaScript object. In this closed JavaScript only system the precision loss had already happened long before the data was passed to the library for storage in the first place. I’m not sure what the value is of testing application data when deserializing. In this case the value is larger than the data type can store without loss of precision, but the precision was already lost. Checking the value is not an accurate measure of data loss. The code should really check if the value is oversized, then convert the value to number and back and compare the result. Only then can you say that the conversion actually lost any precision!

I’m also not sure why this check exists while decoding, but not while encoding the data. Or why it decided this value was an int in the first place when the initial value was stored.