TypeScript: Type inference for rest parameters does not work

TypeScript Version: 4.1.0-dev.20201012 Search Terms: type inference, rest parameters Consider the following function declaration:

declare function f<T>(...list: T[]): void;

Expected behavior: The function f() can receive arbitrary number of arguments of any type. For the given list of arguments type T should be set to the “least common denominator” of the argument types.

Actual behavior: TS fails to infer that common type by itself, but is happy to consume the same call when the common type is provided manually:

f("a", "b", 0); // TS thinks this call is illegal
f<string|number>("a", "b", 0); // yet it is happy to accept the same call with manual typing

The other case that works is when arguments contains the any type:

let a:any;
f("a", "b", 0, a);

Code

declare function f<T>(...list: T[]): void;

const ar = ["a", "b", 0];

f(...ar);
f<string|number>("a", "b", 0);

f("a", "b", 0);

let a:any;
f("a", "b", 0, a);
Output
"use strict";
const ar = ["a", "b", 0];
f(...ar);
f("a", "b", 0);
f("a", "b", 0);
let a;
f("a", "b", 0, a);

Compiler Options
{
  "compilerOptions": {
    "noImplicitAny": true,
    "strictNullChecks": true,
    "strictFunctionTypes": true,
    "strictPropertyInitialization": true,
    "strictBindCallApply": true,
    "noImplicitThis": true,
    "noImplicitReturns": true,
    "alwaysStrict": true,
    "esModuleInterop": true,
    "declaration": true,
    "experimentalDecorators": true,
    "emitDecoratorMetadata": true,
    "moduleResolution": 2,
    "target": "ES2017",
    "jsx": "React",
    "module": "ESNext"
  }
}

Playground Link: Provided

About this issue

  • Original URL
  • State: closed
  • Created 4 years ago
  • Reactions: 1
  • Comments: 19 (7 by maintainers)

Most upvoted comments

Thank you for the explanation, your time, an with all due respect to the six years of the TypeScript history, but…

The system you outlined above is self-inconsistent and contradicts JavaScript semantic of the rest parameters (five years old feature). Consider the second example I gave, with three classes. In that example the types of the parameters are compatible and I would even be happy if the compiler deduces the parameter type T to be A|null. But it returns C|null. How is that possible?

Now, examples with two parameters are not very helpful, because they illustrate quite a different pattern, where two entitles are at play and there the compiler obtains the value for T from the first argument, tries to apply it to the second argument and that fails. Clear and simple. My examples contain a single entity (array) only. And from the runtime behaviour (i.e. the JavaScript run-time rules), this shall work in the very same way as it works for arrays. You don’t expect us, users, to write

const array:(string|number)[] = ["a", 1];

do you?

Why then when the very same array is created implicitly TypeScript says it’s impossible?

@ezsh your classes are empty and that’s the only reason you see that behavior. Add members to B / C and you will see the behavior you expect. This is addressed in the FAQ.

@Arian94 personal snipes aren’t appreciated here

Thank you for the explanation, your time, an with all due respect to the six years of the TypeScript history, but…

The system you outlined above is self-inconsistent and contradicts JavaScript semantic of the rest parameters (five years old feature). Consider the second example I gave, with three classes. In that example the types of the parameters are compatible and I would even be happy if the compiler deduces the parameter type T to be A|null. But it returns C|null. How is that possible?

Now, examples with two parameters are not very helpful, because they illustrate quite a different pattern, where two entitles are at play and there the compiler obtains the value for T from the first argument, tries to apply it to the second argument and that fails. Clear and simple. My examples contain a single entity (array) only. And from the runtime behaviour (i.e. the JavaScript run-time rules), this shall work in the very same way as it works for arrays. You don’t expect us, users, to write

const array:(string|number)[] = ["a", 1];

do you?

Why then when the very same array is created implicitly TypeScript says it’s impossible?

TS has many issues and their developers are so full of themselves in a silly way. So don’t be too positive about it.

Generic inference proceeds along the following algorithm:

  • Collect all the candidates
  • Choose a type from amongst those candidates

This is the intended behavior because calls like this

declare function find<T>(haystack: T[], needle: T): boolean;
find([1, 2, 3], "four");

should fail; inferring number | string as the type argument here is actively working against the intent of the author. This is the intended behavior; we know this because TS used to do the other thing and we got nothing but complaints about it. So we changed this to be the intended behavior.

Now if you write

declare function find<T>(haystack: T[], needle: T): boolean;
const sn: Array<string | number> = [1, 2, 3];
find(sn, "four");

Now this is a valid call, because the candidate string | number is a valid supertype of all the candidates. This is the intended behavior because these types share a common root.

When you write something like this:

declare function f<T>(...list: T[]): void;

const ar = ["a", "b", 0];

f(...ar);

There’s only one candidate presented here: string | number. There’s no basis by which you could reject this; you could as easily have written

declare function f<T>(...list: T[]): void;

const ar = ["a" as string | number];

f(...ar);

which clearly must pass by your proposed logic of “they should behave the same”, but the type of ar is the same in both cases.

So anyway, this is the intended behavior; you haven’t found a six-year-old bug in TypeScript. You’re free to disagree with that design but it’s not a bug.

I can’t understand why do you keep referring to the find example when it works and fails in a different way comparing to the case I report.

This is begging the question. Your assertion has been that a function f<T>(...list: T[]) should act as a singular inference candidate site so that a call like f(1, "foo") would infer as string | number, because it’s “the same” as the array case. My assertion is that this is the same" as function f<T>(arg0?: T, arg1?: T, arg2?: T, arg3?: T, ... and should be processed that way. Whether or not these examples are truly isomorphic is just a question of where we park certain inconsistencies that arise from a) accepting heterogeneous arrays and b) wanting to reject bad find calls.

I still believe you correct the error at the wrong place.

We’re just going to have to agree to disagree here. This is basically the same as #27859 so you can add your use case there for tracking purposes, but I don’t think we’re going to completely rethink inference based on this one use case.

@ezsh your classes are empty and that’s the only reason you see that behavior. Add members to B / C and you will see the behavior you expect. This is addressed in the FAQ.

@Arian94 personal snipes aren’t appreciated here

There will be no snipes, if you don’t hide yourself.

Thank you, I already know that the current behaviour is intended. I believe it is incorrect because f("a", "b", 0) and f<string|number>("a", "b", 0) are equivalent but the compiler treats them differently.

This issue is about another problem, so it is not a re-creation. BTW, clothing issues without explaining is also not the right way to communicate.