Skip to content

Generic types can't have some types inferred if they are derived from another generic which doesn't necessarily use them.Β #44525

Closed
@SephReed

Description

@SephReed

Bug Report

πŸ”Ž Search Terms

Generics, infer, derived, unused, doesn't work

⏯ Playground Link

Playground link with relevant code

πŸ’» Code

type A<T, Y, Z> = () => true;  // this does not work
// type A<T, Y, Z> = () => T;  // this works

type InferT<I> = I extends A<infer T, any, any> ? T : never;

// typeof directInfer is "true"
let directInfer: InferT<A<true, null, null>>; // this works either way


type B<T> = A<T, null, null>;
const test: B<true> = () => true;

// typeof indirectInfer is "uknown"
// should be of type "true", and it works if you uncomment the line above
let indirectInfer: InferT<typeof test>;

πŸ™ Actual behavior

  • If you make a generic A which does not necessarily use all of its arguments, then instantiate a variable of that type, you can still use infer to pull out every generic argument from it.

  • If you then make a new generic B which is a version of A with some fields pre-filled, then use the same inference function as for A, it will no longer work on generic arguments which are not used in A.

  • If you then change A to use all of its generic arguments, inferring from B will now work.

πŸ™‚ Expected behavior

It shouldn't matter whether or not the base generic A uses a generic field. If it can be inferred one way, it should be able to be inferred either way.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions