Skip to content

Instantly share code, notes, and snippets.

@mattpocock
Last active September 24, 2020 06:25
Show Gist options
  • Save mattpocock/8e65b42c014b7936892939f453f9d567 to your computer and use it in GitHub Desktop.
Save mattpocock/8e65b42c014b7936892939f453f9d567 to your computer and use it in GitHub Desktop.

Intro

Static analysis of state machines has enormous potential. In this RFC, I'd like to talk about using a CLI tool to generate perfect Typescript types by analysing XState machines in your code.

Typescript with XState is currently imperfect because of XState's innate complexity. The goal is to:

  1. Get perfect typing of any MachineOptions types passed into Machine, interpret, useMachine etc. This includes typings of events, services, actions, guards and activities based on their usage in the machine.
  2. Get autocomplete on the matches function in interpreted state nodes, to allow state.matches('even.deep.nested.states') to be type-checkable.

This CLI could, in the future, be tooled to achieve some other stretch goals:

  1. Autocomplete for states within state machines
  2. Analysis of state machine paths so that they error in the IDE, not at runtime
  3. Compile only the features of XState you use to save on bundle size.

Running a CLI gives us many, many more options heading into the future.

Why post the RFC?

We want to get your thoughts on how best to implement the types the codegen tool generates, or if there are any alternative approaches we could take - such as VSCode intellisense or others. We are open to any and all ideas.

Codegen approaches

1. Generating sibling files with useMachine exports

Example Output

This is how the xstate-codegen npm package works currently.

  1. Run xstate-codegen "**/*.machine.ts" to watch your files.
  2. You create a file with a .machine.ts extension.
  3. The CLI creates a sibling file, with a .machine.typed.ts extension.
  4. This file creates a custom function to interpret the Machine, depending on configuration passed to the CLI. For now, it only creates a useMachine hook for React users. This could work for other contexts too.

Try it out in this npm package: https://www.npmjs.com/package/xstate-codegen

Constraints

The user must use the codegen-generated functions, and pass in the same machine. This is quite a clunky API:

import { useLightMachine } from "./trafficLightMachine.machine.typed";
import { lightMachine } from "./trafficLightMachine.machine";

const [state, send] = useLightMachine(lightMachine, {
  // options
});

Pro's

Good typings, without overloads.

Con's

Clunky API compared with other approaches.

Users would need to configure per project which outputs they'd need - useMachine for React projects, a custom interpret function for others. The tool should work with minimal config.

@Andarist pointed out this could work by scanning the package.json, which would remove this con.

2. Global declaration file which overloads XState outputs

Example Output

  1. Run xstate-codegen "**/*.machine.ts" to watch your files.
  2. The CLI creates a single file, xstate-codegen-env.d.ts at the root of your project.
  3. For each machine in your project, it creates a <MachineName>StateMachine type which can be exported and assigned to your machine:
import { Machine, TrafficLightStateMachine } from "xstate";

const trafficLightMachine: TrafficLightStateMachine = Machine({
  // machine config
});
  1. The global .d.ts overrides interpret, useMachine etc so that you can use the machine as you would usually:
import { useMachine } from "@xstate/react";

const [state, send] = useMachine(trafficLightMachine, {
  // options
});

User Constraints

The user must pull in and assign the generated type to their machine.

This could also be automated on file save via a codemod.

Pro's

Typings are tied to the machine, which is much better than tying them to the implementation of the machine (i.e. useMachine, interpret etc.). They'll get the compiled types anywhere they use the machine, instead of only in codegen-generated functions.

This approachs asks the least of users - a single import change and you're good to go.

Con's

This approach, currently, has the weakest typing. This is because the typings on useMachine, interpret is done by overloads. This renders the typing of machine options next to useless. If the user passes in an incorrect options object, the types fall back to the previous overload, and no error is shown. This is not currently type safe.

There is more exploratory work to be done on the above to see if any internal changes within XState could be made to facilitate this. I have some ideas which I haven't yet tried.

3. Global declaration file without XState Overloads

Example Output

This works the same as 2, but instead of using useMachine to consume your machine, you would use useCompiledMachine.

import { useCompiledMachine } from "@xstate/react";

const [state, send] = useCompiledMachine(trafficLightMachine, {
  // options
});

Instead of interpret, you would use interpretCompiled or similar.

import { interpretCompiled } from "xstate";

const service = interpretCompiled(trafficLightMachine, {
  // options
});

XState would require an update to alias interpretCompiled to interpret.

Pro's

Same pro's as 2, with the added benefit of strong typings without any overloads needed.

Con's

Requires more steps from users to make work.

4. @xstate/compiled folder which re-implements xstate

This has no example output as of yet.

This approach would entirely re-implement the typings of XState in a node_modules folder, @xstate/compiled. This would allow for maximum customisation of the types by the codegen tool.

It could require the same import syntax as 2, depending on how the internal types are handled:

import { Machine, TrafficLightStateMachine } from "@xstate/compiled";

const trafficLightMachine: TrafficLightStateMachine = Machine({
  // machine config
});

To use useMachine, users would import from @xsate/compiled/react:

import { useMachine } from "@xstate/compiled/react";

const [state, send] = useMachine(trafficLightMachine, {
  // options
});

Pro's

As much control as we like over the typings, without the potential danger of using overloads.

A small change for users to make, which can also be opted into gradually if required.

Con's

Potentially a more complex implementation for the codegen tool, but this is not necessarily a con for the end user.

@mattpocock
Copy link
Author

I've been trying to implement 2a by adding a distinguishing _isGenerated property to the StateNode in XState core. This is working for interpret, but not for useMachine. It feels a little hacky, too - and I'd need someone from core to take a look to see if I'm on the right track.

@mattpocock
Copy link
Author

Ignore the above, I found a way through. On a branch here:

https://github.com/mattpocock/xstate/tree/added-_generated-type

@Andarist
Copy link

Andarist commented Aug 6, 2020

Minor comments

Users would need to configure per project which outputs they'd need - useMachine for React projects, a custom interpret function for others. The tool should work with the minimal config.

This could be inferred based on package.json


I'm somewhat surprised that this works:

const lightMachine: LightMachineStateMachine = Machine(/* ... */)

like if I remove the explicit type from this declaration and I do this:

type A = LightMachineStateMachine extends typeof lightMachine ? true : false // true 
type B = typeof lightMachine extends LightMachineStateMachine ? true : false // true

hooow? those types are not the same after all, so how one can be both a supertype and subtype of another? unless I've made something rly stupid here and i don't understand this at all

2c variant

@xstate/compiled package that could just reexport runtime stuff, but would require fewer changes to the code because you could just search & replace xstate -> @xstate/compiled. This would also decouple us all together from the types contained in the current XState packages - so we wouldn't have problems with overloads etc.

Higher-level comment

This RFC right now focuses on typing .options correctly and if that would be the only thing we could achieve, it would already be a great improvement. I feel though that we maybe could do even more. Some things that I'm missing from here:

  • typing for "inline" onEntry (just an example) - and it's not that this is currently not implemented by the codegen (that's totally fine), but it's that the used technique doesn't allow for providing strict typing for this, because to really achieve this we'd have to provide very strict types for every state node in the config, so I start to believe that we can't just rely on the builtin Machine/createMachine APIs as they don't allow for such granular control
  • there is a long-standing issue for generics not being inferred correctly sometimes, like here, the goal for this would be for this assign to work without explicit TContext parameter being fed to it. I think this might not be possible right now with this codegen for similar reasons in the point above as it still relies on the builtin Machine/createMachine types so the inference for things like this stays unaffected
  • IntelliSense for targets, .options[category] keys and similar. This would again require more granular control over the config type (so pretty much the same root cause as in the previous points). Note for this one: we probably shouldn't restrict string values to the ones matching the correct signature, but rather allow all defined guards (for example) and just recompile underlying types so the options object could start reporting a problem, this would provide better usability than restricting possible values only to those that match

IntelliSense

You have mentioned that we could provide this using other means - such as VScode extension. And I agree - this is possible and maybe it would even be preferable? Types would be a little less complex if we could offload this to the IDE integration and maybe thanks to that the TS perf would be better. Not sure how to best benchmark this.

My main motivation for proposing TS-based IntelliSense is that it would work in wider range of scenarios.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment