- Lenz Weber-Tronic
- Dominik Dorfmeister
- Fredrik Hoglund
- Jerel Miller
- Alessia Bellisario
- Andrew Clark
- Dan Abramov
- Mark Erikson
- Lenz: open questions around the React
use
RFC and thecache()
proposal, and how our data fetching libraries are supposed to integrate with those - Andrew:
use
andcache
are different things, let's talk aboutuse
first- can think of it purely as a replacement for
throw
promise, can just migrate to that internally - something about internal call stacks, and can maybe throw a promise internally and pass to
use()
use
has a capability that throwing a promise doesn't: users can conditionally suspend. (current lib APIs do somedisable: true
options)- Suspense compat: not about throwing a promise, but about taking advantage of Suspense fallbacks vs
{isLoading}
. Loading states still works, but to take advantage / max compat, encourage users to take advantage ofstartTransition
and Suspense boundaries
- can think of it purely as a replacement for
- Lenz: bigger question is all the capabilities that are in the hooks right now:
useQuery
can be skippeduseQueries
can do a bunch of requests- There might be more updates over time (revalidation, live streaming, etc) - promises don't handle all those cases
- Andrew: don't expect the libraries to get rid of the hooks, keep those. But, you could return a promise from the hook. Can
setState(newPromise)
, and let the user decide if they want to wait for it. - Lenz: hand-wave-y wish for being able to remove the hooks and be UI-agnostic, just return a promise or observable
- Andrew: we've got a vague proposal but haven't written it up yet
- Dominik: we already do pretty much this. There's a "query observable" internally, and all the UI adapters just subscribe to that (such as
useSyncExternalStore
).- Andrew may have previously said that having users call
use(returnedPromise)
feels clunky - did you change your mind on that?
- Andrew may have previously said that having users call
- Dan: recapping his understanding of the issues people are describing, including the possible issues of letting/making users pass around the promise and deciding where to unwrap it
- Andrew: if we see that all users are going
use(useQuery())
, maybe you should just call it yourself internally. Most folks don't doconst promise = asyncFn(); await promise
in other JS code.- Might be nice to have conditional rendering return a promise instead of what we're doing now
- "Vend a promise" === return a promise instead of the value itself
- Fredrik:
skip/disable
actually avoids fetching entirely - Andrew: could return a "lazy
thenable
instead of a real promise" - Mark: Brian Vaughn has been using a "sync
thenable
" in Replay - Andrew: we like the simplicity of the
thenable
contract - Dominik: if you give
use()
a new promise, does it Suspend again?- Andrew: we've tried to make it easier to use even if the promise identity changes (note: we should update the RFC, we've thought of some new tricks we can use to simplify handling). The hard requirement is that the I/O is cached, not so much the
thenable
- Dominik: in RQ, we go into a full loading state on first request, on refetch we keep the existing data and refetch in the background. How do background updates start to come into play here? Wouldn't we suspend again?
- Andrew: those kinds of updates should be wrapped in
startTransition()
. Otherwise, React will show a fallback.startTransition()
is an opt-in that says "wait until you can do all the rendering". - Dan: some tension here that we need to figure out. We're trying to build React features that bring "don't know the result until render happens" into the component paradigm.
setState()
updates immediately, so you have to model the async behavior outside of React. With the Suspense model, the whole thing is one big state update. (A hypothetical Suspense/Transition-based router is conceptually one big state update. It starts, and doesn't complete until later.) If you wrap your head around this mindset, it changes how you build features. Rendering is an async process that might need to do more work, and you really have to rely on React's primitives for those problems. Loading -><Suspense>
, updates ->setState()
instartTransition
to do things in background, refetch -> React will keep asking for the old thing in render until the new thing is ready (instead of the existing "keep returning the old data until the new data is available" library implementations). The tension is that libs already have existing ways to do this, and also other UI libs like Solid/Vue/etc don't have an "async UI primitive". So, there's overlap in these features - how much do you want to lean on what React provides? How deeply do you want to integrate with React? Not sure how much we can overcome that tension.
- Andrew: we've tried to make it easier to use even if the promise identity changes (note: we should update the RFC, we've thought of some new tricks we can use to simplify handling). The hard requirement is that the I/O is cached, not so much the
- Lenz: currently all our hacky implementation stuff is in the React hooks - the underlying UI-agnostic logic is a lot simpler.
- Still concerned about the one-time promise resolution question. What about SSR?
- Dan: what are some examples you're concerned with?
- Lenz: use cases - changing data locally, mutating to the server, refetching it and having it update a bunch of places in the UI
- Andrew: so not "server-based" so much as "request/response"-based. Agree that we're not serving the "fine-grained subscription" / "websocket" / "live updates" use case. We might make
Observable
a type you can pass touse()
- would suspend until the first value comes back, regular state updates later. Also, what about multiple subscriptions with a dependency graph? That's what makes request/response so nice, simpler to handle. Acknowledged that it's not something we handle first-class yet. - Lenz: yeah, we look at
use()
and think our problems are solved, and then realize we need most of the same workarounds - Dan: our model is more naturally suited for denormalized caching, not entity caching. We're currently looking for something more like what React-Query does with caching responses, so you don't need observables to subscribe to a store. A mutation invalidates the whole cache, just refetch the whole cache entry. That's why we want to have a built-in
cache()
API with our own version offetch()
. (We won't patch it by default, but we'll encourage frameworks to do that, and have a scoped version that integrates with the cache.) It would be global across the app, have your own async operations that reuse the cache, React's ownfetch()
would be integrated so you don't have to keep re-wrapping it in your own code. Caching would be scoped to subtrees. Kinda likeuseMemo()
in a way. - Lenz: RTKQ isn't normalized. But, one mutation can affect 30-some other components that all need new data, so I don't know where
startTransition()
would be needed. - Dan: how does Next do it? It's like "re-render the whole thing"
- Andrew: kinda like routing, find the nearest route and re-render those.
- Andrew: maybe should skip more cache discussion since we're short on time. Yeah, we're kind of encroaching on data lib features here. No built-in invalidation other than "refresh the whole thing".
- Fredrik: other tension worth addressing here. Who calls
startTransition()
? Case: a mutation could cause an update, app dev could callstartTransition()
, but I think we want to avoid calling that in libraries b/c apps might not be ready for that yet. Going back to the background fetch question: what about just wanting to swap new data in for old? How doesuse()
check for new data vs old data?- Andrew: if you give us a new promise with a new value, we should use that
- Dan: other tension here is not wanting to use
startTransition()
. It's going in a circle - how much do you want to buy in to the new React ecosystem? Projects that want to be UI-agnostic have to work with the lowest common denominator. Similarly, some libs want to retain differentiation vs what React can do. But, you should be willing to callstartTransition()
in a library. That is the feature. Long-term perspective is we do want all first-class data fetching approaches to work within React's paradigm. We've built apps with this approach, we know it works, but it's diverged from the ecosystem's approach. So, do you take a bet on this approach or not? Next.js Router (Next 13/app
directory) is a really good example of this overall. - Fredrik: still don't think
startTransition()
is necessarily right for a data lib, more for routing. - Andrew: agreed it's awkward (saw this with Relay). Trap to avoid: compromising users with a brand new app who do want to use idiomatic concurrent features. This may mean having two APIs: existing hooks, new concurrent API (Relay did this). Avoid trying to shove everything into one API.
- Dan: I think Relay does use
startTransition()
inside? - Andrew: yes, for "consistency updates"
- Domniik: yeah, we could call
startTransition()
. Not happy with our ownsuspense
flag and how it works with TS. We should probably have a separate specificuseSuspenseQuery
-type hook to let people opt in to that behavior. - Jerel: Apollo is doing that sort of
useSuspenseQuery
hook
- Mark: what about error handling?
- Andrew: use an
<ErrorBoundary>
. We might special-case network errors. - Andrew: we'll ship
use()
one release after stable React Server Components, because RSCs will serialize promises to the client. Cache and observables and stuff would be sometime after that.
- Andrew: use an
- Fredrik: question on
startTransition()
, can't remember where I first read this bit. Think I saw a comment that it would be better for concurrent features to be used in userland first and not in libraries. However, one important question: how set are you on adding thepromise.status
-type fields? We might want to do that too.- Andrew: we're happy with that idea, yeah. I'm about as sure as I can be despite it not being shipped yet. We intentionally did it to make it an ecosystem convention.
- Lenz: can we set up some kind of ongoing communications channel, similar to the React 18 WG?
- Mark: yeah, WG discussion forum format was good
- Andrew/Lenz: yep, RFC threads are painful
- Dan: ecosystem is changing. Might make sense to start with a very limited Suspense-based feature with simpler features, get that working first, then expand. Whereas this other world is different, don't try to map everything 1:1 and hack around things with effects and refs. Start with just "get data in a component with Suspense and loading indicators show up, cache stays forever". Then do invalidation. Then, etc, etc.
- Andrew: we do like feedback. We're still thinking about future things even as we work on the earlier pieces.
- Lenz: when we think about starting without options, there's things like a "defer" annotation in GraphQL. No idea how to do that right now.
- Andrew: we have that, but it's UI-based. You mark a Suspense boundary as deferred.
- Lenz: that could result in streaming updates even for a "simple" query.
- Andrew: RSCs does out-of-order streaming, so each component can pop in as soon as it resolves.
- Mark: lots of unclear answers around things like waterfall fetches and priorities
- Andrew: generally: initiate ASAP to avoid waterfalls, unwrap as late as possible.
- Andrew: hopefully GraphQL providers get better at streaming.
(missed about 20+ minutes of discussion after Andrew+Dan+me left and I rejoined)
- Fredrik: 3 cases around when to call
startTransition()
in-app vs in-lib - Dominik: we only suspend when there's no stale data yet. Why would anyone want to show a fallback instead of the existing data?
- Jerel: maybe case when there's an explicit "Refetch" button? By default you probably want to show the existing UI
- Alessia: appreciated questioning around "websocket-y/subscriptions/live queries". Was hoping this would come up and we didn't go deep on what the React team has in mind - Andrew just acknowledged it's not addressed and this model is really about single-fetch request/response caching.
- Alessia: I've worked on "defer" support, glad it's being encapsulated in a separate hook. We thought about the core
useQuery
hook being Suspenseful, but glad we're doing it separately. Having a limited API surface will hopefully make getting started with this easier. Interesting to see that "websocket"-style support is off to the side. - Lenz: wrote down and underlined "observable support for
use
may be coming". But also interesting to see that hooks stay around. - Mark: yeah, helpful to know they're aiming for a denormalized cache concept.
- Jerel: interesting to see mis-match of "who's encroaching on whose caching approach"
- Lenz: the old "cache" implementation is so very limited.
- Mark: there's a
ReactCache.js
file in the React repo. It's really "memoizing", not "caching" per se. Memoizes on arguments withWeakMap
and a tree of cache nodes, pulled from Context to be per-subtree. - Fredrik: Sebastian talked about "Cache boundaries" as components, setting invalidation keys per subtrees
- Lenz: see reactwg/react-18#25 and the linked sandboxes . APIs may change, but the idea has been around
- Mark: what about needing to access something like a Redux store or API client in a Suspense async function? In Replay, we're having to call
getSomeDataSuspense(replayClient)
, and that means callingconst replayClient = useContext(ReplayClientContext)
first. I wouldn't want to have to do that with the Redux store.- Lenz: they seem to be moving away from Context some. Maybe due to doing things on a server and using "global server contexts" in Node?
- Fredrik: concerned about those
asyncLocalStorage
-style contexts losing data somehow. - Lenz: I think 90% of context usage is from people who are not working in big apps.
- Lenz/Mark: we should definitely try to have further discussions and collaborate