Can't share the complete code because the app's closed source and still in stealth mode, but here's how I'm using React Router and Redux in a large app with server rendering and code splitting on routes.
-
Wildcard Express route configures a Redux store for each request and makes an
addReducers()
callback available to thegetComponents()
method of each React Router route. Each route is responsible for adding any Redux reducers it needs when it's loaded. (This isn't really necessary on the server, but it's what makes code-splitting possible on the client later.)This is almost exactly the same approach shown in @gaearon's example here.
After the store is configured and routes are loaded, we run React Router's
matchRoute()
. -
If a route is matched, we loop through all its components and look for static
loadAsyncData()
methods, executing any we find and passing in the currentlocation
,params
, and our Redux store'sdispatch()
method. -
Once all
loadAsyncData()
methods have fulfilled their promises, we render the components to HTML and serialize the state of the Redux store to JSON and inject it into the page.
-
The main webpack entry point on the client loads the root routes, runs
matchRoute()
to load dynamic routes (which in turn add their reducers via ouraddReducers()
callback), and then configures a Redux store using the server-sent state as the initial state. -
Components that need to load async data before rendering execute their
loadAsyncData()
methods incomponentWillMount()
. For the initial render this results in zero additional requests and zero DOM changes, since the server has already provided and rendered the initial state.
-
With this architecture, it's absolutely essential that the app be entirely API-based. The server and the client have to be able to use the same code to make the same API calls, resulting in the same state, or things would quickly become unmanageable.
We chose to build an API based on the JSON API spec, and it's working very well for us.
-
As always, build tooling is a challenge. We're writing ES2015 code (with some extras like destructuring) on both server and client, which means we must transpile.
Rather than building everything with webpack, we use a two-path build process. Client-side code (including shared universal code that will run on the client) is built with webpack + Babel. Server-side code (including shared universal code that will run on the server) is transpiled by Babel, but doesn't go through webpack.
During development we use
babel-register
for on-the-fly server-side transpilation andwebpack --watch
for on-the-fly client-side rebuilds.This has worked out really well for us. It allows us to rely on webpack for code splitting and other optimizations that make sense in browser JS, without forcing us to also use webpack for server JS, where it makes less sense.
-
For CSS, we're using PostCSS, but because we don't want our component CSS to be dependent on JS, we're not building it or injecting it via webpack.
We wrote a pretty simple little stylesheet loader component using
react-side-effect
that allows us to load stylesheets on demand on both the server and the client in a declarative way, using markup that looks like this:<CSS path="/path/to/stylesheet.css"> ... component markup (not rendered until CSS is loaded) ... </CSS>
On the server, this results in static
<link>
elements being added to the markup of the initial response. On the client, this results in a<link>
element being dynamically added to the DOM if the referenced stylesheet hasn't already been loaded.
The end result of all of this is that our pages can be rendered entirely on the server (with or without JS enabled on the client), entirely on the client, or we can render the initial pageview on the server and subsequent pageviews on the client. If we get a sudden flood of traffic, we can even flip a switch and turn off server rendering temporarily to reduce server load while we scale up.
Thanks to code splitting, our main JS bundle is reasonably sized, and thanks to server-side rendering, we can serve a complete, working page long before slower clients have even started to download the JS.