Routes can define data dependencies and the elements they render as asynchronous functions that will be guaranteed to resolve before rendering the next location. The suspense is palpable! π
π Parallelized Loaders
"πΆ Don't go chasing waterfalls", not even the data-fetching ones!
Navigating through 3 loaders to a deep link out of the gate? By default, loaders and async elements for matched routes are all parallelized for efficiency. You can fetch serially, too π€·ββοΈ
Placeholder spinners are the worst! To get rid of them, just turn on prefetching for individual <Link />s, all of them, or even prefetch manually with `useLoadRoute`. No more spinners. Basic Caching. Auto garbage collection. Win win win.
π Optional Built-in/External Caching
Out of the box, active and pending route data is cached while you navigate, so only changing UI triggers data fetching. BUT WAIT, you can also use RL's basic maxAge caching for more, OR you can even integrate RL with tools like #ReactQuery!
π 1st-class Integrated Search Params API
Search params are pure π₯-power when building apps so RL doesn't mess around. With 1st-class search param APIs everywhere, you'll *want* to put more state in the URL and your users will thank you for it every time they share or bookmark.
π Code-Splitting
Table Stakes here, but we have to mention it. Code Splitting is of course supported, and the coolest part about it is that any code-splitting you do in RL doesn't change anything about parallelization, prefetching, etc. Ahh tsss... split it. Split it real good!
Sometimes loaders can also be little π’, so pending states are a must. Time them for exactness to give a chance for that "suspense"-y feel, then fall back to the pending state. Heck, you can even force a pending state to show for a minimum amount of time!
πͺ Nested Layouts
Again, table stakes here, but we have to mention it. If you don't know the benefits of nested layouts yet... first of all, what router have you been using for the last 7 years? π But seriously, they're fantastic.
πͺ Batched Navigation/Render
Putting state in the URL and async navigating can get a little πΆοΈ for the URL and rerendering in large apps. "Routings a batch" π.
RL batches both URL updates and rerenders to your app to suss out every last bit of performance it can.
There's a handful of other awesome features like:
- Optional route filtering/ranking
- JSX-style routes for the "clean code"-ers
A a few more that are on the way, mainly completing the SSR story + examples
@theefer@OliverJAsh Iβll respond more at my keyboard, but the simple solution to this is to adopt invalidation patterns over manual data normalization. Just invalidate all of the queries. They already know how to fetch the exact data they need.
@theefer@OliverJAsh Regardless of React Query or not, normalized caches on the front-end come with the upside of fetching less data and using the responses of mutations as much as possible. When you like a tweet, you just go find that tweet in your store and update it
...
@theefer@OliverJAsh This situation works fine if that action doesn't have any side effects on the rest of the system. Imagine tweet deck that is showing only liked tweets, or a list that is sorted by time of liking. This now means that you either have to replicate all of the logic on your server
Lots of talk about React suspense today. Iβm very excited about it.
However, I feel like the patterns Iβve seen in #ReactQuery (and friends like SWR) have alone been more transformative for my own dev process and users and have even prepped me for suspense in a lot of ways.
Of course, Suspense isnβt out yet, and the ecosystem will likely experience a smash of collective inovation around it when it comes out. Thatβs the exciting part. But what I feel like Iβm starting to realize after playing around with it is that in the end,
The biggest thing that suspense is going to do for me is just one thing: avoiding that flash of loading/placeholder for newly mounted components. Suspense doesnβt fetch, it doesnβt codesplit, it doesnβt preload for you and it doesnβt cache for you.