Profile picture
, 11 tweets, 4 min read Read on Twitter
⚡️ #GraphQL & The "Dataloader" Pattern ⚡️

If you're just starting out with GraphQL or getting issues with data loading during execution, you've probably heard things like "Make sure you're using Dataloader" or "Just use batch loading!". What is this all about? Thread:
The principal unit of execution in a GraphQL API is the *resolver*. The resolver is a great pattern to use because it is a stand-alone function with a single purpose, providing data for a single GraphQL field.
The basic way servers execute queries is in a serial manner -- where each field is executed one after the other. Basically a GraphQL server will go through the provided queries in a depth first search manner and execute every resolver function allow the way.
While the resolver concept brings many great things, it can get tricky in regards to loading data requirements for a query. Imagine we were designing an endpoint to return the data above. We'd probably ideally load data this way, with for example 4 SQL calls:
A naive GraphQL implementation will usually look more like this. Notice how every single child resolver is making a call for a user at the bottom. This is because they share no context to load this data all at once for example. If there were 50 nodes we'd see **53 queries**, bad!
You'll often see this called the N+1 problem, but that's only one class of problem. GraphQL makes it hard to optimize data loading in general, for example reusing data that was previously loaded!
The most common solution to this is to change the expectation we have from resolvers. Instead of having them always return a value right away, we could simply say that it will eventually have a value, like returning a **Promise** (This could be any other future result construct)
The next step to solve this problem is to introduce a new concept: **Loaders**. Loaders are quite a simple concept. They take in things to be loaded with a `load` method, which gives us a **promise**, and they load them efficiently once we tell them we're ready.
Depending on the language and implementation, the GraphQL server is able to tell loaders that it's time to batch load everything and carry on with the execution. Notice how the execution is not depth first anymore, and how "lazy" the loading becomes.
Because of this lazy approach, we get much closer to the ideal loading scenario we wanted, because resolvers now always go through loaders to get data, which have all the information needed to make smart choices about data loading 🤘
You can read the much more detailed version right here! medium.com/@__xuorig__/th…
Missing some Tweet in this thread?
You can try to force a refresh.

Like this thread? Get email updates or save it to PDF!

Subscribe to Marc
Profile picture

Get real-time email alerts when new unrolls are available from this author!

This content may be removed anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!