1/19 Here’s thread 3 on the stylized rendering in "Dungeons of #Hinterberg". We'll look at our different “pipeline” stages, & this'll get a little bit icky. :)

If a mix of action-adventure and Persona-style social sim is your jam, follow us at @MicrobirdGames. #gamedev #indiedev
2/19 Thread 1 (quick recap coming up!) covered the basics of our deferred rendering and lighting system:
Thread 2 (not important for this one) covered how we do outlines:
3/19 To recap: we use deferred rendering in Unity's builtin pipeline with a custom gbuffer that stores material IDs, to essentially create a 2nd material system that runs on the GPU. Through the lighting pass we also get a texture with light intensity for each pixel.
4/19 So our post-FX know a lot about each pixel - its material properties and ID, and how much light it received. We can have effects like our screentones (turned up here) include/exclude specific materials or use different colors per material.
5/19 But what if regular VFX - impact flashes, particle systems - had access to that information as well? What if we could have a few stages in our pipeline where the gbuffer from the previous stage is stored and any shader afterwards could access this info from the pixels below?
6/19 We could do things like drawing the enemy's gooey skin in two passes (second pass is offset along normals) and "simulating" the liquid fx with simple screen-space displacement that is applied before outlines are extracted…
7/19 Or we could take an otherwise pretty lackluster fire effect and turn it into something menacing by looking for the material ID of the enemy's face and giving those pixels different albedo / emission / section IDs.
8/19 Or have a deferred decal system that is only applied over specific materials!

And I can’t wait to use that stuff for our water shader. :)
9/19 Conceptually, these are the stages we wanted in our pipeline:
1.) Deferred Geometry
2.) Deferred Overlays 1
[Outline Extraction]
3.) Deferred Overlays 2
[Lighting Pass]
4.) Forward Transparent 1
[Apply Outlines]
5.) Forward Transparent 2
10/19 Between each of these stages we want to grab a copy of the gbuffer (after a deferred stage) or forward image buffer (after lighting) and hand it to the next stage. In built-in, we can do all of that with stacked cameras and CommandBuffers, but…
11/19 ... how do we make sure Unity doesn't re-render every light for every deferred stage? Or re-render shadow maps? How do we exclude objects from being picked up by several cameras, if we need the layer system for other stuff (physics)?
12/19 In an ideal world, our game would probably be a good use case for custom SRP. But we're a tiny studio, and rewriting the entire rendering is not within our means. And frankly, I have no idea whether the API and its docs are now even stable enough to do something like that.
13/19 Plus, right now we can adapt 3rd party shaders/post-FX (like @AmplifyCreates' AO) to our system pretty easily. Custom SRP would throw that out the window. So we need to stay on built-in.
14/19 So here's where things get icky:
How do we make sure each camera only picks up materials for its pipeline stage? Well, there's a shader Tag called "RenderPipeline" that excludes subshaders from being drawn if they don't match a global keyword. docs.unity3d.com/ScriptReferenc…
15/19 And how do we keep Unity from re-rendering each light and its shadow maps for every deferred camera in the stack?
We, uh, hook into OnPreCull and turn off lights or shadows depending on which stage we're rendering.

Yeah, if you have a better solution, I want to hear it! :)
16/19 The best one I can think of is to get rid of Unity's lighting entirely and keep a list of lights on the GPU, and do the whole lighting in a single pass. That might entail rewriting the shadow mapping too. Either way, nothing I can fit into our schedule in the near future.
17/19 For now, our solution is more costly than it should be, but the unoptimized game runs at ~60FPS on my 6 year old desktop with a GTX 970. I expect we'll have to do better if we want to handle Switch, though.
18/19 To end on a slightly sad note: Can you imagine what we could pull off if Unity had kept investing in built-in for the last years? Studios like ours don't need to squeeze every bit of performance, we want flexibility to render weird things. Built-in was perfect for that.
19/19 Anyway, if you enjoyed this, follow us @MicrobirdGames and/or subscribe to our newsletter at dungeonsofhinterberg.com (we respect your inbox - only major updates!)
#MadeWithUnity

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Philipp Seifried

Philipp Seifried Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @PhilippSeifried

18 Feb
1/18 Here’s our second tech art thread on the stylized rendering of “Dungeons of #Hinterberg”. This time, I’ll talk about how we do our outlines!
If an indie mix of Zelda-like/ARPG and Persona-style social sim is your jam, follow us at @MicrobirdGames. #gamedev #indiedev
2/18 Main point of the first thread: we use deferred rendering with a custom gbuffer that stores material IDs, to essentially create a second material system that runs on the GPU. The full thread is here:
3/18 Our main goals for outlines were: resolution independence, good results with minimum tweaking & artistic control. Our material system helps a lot with the latter two, offering control over outline types, colors, width, & a handful of tweak factors that we hardly need to use.
Read 18 tweets
16 Feb
1/8 Rewrote our camera system last week, from a basic FSM to a stack of behaviour layers with blend weights. I don't know if there's a name for this design pattern, but I've been using it a lot lately, so here's a description:
#gamedev #indiedev
2/8 Suppose you have a set of values that can be produced by different behaviours. Usually only one behaviour is dominant, but you want behaviours to transition smoothly and some behaviours should be able to offset the values rather than overwrite them.
3/8 Example: Our camera is mostly player-controlled, but we want to smoothly transition to cut scenes, & screenshakes or skill aim offsets should work on top of the general logic. Other ex: rumble intensity, global time scale (if you have slowdown mechanics), post-processing.
Read 8 tweets
28 Jan
1/x Over the next weeks, I want to post a few threads about the stylized rendering we’ve implemented for our game “Dungeons of #Hinterberg”. I’ll start with an overview, but I'll get pretty technical. Follow @MicrobirdGames, if you think the game looks rad! :)
#gamedev #indiedev
2/x #Hinterberg is #MadeWithUnity. We use a completely custom deferred solution, based on the built-in render pipeline. Our standard shader mimics Unity's, but was built from scratch, as were the lighting pass shaders. @catlikecoding's tutorials were an invaluable resource here.
3/x The main secret ingredient is that we store collections of various material properties in a StructuredBuffer on the GPU, and let deferred shaders write their material IDs (i.e. their index into the StructuredBuffer) into the gbuffer.
Read 16 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!