I implemented a couple of different approaches to reconstruct normals from a depth texture. No separate script dependencies, and works for Unity's built in and URP as a post process or on world objects.
No script dependencies comes from @_kzr 's latest version of his inverse projection setup, which oddly still uses a script to setup an inverse view matrix Unity already provides by default. github.com/keijiro/DepthI…
"Improved" method based on János Turánszki's Improved Normal Reconstruction with some minor optimizations. Works surprisingly well for getting rid of depth disparity artifacts, but adds others on interior edges. wickedengine.net/2019/09/22/imp…
"Accurate" method based on Yuwen Wu's Accurate Normal Reconstruction, altered (based on comments in that article) to work with Unity's existing non-linear depth texture. Only minorly more expensive than János's "improved" method. atyuwen.github.io/posts/normal-r…
It should be noted that all 4 options have different benefits and different problem cases.
If anyone is wondering why I didn't include a ddx/ddy based approach, that's because they're terrible and you should feel bad for suggesting it.
I should also note I don't exactly match @atyuwen 's example. I calculate the each derivative from the diff of the center position and one offset position rather than using the diff of two offset position. I'm not sure if the original approach is better?
Compared my "centered" modification to the original "offset" approach.
The original is definitely worse, so I have to assume it was a typo in the article? In the below there's a phantom ghost edge that pops up here.
I realized I was being dumb here. For a post process you do still need an external script since the inverse view matrix I’m using isn’t the same as when rendering the shader on a mesh.
However it might still work if replaced with unity_CameraToWorld instead. I’ll have to try.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Does anyone know if there's a proper name for anti-aliased pixel art scaling? I don't mean xRB / hq#x / MMPX. I mean the equivalent of integer scaling pixel art and down scaling via bilinear.
Maybe "super sampled nearest neighbor scaling"?
But, doesn't actually need to be super sampled. It feels like there should be a proper name for it, and maybe there is and it's just been appropriated by what people call pixel art scaling algos. shadertoy.com/view/ltBfRD
@IRCSS@benoitvimont@Atrix256 This is the code I used for converting from a dithered alpha to explicit coverage mask (either SV_Coverage or gl_SampleMask).
I want to rant a bit about recent price & spec reveal of the Samsung Odyssey G9, and why this it is an impressive bit of tech ... and how Samsung is lying by omission.
Basic specs their web site (and all the press dutifully copy paste everywhere):
1000R 49” 5120x1440 HDR1000 QLED 1ms 240hz with GSync and FreeSync Premium Pro support. This has all the bells and whistles you could ever hope for. Except there’s one more spec that’s important.
Weird new @unity3d bug I think I found. For the last few months I've had strange bugs with my builds that no one else on my team experiences. Totally clean projects will fail to build when using Build & Run, succeed when using Build, but may have crash at consistent points.
@unity3d I didn't really think much of it. They started happening on a single new platform that I was having other issues with, so I figured it was something with my dev environment for that platform that's broken.
Just started working on a new platform ... and I have the same issues.
@unity3d Both platforms error when doing Build & Run, and both errors are something deep inside Unity's c++ code, and are different errors. However, both were occurring inside sound related systems (one complained about the DSP not being available, the other that FMOD had no device).
We eventually laid out a guide with known good versions of Linux and graphics drivers, but it didn't matter. Part of the allure of Linux is the customizability, so few actually stuck to it, and generally wanted to run the game on older hardware we didn't support.