Kostas Anagnostou Profile picture
Lead Rendering Engineer @WeArePlayground working on @Fable. DMs open for graphics questions or mentoring people who want to get in the industry. Views my own.
2 subscribers
Aug 30, 2021 4 tweets 2 min read
Managed to clear the DM backlog, the large majority involved how to start learning gfx programming and how to find a job in the industry. There's a lot of anxiety, given the complex nature of graphics and the amount of learning resources available nowadays is overwhelming. (1/4) However complex it may seem, the core of graphics programming hasn't changed, it's still about lighting and shading. Understanding and implementing a lighting model and materials to begin with will take you quite far and can gradually extend it as your knowledge increases. (2/4)
May 1, 2021 4 tweets 1 min read
In graphics programming we use a lot of awesome sounding names for techniques, which often trigger fantastic mental imagery (as well as actual imagery). Too many to list them all, the top 3 of my favourite ones, in no particular order, probably are: (1/4) 1) "Ambient occlusion": the percentage of rays cast from a point over the hemisphere centred around the surface normal that are not occluded (do not collide with) by geometry. A value of 0 means all rays collide, 1 means none does. (2/4)
Oct 4, 2020 4 tweets 2 min read
During my years in graphics there have been many great conference presentations but also a few that I found "eye opening" and changed the way I think about and approach gfx programming. My top 3, in no particular order, probably are (1/4): "Uncharted2: HDR Lighting" from GDC 2010, slideshare.net/ozlael/hable-j… by @FilmicWorlds, great introduction to linear lighting and its importance in graphics (2/4)
Aug 29, 2020 4 tweets 2 min read
People starting to learn graphics techniques and a graphics API to implement them may find the whole process intimidating. In such a case there is the option to use a rendering framework that hides the API complexity, and handles asset and resource management. (1/4) There are quite a few frameworks out there, for example:

bgfx: github.com/bkaradzic/bgfx
The Forge: github.com/ConfettiFX/The…
Falcor: github.com/NVIDIAGameWork…
Cauldron: github.com/GPUOpen-Librar… (2/4)
Aug 18, 2020 6 tweets 3 min read
Great question from DMs: "How bad are small triangles really"? Let me count the ways:

1) When a triangle goes pixel size it may miss the pixel centre and not get rasterised at all, wasting all the work done during vertex shading docs.microsoft.com/en-us/windows/… (1/6) 2) Even if it does get rasterised, since the GPU shades pixels in 2x2 quads, any work done for pixels in the quad not covered by the triangle will be wasted, leading to quad overshading. blog.selfshadow.com/publications/o…, blog.selfshadow.com/2012/11/12/cou… (2/6)
Jun 20, 2020 4 tweets 2 min read
Good DM question: "is it better to dispatch 1 threadgroup with 100 threads or 100 groups with 1 thread in each?" The GPU will assign a threadgroup to a Compute Unit (or SM), and will batch its threads into wavefronts (64 threads, on AMD GCN) or warps (32 threads on NVidia). (1/4) Those wavefronts/warps are executed on the CU's SIMDs, 64/32 threads at a time (per clock), in lockstep. If you have only one thread in the threadgroup you will waste most of the wavefront/warp as they can't contain threads from different threadgroups. (2/4)
Apr 5, 2020 6 tweets 3 min read
A mini Importance Sampling adventure: imagine a signal that we need to integrate (sum it's samples) over its domain. It could for example be an environment map convolution for diffuse lighting (1/6). Image Capturing and processing many samples is expensive so we often randomly select a few and sum these only. If we uniformly (with same probability) select which samples to use though we risk missing important features in the signal, eg areas with large radiance (2/6). Image
Jan 26, 2020 4 tweets 1 min read
Question from DMs: "So can the GPU automatically generate a Hi-Z pyramid?"

The confusion comes from a GPU feature often called HiZ (esp for AMD GPUs): for every tile of pixels (say 4x4 or 8x8), the GPU stores a min and max depth value in a special buffer while rendering. (1/4) Every time a pixel tile, belonging to the same triangle, arrives, the GPU will use the min/max value in that buffer, that corresponds to the tile, to compare it with the min/max depth values of the pixel tile. (2/4)
Jul 8, 2019 11 tweets 2 min read
I got an interesting question via Twitter today: why would a single, high polycount mesh (>5m tris) render slowly? Without knowing more about the platform and actual use, off the top of my head (thread - please add any potential reasons I have missed): The vertex buffer may be too fat in terms of data types and/or stores more data than we need. "Smaller" formats (bytes, half, or oven fixed point) may help.
Mar 22, 2019 7 tweets 3 min read
Whether you are working on PBR, shadows, area lights or GI it always helps having a "ground truth" raytraced image for reference. If you can't make your own, Mitsuba is an easy to use pathtracer that can give good results. mitsuba-renderer.org (thread) Image You can control the number of light bounces to emulate a more "traditional" game environment, without an advanced GI solution, to focus on the shape of the shadow or the response of a material to a dynamic light. Image
Feb 12, 2019 4 tweets 2 min read
Sweet, the Raytraced Shadows chapter (developer.nvidia.com/books/raytraci…) references my blog post with the first experiments into hybrid raytracing (interplayoflight.wordpress.com/2018/07/04/hyb…). Still good as an introduction, lots of pretty pictures. Since then I tried non-programmer art, discovered SHA, full BVH tree creation and lots of hardware specific improvements that sped up raytracing a lot. interplayoflight.wordpress.com/2018/09/04/hyb…
Feb 7, 2019 5 tweets 2 min read
A few interesting posts on the Eidos website (thread):

Deferred+: next-gen culling and rendering for dawn engine eidosmontreal.com/en/news/deferr… Depth proxy transparency rendering eidosmontreal.com/en/news/depth-…
Nov 15, 2018 9 tweets 4 min read
Scalarisation and wave intrinsics (to allow intra-warp direct communication) is what all cool kids seem to do nowadays, here's a list of resources on the topic. Wave Programming in D3D12 and Vulkan …82yhj72224m8j.wpengine.netdna-cdn.com/wp-content/upl…
Sep 24, 2018 16 tweets 5 min read
Someone at work asked me today where do I find all those presentations about graphics techniques and made me realise that it might not be so common knowledge to people just starting gfx programming. Thread of links. Siggraph's Advances in Realtime Rendering is a invaluable resource for cutting edge rendering techniques advances.realtimerendering.com
Jun 28, 2018 8 tweets 4 min read
Programming with compute shaders (efficiently), balancing workloads with resources and thinking in parallel, gives many opportunities to learn how GPUs really work (well, pretty close at least). A few links to get you started. (1/N) A very gentle intro to CUDA devblogs.nvidia.com/even-easier-in…, devblogs.nvidia.com/easy-introduct…, then a great course on Udacity "Intro to Parallel Programming" eu.udacity.com/course/intro-t… (2/N)
Jun 7, 2018 5 tweets 2 min read
A common theme in the questions I received so far is that beginners feel intimidated by graphics programming and do not know how to start. They need not be though as graphics programming can be approached in different ways and at many levels of complexity (1/5). One can start by exploring shadertoy.com or something similar which allows you to try basic pixel shader programming with instant feedback. Lots of good introductory tutorials online (gamedevelopment.tutsplus.com/tutorials/a-be…) (2/5).
Jun 3, 2018 7 tweets 2 min read
Early prototyping work for a previous game. This is combining volumetric lightshafts and low fog in one screenspace pass. Image This is experimenting with screen space snow accumulation. Image
Jul 28, 2017 6 tweets 2 min read
Where do I start graphics programming? yosoygames.com.ar/wp/2017/07/whe… Great links to learn more about graphics, although not all for "start".
Jun 29, 2017 5 tweets 1 min read
Image Image