My Authors
Read all threads
So, looking at the UE5 demo, some thoughts...
The new GI system doesn't use lightmaps, is fully realtime... if you wait enough. Looking again at the demo, it's obvious that the GI takes a few frame to adapt:
It's not eye adaptation because the effect would be the opposite : less light = eyes open more wide to get more light. So it would go from dark to bright.
The water simulation they presented is odd. Wave dispersion seems ok, but the height of the wave seems very high, or the reflections are weird. Alos no particles for splashes. Feels a bit cheap and old.
The high density of polygons they showcase is likely the recent technology that Nvidia showed named "Mesh shaders" which allows to stream a lot more LODs of geometry. PS5 was announced with a similar technology.
This means heavy meshes will likely be processed in advance, editor side, and then streamed during gameplay/rendering without visible pop/seams.
More details here: devblogs.nvidia.com/introduction-t…
"You will be able to use 8K textures thanks to Virtual Texturing !"
Great, but I seriously doubt game content will be seen at that level. 8K is crazy, and mip-maps will stream off details very quickly.
Even 4K textures today are very rarely seen at 100% of their texel ratio.
This will be more useful for animation/VFX productions rather than games, which will render at very high resolutions the final image. Don't forget the time it takes to produce this kind of content as well. (Unless you have tools using procedurally generated details ;] ).
Happy to see new features for animation with the contextual situations but also the foot warping and predictive placements. This will make interaction with the environment much easier and will help to ground the characters.
Of course, this could be already done in UE4, but having access to such tools natively will make them easier to integrate. Animation blending today is still a difficult topic imho. I wonder if Epic is also looking into system such as motion matching :
I really like the fact that the bugs are handled via Niagara. The new particle system prove to be really polyvalent. Seeing examples like this makes me wonder how much things we will be able to simulate. :)
This statue come from "directly" comes from zBrush, with 33 millions of triangles. I guess it still need UVs because you can't store all the material properties in vertex color only.
Maybe they use material layering as well for this asset ? It could make the texturing process easier. With the high-poly count fidelity the texture work can be less intensive. So you could just blend generic materials via masks.
Tools will need to evolve to make texturing such big assets easier. Better handling of big textures, faster/better UV unwrapping tools. Maybe automatic UVs will be enough because we will be able to compensate via bigger texture. Big waste however.
With such high polycounts, the main bottleneck now will be the production time allowed on an asset and less about the performances of the asset itself. At least this is what seems to be promised.
It won't be possible to scale this level of details (similar to VFX productions) to a full game without some kind of automation at some point. In VFX you usually focus on a scene only, with specific point of views. Games have free cameras.
This part of the demo is very impressive, it showcases how fast the renderer can stream high levels of geometry from the system, while having physics simulation (baked via alembic or realtime ?) at the same time :
No RayTracing in the demo however. The GI system doesn't use it, shadow borders seem harsh and don't widen based on distance, not shiny/mirror reflections.
Outside of the technical aspects (which promise great stuff for which I'm really looking forward), the demo shows as well a really nice artistic direction.
They have a great use of indirect lighting to light the different areas:
That's it ! :)
Overall great demo, I love seeing this kind of stuff because it is always inspiring. I recommend watching the demo on vimeo to get the best quality: vimeo.com/417882964
Indeed, looking back at it you can see only the GI for the first light change if you keep you eyes on the end of the tunnel. This is great, it means dynamic GI is not limited to the sun light. :)
Not RTX since no raytracing, but it could be some kind of Voxel based information (SVOGI maybe ?). The multiple-frame interpolation nature means they update info in one place and then propagate it. A dense lightprobes grid is another way.
Maybe something in-between. :)
Some games will continue to use high to low poly workflows, like stylized game. Main reason being the scaling from one platform to the other. Automated tools will never beat handcrafted assets on that point.
If this Mesh Shader system runs on PS5, it means the AMD GPU handles it, therefore Xbox and AMD GPUs on PC will support it too at some point.
Indeed !
It also means the difference of look between a cinematic and the actual gameplay will be even more smaller thanks to the shared assets and quality level.
I wonder if Tri-Planar UVs could be enough in this case, given the nature of the material/surfaces (noises/dust). This way you could paint on the asset in-engine directly.
In this case, it could be possible to able soft vs hard shadows thanks to the pre-computed distance field. UE4 already supports that for its directional light. It has some limitations however.
Indeed there is some eye-adaptation ! I didn't mentioned it since it wasn't something "new" to the engine. UE4 4.25 introduced a big revamp of this system recently by the way.
Feel free to send me more questions ! :)
Yes !
That's why software like Keyshot/Marmoset Toolbag are popular. I think artist want to to reduce the time between iterations as much as possible and tools like that help a lot.
A good example of that is the work of Tyler Smith which is made in UE4 : artstation.com/tsmith3d
They could be using Tesselation, but I doubt it. 🤔
Tesselation has a lot of caveats (like UV cracks and performances). Also the polygon view they showed doesn't have the typical tessellation patterns.
Too many missing variables, such as:
- Original asset polycount
- Format of storage of the engine data for the micropolygons
- Allocated memory for the demo
- Etc.
To guess you need to know the input and context, we don't have any.
Polycount doesn't matter much if the Triplanar computation is done in the pixel shader. Also GPUs are good at parallelization, so accessing the vertex data to compute the blending mask should remain fast.
(Almost) Nobody use Ptex.
Any workflow that makes the texture data based on the vertices is not popular because you loose everything as soon as the mesh change. Computing LODs (or similar) becomes complicated (need to transfert), etc.
It's some kind temporal effect yes, but not on the final image. Otherwise there would be some kind of ghosting or blurriness/loss of details. In native video the image remains sharp.
I won't speculate further, Epic Games should announce/present more technical details tomorrow. It's better to wait than guessing at this point. :)
Missing some Tweet in this thread? You can try to force a refresh.

Enjoying this thread?

Keep Current with F̴̮͛͝R̶͔͛̈́Ö̴̖̫́̊Y̶͙̭̎͝Ǫ̴̼̏K̸̦̎͐ ▒ Fabrice Piquet

Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!