Reading Impressions of FFVII R online and I think I have come to the conclusion that I cannot intrinsically trust subjective reports AND that PC benchmarking needs to perhaps use capture frame Analysis and not rely on Windows based tools:
If you started Up FFVIIR in its Default Mode, the first level had incredible stutter. Every effect was displayed the first time and was acompannied by a long stutter. It has to be that way, there is no other possibility. The only thing is some do not subjectively perceive it.
Yet If you look at a benchmark online like those from GameGPU they report nothing of this in the GPU Bench numbers. Curiously, PCGamer article which quotes John and me mentions they loaded up the game and saw nothing of the stuttering in the opening.
The first load experience of a game; The first time you go to an area is the most important as it represents the clean playthrough of how someone experiences a game. Benchmark runs after shaders already cached or those that do not describe the fluidity do not capture this.
Reloading an area to check how it runs after the fact will not show the issues that occur the first time a player enters them, e.g. the game's intro. Similarly, Windows based tools do not exactly capture the output of a game. Only post analysis of output frames does it 100%.
Regarding subjectively people not seeing stutter: I think some people are inured to it and have somehow been trained through experience to accepted it as just how it is. FFVIIR also has Hit freeze as design, which may make people assume that the real stutters are even intentional
In the end I say these things because I want PC games to be the most fluid they can possibly be THE FIRST TIME YOU EXPERIENCE them. I really wish developers stressed this in their own testing: capturing the output of the screen in first run and analysing it for smoothness.
Internal metrics about AVG fps, or even 1% lows or even minimum framerate do not represent smoothness in the undeniable way a frametime graph from captured frames does.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Every executive/technical officer at Microsoft should watch this video. Summarized: Windows/DX12's current state is an active hindrance to responsive, smooth gaming on PC. Unlike under Windows/DX12, games on Bazzite do not suffer from intrusive shader compilation stutter. (1/4)
DX12 and Windows' default state is to leave competent shader compilation to devs, who regularly fail to do it as well as they should. That means games on PC can be hitchy jittery messes. Microsoft and the DX team need to embrace the reality and have a mitigation strategy (2/4)
Bazzite avoids shader compilation stutter due to Vulkan Shader Playback from Valve's Fossilize. Why on earth is there not a Microsoft made equivalent in the Windows/DX12 environment? 10 years of DX12 and its default experience without hefty dev work is "runs like junk". (3/4)
We all have our preferences, but one that I could never get over in VG graphics is thanks to Doom 3.
When idtech4 was shown off at Macworld 2001 running on Geforce 3 I was struck by Carmack saying "..the final unification of lighting and shadowing across all surfaces in a game.."
With game objects, moving and static, being treated equally for lighting and shadowing there was no longer obvious visual discontinuities in the game world. This was huge for me then and thereafter as I always knew something was wrong, but I could not put my finger on it before.
I could always visually see the seams in a game due to how objects looked. "oh! That objects is about to move, it looks different." Video games had the problems old cartoons have, E.G. the boulder wile e coyote is next to looks completely different than the surrounding rocks.
Perhaps an open discussion on reasonable usage of VRAM for visual return is coming - talking with @HeyItsThatDon about Diablo 4/reading the @ComputerBase article I am scratching my head. These are the textures you need to use to get a stutter free experience on a 12GB GPU: (1/5)
IMO like those textures in the screenshot above from Computerbase and the one below from @HeyItsThatDon make me wonder why these textures are those you need for a 12GB GPU. They honestly looks lower res than games that came from a decade ago! Surely we can expect better... (2/5)
As a contrast here are screenshots from the Witcher 3 Complete Edition with a Diablo 4 camera angle where each screenshot is 8 GB VRAM or less at 4K (no RT on): a lot higher fidelity textures here in TW3 with more reasonable VRAM utilisation. (3/5)
Working more on the Checkerboad rendering vs. DLSS 2.0 video today for Death Stranding and I think it is time to write an open letter to about DLSS 2.0 as a quality of life feature on PC that Devs should consider if they use TAA already. Devs please read! 🙂 (1/11 Thread)
TAA has been a great compromise for quality and speed since graphics engines are full of specular aliasing, aliasing in motion, and MSAA and SSAA are too expensive or complex to integrate. Single frame post-process AA (FXAA/SMAA) is just too unstable over time. (2/11)
With DLSS 2.0 we are essentially looking at a more perfect TAA where it is not showing the same characteristic full-frame faults that usual accumulation TAA has, or even 2 frame TAA has like SMAA 2Tx. It is doing temporal reprojection and rejection so much better that (3/11)