Fortnite on PS5 draws a full 100W more power idling in the lobby than during game play. 🤨
If anything, I would've expected the reverse.
But even then, I would've expected the difference between idle and game play to be substantially smaller than that. Single player vs splitscreen coop seems to make a negligible difference in power consumption.
Jul 16, 2018 • 15 tweets • 3 min read
Gather 'round kids, story time.
I was at ATI (nowadays AMD) back in 2004-2007. At the time ATI had the reputation for poor OpenGL drivers. NVIDIA was ahead in perf and features. They decided to rewrite the OpenGL driver from scratch. A terrible decision.
Technically I'm sure the reasons were sound. I didn't write drivers, so I don't know, but I bet the architecture was dated and painful to work with. There were issues with code sharing across platforms, leading to unique bugs in Linux unrelated to the OS integration.