I managed to get simple 3D physics working in the prototype for my WebXR game for #js13k. Best part is, I'm not using any extra libs! It's all vanilla JS, clocking in at 9 KB right now. The hand mesh alone is 3 KB so I'm sure there's plenty room for optimization, too.
More #js13k progress! I added toon shading, a few textures, and replaced the hand mesh with a paw which has fewer vertices. This is now still around 9 KB zipped!
Have you ever wanted to be a Godzilla wreaking havoc in a city?
I added rooftops to each building type and a road texture, but really, this tweet is about the fire breath.
Tonight I learned a few things about blending and animating UVs. I feel like I'm discovering the rendering techniques from the early 2000s—and I'm absolutely in love.
Here are a few work-in-progress shots back when I was using different colors to tell building fire from the fire breath apart. It's somewhere between Warcraft 3 poison effect and The Matrix :) #js13k
For the fire sprites I'm using additive blending, and it was enlightening to learn about why different blending functions are needed, and when to use which one.
Lots of updates to my #js13k#webxr entry about the zen of destruction:
I've been experimenting with a Pistol Whip-like gameplay for my #js13k#webxr entry today. To be fair I'm not sure I like it. I'll try to tweak it a bit but without a good soundtrack this might be a dead end.
Build size: 12.7 KB (yay optimizations!)
I came to gamedev for games, I stayed for the odd bugs. Here are the buildings moving in the direction they're facing rather than towards the player.
BUILDING POPCORN MACHINE!
I didn't post updates in this thread during the last week of the competition because I was stressed out by the deadline. I did, however, take notes and work-in-progress screen captures which I'd like to share now that the competition is over. #js13k
I was pleasantly surprised by the JavaScript and WebGL performance on the Oculus Quest. With a simple frustum-culling system which skips unnecessary draw calls, ROAR runs at a smooth 72 FPS.
The game is GPU-bound, and the cost of the extra in-frustum check is negligible. Here's a typical frame profile: sys_render takes 8 ms (83% of the frame), while the CPU-bound systems like sys_transform, sys_collide, and sys_cull add up to less than 1 ms!
It shouldn't be surprising that so much time is spent rendering. To start, the headset has to render everything twice, one for each eye!
Let me walk you through some numbers and measurements which made me realize I needed frustum culling.
The scene typically consists of 64 buildings, each made of 5 cubes on average. That's ca. 300 draw calls and over 10,000 vertices just for the environment. There are also another ~50 draw calls for the hands and other props.
Each building cube has a dormant particle emitter which activates when the building is set on fire. A control system slows the fire down and finally puts it out after ~20 seconds to limit the number of particles on the screen.
An emitter is additionally limited to at most 200 particles. Still, if you go crazy and put the entire city on fire, the GPU would need to render ca. 40,000 particles, each drawn as a textured point with additive blending.
Keep in mind that each emitter is its own draw call, too. In the extreme case, that's almost 500 draw calls and up to 50,000 vertices per frame. This was a problem for performance.
When I tested it, I saw around 50-60 FPS with a reasonable amount of fires started (still pretty good!), and down to 15 FPS for the extreme case when all of the city was burning.
(At this point I considered solving this through game design rather than through optimizations. E.g. I could have made the fire breath require some kind of "fuel" which would be in limited quantity. Breathing fire is fun, though, so I decided to try a technical solution.)
The first iteration of the culling system turned off the Render and EmitParticles components for entities outside the camera's frustum, normalized into the NDC. This was enough to get the number of draw calls to around 150 per frame.
Oculus docs recommend ~50-100 draw calls and ~50,000-100,000 triangles or vertices per frame. I suspect that ROAR got away with more because it only has two materials and changes them only once per frame. developer.oculus.com/documentation/…
(All textured objects, almost all of which are cubes, are drawn first, and then in a second pass all particles are rendered. This happens to work great for blending: translucent particles are drawn on top of all the textured objects.)
The story of the culling optimization is not over, though. 150 draw calls per frame would be good enough if I could guarantee there couldn't ever be more.
Spoiler: there could be more.
At first, the culling only applied to entities behind the player or to the sides of the peripheral vision. This was good enough because at the time the player couldn't really move too far from the center of the scene.
Once I implemented locomotion, however, it became possible to move away from the center of the scene, turn around, and see ALL the buildings. I was back at 300+ draw calls per frame! The rendering performance was bad again.
The solution was to use the oldest trick in 3D programming: the fog! I also decoupled the camera's far distance from the fog distance so that the missiles launched from far away are still rendered.
Thanks to the fog, I can turn off rendering of buildings fairly close to the player which would otherwise be in their plain sight. The number of draw calls is now usually well under 100!
The culling systems still isn't perfect: it only considers objects' positions rather than the bounding boxes. Due to this, you can sometimes see objects at the edges of the screen disappear too early. Cf. these floating buildings:
To sum up: I was able to achieve very good performance on the Oculus Quest thanks to two optimizations:
1. a culling system combined with the fog, and 2. a control system for fire which slows down the release of new particles and puts the fire out completely after some time.
PS. I used the OVR Metrics Tools to monitor the performance on the Oculus Quest during most of the development. It displays an overlay with performance measurements on the screen; it's super helpful!
Two weeks ago over Easter I started working on a submission to the @Gamedevjs Jam. The theme was 'Raw,' and I wanted to create a processing pipeline simulator. This is how Super Simple Salad Simulator came to be. #gamedevjs
It's my hommage to The Incredible Machine from 1993, and also to the vegetable salad, sałatka jarzynowa in Polish. It's a springtime treat in Poland. The full list of ingredients includes potato, carrot, green pea, apple, pickled cucumber, onion, boiled egg, and mayonnaise.
When researching it for this project, I learned that the salad apparently started as a completely different recipe with many kinds of meats in it. Depending where you are, you may know it under the name of the French, the Italian, or the Russian salad. en.wikipedia.org/wiki/Olivier_s…
You Are The Snooze Button is a submission to Ludum Dare 50; the theme was 'Delay the inevitable.' I entered the Compo category, which meant the game had to be made solo, in less than 48 hours, and with original assets only.
ESCAPE is a 2.5D puzzle platformer in 13KB of HTML and JS (zipped) by @michalbe and myself. It's a short story about wildlife and nature on Earth after mankind leaves for good. You can play it right now in the browser on desktop and mobile.
In ESCAPE, you solve environmental puzzles as you progress through a misty world abandoned by people. We were inspired by games like @Playdead's INSIDE and movies like @bladerunner, and we tried to convey a sense of loneliness, remorse, and confusion.
ESCAPE fits into a 13KB Zip, but we built it using fully-featured tools. All our assets, props and scenes are built in Blender and imported into the game as GLTFs. They take up a large part of our size budget, leaving even less space for the engine code and game logic.
It's August again which means @michalbe and I are taking part in @js13kGames! This year's theme is Space.
We're joining a week late with a new idea: a 2.5D puzzle platformer on Earth abandoned by humans and reclaimed by nature. #js13k
When the theme was announced over a week ago, we spent a few hours on Saturday and Sunday brainstorming game ideas. And to be frank, we didn't like any single one of them. They just didn't seem to click.
Most of them weren't even game ideas, just plays on words that could dub as game titles, but nothing more. There was RequiesCAT in Space (probably a game about an astronaut cat), Spaciba (probably a game about cosmonauts), Space Bar (probably a game about managing an inn?)...
The jam's only restriction is that the game must be playable in a web browser. It's a refreshing change from the size-constrained jams that I typically take part in. I'm not used to not having to worry about the code and assets size.
Part 2 of this experiment starts right now! Today, I plan to implement projectile, wall and pickup collisions. I'll generate the terrain procedurally, and I'll add a simple UI.
With this set of features, I want to build an MVP version of the game: loading a map, advancing through it, killing enemies and picking up items, and finally finding the exit.
The idea behind this milestone is twofold:
1. Test and evaluate the core gameplay loop. Show it to other users and gather early feedback.
2. (If this was a compo game) Have something that can be submitted at any time, in case I can't finish the game for any reason.