@Tocelot Dalle, Midjourney or StableDiffusion generate exquisite images. However, it’s challenging to reconstruct specific subjects while maintaining high fidelity to their key visual features. Even w dozens of detailed iterations, the outcome is rarely consistent. Prompting is daunting…
Meet #Dreambooth. I fine-tuned SD 1.5 by using a subset of 20 images and defining the corresponding class.
Once the model was trained (1 hr), I generated 160 weapons (swords, axes, daggers…), with a consistent design, in less than 10 minutes (I could have created 10x more):
HMU if you want to see some of the high-def files. @photoroom_app was used to automatically remove the background on all images (kudos @matthieurouif & team, amazing tool and UX).
While 2D game props seem an easy example, the consistency of the outcome is impressive. Other ongoing tests (on styles & characters) are showing similar results.
Can’t wait to see game companies, game artists, and communities adopting this at scale.
Adding a few great recent tweets/threads on Gaming and/or Generative AI:
A special thanks to @hervenivon for his help with training data.
Also, @SylvioD, @clivedownie, @shay_i_am, @albn feel free to take a look at this, and what comes next! 🚀 There's enormous value to capture, for larger actors in the gaming space.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
I've listened to hundreds of podcasts and interviews about the Metaverse, but this one honestly stands out as one of the best so far. In just 30 min, @AdamDraper explains why NFTs matter and why they will be a key element of the #Metaverse
And because I really enjoyed it, I extracted are some key quotes from the transcript. But you should listen to the video too! And learn about NFTs, the Metaverse, DAOs, identity in the metaverse, and why you want to be part of new technologies that seem like “jokes” at first.
“Scarcity on the internet is gonna be a big deal. People are saying that we're in an NFT bubble. I think people are negative with hope, they hope that it pops so that they can buy in”.
So last WE, I went down into the Catacombs of Paris. If you're not familiar, it's an insane anthill-like network of 200 mi. of galleries and chambers, centuries-old, 60 ft beneath the surface. @NewYorker wrote about it in 2019: newyorker.com/news/dispatch/…. Here's a short story.
1/12
The goal was to 3D scan as many places as possible, using only the iPhone 12 Pro/LiDAR (and two powerful LED lights). And so we spent 12 hrs underground, walking (or crawling) more than 20 miles, and scanning 30 different places using @Scaniverse, @PolycamAI, and @Sitescape
2/12
All scans were uploaded to @Sketchfab and the whole series is available here: sketchfab.com/edemaistre/col…. Interestingly I was able to process all the scans immediately on the device, down below (without service). Cool way to demonstrate the power of LiDAR scanning w the iPhone