Pushed #gen2 to its limits the first day & made a short.
AI animation will never be the same! I am deeply impressed as a filmmaker for what is possible now and where we are already in terms of quality.
It will only get better from here on. 😍🎞️🙏 Thanks @runwayml 🥳 #aianimation
"Taste of Duality" was made within the first day of my beta access to gen-2.
I started with image + text inputs, but soon discovered text only prompting gives some more repeatable and refined outputs. Using --seed & slight text changes felt almost like directing! 🎦 #runwayml
When doing text only prompting it felt like being not too specific with the things you want gives better results. So instead of defining every aspect of your vision, try to describe it in a more general, open way, that the AI can step in with its own creative realisation.
I started a new youtube channel for my artwork creations only.
Please give it some ❤️ and subscibe!
Also follow my art journey on
👉@visiblemaker
Thank you! 🙏
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Pushed #gen2 again & made a movie trailer. #aicinema is finally here!
Every shot made from text prompts, except one iconic shot you all know, done with #gen1
Made possible by @runwayml @bazluhrmann your movies been a great inspiration! 😍🎞️🙏
Voices: @elevenlabsio #aianimation
Some stats about used AI Tools:
☑️500+ shots generated with #gen2 beta to get 65 shots that made it into the movie
☑️5000 credits used to generate 3 custom voices with @elevenlabsio, that fitted my taste of timbre and likeness
☑️Initial idea by me, script co-created with #Chatgpt
Some stats about the film edit process:
☑️Music: the most important part for me! Some shots inspired me to find 2 tracks, put them together with the voices first, imagining the film in my head only
☑️Pace, timing, narration - all of that was done with the soundtrack first.
👇