Tú Michael 🇻🇳 Profile picture
✍ Ex-banking guy turned full-time freelancer 🌟Founder of KTC - private hub for global freelancers⏩ Want in? Fill the survey 👇
Nov 18 9 tweets 4 min read
[Day 8] Most people completely FAIL when trying to create an epic kaiju fight

Not because AI is weak, but because their pre-production is nonexistent.

If you want a cinematic monster battle that feels like IMAX… you need a process.

Here’s how to build one.

👇 Thread 1) What is a Kaiju Fight Scene?

Kaiju means “giant monster.”
A kaiju fight scene is not just two creatures throwing hits.

It’s:
• scale
• shockwaves
• environmental destruction
• impossible camera angles
• energy and chaos

A real kaiju battle feels like a natural disaster with a narrative.
Nov 17 13 tweets 3 min read
[Day 7] Most people play with AI.
A few direct it.

Here’s what separates AI toy users from AI production artists - the people who ship projects, earn client trust, and build careers.

A thread inspired by @CoffeeVector, with extra tips I learned by making “Dreamsplice”👇 1️⃣ Models are instruments, not factories

Treat models like cinematographers, each with a unique lens and motion rhythm.

Build a “model roster” doc listing what each one does best (camera motion, lighting, faces, color).

When a shot fails, swap the artist, not the idea.

Keep a few “fallback” models that trade visual beauty for reliability—use them to close deadlines fast.
Nov 14 8 tweets 4 min read
[Day 5] How to Use "The 7 Pre-Production Systems Cycle" Templates (Beginner Guide)

Many people DM me asking for a step-by-step guide of how to use "The 7 Pre-Production Systems Cycle" templates. So here it is.

These templates act like forms.
You fill in the blanks → the AI uses your answers → your images stay consistent.

You don’t need technical skills. Just describe what you imagine.

Quick note:
If you haven't got those templates, make sure to read Day 4. I will take down those templates soon.

I will share more exclusive templates that save you tons of hours if these posts doing well.

So make sure to bookmark, repost. Thank you. 😍Image Step 1 — Pick the Template You Need

Each template has a purpose:

Character DNA → for people

Vehicle DNA → for cars/ships/mechs

Environment DNA → for locations

Scene DNA → for full cinematic moments

Lighting & Color → for mood logic

FX & Atmosphere → for rain/fog/glitch style

Lore Continuity → for story rules

Choose the one you want to create first.
Nov 14 15 tweets 9 min read
[Day 4] The 7 Pre-Production Systems Cycle

The Harsh Truth: AI Artists Aren’t Failing at Art. They’re Failing at Pre-Production.

Most AI artists fail before they even start.

Not because their skills are bad, but because their pre-production is nonexistent.

If you want consistent characters, coherent worlds and cinematic anime scenes, you need systems… not vibes.

Here’s the AI Image-Generation Super Prompt I built for “Dreamsplice: Hollow Code”. I use 7 pre-production systems to lock character, vehicle and world consistency (Please read Day 3 about it)

I called it "The 7 Pre-Production Systems Cycle".

Steal it. Use it with Grok Imagine, Midjourney, Leonardo, or any anime model.

👇 ThreadImage The 7 pre-production systems form a foundational framework that stabilizes your entire anime universe before any image, scene, or animation is generated. They define what your world looks like, how it behaves, and why it feels consistent no matter which AI model, prompt style, or creative direction you use.

What it is:
A structured set of creative blueprints that guide character design, vehicles, environments, lighting, FX, and narrative rules.

Why it matters:
It prevents visual drift, style inconsistencies, and accidental contradictions. It keeps your creative identity stable even when experimenting with different tools or workflows.

How to use it:
Start by filling the templates with simple descriptions. Treat each system as a “rule sheet” that locks one dimension of your world. When all seven are complete, every image you generate will align with those rules automatically.
Nov 13 15 tweets 5 min read
[Day 3] 7 Pre-Production Systems for AI Anime Creators to Lock in Character Consistency Before Generation.

So Your Protagonist Looks the Same in Every Scene and Your Story Finally Feels Cohesive (With a sneak peak of a SPECIAL PROMPT) 🧵 When I started making Dreamsplice, my first AI anime short,

Riku kept changing faces.

In one scene, he looked 19.

In the next, he looked 35.

Yuna’s hair changed length mid-dialogue.

That’s when I learned: AI doesn’t ruin consistency. You do—if you skip pre-production. Image
Nov 12 13 tweets 4 min read
[Day 2] The BIGGEST misconception of AI animation.

I won 3rd place in a Japanese AI film contest with my first anime.

One-man studio. No funding. Just obsession.

Now I’m making my second one to join a contest by @morphic and @0xFramer

Watch the full video below and read on 🧵 @morphic @0xFramer Most people think AI animation is “type a prompt → get a video.”

That’s wrong.

Good AI animation feels alive — because it’s written, directed, and edited like cinema.

AI only extends your reach, not replaces your craft.
Nov 11 13 tweets 5 min read
🧵 [Day 1] The 5-Step ANIME ENGINE Framework: How I Built a Full AI Anime Short for the Morphic x Framer Contest

I just joined the @morphic x @0xFramer contest — and made a full anime short "Dreamsplice" using AI.

Here’s the 5-Step ANIME ENGINE Framework I used to go from idea → full short film in 2 DAYS

(A main thread to update later. Bookmark this) @morphic @0xFramer The anime industry is about to explode.

By 2026, AI animation tools will go mainstream.
This means:

1 freelancer can do what used to take 30 people.

Indie creators can build their own studios.

A full 2D anime short can be done in weeks — not months.