Like, if you watched yesterday's stream, I did the absolute bare minimum of organization and wasted a lot of time clicking on clips trying to find the thing I was looking for, way more time than it would have taken to do proper labels, tags, or sub-clips.
Spend more time up-front on organization to save time in the actual edit.
Also on the general subject of "learn the shortcuts", editing suites have *hundreds* of keyboard shortcuts, it's kinda overwhelming, but there's a core cluster, namely tools, shuttle controls, and operators, that aren't shortcuts as much as they're core controls.
Like, learning your way around shift/control/alt is pretty fundamental to using a full-suite NLE, on par with learning WASD controls if you want to play a PC shooter. It's just the way it be.
The general con is that they're all incredibly complicated with their own internal thought processes, assumptions, and vocabulary. There's a lot of differences ranging from minor to huge, but they're all mostly trying to achieve the same outcomes.
The biggest argument I'd make is that the process of editing is *so* complicated that it's actually worth skipping over iMovie/WMM entirely straight to Resolve/Premiere/FCPX.
Basically if you're starting from zero where all these words are new and the UI is completely alien, it's really not that much harder to learn your way around a full suite.
Blender *has* a video editor, but technically so does Photoshop. It's very much bare functionality that's more for the convenience of Blender artists than a tool you would otherwise use for video editing.
This is where it comes down to what you want to do. If you already know Blender and are comfortable with it and that's where a lot of your footage is coming from, it works. But I wouldn't recommend someone otherwise learn Blender specifically for use as a NLE.
Work at 24 or 30fps and let Resolve/Premiere/FCP handle the conversions. There's a lot of potential asterisks on this question, but assuming you've just got a mix of camcorder, cell phone, and webcam footage, just toss it all in a 30fps timeline.
That's just broad advice assuming you don't have some other, specific outcome in mind and just don't want your video to look weird.
Expect to spend a day setting up keybindings and/or learning new ones. Resolve combines the slip and ripple tools into a single trim tool that has context functionality based on positioning. Adjusting to a non-modular UI. Lots of tiny things.
Founder's advantage. Enterprise-scale support for both hardware and software. Networking systems for integrating dozens of workstations into a single unified project file. Basically a ton of stuff that a solo editor self-instructing would never think about
There aren't any, but OpenShot and Shotcut are probably the least-unusable. AVIDEMUX has some niche functionality due to being so bad that it ignores I-frame integrity if you want to break video files on purpose.
Open source UX is just too consistently bad to recommend for something like video editing that relies so heavily on UX, and they don't have the features to make it worth the trade off, so I can't really give any of them the thumbs up for anything other than philosophical reasons.
I mean, at a certain point anything is usable, and the tool you have is better than the tool you don't. We used to make movies by literally taping strips of film together, so as long as a program will let you attach clips to one another you can make it work.
DaVinci Resolve has a free version that's only missing tools like Frame io integration, stereoscopic video, or 4k+ workflow. I used it for years before I hit a thing I wanted from the Studio version.
Don't be vague about what you want to see or how you want it to flow. Be familiar with the footage so you're not asking for things that don't exist. Have a sense for when a request is going to take hours and should be a note instead of a live demo.
Speaking of compression, if you’re looking to get into video editing, you’re going to need some hardware, that’s just a reality, but the first thing you should be looking at is hard drives. Storage is right above electricity in the editor’s hierarchy of needs.
You don’t need something extravagant, but you need to get past the point where a 2GB export feels absurd, where every video means deleting the previous.
FCPX's Magnetic Timeline is so particular that there's definitely an adjustment going to any other NLE, but the new Cut page in Resolve functions a lot like it, which might ease that transition.
The most common culprit of drift is a camera that uses variable frame rate (VFR) to reduce bitrate (like smartphones) and it causes hell with external audio. If you can't avoid using said camera then consider looping the audio directly into the phone's mic in when shooting.
It also might just be as simple as a mismatched sample rate, in which case match your audio recorder (and project settings) to the camera's sample rate.
Standard is 48kHz but some cameras use 44.1kHz.
In this case you can also "fix" existing audio by converting it to match.
It's not a jump so much as it's starting over. AE is a compositor and frame renderer which seems like it ought to be just a special kind of video editing tool but is really an entirely separate toolkit.
It's taking one sequence or timeline and putting it inside another. It's a way to isolate particularly complicated elements that can be difficult to move around, especially if they need to be reused or intercut with other things.
Not at all, pre-rendering assets is super common. I definitely had a phase where I was over-reliant on "only render once" and render times definitely needlessly suffered since the interactions are often multiplicative and not additive.
Dynamic Link is super rad, but a .PSD that would take 30 seconds to rasterize as an uncompressed .TGA with alpha might take Premiere 10 minutes to render.
A lot of people are asking this, but the answer is I don't know, because the answer is too personal. Some people self-teach well just by opening a program and trying to Do A Thing. Many don't. Lots need guided instruction.
It also depends on how far you want to go, if you just want to be proficient with making a video or if you want to be the kind of person who cares about timecode and deliverable packages.
Honestly the general interface of NLEs has barely changed since the AVID/1 in 1989
When Western Electric and Warner Brothers needed to agree on a standard speed for the Vitaphone sound system they picked 90 feet-per-minute, a nice round number that was already commonly used in the 1920s
This also translates to a 2.5:1 relationship with 60Hz electrical systems.
They considered a 2:1 relationship (30fps) but the 112.5 feet-per-minute speed put too much strain on the nitrocellulose film, resulting in a lot more breaks during recording and projection.
The reasons for it being "too slow" are actually pretty complicated.
For optical sound-on-film (which was in its roughest proof-of-concept phase at the time) 60 feet/m was way too low of a sample rate for pleasant audio.
I'm not doing an AI video because I promised myself no more moving targets, but this would be the thrust of a big chunk of it: there's a real good chance generative AI is just too damn expensive for a product that's rapidly displaying its fragility.
Also, like, for ChatGPT, every day on Reddit I see the dumbest people you know spam it with inane nonsense until it gives them the answer they want, and that behaviour just isn't sustainable at scale given the compute cost of every query.
I experimented with various generative art tools during the winter to get a sense of the user experience & what this stuff actually does and almost instantly I found myself coming up with a prompt and just recycling it 20, 30, 40, 50 times until it gave me something... tolerable?
Okay, so, back in April I snapped at James in reply to a tweet that was linking to this video (which James has since delisted but not deleted) and I want to talk about the full context of that but I don't want to make a video, put your beatdown memes away.
The first bit of context is that I initially got keyed into James to fact-check his claims about indie filmmaking in Canada. As a filmmaker the entire Telos venture was immediately obvious as a juvenile fantasy dreamed up by someone with no idea how to make a movie.
Just wild claims about their plans that weren't worth debunking because they bordered Not Even Wrong. But in watching one of these pitch videos I noticed that he had a $4000 current-gen camera in the background as a prop, and that seemed both pretentious and weird.
Okay, so, GameStop earnings report came out yesterday. Apes in my mentions have been super dickish for the last three months insisting that the company "is now profitable" but, shocking only them, GS lost money this quarter.
The current cope is that GS lost less this quarter YOY, but they've also shrunk the company pretty dramatically in the last year. Loss relative to revenue is slightly improved but still bad. GameStop remains massively over-valued relative to performance.
Those are the boring numbers rooted in reality, though. What we care about are the insane theories.
Bolger and Ball both pitch Metaverse visions where you can hide a pair of digital sneakers in a spot and they'll still be there years later, but neither addresses the implication that this inevitably creates digital littering.
Also conspicuously absent in all the metaverse reading: no one talks about malicious design.
Multiplayer FPSs used to have user-generated skin systems, you could build and share custom character models and depending on server settings your skin could be automatically pushed to other server participants. Exactly the kind of self-expression the metaverse promises.
The thing that makes the meme stock saga keenly fascinating isn't that it's people piling in on a bad stock based on questionable hype, that happens all the time, it's how it's persisted and grown based on complete mythology.
The Ape theory of market mechanics is that these companies, GameStop, Bed Bath Beyond, AMC, are otherwise normal, healthy companies that are being targeted for destruction by predatory hedge funds who use criminal naked short sales to drive the company out of business.
And yet in Bed Bath's 93 page bankruptcy petition, which includes 25 pages of "here's what went wrong," short sales, hedge funds, predatory securities trading, none of that gets even glanced at. Not even mentioned, let alone cited as a material influence on the company's decline.