#OSINT Workflow Wednesday 🚨

Another week, another workflow. This week we’re going to look at YouTube and how to deconstruct videos, analyze the outputs, and apply automation and AI-lite to the results at scale.


Let’s go!

(1/10)
Step 1: Select a Video of Interest

Go to YouTube and find any video. Don’t pick anything too long for this workflow because it’ll be difficult to work with. Keep it under 15 minutes or 200 mb if you can. Great use cases are wartime videos, human rights violations, etc.

(2/10)
Step 2: Download the Video

You can inspect the element and extract the video, use a Python script, or use a web app. I’ll recommend a quick web app called YT1s that allows you to put in a YouTube link and download the video.

yt1s.com

(3/10)
Step 3: Split the Video into Images

You can upload the new video to an online tool called Video to JPG from Online Converter. It supports videos up to 200 mb. I like it because it downloads the results as a zip to your local drive.

onlineconverter.com/video-to-jpg

(4/10)
Step 4: Analyze the Images for People

Once you’ve split your video into images from individual frames, now it’s time to analyze the results. You’re looking for specific persons of interest within the images of the video. Crop their face to remove any noise.

(5/10)
Step 5: Isolate the Face

To increase the likelihood of success, you want to remove the background from the images leaving only the person’s face. Use a tool like remove.bg to pull this off in record time!

(6/10)
Step 6: Analyze the Images for Landmarks

Let’s move into landmarks. Crop out any landmarks or unique identifiers within each image to sort through later. Remove the background for the landmarks unless the background is important. Blur any obstructions.

(7/10)
Step 7: Reverse Image Search

With the new cropped and edited images, let’s start reverse image searching. Use every reverse search you can (see image). Use this infographic for reference. This works better for landmarks than people but you might get lucky.

(8/10)
Step 8: Facial Recognition

We can use an open source tool called face_recognition to create a library of people you're looking for and run the new faces against those to see if there’s a match.

github.com/ageitgey/face_…

(9/10)
Step 9: Share your Findings

#OSINT is beginning to be adopted by a ton of industries with a variety of use cases. If you have an interesting case study using this workflow, share it with everyone!


Thanks for reading!

(10/10)

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Jake Creps

Jake Creps Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @jakecreps

2 Feb
#OSINT Tool Tuesday 🚨

Another week, another set of tools. This week let's look at Snapchat, Google Earth, and YouTube today. This will include 2 #python tools and 1 web app. Shall we?

(1/5)
The first #OSINT tool is made by @djnemec and it's a #python tool called Snapchat Story Downloader. It allows you to create a db of locations of interest then extract Snapchat stories from those locations indefinitely. Classifier too. Great!

github.com/nemec/snapchat…

(2/5)
The second #OSINT tool is a #python tool I made in response to @raymserrato who was looking to automate screenshot capturing of Google Earth. Earthshot will open and screenshot a list of coordinates you specify on a CSV. It's slow though!

github.com/jakecreps/eart…

(3/5)
Read 5 tweets
27 Jan
#OSINT Workflow Wednesday

This week we’ll discuss how to find date/time information of web content even if it’s not obvious. This will help you establish a timeline of content or determine if an article has been altered since the original publication.

Let’s get started.

(1/8)
Step 1: Check the URL

This is a no brainer, but a lot of web content will include the original date it was published in the URL. Keep in mind that this could be an updated URL. We’ll look at other data to determine that next.

(2/8)
Step 2: Check the Sitemap

Simply add “/sitemap.xml” to the end of a URL to check if a sitemap is available. The sitemap usually includes the date/time stamp of when all content was updated on the website. This is great for finding different URL types on a website too!

(3/8)
Read 8 tweets
26 Jan
#OSINT Tool Tuesday

It’s time for another round of OSINT tools to help you improve your efficiency and uncover new information. A quick overview:

[+] Reversing Information
[+] Automating Searches with #Python
[+] Testing/Using APIs

RT for Reach! 🙏

(1/5) 👇
The first #OSINT tool is called Mitaka. It’s a browser extension that will reverse multiple data points right from your browser. Right-click what you want to reverse and Mitaka will show you what sources are available. Improve your efficiency!

github.com/ninoseki/mitaka

(2/5) 👇
The second #OSINT tool is called Sitedorks from @zarcolio. It’s a #Python tool to automate your Google queries. Enter a query and it’ll open up to 25+ tabs—checking your parameters across a variety of website categories.

github.com/Zarcolio/sited…

(3/5)☝️👇
Read 6 tweets
19 Jan
#OSINT Tool Tuesday

It’s the third week of the year and time for another set of tools to help you with your work in OSINT. This week will be:

[+] Spaghetti Bookmarklet (mine)
[+] Tab management/automation
[+] OnlyFans?

RT for Reach!

🙏

👇(1/5)
The first #OSINT tool is a bookmarklet I built called Forage and is meant to expand your search across popular social media sites. It takes the username from FB/IG, Twitter, and LinkedIn and expands your search across the web. Expect updates!

github.com/jakecreps/fora…

☝️👇(2/5)
The second #OSINT tool is a browser extension called TabTrum that takes a snapshot of your active tabs and saves them so you can reopen that same combination again with one click. It's saved me SO much time with work and OSINT investigations.

tabtrum.live

☝️👇(3/5)
Read 5 tweets
8 Dec 20
🚨 #OSINT Tool Tuesday

I think I'm going to stick with 2 tools a week. One web-based, the other script-based. This week it's about archives and scaling your work in search engines.

👇 (1/4)
The first tool is from @ODU and it's a web-based tool that will tell you the date a website started and show the earliest archive from multiple sources.

carbondate.cs.odu.edu

👇 (2/4)
The next tool I found through @s0md3v and it's called degoogle which lets you extract results directly. The reason why I included this is that the author claimed to not have run into a captcha for weeks and Somdev said he had 0 with 120+ requests.

github.com/deepseagirl/de…

(3/4)
Read 4 tweets
25 Nov 20
🚨 #OSINT Workflow Wednesday

In addition to OSINT Tool Tuesday, I'm going to start doing Workflow Wednesday where I unpack a process, instead of a tool, for open source intelligence. This week I'm going to talk about how to deconstruct a new social media platform.

👇 (1/9)
Step 1: Map the platform without an account.

You want to see what you can access without registering. Explore the platform from the website but also check out what's indexed by Google and other search engines using site:, -site:, inurl:, intext:, and other operators.

👇 (2/9)
Step 2: Understand the platform's privacy policies and other fine print.

You want to see what the risks are for registering an account including what information is collected and shared. You also want to know what other users can view once you've registered.

👇 (3/9)
Read 9 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!