Another week, another workflow. This week we’ll take a look at how to build a basic bookmarklet to automate OSINT searches using #javascript. This will be ground floor level stuff, don’t worry!
Let’s go!
(1/9)
Step 1: Create an option for input
Start a new #javascript file and create a variable to accept input for usernames. This will create a prompt when you click the bookmarklet which we'll use to automate our process.
var username = prompt("Enter a username: ");
(2/9)
Step 2: Plug that username into a URL
Create a new variable called fb and add the URL of the site you want to check. This will append the username to the end of the URL which is Facebook in this example.
We’ll look at a basic function that’s built into javascript called window. We want to OPEN that URL in a new tab after the username is input which can be done by adding the "_blank" argument like:
We want to minify your #javascript, or make it one line, to prepare it for becoming a bookmarklet. You can easily do this by going to JavaScript Minifier’s website.
There’s a formula to create a bookmarklet but I found a site called MrColes that does it for you. Simply copy and paste the minified #javascript into this site and drag and drop the bookmarklet to your bookmarks bar.
You can add additional variables for checking multiple websites and add additional functions to open those websites with one motion. You can also add Google search URLs to automate Google Dorking. The possibilities are endless.
(8/9)
Step 7: Share your Project
Bookmarklets are very basic but are also very useful in many cases. The Expand All bookmarklet I use for Facebook is an example of a sophisticated bookmarklet that’s still free and open source. Share your work!
Another week, another set of tools. This week is a low tech week but we'll look at data set search engines and what to look for, tools for identifying scams and typosquatting, and finally, #OSINT for Github.
Let's go!
(1/6)
The first #OSINT tool is Dataset Search by Google. I've been working with datasets a lot lately and stumbled across this. Try searching for things like "webcams" or "cctv" to see the depth of what types of data you can work with.
The second #OSINT tool is called Registered Domain Names Search. It seems very basic; however, by searching for keyword only, you can find a ton of phishing prospects. They also have an API for extended results. Try searching for "facebook".
Another week, another workflow. This week we’re going to look at YouTube and how to deconstruct videos, analyze the outputs, and apply automation and AI-lite to the results at scale.
Let’s go!
(1/10)
Step 1: Select a Video of Interest
Go to YouTube and find any video. Don’t pick anything too long for this workflow because it’ll be difficult to work with. Keep it under 15 minutes or 200 mb if you can. Great use cases are wartime videos, human rights violations, etc.
(2/10)
Step 2: Download the Video
You can inspect the element and extract the video, use a Python script, or use a web app. I’ll recommend a quick web app called YT1s that allows you to put in a YouTube link and download the video.
Another week, another set of tools. This week let's look at Snapchat, Google Earth, and YouTube today. This will include 2 #python tools and 1 web app. Shall we?
(1/5)
The first #OSINT tool is made by @djnemec and it's a #python tool called Snapchat Story Downloader. It allows you to create a db of locations of interest then extract Snapchat stories from those locations indefinitely. Classifier too. Great!
The second #OSINT tool is a #python tool I made in response to @raymserrato who was looking to automate screenshot capturing of Google Earth. Earthshot will open and screenshot a list of coordinates you specify on a CSV. It's slow though!
This week we’ll discuss how to find date/time information of web content even if it’s not obvious. This will help you establish a timeline of content or determine if an article has been altered since the original publication.
Let’s get started.
(1/8)
Step 1: Check the URL
This is a no brainer, but a lot of web content will include the original date it was published in the URL. Keep in mind that this could be an updated URL. We’ll look at other data to determine that next.
(2/8)
Step 2: Check the Sitemap
Simply add “/sitemap.xml” to the end of a URL to check if a sitemap is available. The sitemap usually includes the date/time stamp of when all content was updated on the website. This is great for finding different URL types on a website too!
It’s time for another round of OSINT tools to help you improve your efficiency and uncover new information. A quick overview:
[+] Reversing Information
[+] Automating Searches with #Python
[+] Testing/Using APIs
RT for Reach! 🙏
(1/5) 👇
The first #OSINT tool is called Mitaka. It’s a browser extension that will reverse multiple data points right from your browser. Right-click what you want to reverse and Mitaka will show you what sources are available. Improve your efficiency!
The second #OSINT tool is called Sitedorks from @zarcolio. It’s a #Python tool to automate your Google queries. Enter a query and it’ll open up to 25+ tabs—checking your parameters across a variety of website categories.
The first #OSINT tool is a bookmarklet I built called Forage and is meant to expand your search across popular social media sites. It takes the username from FB/IG, Twitter, and LinkedIn and expands your search across the web. Expect updates!
The second #OSINT tool is a browser extension called TabTrum that takes a snapshot of your active tabs and saves them so you can reopen that same combination again with one click. It's saved me SO much time with work and OSINT investigations.