Shane 📈🚀 Profile picture
Jun 2 22 tweets 7 min read
This is the literal step-by-step process I use for #keywordresearch and to determine what to write about next on my sites.

For this you'll need
@ahrefs
✅ Google Sheets
✅ A small bit of Python (my script + instructions are provided below)

#nichesites #seo #contentsites
🧵🪡
First, start with a super broad keyword that is relavent to your site and load that into Ahrefs.

I'm using a random word as an example for this thread.
Then click on the questions section and create a filter for Keyword Difficulty with a max of 10
This will give you a much shorter list of what Ahrefs believes to be easy to target keywords
Next I export this list and import it into Google Sheets into a tab called Ahrefs.

Then we need to copy the list of all the keywords from column B and paste these in a file called keywords.txt
Now move that keywords.txt file into a folder called Google Counts.

Then head to this GIST page and copy this python code for a script that I have created.

In the Google Counts folder paste this script as main.py

gist.github.com/shanejones/8ca…
This script uses Requests to query the search URL's we're mocking and Beautiful Soup to parse the result page HTML and scrape content for the specific parts of the page we want.
ℹ️NOTE: You may need to install Requests and Beautiful Soup 4 on your machine if you don't have them.

A quick search for "how to install Requests in Python" or how to install Beautiful Soup 4 in Python" should help you out with this.
This python script will look for the keywords.txt file in the same folder and perform 3 Google searches for each and return the number of results for

➡️ Standard search of search term
➡️ Search with quoted "search term"
➡️ An allintitle:Search Term
Run this script using
python3 main.py

You'll see that your console will start churning through the list
⚠️ NOTE: Google doesn't like you doing this so much and may stop giving you results after about 200 keywords so run this in batches of 200 keywords and maybe switch VPN's to get around this :)
The output of this script completing will be a CSV in your folder with the totals of the searches alongside the search term.

Once you have this CSV, come back to Google Sheets and import the file into a new tab called Google Results.
Then back in your main Ahrefs tab add three new columns
- Search Results
- Quoted Results
- Allintitle Results

Then create a VLOOKUP in these columns from these keywords to lookup the values from the Google Results tab
ℹ️ NOTE: some results may return 0 across all three searches. Since using this script I've noticed that some search pages actually hide the search result total
You now have a complete data sheet where you can look for opportunities.

Here is the competed sheet I created for this thread with a slight bit of conditional formatting to help me identify things easier
docs.google.com/spreadsheets/d…
I use this basic theory for searching for opportunities in this keyword sheet.

✅ Under 1 million search results,
✅ Under 1,000 quoted results
✅ Low or no allintitle results
✅ Any search volume

This should result in easy traffic
As an example we have "what is borderline ecg sinus rhythm"

This has 100 searches per month, 259,000 results, 2 quoted results and 1 allintitle result.
Then from here I manually look at the SERPS to check the top 3 - 5 positions, check a mix of quoted and allintitle searches too.

Allintitle shows you who is going after that keyphrase as they have it exactly in the title.

You can also research a specific SERP in Ahrefs too.
This quick bit of research helps me determine how much content to create before adding a ticket to my content board in Notion ready for when I next go on a writing binge 😄
I've also worked out a potential parent page for this that could be on ECG Rhythms.

We can now do some more research around ECG Rhythms to populate that parent page and to discover other potential child pages too.

This is where the process can repeat continuously.
Now all you need to do is write the content and get it published. Thats the easy part right?

🥇 Using this exact method I've had posts reach Page 1 in as little as 2 weeks to a month.

Let me know how you get on too.
If you liked this thread I tend to do these about once a week at the moment so be sure to give me a follow so you don't miss the next one and maybe give the original thread tweet a RT too. Thanks!

twitter.com/shanejones

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Shane 📈🚀

Shane 📈🚀 Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @shanejones

May 27
Here is a thread about how to gain some permanent backlinks for around the price of a domain name.

I'm not saying you should do this.

Let's just say it's a very creative, hypothetical way to get backlinks though.

#seo #nichesites #contentsites #linkbuilding

🧵🪡
First things first, head to expireddomains.net and set yourself up an account.

Logging in gives you access to more information about the domain names that are being shown on the site.
When you're logged in, you want to head to the Deleted Domains page member.expireddomains.net/domains/combin…

The default view shows you all of the domains that have recently been deleted.
Read 25 tweets
May 6
Friday thread time featuring images!

Here's a quick summary thread on how I structure my content pages on my sites.

I'm currently in the process of moving my older sites into this format too.

🧵 #seo #nichesites #nichecontent
First up we have a header section this has

✅ Relevant image. Optimised and responsive.
✅ Large title, usually the main keyword
✅ Breadcrumb - make sure to use Breadcrumb Schema
✅ Table of Contents - Longer posts this should be collapsed with a "show table of contents" button
Core Section - We have an introduction section which includes

✅ H2 with a variation on the H1
✅ Large lead paragraph
✅ Bold paragraph that solves the question the page is trying to answer
✅ Continuation paragraph to lead into the rest of the sites content
Read 16 tweets
Jun 22, 2020
In this thread.

"How I lowered the page load time of CNN.com from 18 seconds to 2.75 seconds with no development work".

#pagespeed #webperf #techseo
Here is some more proof. Below you can see the before and after screenshots of my tests and times in the Chrome Network tab.

Caveat: the Finish time never stops on CSS due to external scripts constantly running. The screenshot was when it stabilised though. CNN BeforeCNN After
Post 3! The game is up! I didn’t really do this. :D

I’m going to show you how in DevTools you can measure the potential gains that can be had by removing bloated scripts. This method is great to get buy-in from other teams before going ahead deleting anything from the code.
Read 16 tweets
May 20, 2020
After yesterday's Lighthouse changes going live you probably want to know how to optimise for the new metrics which are

Largest contentful paint
Total Blocking Time / First Input Delay
Cumulative Layout Shift

Here is a thread with some quick pointers.

#sitespeed #webperf #ux
— Optimising Largest Contentful Paint

Reduce response times from the server - CDN's can help here

Reduce render blocking JS and CSS - try to only load the minimum on a per page basis. Minify, inline and defer are your friend. Check usage using code coverage tool in Dev Tools
— Optimising Largest Contentful Paint - cont.

Imagery - Make sure they are well optimised and compressed. Also keep an eye on those sizes. Look into image CDN's and lazyloading

Preloading - You can use preload the most important elements on your page
Read 9 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us on Twitter!

:(