I hear you.
I see you.
I've got you.
Because I was you.
For all you "non-technical" SEOs out there, here's a shortcut to getting better with #techseo 👇🧵
Technical SEO is about uncovering inefficiencies a site's webpages have to be crawled, rendered, indexed, and ranked.
But this scope of work is SUPER broad.
📋 Find out what could be holding it back with a 170+ point technical SEO checklist I made just for Wordpress builds 👇
Now before you get started, a few things:
• you do NOT have to complete all the checks
• there is no fail grade, instead;
• a 'triage' using IFS statements gives you a prompt to consider your next steps
• pass = ✔️ (generally speaking)
• requires attention = dig a lil deeper
One more thing: technical SEO is not a one-off task because, "QA is an evolving process" - @myriamjessier + @tentaclequing
And a checklist is a great starting point. However, completing this checklist ≠ technical audit.
The checklist is in the next tweet (I promise) 👇
First, open the link and MAKE A COPY of the Google Sheet 👇
🔍 Are crawlers or search engine crawlers being blocked in the robots.txt file?
⚠️ yes, investigate what is being blocked and why.
✔️ no, but should there be files/assets/URLs that should be blocked?
🔍 Are paginated URLs being blocked in robots.txt?
⚠️ yes, investigate what is being blocked and why.
⚠️ no, understand what pagination strategy is being used and decide whether these should be blocked in robots.txt.
🔍 According to Search Console crawl stats, have HTML files been crawled in the last 72-hours?
✔️ yes, Google is crawling HTML pages - good!
⚠️ no, Google hasn't crawled any pages - this is not good and you should allocated time to finding out and verifying why this is the case.
🔍 Does the website have a sitemap?
✔️ yes, nice.
⚠️ no, is it because the website has less than 50 URLs? If not, a sitemap will be beneficial.
💡Here is @danielwaisberg explaining what a sitemap is and why it helps with crawling 👇
🔍 Is the sitemap index URL(s) referenced in the robots.txt file?
✔️ yes, crawlers can easily find the URLs that you care about.
⚠️ no, is there a sitemap and has it been submitted to GSC?
"It is so hard to find a good writer. We've tried multiple writers, paid them well, and even hired locals. But the content sucks! The time it takes to edit the content - we may as well write it ourselves!"
If this sounds familiar ..
🧵👇🏾
Here is a real live feed of your writer.
They write. Doh!
They may do some research, but primarily, they put words on a page for you.
Most writers are great.
But they're not SEOs.
Their task is to write.
Your task is to tell them what to write.
Via a content brief.
💡 FYI, these are not content briefs:
❌ a topic/headline
❌ pre-populated headings + subheadings
❌ a link to a page to paraphrase
❌ number of words to aim for
❌ FAQs pulled from People Also Ask
❌ list of keywords
❌ SurferSEO, Frase, or Clearscope report