Hello friends following me this Friday on "implicit vs explicit search intent."
When you've been around the SEO industry for a while you start hearing new terms emerge. Search intent, user intent, keyword intent. As usual, I'm going to start with a bit of history.
1/n
Informational, navigation, and transactional queries - this categorization has been around since I've been doing SEO, and it takes an internet search to appreciate how old it is - 2002. You can still split up queries this way, but it's not great.
Happy Friday folx - we're ranting about pagination.
We're going to break down the problem pagination was supposed to fix, the problems it ended up creating, and why I want to kill it with fire. Something #passageindexing
And fun. And wacky. And evolving. More people started connecting to the internet, own websites, publish content, sell stuff online (who would trust that?!), and much more.
2/n
Then there were at least three problems:
- Folx created more content, it was no longer feasible to put it all in one URL
- People broke up their content into chunks for better user experience or ad monetization
- Inventory-driven websites were confusing search engines
3/n
Today we’re going to rant about “microsites,” a broad term used to describe creating a “new” website experiences. It will cover the why, some pros, some cons, and probably go off topic.
1/n
Microsites can be subdomains, brand new domains, or even subfolders of a brand - the telling sign is generally the tech stack differs (not always true). Quick hop on builtwith.com and you'll often see React apps powering parallax experiences (RIP Flash websites).
3/n
Happy Friday - let's rant about keyword density, LSI keywords, internal/external links per N words, and word count - vernacular you want to get out of your processes sooner rather than later.
1/n
Keyword density, usually defined as a percentage of amount of times a keyword shows up divided by the total number of keywords on the page, is product of the keyword-stuffing years of SEO.
e.g. I heard SEO Expert Chad recommends pages should have a Keyword Density of 2% !
2/n
Search engines were still using the keywords meta tags for rankings, the amount of times a keyword was repeated on a page had a huge influence on whether a page would rank, and there were algorithms that would pick up keyword stuffing. Yo it's not the pre 2k or early 2000's.
3/n
The subdomains vs subfolders/subdirectories debate just won't die. Search engines have evolved their treatment of "what is a website" over time, and yet the debates cling to old case studies.
At some point of this rant, we're going to talk about ccTLD's, sub-subdomains, and subdomain plus subfolder combinations with ccTLD's because #teamsubfolders uses the same argument for everything.
2/N
The concept of a "website" in the early days of the internet was that subdomains were separate entities from the "home" site. This article on website boundaries from Bing is worth revisiting. blogs.bing.com/webmaster/nove…
Websites are "leasing" subdomains/subfolders to rank stuff.
3/N
Happy Friday friends. Today we're going to rant about the trials and tribulations of data accuracy for digital marketers, why data-based goal setting is hard, and how you can't fix both of these things forever, using the fabled "Bounce Rate" as an example.
1/n
A common mistake in digital marketing is not understanding, fundamentally, how website data or 3rd party competitor website data is being collected. How is SimilarWeb, SEMRush, and Ahrefs collecting data? How does GA, GSC, Bing, Pikwik, Adobe, HubSpot collect data?
2/n
I can't count the amount of times different implementations of Google Analytics goes awry - double tags, scraper site uses your GA ID, missing tags, non-standard implementations that weren't working, deprecated implementation, code in the wrong place...
3/n