The subdomains vs subfolders/subdirectories debate just won't die. Search engines have evolved their treatment of "what is a website" over time, and yet the debates cling to old case studies.
At some point of this rant, we're going to talk about ccTLD's, sub-subdomains, and subdomain plus subfolder combinations with ccTLD's because #teamsubfolders uses the same argument for everything.
2/N
The concept of a "website" in the early days of the internet was that subdomains were separate entities from the "home" site. This article on website boundaries from Bing is worth revisiting. blogs.bing.com/webmaster/nove…
Websites are "leasing" subdomains/subfolders to rank stuff.
3/N
Oh and that stuff's working both ways. Why?
It's because the definition of a website is being defined by internal/external links more so than your url path or keywords-in-subdomain.
Ever look up a local restaurant and get 10 YELP results? Usually you'll get 1-2 results max from the same website in 2020 because of this "reranking algorithm"
6/n
In the early 2000's, "dominating" a SERP meant you were showing up in positions 1, 2, 3, 4, 5 etc. of a page. In 2020, "dominating" a SERP means you're in a FS, the position after that, possibly an image result, and you've got partnerships with the websites for said keyword
7/n
Again, if you're able to play around Google's recognition of what the definition of a website is, you're able to manipulate how many maximum results can show up in a search result.
Google/Bing Ads found Ads + Organic Rankings = More net clicks.
(Disclaimer: you should always test things yourself, if the unit economics make sense, you do it)
You know who's killing it with multi-site strategy? Dotdash.
9/n
You known who else is killing it with the multi-site strategy? Wayfair.
You might be arguing "that's a different domain, not a different subdomain" - yes, but again, the crux of the issue is "what is a separate website?"
You're missing the point.
10/n
Ket's share an old document folks stopped talking about. http://162.250.19.7/ac0xl/Dont-Be-Evil/Fake%20News/Twiddler%20Quick%20Start%20Guide%20-%20Superroot.pdf "BlogCategorizer places all the results from a blog in a
max_total category to prevent too many being shown. "
11/n
*K Let's* ahhh that edit button.
So where do ccTLD's fall into this equation when determining whether it's the same site or separate site? Separate website.
Go check out those Pinterest cctld's ranking for "ideas/inspiration"
The thing a lot of folks "mess up" in migrations is doing with content, design, and url changes in one swoop. We correlate the wrong factor as the root cause. Missing 301 redirects or equivalent destination, outdated internal links, or lost branded traffic...
13/n
Or the opposite - consolidated set of pages from pruning, improving internal linking, improving page experiences (speed/https), removing of extraneous links in nav structures.... these extraneous variables often mixed up with moving from a subdomain to a subfolder.
14/n
There's a ton of variables and it's almost impossible to just isolate things down to just a URL change. I'm excited to say I'm working on a project that is doing just that.
We mirrored existing content but updated the design using proxy redirects.
15/n
Then when we replicated all functionalities (with the infra change we wanted) we switched the 305's to 301's. Boom.
I've ranted a ton, so let's put things to the test?
Because at the end of the day, SEO is evolving. Your site=/= my site. Your strategy =/= my strategy.
16/end
Happy Friday folks. If you missed me for the last week or so (e.g. like my rant about "BRAND" as a ranking factor) then you'll find it somewhere deep into the links:
The SEO industry needs less gatekeepers and more pragmatists. The typical lifecycle of a "community" is that pragmatists often become gatekeepers, or incorrectly equivocate their experiences as superior to others'.
We can all be right. We can all be wrong.
1/N
Web rings and blogrolls. Social bookmarking websites. Directory submissions. Article submission/content syndication websites. Authorship. Penguin. Panda. Hummingbird. Mobilegeddon. Interstitials. Locality. Query deserves freshness. PageSpeed. Https.
The opinions don't stop.
2/N
In the early 2000's all the articles on SEO were about trying to get your website in a respected web directory, pages on article directories, and how to take advantage of "web 2.0" social media websites - link building was very manual and websites weren't easy to make.
3/N
Hello friends following me this Friday on "implicit vs explicit search intent."
When you've been around the SEO industry for a while you start hearing new terms emerge. Search intent, user intent, keyword intent. As usual, I'm going to start with a bit of history.
1/n
Informational, navigation, and transactional queries - this categorization has been around since I've been doing SEO, and it takes an internet search to appreciate how old it is - 2002. You can still split up queries this way, but it's not great.
Happy Friday folx - we're ranting about pagination.
We're going to break down the problem pagination was supposed to fix, the problems it ended up creating, and why I want to kill it with fire. Something #passageindexing
And fun. And wacky. And evolving. More people started connecting to the internet, own websites, publish content, sell stuff online (who would trust that?!), and much more.
2/n
Then there were at least three problems:
- Folx created more content, it was no longer feasible to put it all in one URL
- People broke up their content into chunks for better user experience or ad monetization
- Inventory-driven websites were confusing search engines
3/n
Today we’re going to rant about “microsites,” a broad term used to describe creating a “new” website experiences. It will cover the why, some pros, some cons, and probably go off topic.
1/n
Microsites can be subdomains, brand new domains, or even subfolders of a brand - the telling sign is generally the tech stack differs (not always true). Quick hop on builtwith.com and you'll often see React apps powering parallax experiences (RIP Flash websites).
3/n
Happy Friday - let's rant about keyword density, LSI keywords, internal/external links per N words, and word count - vernacular you want to get out of your processes sooner rather than later.
1/n
Keyword density, usually defined as a percentage of amount of times a keyword shows up divided by the total number of keywords on the page, is product of the keyword-stuffing years of SEO.
e.g. I heard SEO Expert Chad recommends pages should have a Keyword Density of 2% !
2/n
Search engines were still using the keywords meta tags for rankings, the amount of times a keyword was repeated on a page had a huge influence on whether a page would rank, and there were algorithms that would pick up keyword stuffing. Yo it's not the pre 2k or early 2000's.
3/n