Happy Friday friends. Today we're going to rant about the trials and tribulations of data accuracy for digital marketers, why data-based goal setting is hard, and how you can't fix both of these things forever, using the fabled "Bounce Rate" as an example.
1/n
A common mistake in digital marketing is not understanding, fundamentally, how website data or 3rd party competitor website data is being collected. How is SimilarWeb, SEMRush, and Ahrefs collecting data? How does GA, GSC, Bing, Pikwik, Adobe, HubSpot collect data?
2/n
I can't count the amount of times different implementations of Google Analytics goes awry - double tags, scraper site uses your GA ID, missing tags, non-standard implementations that weren't working, deprecated implementation, code in the wrong place...
3/n
The reason why a lot of SEO's get really good at troubleshooting these things is that our goals are often directly tied to traffic or "user engagement" signals.
Work hack? Increase traffic by properly attributing them to organic. There are worse things you could do.
4/n
Sometimes the mess is just enormous and you're working with a black box. Groupon pulled the switch in 2014, and found that up to 60% of their direct traffic was actually organic by de-indexing their whole website for 6 hours searchengineland.com/60-direct-traf…
5/n
And sometimes you're working on Page Speed, you thought the traffic increase was from that... but really....
"Speed = Better Analytics"
"More Traffic Recorded = "Traffic Increased"
Good marketers want the data to be as descriptive to actual website visitor behavior as possible. Bad implementation creates the opposite effect. Before goals are set, you need to do a "good faith" clean-up. Sometimes you let certain metrics be useless. Me? GA Bounce Rate.
7/n
How did GA Bounce Rate become useless? Well, we started tracking scroll depth. That's an "action" that's triggered the moment a visitor scrolls. With that action the visitor is no longer a bounce. There are other similar examples out there in the wild.
8/n
As you get into the nuances of web analytics, you start to learn how to break or bias them - this includes 3rd party tools. Blocking AhrefsBot or SEMrushBot user-agents when they visit your servers? Yeah it's a thing.
Fun, but keep the knowledge creating value.
9/n
This is why it's very easy to feel uncomfortable with setting goals - especially when you're likely improving the website whilst playing the analytics consultant. Don't let that stop you.
Fixing bad data usually means you can see a % change. Adjust for that and move on.
10/n
This is true if there's a new adblocker/browser security/cookie consent law and banner that eats away on your website analytics. You simply have to accept it and move on (or use an web analytics platform that uses web logs for the most part).
11/n
We care about traffic and behaviors that influence $$$. It's easy to fall into the trap of "how do I increase this number" and forget the human on the other side. Myopically chasing "user experience" metrics can lead to dark patterns. darkpatterns.org
12/n
The next version of browsers, a new privacy law, a change in how web analytics providers "calculates" stuff are all things I've seen in my career. This is a problem that's imperfect and evolving. Adapt and move on. Create better user experiences as they your bottomline.
13/end
I reference these documents all. the. time. No Shame.
The SEO industry needs less gatekeepers and more pragmatists. The typical lifecycle of a "community" is that pragmatists often become gatekeepers, or incorrectly equivocate their experiences as superior to others'.
We can all be right. We can all be wrong.
1/N
Web rings and blogrolls. Social bookmarking websites. Directory submissions. Article submission/content syndication websites. Authorship. Penguin. Panda. Hummingbird. Mobilegeddon. Interstitials. Locality. Query deserves freshness. PageSpeed. Https.
The opinions don't stop.
2/N
In the early 2000's all the articles on SEO were about trying to get your website in a respected web directory, pages on article directories, and how to take advantage of "web 2.0" social media websites - link building was very manual and websites weren't easy to make.
3/N
Hello friends following me this Friday on "implicit vs explicit search intent."
When you've been around the SEO industry for a while you start hearing new terms emerge. Search intent, user intent, keyword intent. As usual, I'm going to start with a bit of history.
1/n
Informational, navigation, and transactional queries - this categorization has been around since I've been doing SEO, and it takes an internet search to appreciate how old it is - 2002. You can still split up queries this way, but it's not great.
Happy Friday folx - we're ranting about pagination.
We're going to break down the problem pagination was supposed to fix, the problems it ended up creating, and why I want to kill it with fire. Something #passageindexing
And fun. And wacky. And evolving. More people started connecting to the internet, own websites, publish content, sell stuff online (who would trust that?!), and much more.
2/n
Then there were at least three problems:
- Folx created more content, it was no longer feasible to put it all in one URL
- People broke up their content into chunks for better user experience or ad monetization
- Inventory-driven websites were confusing search engines
3/n
Today we’re going to rant about “microsites,” a broad term used to describe creating a “new” website experiences. It will cover the why, some pros, some cons, and probably go off topic.
1/n
Microsites can be subdomains, brand new domains, or even subfolders of a brand - the telling sign is generally the tech stack differs (not always true). Quick hop on builtwith.com and you'll often see React apps powering parallax experiences (RIP Flash websites).
3/n
Happy Friday - let's rant about keyword density, LSI keywords, internal/external links per N words, and word count - vernacular you want to get out of your processes sooner rather than later.
1/n
Keyword density, usually defined as a percentage of amount of times a keyword shows up divided by the total number of keywords on the page, is product of the keyword-stuffing years of SEO.
e.g. I heard SEO Expert Chad recommends pages should have a Keyword Density of 2% !
2/n
Search engines were still using the keywords meta tags for rankings, the amount of times a keyword was repeated on a page had a huge influence on whether a page would rank, and there were algorithms that would pick up keyword stuffing. Yo it's not the pre 2k or early 2000's.
3/n