.
:: *sigh* - someone found the cookie-cutters! ::

*checks calendar*

@NicheSiteLady & @NicheDown
It's called "cookie-cutter content";
when you basically copy a piece of content, change a tiny % of it, to rank for n+ terms.

developers.google.com/search/docs/es…

#SEO #BrandSwapping
Now, I know it says "affiliate",
but it applies for just about any type of site,
be it's monetisation via
* direct sales
* ad-rev
* affiliate payments
* referral fees

(The term MFA used to be applied (made for ads (affiliates))

developers.google.com/search/docs/es…
So, the problem is - though it can (does!) work,
(bad Google, bad!),
it's possible that G will catch it at some point,
and may hammer a site for it
(so please - at least give people a warning!).

There are ways to handle it "better",
with reduced risk.
It's an established (very old) SEO approach,
based on ranking for specific keywords.
(I think it even predates "spinning" - as it was typically a simple S&R)

Several types of business/site/content tend to utilise it,
including Localisation, real estate and E-Com (variant prods).
Option 1: Increase the differences!

a) Alter keywords
b) Alter non-keywords/phrases (synonymise, add/remove bits of a sentence etc.)
c) Syntactic changes (break/combine sentences, remove/add some, swap order etc.)
d) Inject "variant" specific data in (dates, quantities etc.)
Option 2: Canonicalise!

Create a "primary version" (brandless).
That's your canonical.

Create "brand variants".
Canonicalise them to your canonical.

G will see you have N variants of X,
and will show what it thinks is the more relevant one when people search.
Option 3: Get programmatic!

There's usually several different ways to say things.
If you know you are going to generate N variants,
get the writers to provide several versions of each sentence/paragraph/section.

You can then use these to create additional versions.
Personally - I advocate ALL 3 !

You should be taking a programmatic approach,
tagging parts of the content as optional (random inclusion).

... and pulling in additional data,

... and utilising canonicalisation (when/where applicable).
Taking such steps contributes towards avoiding G for longer, if not completely.

(Remember - it's not just about dissimilarity - it is meant to be useful/beneficial to the user)

The next thing is ... knowing what to do if G does find it,
dislike it, and kick your site down!
If G does hammer your site:

1) Look for a Manual Action in GSC

2) Look at what pages/terms are hit via GSC

3) Decide on a "fix"
a) Start 301 Redirecting variants to a primary
(or weaker/less valuable versions to the biggest earner etc.),
b) Noindex the weaker until rewritten
Once you've made the fixes:
i) If there was a MA, fire a Reconsideration Request
or
ii) You have to wait for whatever quality algo/filter that hit you decides you're good enough again (may take Months (and Months!)!)

(See why I say you should warn people about this?)

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Lyndon NA (Darth Autocrat)

Lyndon NA (Darth Autocrat) Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @darth_na

Oct 13
.
:: Content Design ::

The UX magic happens when content creation, design principles and application/HTML markup/CSS intersect.

But it's not as easy as it appears,
due to display variances (such as "5 lines" on Desktop isn't 5 lines on Mob!)

#SEO #Content #ContentDesign

>>>
It starts with ... the User.

Good content design is grounded in knowing what the user wants, and how they want it.

(I know the general rule is stuff like "write for an 8th grader" or "16yo" etc. - but that doesn't work when your audience is literally brain surgeons!)
The audiences language and knowledge "levels"
defines things like whether you can include:
* abbreviations
* topic/industry terms/jargon
* sentence length
* sentences per paragraph
* distance between references
* overall length of content
etc.
Read 19 tweets
Oct 11
1/?
:: Crawling Large, Huge and Mega sites ::
:: Partial vs Full crawls ::

@JetOctopus has done a piece looking at some of the issues that can arise from not doing a Full Crawl,
or, as I'd phrase it, doing a really shoddy Partial Crawl.

#SEO #Crawling

>>>
2/?
>>>

Personally, I hold a slightly different opinion.

Yes - you should do a Full Crawl,
but you shouldn't need to do one every single time.

Instead, you should do an initial,
and run one after any significant change (such as platform etc. (preferably dev ver. first)

>>>
3/?
>>>

But most of the time, Partial Crawls should be sufficient,
if done properly!

So, here's a quick guide for Partial Crawling ...

> Priorities
You should know what pages are Vital to the site/business

> Updates
You should know URLs of new and altered content

>>>
Read 10 tweets
Oct 9
1/?

:: B2B SEO difficulty ::

Getting the idea of ToF/MoF (or early journey) content being equal to BoF/Conversion.

Due to some SEOs and the way G says things,
there's this stupid misunderstanding about ranking pages (not sites),
and not getting how X supports/improves Z.

#SEO
2/?
From a marketing/consumer perspective,
that content enables early awareness/recognition of the company,
and starts building trust, rapport and emotion/loyalty.

From an SEO perspective, it increases topicality and Link value flow (internal links have been important for years)
3/?

There's also the wonderful confusion (read as *fucking annoying) regarding the "funnel".
For some reason - people seem to think there's only one.
In most cases - there's 2 "broad" funnels (marketing and sales).
And the "marketing funnel" is often several funnels!
Read 7 tweets
Sep 26
.
:: Regular performance audits are good ::
:: Alerts on Priority content is better! ::

You should have a separate segment,
tracking priority pages:
1) Those that do contribute heavily to Goals
2) Those that ideally would contribute to Goals

Alert when they drop!

#SEO

>>>
>>>

For SEO, the primary metrics will be:
1) Ranking position per term
2) The URL per term
3) CTR per term/URL
4) Impressions per term/URL

For Business, these may in turn influence:
Business Goal KPI (such as revenue)

When you start to see negative shifts,
>>>
>>>

... you want to start keeping an eye on things!

For starters - don't panic.
Verify!

A) Has it impacted Goal/KPI?
B) Is it actually a drop
C) Is it seasonal
D) Is the same shift happening for everyone

Not all drops are painful or unusual!

>>>
Read 11 tweets
Sep 24
1/5
🚧:: Page Layout and Content Concepts ::🚧
:: There is (should be!) a lot in a page! ::

For example, the basic break down below is missing:
* Date (Publish/Edit)
* Author
* Secondary Nav (Breadcrumbs)
* Primary Heading
* Secondary Heading

#SEO #Webdesign

>>>
2/5
>>>

* Introduction/Value Proposition
* Page Index/Content/Jump Links
* SubHeadings
* CTAs (Primary/Secondary)
* Support/Prop Links (to Primary/BoF/Goal page)
* Tertiary Nav (Section/Siblings)

And those are the bare-basics!
There's also optional things...

>>>
3/5
>>>

Optional elements would include things like:
* Read time (est. based on word count)
* Social action/share links
* Pull quotes (and share-excerpt links for social)
* Comments/Reviews (and indication of such at top)
* Sub-Images (non-hero, set as lazy etc.)
etc.

>>>
Read 5 tweets
Sep 23
.
:: Google doesn't have a % for Duplicate Content? ::

1) How does G avoid showing dupes in the SERPs?
2) How does G identify candidates for auto-canonicalisation?
3) Or decide to act on Redirects/CLE?

... and this is why Googlers typically don't answer my questions :(

#SEO Screenshot : Cropped : Two Tweets. Bill Hartzer (@bhartzer)
Logic says G not only have a method (at least one),
but use it too!

And I've asked (at least 4 times) for an indication of what the threshold is,
and whether it's based solely on "content",
or if it includes code etc.,
and been ignored/evaded - every single time :(
I don't think there is a "specific" threshold,
I believe it is variable - based on availability.

G can/does show duplicates and highly similar results in a SERP,
esp. if there is little else for them to show.

The more "options" to show (that aren't dupes),
the fewer G shows.
Read 4 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us on Twitter!

:(