Noel Ceta Profile picture
Oct 24 16 tweets 3 min read Read on X
I used AI to analyze and rebuild the internal linking structure for a 2,500-page site.

Organic traffic increased 47% in 90 days.

Zero new content. Zero new backlinks.

Here's the exact AI-powered internal linking system: 🧵
1/ Why internal linking matters:

It tells Google:
What pages are most important
How topics relate to each other
Where to distribute PageRank
What content is authoritative

Most sites have chaotic internal linking that wastes authority and confuses crawlers.
2/ The Traditional Problem:

Manual internal linking for 500+ pages is:

Time-intensive (months of work)
Inconsistent (humans miss opportunities)
Subjective (no data-driven decisions)
Impossible to maintain (new pages = new links needed)

This is where AI changes everything.
3/ The AI-Powered System (5 Steps):

Extract and analyze all content
Map semantic relationships
Identify linking opportunities
Generate contextual anchor text
Monitor and optimize

Let me break down each step:
4/ Step 1: Extract and Analyze All Content

Use Python + OpenAI API to:

Scrape all page content
Extract main topics from each page
Identify primary and secondary keywords
Categorize by content type and funnel stage

AI understands semantic meaning, not just keyword matching.

Output: Database of all pages with topic vectors.
5/ Step 2: Map Semantic Relationships

Use AI embeddings to find:

Topically related pages (cosine similarity >0.7)
Parent-child relationships (pillar → cluster)
Supporting content connections
Cross-selling opportunities

AI finds connections humans would miss.

Example: "SEO tools" semantically links to "keyword research process" even without shared keywords.
6/ Step 3: Identify Linking Opportunities

AI analyzes:
Pages with high authority (backlinks, traffic)
Pages needing authority boost (low rankings)
Content gaps (topics mentioned but not linked)
Broken or weak internal link paths

Priority matrix:
High authority page → Low performing related page = Top priority link
7/ Step 4: Generate Contextual Anchor Text

AI creates natural anchor text by:

Analyzing surrounding paragraph context
Suggesting 3-5 anchor text variations
Avoiding over-optimization (varied anchors)
Maintaining natural reading flow

Example: Instead of "click here" or exact match "best SEO tools", AI suggests "we've covered the top keyword research platforms" with contextual fit.
8/ The Technical Implementation:

Tools I use:
Python + BeautifulSoup (content extraction)
OpenAI API (semantic analysis)
Pandas (data processing)
Google Sheets API (review interface)
WordPress API (implementation)

The entire process takes 3-5 hours for 1,000+ pages vs weeks manually.
9/ The Linking Strategy Rules:

AI follows these parameters:

Max 5 internal links per 1,000 words
Link to pages within 2 topic degrees
Prioritize money pages (3× more links)
Deep link to old content (not just homepage)
Vary anchor text (no repetition)
Link bidirectionally when relevant
10/ Real Results from Implementation:

Site metrics after 90 days:

47% increase in organic traffic
156 keywords moved from position 11-20 to top 10
23% improvement in crawl efficiency
31% increase in pages receiving organic traffic
Average session duration +18%

No new content. Just better internal linking.
11/ The Authority Distribution Effect:

Before AI optimization:

Homepage: 95% of authority
Category pages: 3% of authority
Deep content: 2% of authority

After AI optimization:

Homepage: 60% of authority
Category pages: 25% of authority
Deep content: 15% of authority

Better distribution = better rankings site-wide.
12/ The Maintenance System:

AI automation for ongoing optimization:

Weekly: Scan for new content, suggest links
Monthly: Identify orphan pages (0 internal links)
Quarterly: Re-analyze semantic relationships
Continuous: Monitor ranking changes per linking changes

Internal linking is not one-and-done.
13/ Common Internal Linking Mistakes AI Fixes:

❌ Linking to irrelevant pages (keyword match only)
❌ Over-linking to homepage (wasted authority)
❌ Orphan pages (no internal links)
❌ Using same anchor text repeatedly
❌ Ignoring contextual relevance
❌ No links to deep content

AI catches all of these systematically.
14/ How to Start Without Coding:

If you can't build the system:

Use these AI tools:
LinkWhisper (WordPress plugin with AI suggestions)
Surfer SEO (internal link recommendations)
ChatGPT + site crawl data (manual analysis)

Or hire a developer for custom implementation.

The ROI justifies the investment.
15 / AI-powered internal linking:

✓ Saves 100+ hours of manual work
✓ Finds opportunities humans miss
✓ Distributes authority strategically
✓ Scales with your content growth
✓ Compounds ranking improvements

Stop linking randomly. Start linking strategically with AI.

Bookmark this for your next site audit

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Noel Ceta

Noel Ceta Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @noelcetaSEO

Oct 22
Google switched to mobile-first indexing in 2019.

Desktop version doesn't matter anymore.
If your mobile site is broken, you don't rank.

I audit sites monthly. A lot of them fail basic mobile-first requirements.

The 15 mobile checks that determine if you rank or die: 🧵👇
1/ Content parity: Mobile = Desktop

Biggest mistake: Hiding content on mobile for "clean design."

Google only sees mobile.
Hidden content = doesn't exist.

Check:
□ All desktop text visible on mobile
□ All images present on mobile
□ All internal links present on mobile
□ No content in collapsed accordions Google can't access
□ Tabs/expandables properly coded

Client hid 60% of product descriptions on mobile.
Rankings tanked.
Restored content → rankings recovered.
2/ Mobile viewport configuration

Must have this meta tag:

html

``

Without it:

- Desktop version loads on mobile
- Users pinch and zoom
- High bounce rate
- Google sees bad UX = lower rankings

Check:
□ Viewport meta tag present
□ No maximum-scale limitation
□ Content fits screen width
□ No horizontal scrolling
Read 18 tweets
Oct 15
Run this 200-point SEO technical audit every 6 months.

Catches issues before they tank your rankings.

Here's the complete checklist: 🧵

Repost + comment 'AUDIT' to get instructions + spreadsheet.
1/ Category 1: Crawlability (30 checks)

CRITICAL:
□ Robots.txt not blocking important pages
□ XML sitemap present & submitted
□ No orphan pages (pages with no internal links)
□ All pages <3 clicks from homepage
□ No redirect chains (>2 redirects)
□ No 404 errors on important pages
2/ Crawlability (continued):

IMPORTANT:
□ Server response time <600ms
□ Crawl budget optimized (large sites)
□ Log files analyzed monthly
□ No crawl errors in GSC
□ Pagination implemented correctly
□ Canonical tags correct
□ Hreflang correct (international sites)
Read 16 tweets
Oct 13
Google's proximity update in March 2025 changed local SEO forever.

Businesses with perfect profiles got crushed overnight.

Here's what changed and how to adapt: 🧵
1/ What Changed:

Before: Business in city center ranked #1 for "plumber in [City]" citywide

After: Google shows the CLOSEST relevant business to the searcher's location

Distance from searcher now outweighs:

- Review count
- Rating
- GMB optimization

Proximity = King
2/ The Impact:

Businesses saw rankings fluctuate based on WHERE the search happened.

Example:

- Search from downtown: Downtown businesses rank #1
- Search from suburbs: Suburban businesses rank #1
- Same keyword, different results

Local = HYPER-local now.
Read 8 tweets
Oct 9
Most businesses post to Google once a month.

Then wonder why they don't rank.

Google Posts are a ranking factor in 2025 - here's how to use them right: 🧵 Image
1/ The Mistake:

You post:

- Once a month
- Generic promotional content
- No engagement
- No strategy

Your competitor posts:

- 3x per week
- Helpful, specific content
- Optimized with keywords
- Linked to landing pages

Guess who ranks higher?
2/ The Posting Frequency Rule:

Minimum: 2x per week
Optimal: 3x per week

Maximum: 5x per week

Why? Google Posts expire after 7 days.

If you post less than 2x/week, you have gaps where NO posts show.

Gaps = missed ranking opportunities.
Read 8 tweets
Oct 8
$1.3M gone in 6 months.

All because of fake Google reviews.

Here's how a Houston restaurant owner almost lost everything (and the exact playbook that saved his business): 🧵👇

(Client details under NDA - sharing the framework) Image
1/ The Setup:

Popular Tex-Mex restaurant in Houston's Montrose district.

Revenue steady at $80K/month until a competitor hired a black-hat agency to sabotage them.

Within weeks, 50+ fake 1-star reviews appeared.

Rating dropped from 4.7 to 2.9 stars.
2/ The immediate impact was devastating:

📉 Foot traffic down 20%
📉 Online orders cut in half
📉 Monthly revenue: $80K → $32K
📉 Had to reduce staff from 22 to 9

Competitors were literally buying their customers away with fake reviews.
Read 14 tweets
Oct 7
$100K/year saved.

Just by fixing how Google crawls the site.

Most sites waste 60-80% of their crawl budget on useless pages.

Here's the complete crawl budget optimization playbook: 🧵👇

(Got reminded to write this after a discovery call today with a crypto site: 500K pages, 1.5M not indexed. 10+ year old domain. Wild.)
1/ What is Crawl Budget?

Google allocates a limited number of pages it will crawl on your site per day.

Small sites: Not an issue
Large sites (10,000+ pages): HUGE issue

If Google wastes crawl budget on junk pages, it doesn't crawl your money pages.

No crawl = No index = No rankings = No revenue.
2/ The Problem We Found:

Ecommerce site with 50,000 pages.

Google crawling:

- 18,000 duplicate pages with no canonical tags (useless)
- 8,000 out-of-stock product pages (dead)
- 5,000 duplicate URLs with parameters
- 3,000 low-value filter pages
- 2,000 pages with 404
- Only 16,000 valuable pages getting crawled

64% crawl budget WASTED.
Read 16 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us!

:(