Noel Ceta Profile picture
Dec 9 9 tweets 3 min read Read on X
I audited 240 AI-generated articles that weren't ranking.

Applied a systematic quality improvement process.

Average position improved from 28 to 11 over 5 months (67% improvement).

Here's the exact audit framework: 🧵👇
1/ The baseline situation:

Starting performance data:

Content analyzed:

- 240 articles published over 8 months
- All AI-generated (Claude and GPT-4)
- Human editing: 20-30 minutes per article
- Average word count: 1,800 words

Performance metrics (after 4 months):

- Average ranking position: 28
- Page 1 rankings: 18 articles (7.5%)
- Organic traffic: 3,200 sessions/month
- Engagement: 1:24 average time on page

Clear underperformance relative to expectations.
2/ The systematic audit methodology:

Quality assessment process:

Evaluated each article across 8 categories (scored 0-10):

1. Factual accuracy (sources cited, data current)
2. Content depth (comprehensive vs superficial)
3. Unique value (original insights vs rehashed)
4. User intent match (answers actual query)
5. Structure quality (scannable, logical flow)
6. E-E-A-T signals (expertise demonstrated)
7. Technical SEO (proper optimization)
8. Engagement elements (visuals, examples, CTAs)

Articles scoring under 60/80 flagged for improvement.

Result: 187 articles needed substantial updates (78%).
3/ Common AI content problems identified:

Pattern analysis across 240 articles:

Issue 1: Generic information (found in 68% of articles)

- Restated common knowledge
- No unique perspective
- Indistinguishable from competitors

Issue 2: Weak examples (found in 71% of articles)

- Generic hypotheticals ("imagine a company...")
- No specific case studies
- Vague scenarios

Issue 3: Missing depth (found in 64% of articles)

- Surface-level coverage
- Key questions unanswered
- Insufficient how-to detail

Issue 4: Poor E-E-A-T (found in 82% of articles)

- No author expertise shown
- Sources not cited
- No original data or research
4/ The improvement protocol:

Step-by-step enhancement process:

For each flagged article (187 total):

Week 1-4: Batch 1 (60 articles)

- Add 3-5 authoritative sources (linked)
- Insert 1-2 specific examples
- Expand thin sections by 300-500 words
- Add author expertise note
- Update publish date

Week 5-8: Batch 2 (64 articles)

- Create original data visualization
- Add industry-specific insights
- Improve structure with better H2s
- Insert FAQ section with schema

Week 9-12: Batch 3 (63 articles)

- Continue same protocol
- Focus on user intent refinement

Time per article: 90-120 minutes (vs original 20-30 minutes).
5/ Specific enhancement tactics:

Actionable improvements applied:

Tactic 1: Source addition

- Before: Claims without attribution
- After: 3-5 links to authoritative sources (studies, government data, industry reports)

Tactic 2: Example specificity

- Before: "Many companies struggle with X"
- After: "According to Gartner's 2024 survey of 500 enterprises, 67% report challenges with X, primarily due to Y and Z"

Tactic 3: Depth expansion

- Before: 200-word section covering topic
- After: 500-word section with subsections, examples, and actionable steps

Tactic 4: E-E-A-T signals

- Before: Anonymous content
- After: Author bio with credentials, "Based on analysis of 50+ client implementations"
6/ Results tracking methodology:

Performance monitoring process:

Tracked weekly for 20 weeks:

- Position changes (Google Search Console)
- Click-through rate improvements
- Organic traffic per article
- Engagement metrics (GA4)

Measured in cohorts:

- Batch 1 (improved weeks 1-4): Tracked from week 5
- Batch 2 (improved weeks 5-8): Tracked from week 9
- Batch 3 (improved weeks 9-12): Tracked from week 13

Control group: 53 articles left unchanged for comparison.
7/ Improvement results by timeline:

Performance progression data:

Month 1 post-improvement:

- Average position: 28 → 23 (18% improvement)
- Page 1 rankings: 18 → 29 articles
- Traffic: 3,200 → 4,100 sessions/month (+28%)

Month 3 post-improvement:

- Average position: 23 → 15 (46% improvement)
- Page 1 rankings: 29 → 67 articles
- Traffic: 4,100 → 8,900 sessions/month (+178%)

Month 5 post-improvement:

- Average position: 15 → 11 (67% total improvement)
- Page 1 rankings: 67 → 89 articles
- Traffic: 8,900 → 12,400 sessions/month (+288% from start)

Control group (unchanged articles):

- Average position: 29 → 27 (minimal change)
8/ The AI content audit improved rankings because:

✓ Systematic quality assessment (8-category scoring)
✓ Pattern identification (68-82% of articles had common issues)
✓ Specific enhancements (sources, examples, depth, E-E-A-T)
✓ Substantial time investment (90-120 min per article vs original 20-30)
✓ Phased implementation (12-week improvement cycle)
✓ Performance tracking (weekly monitoring, control group)

Timeline: 5 months from audit start to 67% improvement.

Investment: 280-375 hours total editing time (187 articles × 90-120 min).

AI content can perform well, but requires quality control and strategic enhancement.

Initial light editing (20-30 min) insufficient for competitive niches.

Proper enhancement (90-120 min) brings AI content to competitive performance levels.

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Noel Ceta

Noel Ceta Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @noelcetaSEO

Dec 10
Client had 2,847 pages with 404 errors.

Each 404 = wasted opportunity.
Some had backlinks pointing to them.
Link equity flowing to dead pages.

Created 404 recovery strategy.

Results:

- Recovered 340 backlinks
- Rankings improved for 67 pages
- Traffic increased 45%

Here's the exact 404 fixing process: 🧵👇
1/ Finding all your 404s

Method 1: Google Search Console

- Index → Pages → Not found (404)
- Export list
- Sort by referring pages (priority)

Method 2: Screaming Frog

- Crawl site
- Filter: Status Code → 404
- Export with inlinks

Method 3: Server logs

- Check 404 requests
- See which get traffic
- Prioritize those

Client had:

- 2,847 total 404s
- 340 with backlinks
- 127 with actual traffic

Fixed high-value ones first.
2/ The 404 triage system

Priority 1 (fix immediately):

- Has backlinks
- Has traffic
- High-value keyword potential

Priority 2 (fix this week):

- Has backlinks OR traffic
- Part of important section
- Frequently requested

Priority 3 (fix when possible):

- No backlinks, no traffic
- Old/irrelevant content
- Low priority

Client focused on Priority 1:

- 340 pages
- Fixed in 2 weeks
- Massive impact
Read 8 tweets
Dec 10
One law firm generated $10M in signed case value from organic search over 18 months.

No paid ads. Pure SEO strategy.

Here's the complete playbook they used: 🧵👇
1/ Firm profile and starting position:

Baseline metrics before SEO investment:

Law firm details:

- Practice area: Personal injury (car accidents, slip and fall)
- Location: Mid-sized US city (population 500K)
- Firm size: 4 attorneys
- Years established: 12 years

Starting SEO performance (Month 0):

- Organic traffic: 850 sessions/month
- Ranking keywords: 120
- Leads from organic: 3-5/month
- Case value from organic: ~$150K/year

Decided to invest seriously in SEO after paid ads became too expensive ($800+ per lead).
2/ Investment and timeline:

Budget allocation over 18 months:

SEO budget: $6,500/month average

Breakdown:

- Content creation: $3,000/month (15-20 articles/month)
- Technical SEO: $1,000/month (site optimization, local SEO)
- Link building: $1,500/month (local citations, PR, guest posts)
- Strategy and management: $1,000/month

Total investment: $117,000 over 18 months

Return: $10M in case value (85x ROI)

Note: Legal cases have long sales cycles, settlement times extended.
Read 12 tweets
Dec 9
E-E-A-T (Experience, Expertise, Authoritativeness, Trustworthiness) determines rankings in competitive niches.

Built authority from zero to ranking in 90 days using this framework.

Here's the complete implementation plan: 🧵👇
1/ Understanding E-E-A-T signals:

What Google evaluates:

Experience: First-hand involvement

- Personal case studies
- Direct product usage
- Real implementation examples
- Industry participation

Expertise: Demonstrated knowledge

- Credentials and qualifications
- Deep subject matter understanding
- Accurate information
- Professional recognition

Authoritativeness: Industry recognition

- Citations by others
- Speaking engagements
- Published research
- Awards or recognition

Trustworthiness: Reliability

- Accurate information
- Transparent practices
- Secure website
- Clear policies
2/ The 90-day implementation timeline:

Phase-based approach:

Days 1-30: Foundation (Trust + Expertise signals)
Days 31-60: Credibility (Authority building)
Days 61-90: Validation (Experience demonstration)

Each phase builds on previous work.

Sequential execution critical for maximum impact.
Read 14 tweets
Dec 8
Built 200+ local links in 90 days

Without sending a single outreach email.

No begging. No templates. No spam.

Here are 25 link-building tactics that actually work in 2025: 🧵👇
1/ Category 1: Easy Wins (Do First)

TACTIC 1: Local Business Directories

- Chamber of Commerce
- Better Business Bureau
- City business licenses
- Industry associations

Time: 1 hour
Links: 10-15
DR: 40-70
2/ TACTIC 2-4: Easy Wins (continued)

1. Local News Submissions
Submit to community calendars, event listings
2. Supplier/Vendor Pages
Ask suppliers to list you as customer
3. Client Testimonials
Give testimonial → get link back

Each: <30 minutes
Quality: Medium-High
Read 15 tweets
Dec 1
Tested AI vs human-written meta descriptions across 500 pages for 6 months.

AI-generated descriptions boosted CTR by 23% on average. Saved 20+ hours of writing.

Humans write. AI converts.

Here’s why it works and how to implement it at scale 🧵👇
1/ The testing methodology:

Fair comparison setup:

Sample: 500 product pages split into two groups

- Group A (250 pages): Human-written meta descriptions
- Group B (250 pages): AI-generated meta descriptions
- All pages had similar traffic and rankings
- Tracked for 6 months

Metric measured: Click-through rate (CTR) from search results.
2/ CTR performance results:

Click-through rate comparison:

Human-written meta descriptions:

- Average CTR: 3.2%
- Range: 1.8% to 5.4%
- Time to write: 3-5 minutes each

AI-generated meta descriptions:

- Average CTR: 3.94%
- Range: 2.4% to 6.1%
- Time to generate: 10 seconds each

Difference: +23% higher CTR with AI descriptions.

Time saved: 1,250 minutes (20+ hours) for 500 descriptions.
Read 9 tweets
Nov 30
Security headers improve SEO?

Yes.

Google wants secure sites.
Security headers signal trustworthiness.

Added proper headers to client site.

Results:

- Security score: D → A+
- Site speed improved
- Google's trust increased
- Rankings up 15% average

Takes 10 minutes to implement: 🧵👇
1/ Why security headers matter for SEO

Google's perspective:

- Secure sites = better user experience
- Headers prevent attacks
- Headers improve performance
- Trust signal for rankings

Headers that matter:

- HSTS (force HTTPS)
- CSP (prevent XSS)
- X-Frame-Options (prevent clickjacking)
- X-Content-Type-Options (prevent MIME sniffing)
- Referrer-Policy (control referrer info)

Client before headers:

- SecurityHeaders.com score: F
- Vulnerable to basic attacks

After:

- Score: A+
- Rankings improved
2/ HSTS: Force HTTPS permanently

HTTP Strict Transport Security.

Tells browsers: "Always use HTTPS for this site."

Implementation:

apache

`# Apache .htaccess
Header always set Strict-Transport-Security "max-age=31536000; includeSubDomains; preload"`

nginx

`# Nginx
add_header Strict-Transport-Security "max-age=31536000; includeSubDomains; preload" always;`

Benefits:

- No HTTP→HTTPS redirect needed (faster)
- Prevents protocol downgrade attacks
- Google loves it

Preload:
Submit to hstspreload.org
Chrome/Firefox will always use HTTPS for your domain.
Read 8 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us!

:(