The default view shows you all of the domains that have recently been deleted.
Now let's customise our view a bit better to give us some better information. Click Column Manager and tweak your columns for the information you want and hit save.
You can see my columns are more focused on backlinks.
Clicking back on Deleted Domains at the top you should see deleted domains with more data around backlinks.
Now we obviously just want to see domains that are available here, click Show Filter and tick this bock on the Common Tab.
We also want to get rid of any domains that have no backlinks here so Click on Majestic and look for Ext BL and add Min of 1.
You can also search for domains that have .edu or .gov links using this filter too.
Click Apply Filter to see the domains update
Let's say you're in a travel Niche looking for a backlink to a Paris landing page.
Search for Paris at top right.
Then sort by MBL to see domains with the most backlinks according to Majestic.
I also have this view to show just .co.uk and .com domain names from the filter
It's a case of finding a domain that might fit what we're after here.
Here we can see a domain that might suit us, If you click the ACR link we can inspect this site and see what it used to look like on WayBack Machine
This site turns out to be a blogger so could work for us.
Domain expired in 2019 so this blogger has likely moved on to other things and potentially longer needs a website.
Diving into Ahrefs you can see how many backlinks are still active to this domain and also inspect further if this is an ok candidate for what we are looking to do.
Back in Wayback Machine, look for the last date that this blog had content indexed.
When you can see the last best looking version of the site.
Grab the URL String and look for the section with web/20180914042141
There are probably similar tools out there for Windows if you have a search around GitHub.
Follow the install guide and then head to your terminal.
Now let's scrape.
Using the commands in the documentation, add the domain name and that timestamp from WayBackMachine.
The tool will take a site snapshot from that date with all assets and generate this into a static website.
Once all the files have been downloaded, and test what that looks like.
If you browse to the folder where your files were downloaded in my case it was
~/websites/
Go into the folder you downloaded and the run a local server using
npx serve
Now if I head to localhost:3000 in my browser I can see the scraped website running fine.
So we want a backlink on this site.
Using your code editing tool you can open up this site and either add a backlink somewhere or replace some content on the site with more relevant content and include a link back to your site.
Easy link building is easy.
Once you've edited the content, you essentially have a static website ready to host.
Netlify gives you free static hosting.
This will likely be the easiest website you've ever deployed too!
No FTP or SSH needed here!
Once you're signed up for Netlify. You should see something like this.
Click browse to upload and select the folder which we want to host.
You should then see something like this. That's it, your site is deployed
Netlify gives new domains random string urls. If you click Site Settings, change site name. You can change to be something more relevant.
Now, all we need to do is buy the domain name from your provider and point this to your Netlify instance.
Once that is all done and once the search engines pick this site back up, you now have a nice link on an expired domain.
Obviously, there's a few more things you can do next like, optimise the scraped site, add sitemap.xml and push that into Google Search Console etc.
Then again, you probably shouldn't do this at all.
It's all very dodgy this stuff.
If it's someones old blog, you should really remove any images and source your own images.
Maybe even the content too as this content could have been migrated elsewhere and be duplicated.
If it turns out to be an old brands website you choose, they could still own certain assets for that site and you could end up in some hot water with legal teams sending cease and desists and other things.
Maybe a better approach would be to buy that domain, relaunch it with your own content / assets and redirect old links to new pages and add your links there.
But I guess you'll be slowly heading down the PBN route then.
So let's say this thread was just for educational purposes and leave it there.
Learning the sketchy #SEO techniques just is as important as the non-sketchy ones.
✅ Relevant image. Optimised and responsive.
✅ Large title, usually the main keyword
✅ Breadcrumb - make sure to use Breadcrumb Schema
✅ Table of Contents - Longer posts this should be collapsed with a "show table of contents" button
Core Section - We have an introduction section which includes
✅ H2 with a variation on the H1
✅ Large lead paragraph
✅ Bold paragraph that solves the question the page is trying to answer
✅ Continuation paragraph to lead into the rest of the sites content
Here is some more proof. Below you can see the before and after screenshots of my tests and times in the Chrome Network tab.
Caveat: the Finish time never stops on CSS due to external scripts constantly running. The screenshot was when it stabilised though.
Post 3! The game is up! I didn’t really do this. :D
I’m going to show you how in DevTools you can measure the potential gains that can be had by removing bloated scripts. This method is great to get buy-in from other teams before going ahead deleting anything from the code.
Reduce response times from the server - CDN's can help here
Reduce render blocking JS and CSS - try to only load the minimum on a per page basis. Minify, inline and defer are your friend. Check usage using code coverage tool in Dev Tools
— Optimising Largest Contentful Paint - cont.
Imagery - Make sure they are well optimised and compressed. Also keep an eye on those sizes. Look into image CDN's and lazyloading
Preloading - You can use preload the most important elements on your page