TL;DR
- Synthetic tools only
- Trust Lighthouse Score
- Improper metrics
- Mobile (or Desktop) only
- Confound domain and page-level data
- Slow outcomes validation
- Measure all your internal pages
- ...
Synthetic tools only
They are useful during debug/dev but never represent your real users' experiences.
You should primarily monitor with RUM (Real User Monitoring) tool and combine it with a synthetic one.
Trust Lighthouse score
You shouldn't trust its score for 2 reasons:
1- 50% of pages with a perfect score, don't pass the WebVitals test. @philwalton wrote about bit.ly/3vxMBEh
2- Run 3 consecutive audits on a page, never get the same score: Variability!
Improper site speed metrics
Many of the known web performance metrics became LAPSED today.
Stick with the ones that matter the most. Here are mines:
TTFB for server;
FCP, LCP for loading;
FID, Responsiveness for interactivity;
CLS for layout stability
Monitor on a single device type
Don't neglect monitoring site speed on each considerably-used device (by your audience).
Above all, compare site speed data between devices for example Desktop Vs Mobile (it will surprise you!)
Confound domain and page-level data
Domain (origin) data is useful to evaluate how your site is performing in an aggregated way.
Page-level data is where you dive into performance details for individual page types on your website (product page, listing page, content page..etc)
Monitoring both is a must-do! And the most important part is the TRANSITION from the first to the second.
1⃣My domain LCP is not that good 🙃
2⃣What page type is the most contributing to it? 👀
3⃣Prioritize it 🙌
Slow outcomes validation
You made a webperf optimization and pushed it live. Now what?
You wait 1 month to validate the outcomes from real users? It's too late ☠️
You need a Fast (daily) RUM validation to validate wins or detect regressions per page type.
Measure all your internal pages
Crawling your website and measuring the web performance of each page is useless and not an effective approach.
Choose rather the most popular page per page type.
All your listing pages share (at 99%) the same template and the same webperf issues
Want to avoid all the above pitfalls?
I'm building @SpeetalsTool, a tool that saves you time and helps you get the right way for site speed.
[Thread] New details from @googlewmc on Mobile-first indexing best practices ! They added :
- Focus on having "same content" on Mobile VS Desktop
- Same meta robots
- No interaction based lazyload
- Resources must be crawlable
- Same structured data on both versions
1/4
- URLs in Mobile structured data must point out to mobile ones ! [😅]
- Same meta tags [title, description]
- High Quality images
- Persistant images URLs
- Same alt text for mobile/Desktop images
- Use supported video formats
- Don't put the video in the bottom of the page
2/4
Additional best practices for separate URLs (m dot) :
- Mobile version shouldn't give an error
- No fragment URLs for Mobile
- No one for many mobile URL
- Verify both versions on Search Console
- Same robots.txt file
- Correct Canonical and Hreflang implementation
3/4