) que compartió lo que había echo durante el evento de #vamostalegon.
Lo he extendido agregando los principales E-commerce de España.
¿Cómo se obtienen los datos?
1. En general cuando hay un autosuggest, la web se nutre de datos enviados por una API que puedes consultar mediante una URL.
Casi siempre, el formato es JSON. Si sabes buscar, encontrarlo no es muy complicado.
2. Usando nodatanobusiness.com/importjson/, podemos usar una nueva fórmula, IMPORTJSON, para obtener estos datos en Sheets sin complicarnos la vida.
3. Detectando el pátron para cada web, podemos recuperar sus datos usando una palabra clave de entrada. Quitamos los duplicados entre las webs y obtenemos nuestro mini keyword-research con el volúmen usando la API de Keyword Surfer
Si buscáis un nuevo nicho, usar esta técnica como punto de partido es útil y permite ahorrar tiempo.
También os digo que algunos E-commerce tienen un cantidad de datos (como conversión por palabra de búsqueda) que podéis consultar sin problema que me flipa.
No tengo el archivo 100% acabado, así que no lo puedo compartir aún pero si os interesa, lo haré en algunos días con instrucciones de uso detalladas.
Let's go through the most common formulas you need to master to work quicker.
Most of them can also be used in Excel, but not all of them.
1. VLOOKUP
THE formula you have to master because it allows you to merge data from different tables. Very useful to combine Search Console and Analytics data, for instance.
You have to master it.
2. FILTER
I've explained everything about this formula in a separate thread:
🕵️ How can you spy on a competitor's content strategy? 🕵️
Your strategy must never be a simple pale copy of what others are doing, but it's always a good idea to know what they are up to.
Let me show you, with a real example, how you can generate insights quickly.
Let's assume we're working in the travel industry and one of our competitors is Skyscanner.
We want to understand what they are doing on their blog and generate some insights based on the data we have at our disposal.
First step: get an exhaustive list of their URLs
This could be done through a crawl, but I'd rather get the list from a sitemap. Not always doable, but in this case, it was easy to find what I was looking for.
Content rehydration is a process that occurs when a website, built with a JavaScript framework, such as Angular or React, dynamically updates the content on a page without requiring a full-page refresh.
Why using rehydration instead of relying only on SSR? It is faster!
What is the issue with content rehydration?
It will add a script to the raw response sent by your server with all the required code to make the application dynamic. Out-of-the-box, this script can easily represent more than 90% of the total HTML size.