Digital Marketing Specialist
Czech Republic
Click here to reveal email address
Google Search Console, Google Analytics, Looker Studio
Ahrefs, Screaming Frog
Google Ads, Autohotkey
On-page SEO, Off-page SEO, Content Strategy
PPC Strategy, SEO Reporting
Technical SEO, Revenue Forecasting, Budget Planning
Hungarian
English
Chinese
Czech, German
Description: Developed various AutoHotkey scripts to automate repetitive tasks, improve workflow efficiency, and enhance productivity in digital marketing operations.
This is an overview over SEO reporting and learning insights about your business, your customers using your own data. I heard many people not using Google Search Console, and limit their insights only to Google Analytics results. Perhaps using one or two external tools, such as Ahrefs or Screaming Frog. But this is not the case for many companies.
First, I want to walk through the metrics and dimensions that I use for creating my SEO reporting strategy.
The SEO Funnel starts with Visibility, to present how many keywords are belonging to ranking groups on positions 1-3, 4-10, 11-20, 21-50, 50+.
Then we follow up on Impressions, Clicks and Average Positions from Google Search Console. I hardly can imagine that there are businesses running without GSC, so you can utilize these metrics along with the urls and queries.
So you will see plenty of reports here based on your own GSC data. It is all yours, for free and available for 16 months and all the other tools that you are spending monthly, hundreds of euros are asking for access to your GSC and even GA4 data.
It is nonsense to share your data and pay for seeing into your own data and learn about your own performance.
The next steps are to see what is happening to your visitors on the webpages, so call Default Channel Group and analyze User, Engaged Sessions, Views, Engagements rate-Session duration, Events if applicable to your stakeholders and narrow dow to Transaction and Revenue.
I saw companies who decided that SEO only will be accountable for Impressions , Clicks and Positions along all the performance data in GSC (indexed pages, 404 pages, link building activities, because what happens on the website is not solely the responsibility of the SEO team. With a UX-UI you can have great content, revenue will not come. If UX and SEO work in harmony but IT is underperforming with slow page loading, non functioninig cart or registration page or buttons, the revenue will not come.
I very much agree to the shared responsibility for success in between the teams.
Avoid manual work, save time and reduce errors to zero with bringing your GSC data to Looker. With this dynamic dashboard you will be able to see all your core organic performance.
See your website's urls and the Impressions, Clicks and the related Average Positions. I believe you know what these metrics mean. If not,
Impressions: How often someone saw a link to your site on Google. Depending on the result type, the link might need to be scrolled or expanded into view.
Clicks: How often someone clicked a link from Google to your site.
Average Position: A relative ranking of the position of your link on Google, where 1 is the topmost position, 2 is the next position, and so on. Shown only for Google Search results.
Bring these all into 1 table. Add a date filter and perhaps other filters for the metrics and for your urls.
You can create new calculated fields to name urls and add labels for page type, create line charts to follow up on position changes over time and many more..
Now copy this page and use this dataset to create a pivot table with all these dimensions and metrics but add the queries. So you see all the urls and the queries from ranking keywords till long tails keywords.
You can use it as a cannibalisations tool, you can use it to see what search terms the visitors used, if you find new products in the queries don't neglect the results but hand over the data to procurement and tell them that these new things are landing in your webpages, get them and make more revenue.
Also Big impressions and Small clicks might mean that the visitors receive your weblink on their SERPs but they rather not click on it. In this case you can work on your good old meta titles, description and support your brand with more visibility perhaps using a spot on PPC campaign.
Bring in Looker the GSC and Google Analytics 4 to see where the Revenue landed on merging the datasets and making this report dynamic as well.
Moreover, make this report to show you Products or Categories are the organic Revenue flagship? Based on several criteria you can steer your product page SEO or your category page SEO or create more appealing content that aids conversion or brings in the Revenue.
If you merge GSC and GA4 data, then you could check out which urls receive what average engagement time on active user and link to it the event count. Can you imagine 12 sessions in 0 seconds? You might be surprised how big a percentage of your traffic is coming from bots.
I am not keen on expensive SEO tools, I really love Ahrefs but I would highly recommend Screaming Frog.
You can use Screaming Frog to learn about your best competitions' content strategies and status but now you can use this tool for 20 sth euros per month and run a crawl on your webpages.
Bring GSC and merge your SF crawl report saved in Google Sheets and find out what urls ranking with what average position and what is the length and headers and metadata.
You will probably focus your resources on creating the best length for the highest ranking with the best CTR.
You can also use SF to crawl your best competition and learn from that content strategy.
Where are your organic positions placing you on the market? Well, it sounds a bit talking you into it but I want to build with you on the same common ground.
Search volume will show you an organic keywords estimated monthly search volume. By this you can maybe not earn an accurate value but let's agree that we use a tool to find the search volume.
Impressions from GSC as per definition goes "How often someone saw a link to your site on Google.", which was generated after a relevant search term was entered to Google's search bar.
I want to see if I have a keyword-query-search term name it with monthly estimated volume of 10.000 and you have for the same keyword (and its variants) in total of 6421, then we can make that statement that on the level of indication your keywords are covering a big part of the market.
Let's not make a statement that it is covering 64%, because the keyword tools are providing their estimate and your keywords' impressions are not matching every aspect of the SERP. But you can position your business for 10.000 SV - 6421 impressions or 10.000 and 777 impressions. Strong position on the market or weak position on the market.
Do you use auto generated content by AI or by other means? Can you match them from your url structure or by other indicators? Then you can use your GSC report base and use a few formulas and make conclusions about what content type provides what kind of performance.
You can finetune such reports and combine other report types with each other based on your preferences, resources and strategies.
Check your performance in GSC, find out the nature of your errors, and correct these. A report could just focus on the indexed pages, error pages, 404 pages, core web vitals.. You shall monitor external links and find malicious sites pointing to your website and disavow those, filter the most performing and relevant sites and create a list for internal linking strategies.
Google Search Console is indeed an amazing tool and it is for your availability.
After finding out everything about your business, make the next moves and find out the most about your successful competition. Use Screaming Frog, Ahrefs or make tools with the assistance of AI. Learn about their link building, content layout, meta titles and description and everything that is possible.
Learning from the better is cheaper. They did a great job and you shall use these insights. Reach out to me for help with the implementation of your core SEO reporting.