How to Do Log File Analysis | Lesson 4/7 | SEMrush Academy

How to Do Log File Analysis | Lesson 4/7 | SEMrush Academy

User Photo

5 months
Want to watch this again later?
Sign in to add this video to a playlist. Login
0 0
Tune in to our live mega webinar™️ 5 Hours of Content Marketing on May 28 ➤

In this lesson, we’ll use the Log File Analyzer in order to gain an insightful and in-depth knowledge of how Google bot crawls your website.
Watch the full course for free:

0:16 Log File Analyzer
0:51 Googlebot Activity graph
2:41 Summary

✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹
You might find it useful:
Apply your newly acquired knowledge by practicing with SEMrush and build up your real-world skills.
Go to Log File Analyzer:

Get a comprehensive and detailed overview of technical SEO in our free course:
✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹

After probing your website's crawlability with the help of the Crawlability report within the Site Audit tool, it's high time to examine the access logs of your web server.

In this lesson, we'll use the Log File Analyzer tool in order to gain an insightful and in-depth knowledge of how Googlebot crawls your website.

Log File Analyzer
First, you need to download log files from your website. In most cases, this can be done via an FTP client. Once you've acquired your log files, make sure that they have a proper format - it must be a Combined Log Format - and don't exceed 1Gb each. Then, go to the Log File Analyzer page and upload them. After a short while, your file will be uploaded and the report ready.

At the top of the report, you can see the Googlebot activity graph. By default, it shows how many times Googlebots hit your website daily during the entire period of logging. You can change the view by selecting a Googlebot and setting a period of time in the dropdown lists above. Then, use the selector to see the distribution of crawled pages' status codes and found file types over the chosen time frame.

Below, you'll see a list of your website's crawled files and folders - it's sorted by the number of bot hits by default. You can see and sort by a file type, the share of hits, crawl frequency for each of your files, and last crawl date.

To refine the list and see only specific elements, apply one of the following filters:

By path - if you need to find a particular file or folder and know its name
Last status code recorded - if you need to see which pages have, say, errors or redirections, and
File type
You can combine these filters as you like.

All this data provides you with valuable insights that help you:

Get rid of structural and navigational problems that affect the accessibility of your pages.
Track the occurrence of technical problems (broken pages, incorrect redirects, etc.) over a chosen time frame.
Optimize your crawl budget by finding areas of ineffective spending.
See how Googlebot prioritizes content and build up SEO strategy by drawing bots' attention to the most important pages.
Plan your content production wisely, synchronizing your content creation plan with the actual timing of its emergence in SERPs.
Ensure that your website has gone through migration without loss, if this process took place.
You can upload another Log File at any time. If it's an up-to-date version of an already uploaded file, the tool will automatically update the information contained in the report.

#TechnicalSEO #TechnicalSEOcourse #SEOlogFile #LogFileAnalysis #SEMrushAcademy


Up Next Autoplay