SEO With Deepcrawl
SEO tools have evolved so much now that they allow us to keep everything under control. Today, I would like to tell you about a tool that I use frequently and is easy to use and easy to use: Deepcrawl.
Deepcrawl not only scans your site in a technical sense, but also provides great convenience in terms of analysis by performing log analysis . Even if I write all the features of Deepcrawl here, I will explain roughly how it can solve SEO problems in a way that you can easily overcome and work for you.
Note: Deepcrawl is a paid tool and I am completely voluntarily promoting this tool as I do not charge a fee for it: =)

We would like to mention at the beginning of the article, firstly, please login to the Deepcrawl system and follow the steps from the article.
Entube.Net

What is Deepcrawl?
Deepcrawl is an SEO tool that helps to better understand the sources of problems on the site by analyzing and scanning hundreds of thousands of pages.
Using Deepcrawl
You can set your site address to be able to scan both http and HTTPS versions , whether you site on a subdomain basis .
With these, scanning;
- Site maps ,
- Google Search Console,
- Google Analytics,
- Backlinkler,
- Log Analysis,
- You can expand the crawl by adding URL lists.
For backlinks, Majestic has an infrastructure designed to work with popular SEO tools. Depending on the account feature, both the log analysis and the number of added backlinks may vary.
For log analysis, unfortunately, you cannot directly assign your log file to Deepcrawl, for this you need to use tools such as Splunk or Logz.io. The data obtained from here are interpreted and presented to you visually. This visualization and analysis is very important in millions of pages. According to the features of your account, you can also select Javascript Rendering and scan for JS.
After making all the settings and when you start scanning, you can check the progress in real time. You can download reports as well as share them with others.
Let’s continue by touching on some important parts in Deepcrawl.
Summary
Dashboard
When the scanning process has been successfully completed, the dashboard screen will show you an overview. Here , you can learn about many topics such as thin pages , bad URLs, duplicate content and non-indexable pages.
If there is a previously saved search for the same site, what are the changes compared to the previous search. You can see which URLs cannot be crawled and graphics such as web crawl depth . You can specifically use log analysis to optimize your browsing budget. Note that in some cases, you can contact developers to fix these errors.
Issues
You can follow the trends and changes of errors compared to the previous scan.
Here in the following subjects;
- AMP errors,
- Failed URL’S,
- Pages with thin content,
- Max HTML size,
- Max Fetch time,
- Links to outside,
- Canonical problems ,
- Pages that do not have a blank H1,
- Short titles,
- Long titles
- Duplicate pages,
- Pagination issues ,
- Problems with URLs in sitemap,
- You can see errors such as duplicate descriptions.
Indexation
The part where you can see an overview of your index status in the Google index. Also, if you connect to Google SC, you will get much more comprehensive reports.
Indexable & Non-Indexable Pages
Here you can find the indexed and non-indexable pages. You can analyze whether non- indexable pages cannot be indexed because of Canonical, Disallow, Nofollow or Noindex .
Mobile Pages
We always think about the status of desktop pages, but what is the situation in mobile? The situation does not end just by looking at mobile. If you are using AMP, are there any indexing problems in your AMP pages? Is Amphtml used correctly or what is the case with Canonicalized Pages? You can see all of them from the relevant section.
Unscannable Pages
You may have intentionally or unintentionally blocked some pages from being crawled. While this can be manually controlled for very few URLs, it is a big problem with thousands of URLs. You can see if the pages you blocked with robots.txt could not be scanned as follows. Again, if there is a browsing problem with the external links, you can detect these problems.
Paginated 2+ Pages
In short, you can see the status of your paged content, how much space the 1st page paginated content takes up.
You can sometimes come across where it says Deeprank in the reports . Deeprank works similar to Google’s Pagerank algorithm and offers you some value.
Content
The section where you can see your content in general. You can find broken images, CSS or JS files along with the content.
What reports are there?
- Blank, Short and Long SEO Title ,
- Duplicate Title ve Description,
- Blank, Short and Long Description,
- Long Description by Mobile,
- Pages with no description,
- Blank pages,
- Thin pages,
- Duplicate and non-H1 pages,
- Noarchive pages,
- JS, CSS and PDF files,
- Disallowed JS / CSS resources,
- Broken pictures.
Config
One of the most technical parts of Deepcrawl. Especially hreflang can save lives if you have a foreign language website.
Canonical
You can find many details here such as faulty canonical configurations, multiple canonicals or canonicals that do not return 200 arrow codes. In particular, you should look here to avoid duplicate content and indexing.
Redirects
Analyzes of redirects such as 301 & 302 on the site are included here. Even the vehicle can detect meta redirect if you are using it. You can easily see all URLs in broken redirects and redirect chains. If there are also javascript redirects, it appears here.
HTTPS
When using Https some of your pages may have many internal links perhaps linked to Http. Deepcrawl comes into play here and it tells you which pages are Https-free.
Hreflang
The correct use of Hreflang on your multilingual pages is one of the most critical points in SEO . You can see every page with and without hreflang. Of course if indexe is open: =)
You can also use Pagination and URL design sections under this menu.
Links
The report section where you can analyze your site maps with internal and external links.
Internal Links
This place can work for you to determine the problems and correct usage in your link structure within the site. Pages that will not be followed with broken links, image alt tag and nofollow are in this section. Moreover, you can see whether your sitemaps are broken or not with an analysis.
External Links
You can use this field to detect the links you give out of your site.
Sitemaps
The section that solves many SEO problems such as broken links in all your sitemaps and URLs in the sitemap, despite the command not to index.
You can also check the status of your backlinks when you import them.
Traffic
If you make a Google Search Console link, you can find many more visualized details such as page views and number of clicks here.
Mobile
The part where you can see the status of your site’s responsive or AMP pages in detail. If there is a problem, you can quickly see it and focus on the solution.
Log Files
One of the best features of the tool, which I use often 🙂 I plan to write an article about how log analysis is done in SEO in the upcoming period. This is a key report for Crawl Budget optimization . You can see which pages and time the bots visited the site and which status code they returned.
Performance
How is the speed of your pages? In the Fetch time section, you can see which pages are opened very fast, moderately and very slowly. In addition, with Page Size, you can examine the total dimensions of the pages in detail.
For more information, you can check out https://deepcrawl.com . The information here may change over time or new additional features may be introduced to the vehicle. If I do not know about it, would you please specify? 🙂 It will also be very useful if the users of the tool write the situations they like or dislike.