What is On-Site SEO İnternal SEO and How To Do It?
The main goal in internal SEO work is to become a site that Google wants, and a series of improvements are required for this. With the work done, both Googlebot and users navigate the site more easily, comfortably and quickly. This is welcomed by Google and allows to be listed at the top of target queries and keywords in search results.
What Benefits Does Internal SEO Provide?
Within the scope of internal SEO studies, a very detailed study is carried out, from page titles to crawl budget optimization. These efforts ensure that your site becomes smooth and a site that is easily used by both the user and Googlebot.
Studies are carried out on page titles, descriptions, crawlability, content, title tags, in-site linking, incorrect redirects, broken link and crawl errors optimization and many more. After these studies, it is possible to see an increase in targeted keywords and queries. The basic rule in the studies is to respond to what Google wants on a page with the closest compatibility.
If the studies are carried out naturally and as Google wants, there will be an increase in search results after a while – depending on the targeted keywords – and thus organic visitors will be obtained from target keywords and queries.

What To Do In Internal SEO Studies?
For example, you can see an example of internal SEO work steps below. With this list, you can also make internal SEO checks and adjustments on your website;
- Determination of keywords / queries targeting rise in SERP,
- Registration of the site to Google Search Console, Yandex Webmaster Tools and Bing Webmaster Tools,
- Determining which pages the target keywords and queries will be studied,
- Controlling and editing Structured Data markings,
- HTTPS protocol (SSL) control,
- Full scanning of the site with tools such as DeepCrawl or Screaming Frog SEO Spider and identifying errors,
- Editing the page titles (meta-title) for the target keyword or query,
- Writing page descriptions (meta-descriptions) in accordance with the target query and page content,
- Ensuring the use of SEO friendly URLs,
- Canonical control,
- W3C control,
- URLs on the site are short and compatible with the keyword on the relevant page,
- Ensuring the hierarchical balance of the heading tags (h1, h2, h3, h4, h5, h6) on the pages,
- Checking image sub tags,
- Creating sitemaps and sending them to search engines,
- Creating the robots.txt file and closing unnecessary directories from bots,
- Using AMP technology and correcting errors in AMP pages,
- Establishing the internal linking strategy and balancing the links made and Internal Links,
- Identifying broken (non-working, non-directed) links within the site and correcting them with working URLs,
- Checking crawl errors and directing pages that contain error codes such as 404 and those that do not open to the relevant pages with 301,
- Solving problems in mobile usability and ensuring full compatibility with mobile,
- Solving the problems in the design and software field in order for the site to work properly on all devices,
- Maximizing the page loading speed (PageSpeed),
- Activating the browser caching system (cache),
- Controlling image file sizes and optimizing compression / reduction of large files,
- Compressing CSS files by controlling and reducing their size,
- Optimizing JavaScript files to the smallest size,
- Designing the 404 error page in accordance with the user experience,
- Identifying poor, empty and non-indexable pages with scanning budget optimization,
- Marking the pages whose content is weak, empty and that do not need to be indexed as “noindex” and preventing bots from accessing Robots.txt,
- Checking the links inside the site and removing unnecessary / harmful links.