Search engine optimization

Successful increase in sales of products and services, which is the ultimate goal of website promotion, is impossible without optimizing the website for ranking algorithms of search engines.

Search engine optimization is bringinglets do work"Fixing up" is easy to say. Standard optimizer's check-list has 35 items. the website content (not just meta tags, but the content itself) to the requirements of search engines. In the last years the market of service providers of website optimization has broken into 2 types of companies: TOP companies in website promotion—they do a pretty decent job of optimizing your website, work with metrics, objectives, but are not always affordable for small businesses. Small companies or freelancers usually do not allocate a lot of funds for website optimization, or sometimes they only optimize only the pages containing promoted keywords (typically, 10-25% of all pages). Such approach leads to uneven website promotion, with some parts of it getting to the TOP of ranking, and others not even showing in top 500. Every page of the website should be optimized.

Let’s move on from theory to practice and demonstrate how the professional search optimization works.

Optimizing website pages

Each website consists of HTML pages (with the exception of flash sites). HTML code of the page is a set of special symbols (meta tags—the language that browsers understand) of which the page is “weaved”. Optimizing website pages is the most important phase of the search optimization, inferior to probably only to creating the most unique content. The process of optimization of a page could be broken down as follows:

  1. Optimization of service tags (keywords, description, title);
  2. Optimization of the content design tags;
  3. Selection of the optimal keyword density for the page;
  4. Getting rid of the non-informative parts of the code. Such parts get prohibited from being indexed using nonindex tag (it only works for Yandex). Non-informative parts are foreign parts of the code that are used, for example, to gather statistics (i.e. search engine counters), or JavaScript plug-ins, which are used for creating different dynamic effects;
  5. Correcting W3C mistakes and reccomendations. This item is not mandatory, but important. W3C sets very strict requirements for the code, closing tags and correct tags structure. As an alternative, there is checking for extraneous code that doesn’t carry functional load or extra parameters;
  6. Checking for the presence of the parts of the code that are not visible to users, but visible to search robots (it could be zero size images, using text with the same color background, hidden units (for example, span tags with invisible value).

All examples listed here are so called «black seoblack seoMany optimizers will put on a "helmet of horror" to speed up the site promotion, just to make extra money. Usually, the websites get blocked as a result.» methods and could lead to a forced drop in search results or even to a complete ban of website from search engine listings;

Also, there is of course optimization of such tags as h1-/h1, h6, alt, img , strong, em, b and so on

This question is often raised on internet among optimizers, discussed at SEO seminars and conferences. There is no straightforward answer to it yet. From general experience we know that a keyword should appear at least once on a page (although there are exceptions, when there is no keywords on the page, and yet it comes up on top of search results). Also, it is beneficial to have a keyword/phrase included in the title of html page, as well as in page titles (h1, h2, h3, and so forth).

Keywords should be used on the pages of a website only for users’ sake. Injecting them left and right, thus distorting the logic of the page and corrupting the language, is not right. Remember that a quality website (from both, search engines’ and users point of view) should serve the end users first and foremost. This is often empathized by Yandex and Google representatives. Do not overdo it with the keywords!

The community of optimizers has been debating long and hard on what is the optimal density of keywords. Most agree on a 3-9% range. Some real cases confirm the point, some don’t. We believe that for each segment of websites search engines come up with a different optimal density for keywords. So before implanting your keywords, you should determine the density of the competitors websites that appear in TOP 20.

On top of everything else, the following procedures should be done:

Eliminating duplicate content on a page and on a website

Historically, during website development, two version are created (test.ru and www.test.ru). From the search engine perspective, these are 2 different websites and they have duplicate content, which created a problem of a non-unique content. Search engines are very critical of such pages. As a rule, the search robot chooses one page and indexes it while ignoring its duplicates. When there are several duplicate pages on a website, a search engine may sanction it or make indexing more problematic. Among other things, we often see websites on the same domain which have pages with duplicate content.

Ensuring increase of organic backlinks

Organic links are backlinks to the website that were placed by users without coercion. These links could be placed on blogs, forums, by adding RSS feeds and so on. At this point search engines are very aggressive towards paid (rented) backlinks.
Not so far ago Google, for example, has declared war on SAPE — the infamous and biggest broker of paid links.

Also, with the growing popularity of social networks, your website should implement functional controls (buttons or icons), which allow the users with just one click to share the info from your website onto blogs, forums and social networks; links to add to your favorites and to send my email. These controls will be of great convenience to those users who wish to put a link back to one of your pages on their blog or website. This will enable them to quickly repost and forward the content they like.

It is imperative to check the uniqueness of the content

In 2013 unique content became probably the most important factor for search engines. Borrowing texts from other websites is frowned upon by search engines, while unique content will definitely earn you good graces. Having non-unique content on your website may lower your website’s ranking all the way to manual sanctions of the search engines, and applying various blocking filters for your website.

To get a sense of the situation (whether the content is borrowed or not) it is enough to check just a dozen of pages. You can test your website either on www.copyscape.com, or via Yandex by copying a few sentences of text and typing them into the search box.

Besides optimizing the code, it is important to review the website’s structure. We will talk about it in the next section.

If your company hasn’t yet optimized its website—we are here to help you!

Order costing

+7 495 506-40-08