The side of the semantic core of the phrase. A simple example of compiling a semantic core

Useful materials from the blog on collecting keys for semantics, clustering queries and optimizing site pages.

Article topics:


Semantic core



Correctly composed semantic core is able to send only the right users to your site, and the unsuccessful ones can be buried in the bowels of the search results.

Working with queries included in the semantic core (SN) consists of collection, cleaning and clustering. Having received the grouping results, you need to determine the optimal place for each of them: on the resource page, as part of the content of your site or a third-party site.


How to collect keys for SA


Briefly about the important: which operators to use in Wordstat to view the necessary requests, and how to make it easier for yourself to work in the service.

Wordstat does not provide absolutely accurate information, it does not contain all requests, the data can be distorted, because not all consumers use Yandex. Nevertheless, from this data one can draw conclusions about the popularity of a topic or product, approximately predict demand, collect keys find ideas for new content.

You can search for data by simply entering a query into the service search, but there are operators to specify queries - additional characters with refinements. They work on the search tabs by words and by regions, on the tab with the history of queries, you can only use the "+ query" operator.

In the article:

  • Why you need Wordstat
  • Working with Wordstat Operators
  • How to read Wordstat data
  • Extensions for Yandex Wordstat

We aspire to become leaders in the search results: how the analysis of articles from the top will help in working on content, what criteria to analyze and how to do it faster and more efficiently.

It is difficult to track the results of blogging and publishing other texts on the site without detailed analytics. How to understand why competitors' articles are in the top, but yours are not, although you write better and more talented?

In the article:

  • What is usually advised
  • How to analyze
  • Cons of the approach
  • Benefits of Content Analysis
  • Tools

How to write optimized texts


What content gets more links and social signals? Backlinko partnered with BuzzSumo to analyze 912 million blog posts: article length, title format, social signals and article backlinks, and make recommendations for content marketing. We translated and adapted the study.

In the article:

  • Brief findings from the content research
  • New knowledge about content marketing, in detail:
  1. What content gets the most links
  2. What texts are most popular in social networks
  3. Backlinks are hard to get
  4. What materials are collected by all reposts
  5. How is the number of backlinks and reposts related?
  6. Which titles get the most reposts?
  7. What day of the week is best to post content?
  8. What format of content is reposted more often
  9. Which content format gets more links
  10. How it generates links and reposts B2B and B2C content

The semantic core (abbreviated as SN) is a specific list of keywords that describe the theme of the site as much as possible.

Why you need to make up the semantic core of the site

  • the semantic core characterizes, it is thanks to it that robots indexing the page determine not only the naturalness of the text, but also the subject in order to bring the page into the appropriate search section. It is obvious that robots work on full autonomy after entering the address of the site page into the search resource base;
  • well-written is the semantic basis of the site and reflects the appropriate structure for SEO promotion;
  • each page of the site, respectively, is tied to a certain part of the CL of the web resource;
  • thanks to the semantic core, a promotion strategy in search engines is formed;
  • according to the semantic core, you can estimate how much the promotion will cost.

Basic rules for compiling a semantic core

    In order to assemble the SA, you will need to assemble sets of keywords. In this regard, you need to evaluate your strength in relation to the promotion of high- and mid-frequency queries. If you want to get the most visitors on a budget, you need to use high- and mid-frequency queries. If vice versa, then medium- and low-frequency requests.

    Even if you have a high budget, it makes no sense to promote the site only for high-frequency queries. Often, such requests are too general and have an unspecified semantic load, for example, “listen to music”, “news”, “sports”.

When choosing search queries analyze a set of indicators that correspond to the search phrase:

  • number of impressions (frequency);
  • the number of impressions without morphological changes and word combinations;
  • pages that are issued by the search engine when entering a search query;
  • pages in search TOP for key queries;
  • estimate the cost of promotion on demand;
  • keyword competition;
  • predicted number of transitions;
  • bounce rate (closing the site after clicking on the link) and seasonality of the service;
  • keyword geo-dependency (geographic location of the company and its customers).

How to build a semantic core

In practice, the selection of the semantic core can be carried out by the following methods:

    Competitor websites can become the source of keywords for the semantic core. This is where you can quickly pick up keywords, as well as determine the frequency of their "environment" using semantic analysis. To do this, you need to make a semantic assessment of the text page, the most mentioned words make up the morphological core;

    We recommend creating your own semantic core based on the statistics of special services. Use, for example, Wordstat Yandex - a statistical system search engine Yandex. Here you can see the frequency of the search query, as well as find out what users are searching for along with this keyword;

    "Hints" systems appear when you try to enter a search phrase in the interactive line. These words and phrases can also enter the SL as connected ones;

    Closed databases of search queries, for example, Pastukhov's database, can become the source of keywords for CL. These are special data arrays containing information about effective combinations of search queries;

    Internal site statistics can also become a source of data about search queries that are of interest to the user. It contains information about the source and knows where the reader came from, how many pages they viewed and what browser they came from.

Free tools for compiling a semantic core:

Yandex.Wordstat- a popular free tool used in compiling a semantic core. Using the service, you can find out how many times visitors entered a specific query into the Yandex search engine. Provides an opportunity to analyze the dynamics of demand for given request by months.

Google AdWords is one of the most used systems for leaving the semantic core of the site. With the help of the Google Keyword Planner, you can calculate and forecast the impressions of specific queries in the future.

Yandex.Direct many developers use the most profitable keywords for selection. If in the future it is planned to place advertisements on the site, then the owner of the resource with this approach will receive a good profit.

Slovoeb- the younger brother of Kay Collector, who is used to compile the semantic core of the site. Data from Yandex is taken as a basis. Among the advantages, one can note an intuitive interface, as well as accessibility not only for professionals, but also for beginners who are just starting to engage in SEO analytics.

Paid tools for compiling a semantic core:

Base Pastukhov according to many experts have no competitors. The database displays such requests that neither Google nor Yandex shows. There are many other features inherent in Max Pastukhov's databases, among which a convenient software shell can be noted.

SpyWords- An interesting tool that allows you to analyze the keywords of competitors. With its help, you can conduct a comparative analysis of the semantic cores of the resources of interest, as well as get all the data on the PPC and SEO companies of competitors. The resource is Russian-speaking, it will not be any problem to deal with its functionality.

A paid program created specifically for professionals. Helps to compose the semantic core by identifying relevant queries. It is used to estimate the cost of promoting a resource for the keywords of interest. In addition to a high level of efficiency, this program stands out for its ease of use.

SEMrush allows you to determine the most effective keywords based on data from competing resources. It can be used to select low-frequency queries characterized by a high level of traffic. As practice shows, for such requests it is very easy to promote the resource to the first positions of the issue.

SeoLib- a service that has won the trust of optimizers. Has a lot of functionality. Allows you to competently compose a semantic core, as well as perform the necessary analytical measures. In free mode, you can analyze 25 requests per day.

promoter allows you to assemble the primary semantic core in just a few minutes. This is a service used mainly for the analysis of competing sites, as well as for the selection of the most effective key queries. Word analysis is selected for Google in Russia or for Yandex in the Moscow region.

The semantic core is assembled fairly quickly if sources and databases are used as a hint.

The following processes should be distinguished

According to the content of the site and relevant topics, key queries are selected that most accurately reflect the semantic load of your web portal.
- From the selected set, superfluous ones are eliminated, possibly those queries that can worsen the indexing of the resource. Keyword filtering is carried out based on the results of the analysis described above.
- The resulting semantic core should be evenly distributed between the pages of the site, if necessary, texts with a specific theme and volume of keywords are ordered.

An example of collecting a semantic core using the Wordstat Yandex service

For example, you are promoting a nail salon in Moscow.

We think and select all kinds of words that fit the theme of the site.

Activity of the company

  • manicure salon;
  • nail service salon;
  • nail service studio
  • manicure studio;
  • pedicure studio;
  • nail design studio

General service name

Pedicure;
- manicure;
- nail extension.

Now we go to the Yandex service and enter each request, after selecting the region in which we are going to move.

We copy all the words in Excel from the left column, plus auxiliary phrases from the right.

We remove unnecessary words that do not fit the topic. The words that match are highlighted in red below.

The figure of 2320 requests shows how many times people typed this request not only in its pure form, but also as part of other phrases. For example: manicure and price in Moscow, price for manicure and pedicure in Moscow, etc.

If you enter our query in quotation marks, then there will already be another figure, which takes into account the word forms of the key phrase. for example: manicure prices, manicure prices, etc.

If you enter the same query query in quotation marks with exclamation marks, we will see how many times users typed the query "manicure price".

Next, we break down the resulting list of words into pages of the site. So, for example, we will leave high-frequency queries to home page and on the main sections of the site, such as: manicure, nail service studio, nail extension. We will distribute the mid- and low-frequency frequencies on the rest of the pages, for example: manicure and pedicure prices, gel nail extension design. Words should also be divided into groups according to meaning.

  • Main page - studio, nail service salon, etc.
  • 3 sections - pedicure, manicure, prices for manicure and pedicure.
  • Pages - nail extension, hardware pedicure, etc.

What mistakes can be made when compiling a

When compiling a semantic core, no one is immune from errors. The most common include the following:

  1. There is always the danger of choosing inefficient queries that generate the minimum number of visitors.
  2. When re-promotion of the site, you should not completely change the content posted on the site. IN otherwise all previous parameters will be reset, including ranking in search results.
  3. You should not use queries that are incorrect for the Russian language, search robots already define such queries well and remove the page from the search when they “spam” with keywords.

We wish you good luck in promoting your site!

What is the semantic core of the site? The semantic core of the site (hereinafter referred to as SA) is a set of keywords and phrases for which the resource moving forward in search engines and which indicate that the site belongs to a certain topics.

For successful promotion in search engines, keywords must be correctly grouped and distributed throughout the pages of the site and in certain form contained in meta descriptions ( title, description, keywords), as well as in the headings H1-H6. At the same time, overspam should not be allowed so as not to “fly away” to Baden-Baden.

In this article, we will try to look at the issue not only from a technical point of view, but also look at the problem through the eyes of business owners and marketers.

What is the collection of SA?

  • manual- possible for small sites (up to 1000 keywords).
  • automatic- programs do not always correctly determine the context of the request, so there may be problems with the distribution of keywords across pages.
  • semi-automatic- phrases and frequency are collected automatically, the distribution of phrases and refinement is done manually.

In our article, we will consider a semi-automatic approach to creating a semantic core, since it is the most effective.

In addition, there are two typical cases when compiling SA:

  • for a site with a ready-made structure;
  • for a new site.

The second option is more preferable, since it is possible to create an ideal site structure for search engines.

What is the process of compiling a SA?

Work on the formation of the semantic core is divided into the following stages:

  1. Identification of directions in which the site will move.
  2. Collection of keywords, analysis of similar queries and search suggestions.
  3. Frequency parsing, elimination of "empty" requests.
  4. Clustering (grouping) requests.
  5. Distribution of queries across the pages of the site (drawing up the ideal structure of the site).
  6. Recommendations for use.

The better you make the core of the site, and quality in this case means the breadth and depth of semantics, the more powerful and reliable the flow of search traffic you can send to the site and the more you will attract customers.

How to make the semantic core of the site

So, let's take a closer look at each item with various examples.

At the first step, it is important to determine which goods and services present on the site will be promoted in the search results of Yandex and Google.

Example #1. Suppose there are two areas of services on the site: computer repair at home and training in working with Word / Excel at home. In this case, it was decided that learning is no longer in demand, so it makes no sense to promote it, and therefore collect semantics on it. Another important point, you need to collect not only requests containing computer repair at home, but also laptop repair, pc repair and others.

Example #2. The company is engaged in low-rise construction. But at the same time he builds only wooden houses. Accordingly, queries and semantics by directions "construction of houses from aerated concrete" or "building brick houses" may not be collected.

Collection of semantics

We will consider two main sources of keywords: Yandex and Google. We will tell you how to collect semantics for free and briefly review paid services that allow you to speed up and automate this process.

In Yandex collection key phrases carried out from the Yandex.Wordstat service and in Google through query statistics in Google AdWords. If available, as additional sources of semantics, you can use data from Yandex Webmaster and Yandex Metrics, Google Webmaster and Google Analytics.

Collection of keywords from Yandex.Wordstat

Collecting requests from Wordstat can be considered free. To view the data of this service, you only need an account in Yandex. So let's go to wordstat.yandex.ru and enter a keyword. Consider an example of collecting semantics for a car rental company website.

What do we see in this screenshot?

  1. Left column. Here is the main query and its various variations with "tail". Next to each request is a number that indicates how much this request in general has been used by various users.
  2. Right column. Requests similar to the main one and indicators of their overall frequency. Here we see that the person who wants to rent a car, apart from the request "car rental", can use "car rental", "car rental", "car rental" and others. This is very important data that you need to pay attention to so as not to miss a single request.
  3. Regionality and history. By choosing one of the options, you can check the distribution of requests by region, the number of requests in a particular region or city, as well as the trend over time or with the change of season.
  4. Devices from which the request was made. By switching tabs, you can find out which devices are most often searched from.

Check different options for key phrases and record the received data in Excel spreadsheets or Google spreadsheets. For convenience, install the plugin Yandex Wordstat Helper. After installing it, plus signs will appear next to the search phrases, by clicking on which the words will be copied, there will be no need to select and insert the frequency indicator manually.

Collecting keywords from Google AdWords

Unfortunately, Google doesn't have a public source of search queries with their frequency metrics, so there's a workaround to be done here. And for this we need a working account in Google AdWords.

We register an account in Google AdWords and replenish the balance by the minimum possible amount - 300 rubles (on an account that is inactive in terms of budget, approximate data is displayed). After that, go to "Tools" - "Keyword Planner".

Will open new page, where in the tab "Search for new keywords by phrase, site or category" enter the keyword.

Scroll down, click "Get Options" and see something like this.

  1. Main query and average queries per month. If the account is not paid, then you will see approximate data, that is, the average number of requests. When there are funds on the account, the exact data will be shown, as well as the dynamics of changes in the frequency of the entered keyword.
  2. Keywords by relevance. This is the same as similar queries in Yandex Wordstat.
  3. Downloading data. This tool is convenient in that the data obtained in it can be downloaded.

We considered working with two main sources of statistics on search queries. Now let's move on to automating this process, because collecting semantics manually takes too much time.

Programs and services for collecting keywords

Key Collector

The program is installed on the computer. The program connects working accounts from where statistics will be collected. Next, a new project and a folder for keywords are created.

Select "Batch collection of words from the left column of Yandex.Wordstat", enter queries for which we collect data.

An example is introduced in the screenshot, in fact, for a more complete SA, here you additionally need to collect all query options with car brands and classes. For example, "rent a bmw", "buy a toyota with option to buy", "rent an SUV" and so on.

SlovoEb

Free analogue previous program. This can be considered both a plus - you do not need to pay, and a minus - the program has significantly reduced functionality.

To collect keywords, the steps are the same.

Rush-analytics.ru

Online service. Its main advantage is that you do not need to download and install anything. Register and enjoy. The service is paid, but when registering, you have 200 coins on your account, which is quite enough to collect small semantics (up to 5000 requests) and parse the frequency.

Minus - collection of semantics only from Wordstat.

Checking the frequency of keywords and queries

And again we notice a decrease in the number of requests. Let's go ahead and try another word form of the same query.

We note that in the singular, this request is looking for a much smaller number of users, which means that the initial request is a higher priority for us.

Such manipulations must be carried out with each word and phrase. Those requests for which the final frequency turns out to be equal to zero (when using quotes and an exclamation mark) are eliminated, because "0" - indicates that no one enters such requests and these requests are only part of others. The point of compiling a semantic core is to select the queries that people use to search. All requests are then placed in an Excel table, grouped by meaning and distributed across the pages of the site.

To do this manually is simply not realistic, so there are many services on the Internet, paid and free, that allow you to do this automatically. Let's give a few:

  • megaindex.com;
  • rush-analytics.ru
  • tools.pixelplus.com;
  • key-collector.ru

Removing non-target requests

After sifting keywords, you should remove unnecessary ones. What search queries can be removed from the list?

  • requests with the names of competitors' companies (can be left in contextual advertising);
  • requests for goods or services that you do not sell;
  • requests that indicate the district or region in which you do not work.

Clustering (grouping) requests for site pages

The essence of this stage is to combine requests that are similar in meaning into clusters, and then determine which pages they will be promoted to. How to understand which queries to promote to one page, and which to another?

1. By request type.

We already know that all queries in search engines are divided into several types, depending on the purpose of the search:

  • commercial (buy, sell, order) - they are promoted to landing pages, product category pages, product cards, service pages, price lists;
  • informational (where, how, why, why) - articles, forum topics, a section answering a question;
  • navigational (phone, address, brand name) - page with contacts.

If you are in doubt what type of request is, enter its search string and analyze the output. For a commercial request, there will be more pages with a service offer, for an informational request, there will be more articles.

Also have geo-dependent and geo-independent queries. Most commercial queries are geo-dependent, as people trust companies located in their city to a greater extent.

2. Request logic.

  • "buy iphone x" and "iphone x price" - you need to promote one page, since in the first and second cases, the same product is searched, and more detailed information about him;
  • "buy iphone" and "buy iphone x" - you need to promote on different pages, since in the first request we are dealing with a general request (suitable for the product category where iPhones are located), and in the second, the user is looking for a specific product and this request should be promoted to the product card;
  • "how to choose good smartphone» - this request is more logical to promote to a blog article with the appropriate title.

View search results for them. If you check which pages of different sites lead to the queries “building houses from timber” and “building houses from bricks”, then in 99% of cases these are different pages.

4. Automatic grouping by software and manual refinement.

The 1st and 2nd methods are great for compiling the semantic core of small sites, where a maximum of 2-3 thousand keywords are collected. For a large SA (from 10,000 to infinity requests), the help of machines is needed. Here are a few programs and services that allow you to perform clustering:

  • KeyAssistant - assistant.contentmonster.ru;
  • semparser.ru;
  • just-magic.org;
  • rush-analytics.ru
  • tools.pixelplus.com;
  • key-collector.ru

After the completion of automatic clustering, it is necessary to check the result of the program's work manually and, if errors are made, correct them.

Example: the program can send the following queries to one cluster: “rest in Sochi 2018 hotel” and “rest in Sochi 2018 hotel breeze” - in the first case, the user is looking for various options for hotels to stay, and in the second, a specific hotel.

To eliminate the occurrence of such inaccuracies, you need to manually check everything and, if errors are found, edit.

What to do next after compiling the semantic core?

Based on the collected semantic core, then we:

  1. we compose the ideal structure (hierarchy) of the site, from the point of view of search engines;
    or in agreement with the customer, we change the structure of the old site;
  2. we write technical tasks for copywriters on writing text, taking into account the cluster of requests that will be promoted to this page;
    or finalizing old articles texts on the site.

It looks like this.

For each generated request cluster, we create a page on the site and determine its place in the site structure. The most popular queries are promoted to the most top pages in the resource hierarchy, the less popular ones are below them.

And for each of these pages, we have already collected requests that we will promote on them. Next, we write technical specifications for copywriters to make text for these pages.

Terms of reference for a copywriter

As in the case of the site structure, we will describe this stage in general terms. So, the terms of reference for the text:

  • number of characters without spaces;
  • page title;
  • subheadings (if any);
  • a list of words (based on our kernel) that should be in the text;
  • uniqueness requirement (always require 100% uniqueness);
  • desired text style;
  • other requirements and wishes according to the text.

Remember, don't try to promote +100500 requests to one page, limit yourself to 5-10 + tail, otherwise you will get a ban for re-optimization and you will be out of the game for a long time for places in the TOP.

Conclusion

Compiling the semantic core of the site is painstaking and hard work, which needs to be given special attention, because it is on it that the further promotion of the site is based. Follow the simple instructions given in this article and proceed.

  1. Choose the direction of advancement.
  2. Collect all possible requests from Yandex and Google (use special programs and services).
  3. Check the frequency of requests and get rid of dummies (which have a frequency of 0).
  4. Delete non-targeted requests - services and products that you do not sell, a request mentioning competitors.
  5. Form clusters of requests and distribute them across pages.
  6. Create an ideal site structure and draw up technical specifications for filling the site.

The semantic core is a scary name that SEOs have come up with to refer to a fairly simple thing. We just need to select the key queries for which we will promote our site.

And in this article, I will show you how to properly compose a semantic core so that your site quickly reaches the TOP, and does not stagnate for months. Here, too, there are "secrets".

And before we move on to compiling the SA, let's look at what it is, and what we should eventually come to.

What is the semantic core in simple words

Oddly enough, but the semantic core is the usual excel file, which lists the key queries for which you (or your copywriter) will write articles for the site.

For example, here is how my semantic core looks like:

I have marked in green those key queries for which I have already written articles. Yellow - those for whom I am going to write articles in the near future. And colorless cells mean that these requests will come a little later.

For each key request, I have determined the frequency, competitiveness, and invented a "catchy" title. Here is approximately the same file you should get. Now my SL consists of 150 keywords. This means that I am provided with “material” for at least 5 months in advance (even if I write one article a day).

A little lower we will talk about what you should prepare for if you suddenly decide to order the collection of a semantic core from specialists. Here I will say briefly - they will give you the same list, but only for thousands of "keys". However, in SA it is not the quantity that matters, but the quality. And we will focus on this.

Why do we need a semantic core at all?

But really, why do we need this torment? You can, in the end, just write high-quality articles just like that, and attract an audience with this, right? Yes, you can write, but you can’t attract.

The main mistake of 90% of bloggers is just writing high-quality articles. I'm not kidding, they have really interesting and useful materials. But search engines don't know about it. They are not psychics, but just robots. Accordingly, they do not put your article in the TOP.

There is another subtle point here with the title. For example, you have a very high-quality article on the topic "How to do business in the" muzzle book ". There you describe everything about Facebook in great detail and professionally. Including how to promote communities there. Your article is the most high-quality, useful and interesting on the Internet on this topic. No one was lying next to you. But it still won't help you.

Why quality articles fly out of the TOP

Imagine that your site was visited not by a robot, but by a live checker (assessor) from Yandex. He understood that you have the coolest article. And the hands put you in first place in the search results for the query "Community promotion on Facebook."

Do you know what will happen next? You will be out of there very soon. Because no one will click on your article, even in the first place. People enter the query "Community promotion on Facebook", and your headline is "How to do business in the" muzzle book ". Original, fresh, funny, but ... not on demand. People want to see exactly what they were looking for, not your creative.

Accordingly, your article will empty take a place in the TOP of the issue. And a living assessor, an ardent admirer of your work, can beg the authorities for as long as he likes to leave you at least in the TOP-10. But it won't help. All the first places will be occupied by empty, like husks from seeds, articles that were copied from each other by yesterday's schoolchildren.

But these articles will have the correct “relevant” title - “Community promotion on Facebook from scratch” ( step by step, 5 steps, from A to Z, free etc.) It's a shame? Still would. Well, fight against injustice. Let's make a competent semantic core so that your articles take the well-deserved first places.

Another reason to start compiling SA right now

There is one more thing that for some reason people don't think much about. You need to write articles often - at least every week, and preferably 2-3 times a week to get more traffic and faster.

Everyone knows this, but almost no one does it. And all because they have “creative stagnation”, “they can’t force themselves”, “just laziness”. But in fact, the whole problem is precisely in the absence of a specific semantic core.

I entered one of my basic keys — “smm” into the search field, and Yandex immediately gave me a dozen hints about what else might be of interest to people who are interested in “smm”. I just have to copy these keys into a notebook. Then I will check each of them in the same way, and collect clues on them as well.

After the first stage of collecting SA, you should be able to Text Document, which will contain 10-30 wide base keys, with which we will work further.

Step #2 - Parsing Basic Keys in SlovoEB

Of course, if you write an article for the query "webinar" or "smm", then a miracle will not happen. You will never be able to reach the TOP for such a broad query. We need to break the base key into many small queries on this topic. And we will do this with the help of a special program.

I use KeyCollector but it's paid. You can use free analogue- SlovoEB program. You can download it from the official site.

The most difficult thing in working with this program is to set it up correctly. How to properly set up and use Slovoeb I show. But in that article, I focus on the selection of keys for Yandex-Direct.

And here let's take a look at the features of using this program for compiling a semantic core for SEO step by step.

First we create a new project and name it according to the broad key you want to parse.

I usually give the project the same name as my base key so I don't get confused later. And yes, I will warn you against another mistake. Don't try to parse all base keys at the same time. Then it will be very difficult for you to filter out “empty” key queries from golden grains. Let's parse one key at a time.

After creating the project, we carry out the basic operation. That is, we actually parse the key through Yandex Wordstat. To do this, click on the "Worstat" button in the program interface, enter your base key, and click "Start collecting".

For example, let's parse the base key for my blog "contextual advertising".

After that, the process will start, and after a while the program will give us the result - up to 2000 key queries that contain "contextual advertising".

Also, next to each request there will be a “dirty” frequency - how many times this key (+ its word forms and tails) was searched per month through Yandex. But I do not advise you to draw any conclusions from these figures.

Step #3 - Gathering the exact frequency for the keys

Dirty frequency will not show us anything. If you focus on it, then do not be surprised later when your key for 1000 requests does not bring a single click per month.

We need to find the net frequency. And for this, we first select all the found keys with checkmarks, and then click on the Yandex Direct button and start the process again. Now Slovoeb will look for the exact request frequency per month for each key.

Now we have an objective picture - how many times what request was entered by Internet users over the past month. Now I propose to group all key queries by frequency, so that it would be more convenient to work with them.

To do this, click on the "filter" icon in the "Frequency "!" ”, and specify to filter out keys with the value “less than or equal to 10”.

Now the program will show you only those requests, the frequency of which is less than or equal to the value "10". You can delete these queries or copy them for the future to another group of keywords. Less than 10 is very low. Writing articles for these requests is a waste of time.

Now we need to choose those keywords that will bring us more or less good traffic. And for this we need to find out one more parameter - the level of competition of the request.

Step #4 - Checking for Query Concurrency

All "keys" in this world are divided into 3 types: high-frequency (HF), mid-frequency (MF), low-frequency (LF). And they can also be highly competitive (VC), medium competitive (SC) and low competitive (NC).

As a rule, HF requests are simultaneously VC. That is, if a query is often searched on the Internet, then there are a lot of sites that want to advance on it. But this is not always the case, there are happy exceptions.

The art of compiling a semantic core lies precisely in finding such queries that have a high frequency, and their level of competition is low. Manually determining the level of competition is very difficult.

You can focus on indicators such as the number of main pages in the TOP-10, the length and quality of texts. the level of trust and sites in the TOP of the issue on request. All of this will give you some idea of ​​how tough the competition for positions is for this particular request.

But I recommend that you use service Mutagen. It takes into account all the parameters that I mentioned above, plus a dozen more, which neither you nor I probably even heard of. After analysis, the service gives the exact value - what is the level of competition for this request.

Here I checked the request "setting up contextual advertising in google adwords". The mutagen showed us that this key has a concurrency of "more than 25" - this is the maximum value that it shows. And this request has only 11 views per month. So it doesn't suit us.

We can copy all the keys that we picked up in Slovoeb and do a mass check in Mutagen. After that, we will only have to look through the list and take those requests that have a lot of requests and a low level of competition.

Mutagen is a paid service. But you can do 10 checks per day for free. In addition, the cost of verification is very low. For all the time I have worked with him, I have not yet spent even 300 rubles.

By the way, at the expense of the level of competition. If you have a young site, then it is better to choose queries with a competition level of 3-5. And if you have been promoting for more than a year, then you can take 10-15.

By the way, at the expense of the frequency of requests. We now need to take the final step, which will allow you to attract a lot of traffic even for low-frequency queries.

Step #5 - Collecting "tails" for the selected keys

As has been proven and verified many times, your site will receive the bulk of traffic not from the main keys, but from the so-called “tails”. This is when a person enters strange key queries into the search box, with a frequency of 1-2 per month, but there are a lot of such queries.

To see the "tail" - just go to Yandex and enter your chosen key query in the search bar. Here's what you'll see.

Now you just need to write out these additional words in a separate document, and use them in your article. At what it is not necessary to put them always next to the main key. Otherwise, search engines will see "re-optimization" and your articles will fall in the search results.

Just use them in different places in your article, and then you will receive additional traffic from them as well. I would also recommend that you try to use as many word forms and synonyms as possible for your main key query.

For example, we have a request - "Setting up contextual advertising". Here's how you can reformulate it:

  • Setup = set up, make, create, run, launch, enable, host…
  • Contextual advertising = context, direct, teaser, YAN, adwords, kms. direct, adwords…

You never know exactly how people will look for information. Add all these additional words to your semantic core and use when writing texts.

So, we collect a list of 100 - 150 keywords. If you are compiling a semantic core for the first time, then it may take you several weeks to complete it.

Or maybe break his eyes? Maybe there is an opportunity to delegate the compilation of CL to specialists who will do it better and faster? Yes, there are such specialists, but it is not always necessary to use their services.

Is it worth ordering SA from specialists?

By and large, specialists in compiling a semantic core will only take you steps 1 - 3 of our scheme. Sometimes, for a big additional fee, they will also take steps 4-5 - (collecting tails and checking the competition of requests).

After that, they will give you several thousand key queries with which you will need to work further.

And the question here is whether you are going to write articles yourself, or hire copywriters for this. If you want to focus on quality, not quantity, then you need to write it yourself. But then it won't be enough for you to just get a list of keys. You will need to choose those topics that you understand well enough to write a quality article.

And here the question arises - why then do we actually need specialists in SA? Agree, parsing the base key and collecting the exact frequencies (steps #1-3) is not at all difficult. It will take you literally half an hour.

The most difficult thing is to choose high-frequency requests that have low competition. And now, as it turns out, you need HF-NC, on which you can write a good article. This is exactly what will take you 99% of the time working on the semantic core. And no specialist will do this for you. Well, is it worth spending money on ordering such services?

When the services of SA specialists are useful

Another thing is if you initially plan to attract copywriters. Then you do not need to understand the subject of the request. Your copywriters will also not understand it. They will simply take a few articles on this topic and compile “their” text from them.

Such articles will be empty, miserable, almost useless. But there will be many. On your own, you can write a maximum of 2-3 quality articles per week. And the army of copywriters will provide you with 2-3 shitty texts a day. At the same time, they will be optimized for requests, which means they will attract some kind of traffic.

In this case, yes, calmly hire SA specialists. Let them also draw up TK for copywriters at the same time. But you understand, it will also cost some money.

Summary

Let's go over the main ideas in the article again to consolidate the information.

  • The semantic core is just a list of keywords for which you will write articles on the site for promotion.
  • It is necessary to optimize the texts for the exact key queries, otherwise your even the highest quality articles will never reach the TOP.
  • SL is like a content plan for social networks. It helps you not to fall into a “creative block”, and always know exactly what you will write about tomorrow, the day after tomorrow and in a month.
  • To compose a semantic core, it is convenient to use free program Slovoeb, you only need it.
  • Here are five steps in compiling CL: 1 - Selection of basic keys; 2 - Parsing basic keys; 3 - Collection of exact frequency for requests; 4 - Checking the competitiveness of keys; 5 - Collection of "tails".
  • If you want to write articles yourself, then it is better to make the semantic core yourself, for yourself. Specialists in the compilation of CL will not be able to help you here.
  • If you want to work on quantity and use copywriters to write articles, then it is entirely possible to involve delegating and compiling the semantic core. If only there was enough money for everything.

I hope this guide was helpful to you. Save it to your favorites so as not to lose it, and share it with your friends. Don't forget to download my book. There I show you the fastest way from zero to the first million on the Internet (squeezed from personal experience over 10 years =)

See you later!

Your Dmitry Novoselov

Sales Generator

Reading time: 14 minutes

We will send the material to you:

From this article you will learn:

  • How to make the semantic core of the site
  • What programs to use for this
  • How to analyze the semantic core of a competitor's website
  • What mistakes are most often made in the assembly of the semantic core
  • How much does it cost to order a ready-made semantic core of the site

The semantic core is the basis of any Internet resource, the key to its successful promotion and attracting the target audience. How to create the semantic core of the site and what mistakes to avoid, you will learn from this article.

What is the semantic core of the site

The simplest and yet effective method to attract visitors to your site - to make sure that they themselves show interest in it by clicking on a link from the Yandex or Google search engine. To do this, you need to find out what your target audience is interested in, how, by what words, users are looking for the necessary information. The semantic core will help you with this.

The semantic core is a collection of individual words or phrases that characterize the subject and structure of your site. Semantics - originally - the field of philology, dealing with the meaning of words. Nowadays it is more often understood as the study of meaning in general.

Based on this, we can conclude that the concepts of "semantic core" and "semantic core" are synonyms.

The purpose of creating the semantic core of the site is to fill it with content that is attractive to users. To do this, you need to find out what keywords they will use to search for information posted on your site.


Submit your application

The selection of the semantic core of the site involves the distribution of search queries or groups of queries across pages in such a way that they satisfy the target audience as much as possible.

This can be achieved in two ways. The first is to analyze the search phrases of users and, based on them, create a site structure. The second way is to first come up with a framework for the future site, and then, after analysis, distribute keywords over it.

Each method has a right to exist, but the second one is more logical: first you create the structure of the site, and then fill it with search queries, by which potential customers can find the content they need through search engines.

This is how you show the quality of proactivity - you independently determine what information to convey to site visitors. Otherwise, creating a site structure based on keywords, you only adapt to the surrounding reality.

There is a fundamental difference between the approach to creating the semantic core of the site of an SEO specialist and a marketer.

A classic optimizer will tell you: to create a website, you need to select phrases and words from search queries for which you can get to the TOP of search results. Then, on their basis, form the structure of the future site and distribute the keywords across the pages. Page content is created for the selected keywords.

A marketer or entrepreneur will approach the issue of creating a website differently. First, he will think about what the site is for, what information it will carry to users. Then he will come up with an approximate structure of the site and a list of pages. At the next stage, he will create the semantic core of the site in order to understand what search queries potential customers are looking for information on.

What are the disadvantages of working with the semantic core from the position of an SEO specialist? First of all, with this approach, the quality of information on the site is significantly deteriorating.

The company should decide for itself what to say to customers, and not give out content in response to the most popular search queries. Such blind optimization can lead to the fact that some of the promising queries with low frequency rates are eliminated.

The result of creating a semantic core is a list of keywords that are distributed across the pages of the site. This list indicates Page URLs, keywords and the level of frequency of their requests.

An example of the semantic core of the site

How to compose the semantic core of the site: step by step instructions

Step 1. Compile the initial list of requests

First you need to select the most popular search queries on the subject of your site. There are two options for how to do this:

1. Brainstorming method- when for a short period of time you yourself or with colleagues write down all the words and phrases by which, in your opinion, users will search for information posted on your site.

Write down all possible options, including:

  • variations in the spelling of the name of a product or service, synonymous words, ways of writing the name in Latin and Cyrillic letters;
  • full names and abbreviations;
  • slang words;
  • references to the constituent elements of a product or service, for example, building materials - sand, brick, corrugated board, putty, etc.;
  • adjectives that reflect significant characteristics of a product or service (quality repairs, fast delivery, painless dental treatment).

2. Analyze the sites of your competitors. Open an incognito browser for your region. Look at the websites of competitors that will be shown to you by the search results for your topic. Find all potential keywords. You can determine the semantic core of a competitor's website using the com and bukvarix.com services.

Analyze contextual advertisements. On your own or with the help of specialized services (for example, spywords.ru or advodka.com), study the semantic core of someone else's site and find out which keywords competitors use.

By applying all three approaches, you will get a fairly large list of keywords. But it will still not be enough to create an effective semantic core.

Step 2. Expanding the resulting list

At this stage, Yandex.Wordstat and Google AdWords services will help you. If he takes turns entering words from your list of keys generated at the first stage into the search string of any of these services, then at the output you will get a list of refined and associative search queries.

Refined queries can include other words or phrases in addition to your word. For example, if you enter the keyword "dog", then the service will give you 11,115,538 queries with this word, which include such queries for the last month as "photos of dogs", "treatment of dogs", "breeds of dogs", etc.


Association queries are the words or phrases that users searched for along with your query. For example, along with the keyword “dog”, users entered: “dry food”, “royal canin”, “Tibetan mastiff”, etc. These search queries can also be useful to you.


In addition, there are special programs for creating the semantic core of the site, for example: KeyCollector, SlovoEB and online services - Topvisor, serpstat.com, etc. They allow not only to select keywords, but also to analyze them and group search queries.

To expand the list of keys as much as possible, see what the service's search suggestions show. There you will find the most popular search terms that start with the same letters or words as yours.

Step 3. Remove unnecessary requests

Search queries can be classified in different ways. Depending on the frequency, requests are:

  • high-frequency (more than 1500 requests per month);
  • mid-frequency (600-1500 requests per month);
  • low-frequency (100-200 requests per month).

This classification is highly arbitrary. Assigning a request to one category or another will depend on the subject of a particular site.

In recent years, there has been an upward trend in low-frequency queries. Therefore, to promote the site, the semantic core should include mid- and low-frequency queries.

There is less competition among them, so it will be much easier to raise the site to the first page of search results than when working with high-frequency queries. In addition, many search engines welcome when sites use low-frequency keywords.

Another classification of search queries is by search objectives:

  1. Informational- Key words that users enter in search of specific information. For example: “how to glue the tiles in the bathroom yourself”, “how to connect the dishwasher”.
  2. Transactional- keywords that users enter when planning to perform some kind of action. For example: “watch a movie online for free”, “download a game”, “buy building materials”.
  3. vital- queries that users enter in search of a specific site. For example: "Sberbank online", "buy a refrigerator on Yandex.Market", "vacancies on Head hunters".
  4. Other (general)- all other search queries by which you can understand what the user is looking for. For example, the query "car" the user can enter if he wants to sell, buy or repair a car.

Now it's time to remove from the list of keywords all unnecessary ones that:

  • do not correspond to the theme of your site;
  • include competitor brand names;
  • include the names of other regions (for example, buy an iPhone in Moscow if your site works only for Western Siberia);
  • contain typos or errors (if you write “dog” instead of “dog” in the search engine, it will consider this as a separate search query).

Step 4. Define competitive requests

To effectively distribute keywords on the pages of the site, you need to filter them by importance. To do this, use the Keyword Effectiveness Index - KEI (Keyword Effectiveness Index). Calculation formula:

KEI = P2/C,

where P is the frequency of impressions of the keyword in the last month; C - the number of sites that are optimized for this search query.

The formula shows that the more popular the keyword, the higher the KEI, the more targeted traffic you will attract to your site. High competition for a search query makes it difficult to promote a site on it, which is reflected in the KEI value.

Thus, the higher the KEI, the more popular the search query, and vice versa: the lower the keyword performance index, the higher the competition for it.

There is a simplified version of this formula:

KEI \u003d P 2 /U,

where instead of C, the indicator U is used - the number of pages optimized for this keyword.

Let's look at an example of how to use the Keyword Effectiveness Index (KEI). Let's determine the frequency of requests using the Yandex Wordstat service:


At the next step, let's see how many pages are in the search results for the search query we are interested in for the last month.


Substitute the found values ​​of the variables into the formula and calculate the keyword effectiveness index KEI:

KEI = (206 146 * 206 146) / 70 000 000 = 607

How to evaluate KEI values:

  • if KEI is less than 10, then search queries are ineffective;
  • if KEI is from 10 to 100, then search queries are effective, they will attract the target audience to the site;
  • if KEI is from 100 to 400, then search queries are very effective, they will attract a significant share of traffic;
  • with a KEI of more than 400, search queries have maximum efficiency and will attract a huge number of users.

Keep in mind that the gradation of KEI keyword performance index values ​​is determined by the theme of the site. Therefore, the above scale of values ​​cannot be applied to all Internet resources, since for some the value of KEI > 400 may be insufficient, and for highly specialized sites this classification is not applicable at all.

Step 5. Group keywords on the site

Clustering the semantic core of the site is a process of grouping search queries for logical reasons and based on the results of search engines. Before proceeding with the grouping, it is important to make sure that the specialist who will carry it out understands all the intricacies of the company and product, knows their specifics.

This work is expensive, especially when it comes to filling a multi-page Internet resource. But it doesn't have to be done by hand. You can cluster the semantic core of the site automatically using special services, such as Topvisor, Seranking.ru, etc.

But it is better to double-check the results obtained, since the logic of separating keys into groups for programs may not coincide with yours. In the end, you will get the final structure of the site. Now you will clearly understand which pages you need to create and which ones to eliminate.

When is it necessary to analyze the semantic core of a competitor's website?

  1. When starting a new project.

You are working on a new project and are building the semantic core of the site from scratch. To do this, you decided to analyze the keywords that competitors use to promote their sites.

Many are suitable for you, so you use them to replenish the semantic core. It is worth considering the niche in which competitors operate. If you plan to occupy a small market share, and competitors operate at the federal level, then you cannot just take and completely copy their semantic core.

  1. When expanding the semantic core of a working site.

Do you have a website that needs to be promoted? The semantic core was formed a long time ago, but it works inefficiently. Requires site optimization, restructuring, updating content in order to increase traffic. Where to start?

First of all, you can analyze the semantic core on competing sites using specialized services.

How to use keywords from competitor sites in the most effective way?

Here are some easy rules. First, take into account the percentage of matches for keys from your site and from other people's Internet resources. If your site is still under development, then choose any competing site, analyze it and use keywords as the basis for creating your semantic core.

In the future, you will simply compare how much your reference keys intersect with keys from competitor sites. The easiest way is to use the service to download a list of all competing sites and filter them by the percentage of intersections.

Then you need to download the semantic cores of the first few sites in Excel program or Key Collector and add new keywords to the semantic core of your site.

Secondly, before copying the keys from the donor site, be sure to visually check it.

  1. When buying a ready-made site for the purpose of subsequent development or resale.

Consider an example: you want to buy a certain site, but before making a final decision, you need to evaluate its potential. The easiest way to do this is to study the semantic core, so you can compare the current coverage of the site with competitors' sites.

Take the strongest competitor as a benchmark and compare its visibility with the results of the Internet resource that you plan to purchase. If the gap from the reference site is significant, this is a good sign: it means that your site has the potential to expand the semantic core and attract new traffic.

Pros and cons of analyzing the semantic core of competitors through special services

The principle of operation of many services for determining keywords on other people's sites is as follows:

  • a list of the most popular search queries is formed;
  • for each key, 1-10 search results pages (SERPs) are selected;
  • such collection of key phrases is repeated with a certain frequency (weekly, monthly or every year).

Disadvantages of this approach:

  • services issue only the visible part of search queries on the websites of competing organizations;
  • services retain a kind of "cast" of the issuance created during the collection of keywords;
  • services can determine the visibility of only those search queries that are in their databases;
  • services show only those keywords that they know.
  • to get reliable data about keywords on a competing site, you need to know when search queries were collected (visibility analysis);
  • not all requests are reflected in the search results, so the service does not see them. The reasons may be different: the pages of the site have not yet been indexed, the search engine does not rank the pages due to the fact that they take a long time to load, contain viruses, etc.;
  • usually there is no information about which keys are included in the base of the service used to collect search results.

Thus, the service does not form a real semantic core that underlies the site, but only a small visible part of it.

Based on the foregoing, the following conclusions can be drawn:

  1. The semantic core of the competitor's website, formed with the help of special services, does not give a complete up-to-date picture.
  2. Checking the semantic core of a competitor's site helps to complement the semantics of your Internet resource or analyze the marketing policy of competing companies.
  3. The larger the keyword base of the service, the slower the process of processing the issuance and the lower the level of relevance of semantics. While the service collects search results at the beginning of the database, the data at the end of the database becomes obsolete.
  4. Services do not disclose information about the degree of relevance of their databases and the date of the last update. Therefore, you cannot know to what extent the keywords selected by the service from a competitor's site reflect its real semantic core.
  5. A significant advantage of this approach is that you get access to a large list of competitor keywords, many of which you can use to expand the semantic core of your site.

TOP 3 paid services where you can find out the semantic core of competitors

Megaindex Premium Analytics


This service has a rich arsenal for analyzing the semantic core of competing sites. Using the module Site Visibility you can find and download a list of keywords, identify sites with a similar theme that can be used to expand the semantic core of your site.

One of the disadvantages of Megaindex Premium Analytics is the inability to filter the lists of keys in the program itself, you first need to download them in Excel.

Brief description of the service:

Keys.so


In order to analyze the semantic core using the keys.so service, you need to insert the url of a competitor site, select suitable sites based on the number of matching key phrases, analyze them and download a list of search queries for which they are promoted. The service makes it easy and simple. A nice bonus is the modern interface of the program.

Cons: small size of the database of search phrases, insufficient frequency of visibility updates.

Brief summary of the service:

Spywords


This service not only analyzes visibility, but also provides statistics on advertisements in Yandex.Direct. At first, it is difficult to deal with the spywords.ru interface, it is overloaded with functionality, but in general it does its job well.

With the help of the service, you can analyze competing sites, identify intersections in key phrases, and upload a list of competitor keys. The main disadvantage is the insufficient base of the service (about 23 million search phrases).

Brief summary of the service:

Thanks to special programs sites and their semantic cores are no longer a mystery to you. You can easily analyze any Internet resources of competitors you are interested in. Here are some tips for using the information you receive:

  1. Use keywords only from sites with similar topics(the more intersections with yours, the better).
  2. Do not analyze portals, they have too large semantic cores. As a result, you will not supplement your own core, but only expand it. And this, as you already know, can be done endlessly.
  3. When buying a site, be guided by the indicators of its current visibility in the search engine, compare them with the sites included in the TOP to assess the development potential.
  4. Take keywords from competitor sites to complement the semantic core of your site, rather than building it from scratch.
  5. The larger the base of the service you use, the more complete your semantic core will be. But pay attention to the frequency of updating search phrase databases.

7 services that will help you create the semantic core of the site from scratch online

Google Keyword Planner


If you are thinking about how to create a semantic core of a site, pay attention to this service. It can be used not only in Runet, but also in other segments where AdWords works.

Open Google AdWords. In the top bar in the section "Tools" click on option Keyword Planner. A new menu will appear in which you need to select a section "Search for new keywords by phrase, site or category." Here you can configure the following settings:

  • the keyword or phrase to search for;
  • subject matter of the product or service;
  • region of search queries;
  • the language in which users enter search queries;
  • keyword search engine;
  • negative keywords (should not be present in keywords).

Next, click on the button "Get Options" after which Google AdWords will give you possible synonyms for your keyword or phrase. The received data can be uploaded to Google Docs or CSV.

Benefits of using Google service AdWords:

  • the ability to select synonyms for the key phrase;
  • use of negative keywords to refine the search query;
  • access to a huge database of search queries of the Google search engine.

The main disadvantage of the service: if you have a free account, then Google AdWords will provide inaccurate data on the frequency of search queries. The error is so significant that it is impossible to rely on these indicators when promoting the site. The way out is to buy access to a paid account or use another service.

Serpstat


This service allows you to comprehensively collect user search queries by keywords and site domains. Serplast is constantly expanding the number of region bases.

The service allows you to identify your site's key competitors, determine the search phrases by which they are promoted, and form a list of them for subsequent use in the semantic core of your Internet resource.

Benefits of the Serplast service:

  • a large selection of tools for analyzing the semantic core of competitor sites;
  • informative reporting forms reflecting the frequency indicators for the selected region;
  • option to upload search queries for specific pages of the site.

Cons of the Serplast service:

  • despite the fact that the service database data is constantly updated, there is no guarantee that between latest updates realistic data on the frequency of search queries will be provided;
  • not all search phrases with low frequency are displayed by the service;
  • limited languages ​​and countries with which the service works.

Key Collector


This service will help you deal not only with the question of how to assemble the semantic core of the site, but also solve the problem of its expansion, cleaning and clustering. Key Collector is able to collect search queries, provide data on the level of their frequency in selected regions, and process semantics.

The program searches for key phrases in the start lists. It can be used to work with databases of various formats.

Key Collector can show the frequency of keywords from data downloaded from Serpstat, Yandex Wordstat and other services.

Semrush


Compiling the semantic core of the site in the Semrush program will cost you absolutely free. But you will receive no more than 10 key queries with data on their frequency in the selected region. In addition, using the service, you can find out what other search queries users in other regions enter for your keyword.

Advantages of the Semrush service:

  • works all over the world, it is possible to collect data on the frequency of search queries in the western region;
  • for each key phrase gives the TOP sites in the search results. You can be guided by them in the future, when forming the semantic core of your own site.

Cons of the Semrush service:

  • if you want to get more than 10 keywords, you need to purchase a paid version for $100;
  • it is not possible to download the complete list of key phrases.

keyword tool


This service allows you to collect key phrases for the semantic core of the site from foreign Internet resources in broad correspondence. Keywordtool also allows you to select search suggestions and phrases that contain the base key.

If you use the free version of the program, then in one session you can get no more than 1000 search phrases without data on their frequency level.

Advantages of the Keywordtool service:

  • works with different languages ​​and in many countries of the world;
  • shows search queries not only from search engines, but also from popular online stores (Amazon, eBay, App Store) and the largest video hosting service YouTube;
  • the breadth of coverage of search phrases exceeds that of Google AdWords;
  • the generated list of search queries can be easily copied into a table of any format.

Disadvantages of the Keywordtool service:

  • demon paid version does not provide data on the frequency of search queries;
  • there is no way to load keywords at once as a list;
  • searches for keywords only by phrases in which they can be included, does not take into account possible synonyms

Ubersuggest


The semantic core of the site in the Ubersuggest service can be created based on the search queries of users from almost any country in the world in any language. If you use the free version, you can get up to 750 search phrases per query.

The advantage of the service is the ability to sort the list of keywords in alphabetical order, taking into account the prefix. All search queries are automatically grouped, which makes it easier to work with them when forming the semantic core of the site.

As a disadvantage of Ubersuggest, one can single out incorrect data on the frequency of search queries in free version programs and the inability to search by keyword synonyms.

Ahrefs Keywords Explorer


This service can collect keywords for your semantic core in broad, phrase and exact matches in the selected region, taking into account the frequency level.

There is an option to select negative keywords and view the TOP search results in Google for your main keywords.

The main disadvantages of Ahrefs Keywords Explorer are only the paid version of the program and the dependence of data accuracy on the degree of relevance of the databases.

Frequently asked questions on compiling the semantic core of the site

  • How many keys are enough to create the semantic core of the site (100, 1000, 100,000)?

This question cannot be answered unambiguously. It all depends on the specifics of the site, its structure, and the actions of competitors. The optimal number of keys is determined individually.

  • Is it worth using ready-made databases of key phrases to form the semantic core of the site?

On the Internet you can find many different resources with thematic databases of keys. For example, Base Pastukhov, UP Base, Mutagen, KeyBooster, etc. It cannot be said that you should not use such sources. Such databases contain significant archives of search queries that will be useful to you for website promotion.

But remember about such indicators as competitiveness and relevance of keys. Also keep in mind that your competitors can also use ready-made bases. Another disadvantage of such sources is the likelihood of missing key phrases that are meaningful to you.

  • How to use the semantic core of the site?

Key phrases selected to create a semantic core are used to compile a relevance map. It includes title, description tags and h1-h6 headings that are needed to promote the site. Also, the keys are taken as the basis for writing SEO texts for the site pages.

  • Is it worth taking requests with zero frequency for the semantic core of the site?

This is useful in the following cases:

  1. If you spend a minimum of resources and time to create pages with such keys. For example, generating SEO filter pages in automatic mode in online stores.
  2. Zero frequency is not absolute, that is, at the time of collecting information, the frequency level is zero, but the history of the search engine shows requests for this word or phrase.
  3. Zero frequency only in the selected region, in other regions the frequency level for this key is higher.

5 typical mistakes when collecting a semantic core for a website

  1. Avoid keyword phrases with high competition. After all, this does not oblige you to bring the site to the TOP by this key at all costs. You can use such a search phrase as an addition to the semantic core, as a content idea.
  2. Refusal to use keys with low frequency. You can also use similar search terms as content ideas.
  3. Creation of web pages for individual search queries. Surely you have seen sites where similar queries (for example, “buy a wedding cake” and “make a wedding cake to order”) have their own page. But the user who enters these requests actually wants the same thing. There is no point in making multiple pages.
  4. Create the semantic core of the site exclusively with the help of services. Of course, collecting keys automatically makes life easier. But their value will be minimal if you do not analyze the result. After all, only you understand the features of the industry, the level of competition and know everything about the events of your company.
  5. Over-focus on collecting keys. If you have a small site, then start by collecting semantics using Yandex or Google services. You should not immediately engage in the analysis of the semantic core on competitor sites or collect keys from different search engines. All of these methods will come in handy when you realize that it's time to expand the kernel.

Or maybe it is better to order the compilation of the semantic core of the site?

You can try to compose the semantic core yourself with free services that we have talked about. For example, "Keyword Planner by Google" can give you a good result. But if you are interested in creating a large, high-quality semantic core, plan this item in your budget.

On average, the development of the semantic core of the site will cost from 30 to 70 thousand rubles. As you remember, the final price depends on the subject of the business and the optimal number of search queries.

Not to buy a pig in a poke

A high-quality semantic core will not be cheap. To make sure that the performer understands this work and will do everything on high level, ask it to collect trial semantics on a single request. This is usually done free of charge.

To check the results, run the list of keys through Mutagen and analyze how many of them are high-frequency and low-competitive. Often performers provide lists with big amount key phrases, many of which are completely unsuitable for further use.


Array ( => 21 [~ID] => 21 => 09/28/2019 13:01:03 [~TIMESTAMP_X] => 09/28/2019 13:01:03 => 1 [~MODIFIED_BY] => 1 => 09/21. 2019 10:35:17 [~DATE_CREATE] => 09/21/2019 10:35:17 => 1 [~CREATED_BY] => 1 => 6 [~IBLOCK_ID] => 6 => [~IBLOCK_SECTION_ID] => => Y [~ACTIVE] => Y => Y [~GLOBAL_ACTIVE] => Y => 500 [~SORT] => 500 => Articles by Dmitry Svistunov [~NAME] => Articles by Dmitry Svistunov => 11076 [~PICTURE] = > 11076 => 7 [~LEFT_MARGIN] => 7 => 8 [~RIGHT_MARGIN] => 8 => 1 [~DEPTH_LEVEL] => 1 => Dmitry Svistunov [~DESCRIPTION] => Dmitry Svistunov => text [~DESCRIPTION_TYPE ] => text => Articles by Dmitry Svistunov Dmitry Svistunov [~SEARCHABLE_CONTENT] => Articles by Dmitry Svistunov Dmitry Svistunov => statyi-dmitriya-svistunova [~CODE] => statyi-dmitriya-svistunova => [~XML_ID] => => [~TMP_ID] => => [~DETAIL_PICTURE] => => [~SOCNET_GROUP_ID] => => /blog/index.php?ID=6 [~LIST_PAGE_URL] => /blog/index.php?ID=6 => /blog/list.php? SECTION_ID=21 [~SECTION_PAGE_URL] => /blog/list.php?SECTION_ID=21 => blog [~IBLOCK_TYPE_ID] => blog => blog [~IBLOCK_CODE] => blog => [~IBLOCK_EXTERNAL_ID] => => [ ~EXTERNAL_ID] =>)

Computer