A well-formed semantic core is the basis for. Working with the semantic core in practice

What's happened semantic core site? The semantic core of the site (hereinafter referred to as SA) is a set of keywords and phrases for which the resource moving forward in search engines and which indicate that the site belongs to a certain topics.

For successful promotion in search engines, keywords must be correctly grouped and distributed throughout the pages of the site and in certain form contained in meta descriptions ( title, description, keywords), as well as in the headings H1-H6. At the same time, overspam should not be allowed so as not to “fly away” to Baden-Baden.

In this article, we will try to look at the issue not only from a technical point of view, but also look at the problem through the eyes of business owners and marketers.

What is the collection of SA?

  • manual- possible for small sites (up to 1000 keywords).
  • automatic- programs do not always correctly determine the context of the request, so there may be problems with the distribution of keywords across pages.
  • semi-automatic- phrases and frequency are collected automatically, the distribution of phrases and refinement is done manually.

In our article, we will consider a semi-automatic approach to creating a semantic core, since it is the most effective.

In addition, there are two typical cases when compiling SA:

  • for a site with a ready-made structure;
  • for a new site.

The second option is more preferable, since it is possible to create an ideal site structure for search engines.

What is the process of compiling a SA?

Work on the formation of the semantic core is divided into the following stages:

  1. Identification of directions in which the site will move.
  2. Collection of keywords, analysis of similar queries and search suggestions.
  3. Frequency parsing, elimination of "empty" requests.
  4. Clustering (grouping) requests.
  5. Distribution of queries across the pages of the site (drawing up the ideal structure of the site).
  6. Recommendations for use.

The better you make the core of the site, and quality in this case means the breadth and depth of semantics, the more powerful and reliable the flow of search traffic you can send to the site and the more you will attract customers.

How to make the semantic core of the site

So, let's take a closer look at each item with various examples.

At the first step, it is important to determine which goods and services present on the site will be promoted in the search results of Yandex and Google.

Example #1. Suppose there are two areas of services on the site: computer repair at home and training in working with Word / Excel at home. In this case, it was decided that learning is no longer in demand, so it makes no sense to promote it, and therefore collect semantics on it. Another important point, you need to collect not only requests containing computer repair at home, but also laptop repair, pc repair and others.

Example #2. The company is engaged in low-rise construction. But at the same time he builds only wooden houses. Accordingly, queries and semantics by directions "construction of houses from aerated concrete" or "building brick houses" may not be collected.

Collection of semantics

We will consider two main sources of keywords: Yandex and Google. We will tell you how to collect semantics for free and briefly review paid services that allow you to speed up and automate this process.

In Yandex, key phrases are collected from the Yandex.Wordstat service and in Google through query statistics in Google AdWords. If available, as additional sources of semantics, you can use data from Yandex Webmaster and Yandex Metrics, Google Webmaster and Google Analytics.

Collection of keywords from Yandex.Wordstat

Collecting requests from Wordstat can be considered free. To view the data of this service, you only need an account in Yandex. So let's go to wordstat.yandex.ru and enter a keyword. Consider an example of collecting semantics for a car rental company website.

What do we see in this screenshot?

  1. Left column. Here is the main query and its various variations with "tail". Next to each request is a number indicating how much given request in general has been used by various users.
  2. Right column. Requests similar to the main one and indicators of their overall frequency. Here we see that the person who wants to rent a car, apart from the request "car rental", can use "car rental", "car rental", "car rental" and others. This is very important data that you need to pay attention to so as not to miss a single request.
  3. Regionality and history. By choosing one of the options, you can check the distribution of requests by region, the number of requests in a particular region or city, as well as the trend over time or with the change of season.
  4. Devices from which the request was made. By switching tabs, you can find out which devices are most often searched from.

Check different options for key phrases and record the received data in Excel spreadsheets or Google spreadsheets. For convenience, install the plugin Yandex Wordstat Helper. After installing it, plus signs will appear next to the search phrases, by clicking on which the words will be copied, there will be no need to select and insert the frequency indicator manually.

Collecting keywords from Google AdWords

Unfortunately, Google doesn't have a public source of search queries with their frequency metrics, so there's a workaround to be done here. And for this we need a working account in Google AdWords.

We register an account in Google AdWords and replenish the balance by the minimum possible amount - 300 rubles (on an account that is inactive in terms of budget, approximate data is displayed). After that, go to "Tools" - "Keyword Planner".

Will open new page, where in the tab "Search for new keywords by phrase, site or category" enter the keyword.

Scroll down, click "Get Options" and see something like this.

  1. Main query and average queries per month. If the account is not paid, then you will see approximate data, that is, the average number of requests. When there are funds on the account, the exact data will be shown, as well as the dynamics of changes in the frequency of the entered keyword.
  2. Keywords by relevance. This is the same as similar queries in Yandex Wordstat.
  3. Downloading data. This tool is convenient in that the data obtained in it can be downloaded.

We considered working with two main sources of statistics on search queries. Now let's move on to automating this process, because collecting semantics manually takes too much time.

Programs and services for collecting keywords

Key Collector

The program is installed on the computer. The program connects working accounts from where statistics will be collected. Next, a new project and a folder for keywords are created.

Select "Batch collection of words from the left column of Yandex.Wordstat", enter queries for which we collect data.

An example is introduced in the screenshot, in fact, for a more complete SA, here you additionally need to collect all query options with car brands and classes. For example, "rent a bmw", "buy a toyota with option to buy", "rent an SUV" and so on.

SlovoEb

Free analogue previous program. This can be considered both a plus - you do not need to pay, and a minus - the program has significantly reduced functionality.

To collect keywords, the steps are the same.

Rush-analytics.com

Online service. Its main advantage is that you do not need to download and install anything. Register and enjoy. The service is paid, but when registering, you have 200 coins on your account, which is quite enough to collect small semantics (up to 5000 requests) and parse the frequency.

Minus - collection of semantics only from Wordstat.

Checking the frequency of keywords and queries

And again we notice a decrease in the number of requests. Let's go ahead and try another word form of the same query.

We note that in the singular, this request is looking for a much smaller number of users, which means that the initial request is a higher priority for us.

Such manipulations must be carried out with each word and phrase. Those requests for which the final frequency turns out to be equal to zero (when using quotes and an exclamation mark) are eliminated, because "0" - indicates that no one enters such requests and these requests are only part of others. The point of compiling a semantic core is to select the queries that people use to search. All requests are then placed in an Excel table, grouped by meaning and distributed across the pages of the site.

To do this manually is simply not realistic, so there are many services on the Internet, paid and free, that allow you to do this automatically. Let's give a few:

  • megaindex.com;
  • rush-analytics.ru
  • tools.pixelplus.com;
  • key-collector.ru

Removing non-target requests

After sifting keywords, you should remove unnecessary ones. What kind search terms can be removed from the list?

  • requests with the names of competitors' companies (can be left in contextual advertising);
  • requests for goods or services that you do not sell;
  • requests that indicate the district or region in which you do not work.

Clustering (grouping) requests for site pages

The essence of this stage is to combine requests that are similar in meaning into clusters, and then determine which pages they will be promoted to. How to understand which queries to promote to one page, and which to another?

1. By request type.

We already know that all queries in search engines are divided into several types, depending on the purpose of the search:

  • commercial (buy, sell, order) - they are promoted to landing pages, product category pages, product cards, service pages, price lists;
  • informational (where, how, why, why) - articles, forum topics, a section answering a question;
  • navigational (phone, address, brand name) - page with contacts.

If you are in doubt what type of request is, enter its search string and analyze the output. For a commercial request, there will be more pages with a service offer, for an informational request, there will be more articles.

Also have geo-dependent and geo-independent queries. Most commercial queries are geo-dependent, as people trust companies located in their city to a greater extent.

2. Request logic.

  • "buy iphone x" and "iphone x price" - you need to promote one page, since in the first and second cases, the same product is searched, and more detailed information about him;
  • "buy iphone" and "buy iphone x" - you need to promote on different pages, since in the first request we are dealing with a general request (suitable for the product category where iPhones are located), and in the second, the user is looking for a specific product and this request should be promoted to the product card;
  • "how to choose good smartphone» - this request is more logical to promote to a blog article with the appropriate title.

View search results for them. If you check which pages of different sites lead to the queries “building houses from timber” and “building houses from bricks”, then in 99% of cases these are different pages.

4. Automatic grouping by software and manual refinement.

The 1st and 2nd methods are great for compiling the semantic core of small sites, where a maximum of 2-3 thousand keywords are collected. For a large SA (from 10,000 to infinity requests), the help of machines is needed. Here are a few programs and services that allow you to perform clustering:

  • KeyAssistant - assistant.contentmonster.ru;
  • semparser.ru;
  • just-magic.org;
  • rush-analytics.ru
  • tools.pixelplus.com;
  • key-collector.ru

After the completion of automatic clustering, it is necessary to check the result of the program's work manually and, if errors are made, correct them.

Example: the program can send the following queries to one cluster: “rest in Sochi 2018 hotel” and “rest in Sochi 2018 hotel breeze” - in the first case, the user is looking for various options for hotels to stay, and in the second, a specific hotel.

To eliminate the occurrence of such inaccuracies, you need to manually check everything and, if errors are found, edit.

What to do next after compiling the semantic core?

Based on the collected semantic core, then we:

  1. we compose the ideal structure (hierarchy) of the site, from the point of view of search engines;
    or in agreement with the customer, we change the structure of the old site;
  2. we write technical tasks for copywriters on writing text, taking into account the cluster of requests that will be promoted to this page;
    or finalizing old articles texts on the site.

It looks like this.

For each generated request cluster, we create a page on the site and determine its place in the site structure. The most popular queries are promoted to the most top pages in the resource hierarchy, the less popular ones are below them.

And for each of these pages, we have already collected requests that we will promote on them. Next, we write technical specifications for copywriters to make text for these pages.

Terms of reference for a copywriter

As in the case of the site structure, we will describe this stage in general terms. So, the terms of reference for the text:

  • number of characters without spaces;
  • page title;
  • subheadings (if any);
  • a list of words (based on our kernel) that should be in the text;
  • uniqueness requirement (always require 100% uniqueness);
  • desired text style;
  • other requirements and wishes according to the text.

Remember, don't try to promote +100500 requests to one page, limit yourself to 5-10 + tail, otherwise you will get a ban for re-optimization and you will be out of the game for a long time for places in the TOP.

Output

Compiling the semantic core of the site is painstaking and hard work, which needs to be given special attention, because it is on it that the further promotion of the site is based. Follow the simple instructions given in this article and proceed.

  1. Choose the direction of advancement.
  2. Collect all possible requests from Yandex and Google (use special programs and services).
  3. Check the frequency of requests and get rid of dummies (which have a frequency of 0).
  4. Delete non-targeted requests - services and products that you do not sell, a request mentioning competitors.
  5. Form clusters of requests and distribute them across pages.
  6. Create an ideal site structure and draw up technical specifications for filling the site.

If you know the pain of search engines' "dislike" for your online store pages, read this article. I will talk about the path to increasing the visibility of the site, or rather, about its first stage - collecting keywords and compiling a semantic core. About the algorithm for its creation and the tools that are used for this.

Order the collection of the semantic core from SEO-specialists of the Netpeak agency:

Why make up a semantic core?

To increase the visibility of the site pages. Make it so that Yandex and Google search robots begin to find pages of your site at the request of users. Of course, collecting keywords (composing semantics) is the first step towards this goal. Next, a conditional “skeleton” is thrown in to distribute keywords across different landing pages. And then articles / meta tags are already written and implemented.

By the way, on the Internet you can find many definitions of the semantic core.

1. "The semantic core is an ordered set of search words, their morphological forms and phrases that most accurately characterize the type of activity, product or service offered by the site." Wikipedia.

To collect the semantics of competitors in Serpstat, enter one of the key queries, select a region, click "Search" and go to the "Keyword Analysis" category. Then select "SEO Analysis" and click "Select Phrases". Export results:

2.3. Using Key Collector/Slovoeb to create a semantic core

If you need to create a semantic core for a large online store, you cannot do without Key Collector. But if you are a beginner, then it is more convenient to use a free tool - Slovoeb (let this name not scare you). Download the program, and in the Yandex.Direct settings, specify the login and password for your Yandex.Mail:
Create a new project. In the "Data" tab, select the "Add phrases" function. Select a region and enter the requests you received earlier:
Tip: create a separate project for each new domain, and make a separate group for each category/landing page. For example: Now collect the semantics from Yandex.Wordstat. Open the tab "Data collection" - "Batch collection of words from the left column of Yandex.Wordstat". In the window that opens, select the checkbox "Do not add phrases if they already exist in any other groups." Enter some of the most popular (high-frequency) phrases among users and click "Start collecting":

By the way, for large projects in Key Collector, you can collect statistics from competitor analysis services SEMrush, SpyWords, Serpstat (ex. Prodvigator) and other additional sources.

The semantic core is a scary name that SEOs have come up with to refer to a fairly simple thing. We just need to select the key queries for which we will promote our site.

And in this article, I will show you how to properly compose a semantic core so that your site quickly reaches the TOP, and does not stagnate for months. Here, too, there are "secrets".

And before we move on to compiling the SA, let's look at what it is, and what we should eventually come to.

What is the semantic core in simple words

Oddly enough, but the semantic core is the usual excel file, which lists the key queries for which you (or your copywriter) will write articles for the site.

For example, here is how my semantic core looks like:

I have marked in green those key queries for which I have already written articles. Yellow - those for whom I am going to write articles in the near future. And colorless cells mean that these requests will come a little later.

For each key request, I have determined the frequency, competitiveness, and invented a "catchy" title. Here is approximately the same file you should get. Now my SL consists of 150 keywords. This means that I am provided with “material” for at least 5 months in advance (even if I write one article a day).

A little lower we will talk about what you should prepare for if you suddenly decide to order the collection of a semantic core from specialists. Here I will say briefly - they will give you the same list, but only for thousands of "keys". However, in SA it is not the quantity that matters, but the quality. And we will focus on this.

Why do we need a semantic core at all?

But really, why do we need this torment? You can, in the end, just write high-quality articles just like that, and attract an audience with this, right? Yes, you can write, but you can’t attract.

The main mistake of 90% of bloggers is just writing high-quality articles. I'm not kidding, they have really interesting and useful materials. But search engines don't know about it. They are not psychics, but just robots. Accordingly, they do not put your article in the TOP.

There is another subtle point here with the title. For example, you have a very high-quality article on the topic "How to do business in the" muzzle book ". There you describe everything about Facebook in great detail and professionally. Including how to promote communities there. Your article is the most high-quality, useful and interesting on the Internet on this topic. No one was lying next to you. But it still won't help you.

Why quality articles fly out of the TOP

Imagine that your site was visited not by a robot, but by a live checker (assessor) from Yandex. He understood that you have the coolest article. And the hands put you in first place in the search results for the query "Community promotion on Facebook."

Do you know what will happen next? You will be out of there very soon. Because no one will click on your article, even in the first place. People enter the query "Community promotion on Facebook", and your headline is "How to do business in the" muzzle book ". Original, fresh, funny, but ... not on demand. People want to see exactly what they were looking for, not your creative.

Accordingly, your article will empty take a place in the TOP of the issue. And a living assessor, an ardent admirer of your work, can beg the authorities for as long as he likes to leave you at least in the TOP-10. But it won't help. All the first places will be occupied by empty, like husks from seeds, articles that were copied from each other by yesterday's schoolchildren.

But these articles will have the correct “relevant” title - “Community promotion on Facebook from scratch” ( step by step, 5 steps, from A to Z, free etc.) It's a shame? Still would. Well, fight against injustice. Let's make a competent semantic core so that your articles take the well-deserved first places.

Another reason to start compiling SA right now

There is one more thing that for some reason people don't think much about. You need to write articles often - at least every week, and preferably 2-3 times a week to get more traffic and faster.

Everyone knows this, but almost no one does it. And all because they have “creative stagnation”, “they can’t force themselves”, “just laziness”. But in fact, the whole problem is precisely in the absence of a specific semantic core.

I entered one of my basic keys — “smm” into the search field, and Yandex immediately gave me a dozen hints about what else might be of interest to people who are interested in “smm”. I just have to copy these keys into a notebook. Then I will check each of them in the same way, and collect clues on them as well.

After the first stage of collecting SA, you should be able to Text Document, which will contain 10-30 wide base keys, with which we will work further.

Step #2 - Parsing Basic Keys in SlovoEB

Of course, if you write an article for the query "webinar" or "smm", then a miracle will not happen. You will never be able to reach the TOP for such a broad query. We need to break the base key into many small queries on this topic. And we will do this with the help of a special program.

I use KeyCollector but it's paid. You can use a free analogue - the SlovoEB program. You can download it from the official site.

The most difficult thing in working with this program is to set it up correctly. How to properly set up and use Slovoeb I show. But in that article, I focus on the selection of keys for Yandex-Direct.

And here let's take a look at the features of using this program for compiling a semantic core for SEO step by step.

First we create a new project and name it according to the broad key you want to parse.

I usually give the project the same name as my base key so I don't get confused later. And yes, I will warn you against another mistake. Don't try to parse all base keys at the same time. Then it will be very difficult for you to filter out “empty” key queries from golden grains. Let's parse one key at a time.

After creating the project, we carry out the basic operation. That is, we actually parse the key through Yandex Wordstat. To do this, click on the "Worstat" button in the program interface, enter your base key, and click "Start collecting".

For example, let's parse the base key for my blog "contextual advertising".

After that, the process will start, and after a while the program will give us the result - up to 2000 key queries that contain "contextual advertising".

Also, next to each request there will be a “dirty” frequency - how many times this key (+ its word forms and tails) was searched per month through Yandex. But I do not advise you to draw any conclusions from these figures.

Step #3 - Gathering the exact frequency for the keys

Dirty frequency will not show us anything. If you focus on it, then do not be surprised later when your key for 1000 requests does not bring a single click per month.

We need to find the net frequency. And for this, we first select all the found keys with checkmarks, and then click on the Yandex Direct button and start the process again. Now Slovoeb will look for the exact request frequency per month for each key.

Now we have an objective picture - how many times what request was entered by Internet users over the past month. Now I propose to group all key queries by frequency, so that it would be more convenient to work with them.

To do this, click on the "filter" icon in the "Frequency "!" ”, and specify to filter out keys with the value “less than or equal to 10”.

Now the program will show you only those requests, the frequency of which is less than or equal to the value "10". You can delete these queries or copy them for the future to another group of keywords. Less than 10 is very low. Writing articles for these requests is a waste of time.

Now we need to choose those keywords that will bring us more or less good traffic. And for this we need to find out one more parameter - the level of competition of the request.

Step #4 - Checking for Query Concurrency

All "keys" in this world are divided into 3 types: high-frequency (HF), mid-frequency (MF), low-frequency (LF). And they can also be highly competitive (VC), medium competitive (SC) and low competitive (NC).

As a rule, HF requests are simultaneously VC. That is, if a query is often searched on the Internet, then there are a lot of sites that want to advance on it. But this is not always the case, there are happy exceptions.

The art of compiling a semantic core lies precisely in finding such queries that have a high frequency, and their level of competition is low. Manually determining the level of competition is very difficult.

You can focus on indicators such as the number of main pages in the TOP-10, the length and quality of texts. the level of trust and sites in the TOP of the issue on request. All of this will give you some idea of ​​how tough the competition for positions is for this particular request.

But I recommend that you use service Mutagen. It takes into account all the parameters that I mentioned above, plus a dozen more, which neither you nor I probably even heard of. After analysis, the service gives the exact value - what is the level of competition for this request.

Here I checked the request "setting up contextual advertising in google adwords". The mutagen showed us that this key has a concurrency of "more than 25" - this is the maximum value that it shows. And this request has only 11 views per month. So it doesn't suit us.

We can copy all the keys that we picked up in Slovoeb and do a mass check in Mutagen. After that, we will only have to look through the list and take those requests that have a lot of requests and a low level of competition.

The mutagen is paid service. But you can do 10 checks per day for free. In addition, the cost of verification is very low. For all the time I have worked with him, I have not yet spent even 300 rubles.

By the way, at the expense of the level of competition. If you have a young site, then it is better to choose queries with a competition level of 3-5. And if you have been promoting for more than a year, then you can take 10-15.

By the way, at the expense of the frequency of requests. We now need to take the final step, which will allow you to attract a lot of traffic even for low-frequency queries.

Step #5 - Collecting "tails" for the selected keys

As has been proven and verified many times, your site will receive the bulk of traffic not from the main keys, but from the so-called “tails”. This is when a person enters strange key queries into the search box, with a frequency of 1-2 per month, but there are a lot of such queries.

To see the "tail" - just go to Yandex and enter your chosen key query in the search bar. Here's what you'll see.

Now you just need to write out these additional words in a separate document, and use them in your article. At what it is not necessary to put them always next to the main key. Otherwise, search engines will see "re-optimization" and your articles will fall in the search results.

Just use them in different places in your article, and then you will receive additional traffic from them as well. I would also recommend that you try to use as many word forms and synonyms as possible for your main key query.

For example, we have a request - "Setting up contextual advertising". Here's how you can reformulate it:

  • Setup = set up, make, create, run, launch, enable, host…
  • Contextual advertising = context, direct, teaser, YAN, adwords, kms. direct, adwords…

You never know exactly how people will look for information. Add all these additional words to your semantic core and use when writing texts.

So, we collect a list of 100 - 150 keywords. If you are compiling a semantic core for the first time, then it may take you several weeks to complete it.

Or maybe break his eyes? Maybe there is an opportunity to delegate the compilation of CL to specialists who will do it better and faster? Yes, there are such specialists, but it is not always necessary to use their services.

Is it worth ordering SA from specialists?

By and large, specialists in compiling a semantic core will only take you steps 1 - 3 of our scheme. Sometimes, for a big additional fee, they will also take steps 4-5 - (collecting tails and checking the competition of requests).

After that, they will give you several thousand key queries with which you will need to work further.

And the question here is whether you are going to write articles yourself, or hire copywriters for this. If you want to focus on quality, not quantity, then you need to write it yourself. But then it won't be enough for you to just get a list of keys. You will need to choose those topics that you understand well enough to write a quality article.

And here the question arises - why then do we actually need specialists in SA? Agree, parsing the base key and collecting the exact frequencies (steps #1-3) is not at all difficult. It will take you literally half an hour.

The most difficult thing is to choose high-frequency requests that have low competition. And now, as it turns out, you need HF-NC, on which you can write a good article. This is exactly what will take you 99% of the time working on the semantic core. And no specialist will do this for you. Well, is it worth spending money on ordering such services?

When the services of SA specialists are useful

Another thing is if you initially plan to attract copywriters. Then you do not need to understand the subject of the request. Your copywriters will also not understand it. They will simply take a few articles on this topic and compile “their” text from them.

Such articles will be empty, miserable, almost useless. But there will be many. On your own, you can write a maximum of 2-3 quality articles per week. And the army of copywriters will provide you with 2-3 shitty texts a day. At the same time, they will be optimized for requests, which means they will attract some kind of traffic.

In this case, yes, calmly hire SA specialists. Let them also draw up TK for copywriters at the same time. But you understand, it will also cost some money.

Summary

Let's go over the main ideas in the article again to consolidate the information.

  • The semantic core is just a list of keywords for which you will write articles on the site for promotion.
  • It is necessary to optimize the texts for the exact key queries, otherwise your even the highest quality articles will never reach the TOP.
  • SL is like a content plan for social networks. It helps you not to fall into a “creative block”, and always know exactly what you will write about tomorrow, the day after tomorrow and in a month.
  • To compose a semantic core, it is convenient to use free program Slovoeb, you only need it.
  • Here are five steps in compiling CL: 1 - Selection of basic keys; 2 - Parsing basic keys; 3 - Collection of exact frequency for requests; 4 - Checking the competitiveness of keys; 5 - Collection of "tails".
  • If you want to write articles yourself, then it is better to make the semantic core yourself, for yourself. Specialists in the compilation of CL will not be able to help you here.
  • If you want to work on quantity and use copywriters to write articles, then it is entirely possible to involve delegating and compiling the semantic core. If only there was enough money for everything.

I hope this guide was helpful to you. Save it to your favorites so as not to lose it, and share it with your friends. Don't forget to download my book. There I show you the fastest way from zero to the first million on the Internet (squeezed from personal experience over 10 years =)

See you later!

Your Dmitry Novoselov

The semantic core is a set of search phrases and words that are used to promote the site. These search words and phrases help the robots to determine the subject of the page or the entire service, that is, to find out what the company does.

In Russian, semantics is a branch of the science of language that studies the semantic content of the lexical units of a language. Applied to search engine optimization this means that the semantic core is the semantic content of the resource. It helps to decide what information to convey to users and in what way. Therefore, semantics is the foundation, the basis of all SEO.

What is the semantic core of the site for and how to use it?

  • The correct semantic core is necessary to accurately calculate the cost of promotion.
  • Semantics is a vector for building internal seo-optimization: the most relevant queries for each service or product are selected so that users and search robots find them better.
  • Based on it, the site structure and texts for thematic pages are created.
  • Keys from semantics are used to write snippets ( short descriptions pages).

Here is the semantic core - an example of its compilation in a company website for a construction company website:

The optimizer collects semantics, parses it into logical blocks, finds out the number of their impressions, and based on the cost of requests in the top of Yandex and Google, calculates the total cost of promotion.

Of course, when selecting a semantic core, the specifics of the company's work are taken into account: for example, if the company did not design and build houses from glued beams, then we would delete the corresponding queries and not use them in the future. Therefore, an obligatory stage of working with semantics is its coordination with the customer: no one knows the specifics of the company's work better than him.

Types of keywords

There are several parameters by which keywords are classified.

  1. By frequency
    • high-frequency - words and phrases with a frequency of 1000 impressions per month;
    • mid-frequency - up to 1000 impressions per month;
    • low-frequency - up to 100 impressions.
  2. Frequency collection by keywords helps to find out what users most often request. But a high-frequency query is not necessarily a highly concurrent query, and compiling semantics with high frequency and low competitiveness is one of the main aspects in working with the semantic core.

  3. Type:
    • geo-dependent and non-geo-dependent - promotions tied to the region and not tied;
    • informational - according to them, the user receives some information. Keys of this type are usually used in articles - for example, reviews or useful tips;
    • branded - contain the name of the promoted brand;
    • transactional - implying an action from the user (buy, download, order) and so on.
  4. Other types are those that are difficult to attribute to any type: for example, the key is “profiled timber”. By entering such a request into a search engine, the user can mean anything: the purchase of timber, properties, comparisons with other materials, and so on.

    From the experience of our company, we can say that it is very difficult to promote any site for such requests - as a rule, these are high-frequency and highly competitive, and this is not only difficult to optimize, but also expensive for the client.

How to build a semantic core for a website?

  • By analyzing competitor sites (in SEMrush, SerpStat, you can see the semantic core of competitors):

The process of compiling a semantic core

The collected requests are not yet the semantic core, here it is still necessary to separate the wheat from the chaff so that all requests are relevant to the client's services.

To make up the semantic core, requests must be clustered (divided into blocks according to the logic of the service). You can do this with the help of programs (for example, KeyAssort or TopSite) - especially if the semantics is voluminous. Or manually evaluate and sort through the entire list, remove inappropriate requests.

Then send to the client and check if there are any errors.

The finished semantic core is a yellow brick path to the content plan, blog articles, texts for product cards, company news, and so on. This is a table with the needs of the audience that you can satisfy using your site.

  • Distribute the keys across pages.
  • Use keywords in meta tags , <description>, <h>(especially in the heading of the first level H1).</li> <li>Insert the keys in the texts for the pages. This is one of the white hat optimization methods, but it is important not to overdo it here: you can fall under search engine filters for overspam.</li> <li>The remaining search queries and those that do not fit into any section, save under the name "What else to write about." In the future, you can use them for informational articles.</li> <li>And remember: you need to focus on the needs and interests of users, so trying to cram all the keys into one text is pointless</li> </ul><h2>Collecting a semantic core for a site: main mistakes</h2> <ul><li>Refusal of highly competitive keys. Yes, possibly top on request <i>"buy profiled timber"</i> you won’t get in (and this will not prevent you from successfully selling your services), but you still need to include it in the texts.</li> <li>Refusal of low frequency. This is erroneous for the same reason as the rejection of highly competitive requests.</li> <li>Creating pages for requests and for the sake of requests. <i>"Buy profiled timber"</i> and <i>"Order profiled timber"</i>- essentially the same thing, it makes no sense to break them into separate pages.</li> <li>Absolute and unconditional trust in the software. You can’t do without SEO programs, but manual analysis and data verification are necessary. And no program can yet assess the industry, the level of competition and distribute keys without errors.</li> <li>Keys are everything. No, our everything is a convenient, understandable website and useful content. Any text needs keys, but if the text is bad, then the keys will not save.</li> </ul> <p>The semantic core is a rather hackneyed topic, isn't it? Today we will fix it together by collecting the semantics in this tutorial!</p> <p>Don't believe? - see for yourself - just drive into Yandex or Google the phrase the semantic core of the site. I think that today I will correct this annoying mistake.</p> <p><i>But really, what is it for you - <b>perfect semantics</b>? </i> You might think that this is a stupid question, but in fact it’s not even stupid at all, it’s just that most webmasters and site owners firmly believe that they can compose semantic cores and that any student can handle all this, <i>and they also try to teach others ...</i> But in reality, everything is much more complicated. Once I was asked - what should I do first? – the site itself and the content or <b>sem core</b>, and asked a man who does not consider himself a newcomer to SEO. This question also made me understand the complexity and ambiguity of this problem.</p> <p>The semantic core is the foundation of the foundations - the very first step that stands before the launch of any <a href="https://zhumor.ru/en/computer/retargeting-yandeks-direkt-kak-rabotaet-retargeting-i-podbor-auditorii.html">advertising campaign</a> in the Internet. Along with this, the semantics of the site is the most dreary process that will require a lot of time, but it will more than pay off in any case.</p> <p>Well ... let's create <b>him</b> together!</p> <h2><span>A small preface</span></h2> <p>To create the semantic field of the site, we need a single program - <b>Key Collector</b>. Using the Collector as an example, I will analyze an example of collecting a small family group. Apart from <a href="https://zhumor.ru/en/computer/programma-dlya-rekonstrukcii-doma-platnye-i-besplatnye-programmy.html">paid program</a>, there is and <a href="https://zhumor.ru/en/computer/kakie-istochniki-trafika-na-youtube-vybrat-poisk-pohozhie-rekomendovannye-newpipe-besplatnyi-an.html">free analogues</a> like SlovoEb and others.</p> <p>Semantics is collected in several basic stages, among which we should highlight:</p> <ul><li><i>brainstorming - analysis of basic phrases and preparation of parsing</i></li> <li><i>parsing - extension of basic semantics based on Wordstat and other sources</i></li> <li><i>dropout - dropout after parsing</i></li> <li><i>analysis - analysis of frequency, seasonality, competition and other important indicators</i></li> <li><i>refinement - grouping, separation of commercial and informational phrases of the core</i></li> </ul><p>The most important stages of the collection will be discussed below!</p> <h2><span>VIDEO - compiling a semantic core by competitors</span></h2> <p><span class="SVbXALRRlZ4"></span></p> <h2><span>Brainstorming when creating a semantic core - we strain our brains</span></h2> <p>At this stage, you need <b>mentally select</b> the semantic core of the site and come up with as many phrases as possible for our topic. So, we launch the key collector and select <b>wordstat parsing</b>, as shown in the screenshot:</p> <p><img src='https://i2.wp.com/markintalk.ru/wp-content/uploads/2011/08/start-key-collector.jpg' align="center" width="100%" loading=lazy loading=lazy></p> <p>A small window opens in front of us, where you need to enter a maximum of phrases on our subject. As I already said, <b>in this article we will create an example set of phrases for this blog</b>, so the phrases could be:</p> <ul><li>seo blog</li> <li>seo blog</li> <li>blog about seo</li> <li>blog about seo</li> <li>promotion</li> <li>promotion <b>project</b></li> <li>promotion</li> <li>promotion</li> <li>blog promotion</li> <li>blog promotion</li> <li>blog promotion</li> <li>blog promotion</li> <li>article promotion</li> <li>article promotion</li> <li>miralinks</li> <li>work in SAP</li> <li>buying links</li> <li>buying links</li> <li>optimization</li> <li>page optimization</li> <li>internal optimization</li> <li>self-promotion</li> <li>how to promote resource</li> <li>how to promote your site</li> <li>how to promote your site</li> <li>how to promote a website yourself</li> <li>self-promotion</li> <li>free promotion</li> <li>free promotion</li> <li>search engine optimization</li> <li>how to promote a website in yandex</li> <li>how to promote a site in Yandex</li> <li>promotion under Yandex</li> <li>google promotion</li> <li>promotion in google</li> <li>indexing</li> <li>speed up indexing</li> <li>site donor selection</li> <li>donor screening</li> <li>promotion by guards</li> <li>use of guards</li> <li>promotion by blogs</li> <li>Yandex algorithm</li> <li>update ticks</li> <li>search database update</li> <li>Yandex update</li> <li>links forever</li> <li>eternal links</li> <li>link rental</li> <li>leased link</li> <li>monthly payment links</li> <li>compiling a semantic core</li> <li>promotion secrets</li> <li>promotion secrets</li> <li>SEO secrets</li> <li>secrets of optimization</li> </ul><p>I think that's enough, and so the list is half a page;) In general, the idea is that at the first stage you need to analyze your industry to the maximum and select as many phrases as possible that reflect the theme of the site. Although, if you missed something at this stage - do not despair - <i>missing phrases will definitely come up in the next steps</i>, you just have to do a lot of extra work, but that's okay. We take our list and copy it to the key collector. Next, click on the button - <b>Parse with Yandex.Wordstat</b>:</p> <p><img src='https://i2.wp.com/markintalk.ru/wp-content/uploads/2011/08/parsing.jpg' align="center" width="100%" loading=lazy loading=lazy></p> <p>Parsing can take quite a long time, so be patient. The semantic core is usually assembled in 3-5 days, and the first day will be spent preparing the basic semantic core and parsing.</p> <p>I wrote about how to work with a resource, how to choose keywords <a href="https://zhumor.ru/en/internet/kartridzh-hp-85a-ce285a-zapravka-kak-zapravit-kartridzh-hp-ce285a-podrobnaya.html">detailed instructions</a>. And you can find out about the promotion of the site for low-frequency queries.</p> <p>In addition, I will say that instead of brainstorming, we can use ready-made semantics of competitors using one of the specialized services, for example, SpyWords. In the interface <a href="https://zhumor.ru/en/internet/ne-rabotaet-plai-market-pochemu-ne-rabotaet-play-market-na.html">this service</a> we just enter the keyword we need and see the main competitors who are in the TOP for this phrase. Moreover, the semantics of the site of any competitor can be completely unloaded using this service.</p> <p><img src='https://i1.wp.com/markintalk.ru/wp-content/uploads/2011/08/2017-05-11_01-49-55.png' width="100%" loading=lazy loading=lazy></p> <p>Further, we can select any of them and pull out its queries, which will remain to be filtered out from the garbage and used as basic semantics for further parsing. Or we can do it even simpler and use .</p> <h2><span>Cleaning up semantics</span></h2> <p>As soon as wordstat parsing stops completely - <i>it's time to weed out the semantic core</i>. This stage is very important, so treat it with due attention.</p> <p>So, my parsing is over, but the phrases turned out <b>So many</b>, and therefore, screening out words can take us extra time. Therefore, before proceeding to the definition of frequency, it is necessary to carry out the primary cleaning of words. We will do this in several steps:</p> <p><b>1. Filter out requests with very low frequencies</b></p> <p>To do this, click on the symbol for sorting by frequency, and start clearing all requests that have frequencies below 30:</p> <p><img src='https://i1.wp.com/markintalk.ru/wp-content/uploads/2011/08/chastoty.jpg' align="center" width="100%" loading=lazy loading=lazy></p> <p>I think that you can easily deal with this item.</p> <p><b>2. Remove inappropriate queries</b></p> <p>There are such requests that have sufficient frequency and low competition, but they <b>completely irrelevant to our theme.</b>. Such keys must be removed before checking exact occurrences of the key, as verification can take a very long time. We will delete such keys manually. So, for my blog, the following turned out to be superfluous:</p> <ul><li><table><tbody><tr><td>search engine optimization courses</td> </tr></tbody></table></li> <li><table><tbody><tr><td>sell untwisted site</td> </tr></tbody></table></li> </ul> <h2><span>Semantic core analysis</span></h2> <p>At this stage, we need to determine the exact frequencies of our keys, for which you need to click on the magnifying glass symbol, as shown in the image:</p> <p><img src='https://i1.wp.com/markintalk.ru/wp-content/uploads/2011/08/chastoty1.jpg' align="center" width="100%" loading=lazy loading=lazy></p> <p>The process is quite long, so you can go and make yourself some tea)</p> <p>When the check was successful, you need to continue cleaning our kernel.</p> <p>I suggest that you remove all keys with a frequency of less than 10 requests. Also, for my blog, I will delete all requests that have values ​​​​higher than 1,000, since I do not plan to advance on such requests yet.</p> <h2><span>Export and grouping of the semantic core</span></h2> <p>Do not think that this stage will be the last. Not at all! Now we need to transfer the resulting group to Excel for maximum clarity. Next, we will sort by pages and then we will see many shortcomings, which we will fix.</p> <p>Exporting the semantics of the site to Excel is not difficult at all. To do this, you just need to click on the corresponding symbol, as shown in the image:</p> <p><img src='https://i1.wp.com/markintalk.ru/wp-content/uploads/2011/08/exel.jpg' align="center" width="100%" loading=lazy loading=lazy></p> <p>After pasting into Excel, we will see the following picture:</p> <p><img src='https://i1.wp.com/markintalk.ru/wp-content/uploads/2011/08/in-Exel.jpg' align="center" width="100%" loading=lazy loading=lazy></p> <p>Columns marked in red must be deleted. Then we create another table in Excel, which will contain the final semantic core.</p> <p>The new table will have 3 columns: <b>URL</b><b>pages</b>, <b><a href="https://zhumor.ru/en/internet/ispolzuya-oboznacheniya-elementov-iz-periodicheskoi-sistemy-himicheskie-elementy.html">key phrase</a> </b> and his <b>frequency</b>. As the URL, select either an existing page or a page that will be created in the future. First, let's choose the keys for <a href="https://zhumor.ru/en/computer/yandeks-glavnaya-stranica-odnoklassniki-odnoklassniki-socialnaya-set-moya.html">home page</a> my blog:</p> <p><img src='https://i2.wp.com/markintalk.ru/wp-content/uploads/2011/08/for-main-page.jpg' align="center" width="100%" loading=lazy loading=lazy></p> <p>After all the manipulations, we see the following picture. And several conclusions immediately arise:</p> <ol><li>frequency queries such as should have a much larger tail of less frequent phrases than we see</li> <li><b>seo news</b></li> <li>a new key has surfaced, which we did not take into account earlier - <b>SEO articles</b>. Need to parse this key</li> </ol><p>As I said, not a single key can hide from us. The next step for us is to brainstorm these three phrases. After brainstorming, we repeat all the steps from the very first point for these keys. All this may seem too long and tedious to you, but the way it is - compiling a semantic core is a very responsible and painstaking work. On the other hand, a well-composed field will greatly help in website promotion and can greatly save your budget.</p> <p>After all the operations done, we were able to get new keys for the main page of this blog:</p> <ul><li>best seo blog</li> <li>seo news</li> <li>SEO articles</li> </ul><p><i>And some others. I think you understand the method.</i></p> <p>After all these manipulations, we will see which pages of our project need to be changed () and which new pages need to be added. Most of the keys we found (with a frequency of up to 100, and sometimes much higher) can be easily promoted with one .</p> <h2>Final elimination</h2> <p>In principle, the semantic core is almost ready, but there is one more rather important point that will help us significantly improve our seme group. For this we need Seopult.</p> <p>*In fact, here you can use any of the similar services that allow you to find out the competition by keywords, for example, Mutagen!</p> <p>So, we create another table in Excel and copy only the names of the keys there (middle column). In order not to waste a lot of time, I will only copy the keys for the main page of my blog:</p> <p><img src='https://i0.wp.com/markintalk.ru/wp-content/uploads/2011/08/sennorda.jpg' align="center" height="461" width="350" loading=lazy loading=lazy></p> <p><img src='https://i1.wp.com/markintalk.ru/wp-content/uploads/2011/08/seopult1.jpg' align="center" width="100%" loading=lazy loading=lazy></p> <p>Then we check the cost of getting one click for our keywords:</p> <p><img src='https://i0.wp.com/markintalk.ru/wp-content/uploads/2011/08/seopult2.jpg' align="center" width="100%" loading=lazy loading=lazy></p> <p>The cost of switching for some phrases exceeded 5 rubles. Such phrases must be excluded from our core.</p> <p><img src='https://i2.wp.com/markintalk.ru/wp-content/uploads/2011/08/bad.jpg' align="center" height="603" width="358" loading=lazy loading=lazy></p> <p>Perhaps your preferences will be somewhat different, then you can exclude less expensive phrases, or vice versa. <b>In my case, I deleted 7 phrases</b>.</p> <h4><span>Helpful information!</span></h4> <blockquote><p>on compiling a semantic core, with an emphasis on screening out the most low-competitive keywords.</p> <p>If you have your own online store - <b>read</b>, which describes how the semantic core can be used.</p> </blockquote> <h2><span>Semantic core clustering</span></h2> <p>I am sure that you have heard this word before in relation to search promotion. Let's figure out what kind of animal this is and why it is needed when promoting the site. <br>classic model <a href="https://zhumor.ru/en/internet/prodvizhenie-v-poiskovyh-sistemah-samostoyatelno.html">search promotion</a> as follows:</p> <ul><li><span>Selection and analysis of search queries</span></li> <li><span>Grouping queries by site pages (creating landing pages)</span></li> <li><span>Preparation of seo texts for landing pages based on a group of queries for these pages</span></li> </ul><p>To facilitate and improve the second stage in the list above, clustering is used. Basically, clustering is <a href="https://zhumor.ru/en/internet/tablichnaya-chast-dokumenta-1s-8-2-metody-rekvizity-kak-programmno.html">program method</a>, which serves to simplify this stage when working with large semantics, but not everything is as simple as it might seem at first glance.</p> <p>To better understand the theory of clustering, you should make a short digression into the history of SEO:</p> <blockquote><p>Just a few years ago, when the term clustering did not peek out from behind every corner, SEOs, in the vast majority of cases, grouped the semantics with their hands. But when grouping huge semantics into 1000, 10,000 and even 100,000 requests, this procedure turned into a real hard labor for an ordinary person. And then everywhere began to use the method of grouping by semantics (and today many people use this approach). The method of grouping by semantics means combining into one group of queries that have a semantic relationship. As an example, the requests “buy a washing machine” and “buy a washing machine up to 10,000” were combined into one group. And everything would be fine, but <a href="https://zhumor.ru/en/computer/snizhenie-razmernosti-snizhenie-razmernosti-priznakovogo.html">this method</a> contains a number of critical problems and in order to understand them it is necessary to introduce a new term into our narrative, namely - “ <b>request intent</b>”.</p> </blockquote> <p>The easiest way to describe <a href="https://zhumor.ru/en/internet/termin-buferizaciya-potokov-dannyh-video-ne-gruzitsya-na-ekrane.html">this term</a> possible as the user's need, his desire. An intent is nothing more than the desire of a user entering a search query. <br>The basis of grouping semantics is to collect requests that have the same intent, or as close as possible intents, into one group, and here 2 interesting features pop up at once, namely:</p> <ul><li><span>The same intent can have several requests that do not have any semantic similarity, for example, “car service” and “sign up for MOT”</span></li> <li><span>Queries that have absolute semantic proximity can contain radically different intents, for example, a textbook situation - “mobile phone” and “mobile phones”. In one case, the user wants to buy a phone, and in the other, they want to watch a movie.</span></li> </ul><p><i>So, grouping semantics according to semantic correspondence does not take into account query intents. And groups made up in this way will not allow you to write a text that will get into the TOP. In the time of manual grouping, in order to eliminate this misunderstanding, the guys with the profession of “assistant SEO specialist” analyzed the issue by hand.</i></p> <p>The essence of clustering is the comparison of the generated output <a href="https://zhumor.ru/en/internet/tablica-populyarnosti-poiskovoi-sistemoi-v-mire-luchshie-poiskovye.html">search engine</a> looking for patterns. From this definition, you should immediately make a note for yourself that clustering itself is not the ultimate truth, because the generated output may not fully reveal the intent (the Yandex database may simply not have a site that correctly combined requests into a group).</p> <p>The mechanics of clustering is simple and looks like this:</p> <ul><li><span>The system in turn enters all requests submitted to it into the search results and remembers the results from the TOP</span></li> <li><span>After entering queries one by one and saving the results, the system looks for intersections in the output. If the same site with the same document (site page) is in the TOP for several queries at once, then these queries can theoretically be combined into one group</span></li> <li><span>A parameter such as grouping strength becomes relevant, which tells the system exactly how many intersections there must be so that requests can be added to one group. For example, a grouping strength of 2 means that there must be at least two intersections in the results for 2 different queries. To put it even more simply, at least two pages of two different sites must be simultaneously in the TOP for one and the other query. An example is below.</span></li> <li><span>When grouping large semantics, the logic of relationships between queries becomes relevant, on the basis of which 3 basic types of clustering are distinguished: soft, middle and hard. We will talk about the types of clustering in the next entries of this diary.</span></li> </ul> <script>document.write("<img style='display:none;' src='//counter.yadro.ru/hit;artfast_after?t44.1;r"+ escape(document.referrer)+((typeof(screen)=="undefined")?"": ";s"+screen.width+"*"+screen.height+"*"+(screen.colorDepth? screen.colorDepth:screen.pixelDepth))+";u"+escape(document.URL)+";h"+escape(document.title.substring(0,150))+ ";"+Math.random()+ "border='0' width='1' height='1' loading=lazy loading=lazy>");</script> <div class="clear"></div> <div class="share"> <div class="title">Liked the article? Share with your friends:</div> <div class="social-likes"> <div class="facebook" title="Share link on facebook">Facebook</div> <div class="twitter" title="Share link on Twitter">Twitter</div> <div class="mailru" title="Share link in my world">My world</div> <div class="vkontakte" title="Share link in Vkontakte">In contact with</div> <div class="plusone" title="Share link on google plus">Google+</div> </div> </div> <div class="information"> <div class="date left" itemprop="datePublished">31.05.2020</div> <div class="category left" itemprop="articleSection">Internet</div> <div class="clear"></div> </div> <div class="related_posts"> <div class="block_title">The most interesting:</div> <div class="items"> <div class="item left"> <div class="thumb"> <a href="https://zhumor.ru/en/internet/professionalnyi-html5-shablon-v-minimalistskom-stile-professionalnyi-html5-shablon-v-minimalistskom.html"><img class="img" src="/uploads/14faffd007d7c085a59177d87a314b8a.jpg" loading=lazy loading=lazy> </a> </div> <div class="name"><a href="https://zhumor.ru/en/internet/professionalnyi-html5-shablon-v-minimalistskom-stile-professionalnyi-html5-shablon-v-minimalistskom.html">Professional HTML5 template minimalist style Html5 css3 bootstrap responsive cool</a></div> </div> <div class="item left"> <div class="thumb"> <a href="https://zhumor.ru/en/internet/nastroika-uslugi-perenapravlenie-domena-nastroika-uslugi-perenapravlenie-domena-redirekt-301-na-n.html"><img class="img" src="/uploads/660bd2d032838af0fbf4ce31e32dc88f.jpg" loading=lazy loading=lazy> </a> </div> <div class="name"><a href="https://zhumor.ru/en/internet/nastroika-uslugi-perenapravlenie-domena-nastroika-uslugi-perenapravlenie-domena-redirekt-301-na-n.html">Configuring the "Domain redirection Redirect 301 to a new domain" service</a></div> </div> <div class="item left"> <div class="thumb"> <a href="https://zhumor.ru/en/computer/ikony-dlya-rabochego-stola-gde-skachat-ikonki-dlya-papok-i-kak-ih-ustanovit.html"><img class="img" src="/uploads/2931a6e416b0cbab0584ddc3d5da7fec.jpg" loading=lazy loading=lazy> </a> </div> <div class="name"><a href="https://zhumor.ru/en/computer/ikony-dlya-rabochego-stola-gde-skachat-ikonki-dlya-papok-i-kak-ih-ustanovit.html">Where to download icons for folders and how to install them</a></div> </div> <br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br> </div> </div> </div> </section> <aside class="right"> <div class="block cats"> <div class="block_title">Blog Categories</div> <div class="block_data"> <ul class="first_level"> <li> <a href="https://zhumor.ru/en/category/internet/"> <div class="icon"><img src="https://zhumor.ru/wp-content/uploads/2016/06/ic_cat11.png" alt="" loading=lazy loading=lazy></div> Internet</a> </li> <li> <a href="https://zhumor.ru/en/category/internet/"> <div class="icon"><img src="https://zhumor.ru/wp-content/uploads/2016/06/ic_cat11.png" alt="" loading=lazy loading=lazy></div> Internet</a> </li> <li> <a href="https://zhumor.ru/en/category/computer/"> <div class="icon"><img src="https://zhumor.ru/wp-content/uploads/2016/06/ic_cat11.png" alt="" loading=lazy loading=lazy></div> A computer</a> </li> </ul> </div> </div> <div class="banner" style="margin: 0 auto 30px;width:240px;"> </div> <div class="banner"> </div> <div class="block articles "> <div class="block_title">Popular Articles</div> <div class="block_data"> <div class="items"> <div class="item"> <div class="thumb"> <a href="https://zhumor.ru/en/internet/shablon-hostinga-host-cs-com-ua-html-shablony-dlya-it-sfery-hosted-vordpress-tema-hostingovoi-kompa.html"> <img src="/uploads/b59a374f8066a7f8ff168322fa63f8fc.jpg" alt="HTML templates for IT sphere" / loading=lazy loading=lazy> </a> </div> <div class="name"><a href="https://zhumor.ru/en/internet/shablon-hostinga-host-cs-com-ua-html-shablony-dlya-it-sfery-hosted-vordpress-tema-hostingovoi-kompa.html">HTML templates for IT sphere</a></div> </div> <div class="item"> <div class="thumb"> <a href="https://zhumor.ru/en/computer/poslednie-trendy-raspolozheniya-logotipa-po-centru-vs-sleva-oformlenie.html"> <img src="/uploads/c492977d714ffe65fa25523efc4b173e.jpg" alt="Latest logo placement trends: centered vs" / loading=lazy loading=lazy> </a> </div> <div class="name"><a href="https://zhumor.ru/en/computer/poslednie-trendy-raspolozheniya-logotipa-po-centru-vs-sleva-oformlenie.html">Latest logo placement trends: centered vs</a></div> </div> <div class="item"> <div class="thumb"> <a href="https://zhumor.ru/en/internet/kak-na-androide-ustanovit-plei-market-poshagovaya-instrukciya-po.html"> <img src="/uploads/34fd3839f111cb75b0afac8da2e8d238.jpg" alt="Step-by-step instructions for installing the Play Market" / loading=lazy loading=lazy> </a> </div> <div class="name"><a href="https://zhumor.ru/en/internet/kak-na-androide-ustanovit-plei-market-poshagovaya-instrukciya-po.html">Step-by-step instructions for installing the Play Market</a></div> </div> <div class="item"> <div class="thumb"> <a href="https://zhumor.ru/en/internet/kak-vklyuchit-gromkuyu-svyaz-na-samsunge-galaksi-kak-vklyuchit-avtomaticheski-rezhim-gromkoi-svyazi-pri-v.html"> <img src="/uploads/051e70f13c94b152714ece60984b0960.jpg" alt="How to automatically turn on hands-free mode when a call comes in on iPhone" / loading=lazy loading=lazy> </a> </div> <div class="name"><a href="https://zhumor.ru/en/internet/kak-vklyuchit-gromkuyu-svyaz-na-samsunge-galaksi-kak-vklyuchit-avtomaticheski-rezhim-gromkoi-svyazi-pri-v.html">How to automatically turn on hands-free mode when a call comes in on iPhone</a></div> </div> <div class="item"> <div class="thumb"> <a href="https://zhumor.ru/en/internet/kak-ubrat-gromkuyu-svyaz-kak-vklyuchit-avtomaticheski-rezhim.html"> <img src="/uploads/6d1d5b294785e740522b5b77d2e393fa.jpg" alt="How to automatically turn on hands-free mode when a call comes in on iPhone" / loading=lazy loading=lazy> </a> </div> <div class="name"><a href="https://zhumor.ru/en/internet/kak-ubrat-gromkuyu-svyaz-kak-vklyuchit-avtomaticheski-rezhim.html">How to automatically turn on hands-free mode when a call comes in on iPhone</a></div> </div> </div> </div> </div> <div class="banner airSticky"> </div> </aside> <script type="text/javascript">!function(t,e){ "use strict";function n(){ if(!a){ a=!0;for(var t=0;t<d.length;t++)d[t].fn.call(window,d[t].ctx);d=[]} }function o(){ "complete"===document.readyState&&n()} t=t||"docReady",e=e||window;var d=[],a=!1,c=!1;e[t]=function(t,e){ return a?void setTimeout(function(){ t(e)} ,1):(d.push({ fn:t,ctx:e} ),void("complete"===document.readyState||!document.attachEvent&&"interactive"===document.readyState?setTimeout(n,1):c||(document.addEventListener?(document.addEventListener("DOMContentLoaded",n,!1),window.addEventListener("load",n,!1)):(document.attachEvent("onreadystatechange",o),window.attachEvent("onload",n)),c=!0)))} }("wpBruiserDocReady",window); (function(){ var wpbrLoader = (function(){ var g=document,b=g.createElement('script'),c=g.scripts[0];b.async=1;b.src='/?gdbc-client=3.1.11-'+(new Date()).getTime();c.parentNode.insertBefore(b,c);} );wpBruiserDocReady(wpbrLoader);window.onunload=function(){ };window.addEventListener('pageshow',function(event){ if(event.persisted){ (typeof window.WPBruiserClient==='undefined')?wpbrLoader():window.WPBruiserClient.requestTokens();} },false);} )(); </script> <script type='text/javascript' src='https://zhumor.ru/wp-content/plugins/cookie-notice/js/front.min.js?ver=1.2.40'></script> <script type='text/javascript' src='/assets/front1.min1.js'></script> <script type='text/javascript' src='https://zhumor.ru/wp-content/plugins/wp-postratings/js/postratings-js.js?ver=1.85'></script> <script type='text/javascript' src='/wp-includes/js/wp-embed.min.js?ver=4.9.1'></script> <div id="cookie-notice" role="banner" class="cn-bottom bootstrap" style="color: #fff; background-color: #000;"><div class="cookie-notice-container"><span id="cn-notice-text">Friends, I use cookies to give you the best experience on my site. If you agree, click "OK".</span><a href="#" id="cn-accept-cookie" data-cookie-set="accept" class="cn-set-cookie button bootstrap">OK</a> </div> </div> <div class="clear"></div> </div> <div class="cont"> <footer> <div class="col left"> <div class="l"> <img src="/img/logo.png" loading=lazy loading=lazy> </div> <div class="copyright">zhumor.ru - Simple and accessible. We teach technology</div> </div> <ul id="menu-foot" class="links left"> <li class="menu-item type-post_type object-page "><a href="https://zhumor.ru/en/feedback.html">Contacts</a></li> <li class="menu-item type-post_type object-page "><a href="">Advertising and collaboration</a></li> <li class="menu-item type-post_type object-page "><a href="">About the site</a></li> </ul> <div class="clear"></div> </footer> </div> </div> <div class="buttonUp"><a href="#top"></a></div> </div> <script src="/assets/jquery1.js"></script> <script src="https://zhumor.ru/wp-content/themes/raten/js/jquery-migrate.js"></script> <script src="https://zhumor.ru/wp-content/themes/raten/js/scrollTop.js"></script> <script src="https://zhumor.ru/wp-content/themes/raten/js/social-likes.min.js"></script> <script src="https://zhumor.ru/wp-content/themes/raten/js/fancybox.js"></script> <script src="https://zhumor.ru/wp-content/themes/raten/js/jquery.airStickyBlock.min.js"></script> <script src="/wp-includes/js/comment-reply.min.js"></script> <script src="https://zhumor.ru/wp-content/themes/raten/js/scripts.js"></script> <script>// <![CDATA[ // < ![CDATA[ function GoTo(link){ window.open(link.replace("_","http://","https://"));} // ]]></script> <script type="text/javascript">(window.Image ? (new Image()) : document.createElement('img')).src = location.protocol + '//vk.com/rtrg?r=VaOkaUHa8b7aOnpvk6*6aJe31KwblJWqkjoKUg6hphyBGU1uxyonnbQb1r4oIvdJyeruVbHyREoFF4QvP9MgyWw60g4RK7x6C7HK4QqiFEFVR9ga/77OHVf6lAy59pIahKMAGkZelNuxMkkmSG4v6PPVXCRWTvnrTQdWQiLrJII-';</script> </body> </html>