How many characters should the semantic core be? What is the semantic core of a blog. Statistics on search queries Rambler adstat

What is the semantic core of a site? The semantic core of the site (hereinafter referred to as SY) is a set of keywords and phrases for which the resource progressing in search engines and which indicate that the site belongs to a certain topics.

For successful promotion in search engines, keywords must be correctly grouped and distributed across the pages of the site and in a certain form contained in meta descriptions (, keywords), as well as in H1-H6 headings. At the same time, overspam should not be allowed, so as not to “fly away” into .

In this article we will try to look at the issue not only from a technical point of view, but also to look at the problem through the eyes of business owners and marketers.

How to create a semantic core of a website

So, let's look at each point in more detail with various examples.

At the first step, it is important to determine which products and services present on the site will be promoted in the search results of Yandex and Google.

Example No. 1. Let’s say the site has two areas of services: computer repair at home and training to work with Word/Exel at home. In this case, it was decided that training was no longer in demand, so there was no point in promoting it, and therefore collecting semantics on it. Another important point is that you need to collect not only requests containing "computer repair at home", but also "laptop repair, PC repair" and others.

Example No. 2. The company is engaged in low-rise construction. But at the same time he builds only wooden houses. Accordingly, queries and semantics by directions "construction of houses from aerated concrete" or "construction of brick houses" may not be collected.

Collection of semantics

We will look at two main sources of keywords: Yandex and Google. We’ll tell you how to collect semantics for free and briefly review paid services that can speed up and automate this process.

In Yandex, key phrases are collected from the Yandex.Wordstat service and in Google through query statistics in Google AdWords. If available, you can use data from Yandex Webmaster and Yandex Metrics, Google Webmaster and Google Analytics as additional sources of semantics.

Collecting keywords from Yandex.Wordstat

Collecting queries from Wordstat can be considered free. To view the data of this service, you only need a Yandex account. So let's go to wordstat.yandex.ru and enter the keyword. Let's consider an example of collecting semantics for a car rental company website.

What do we see in this screenshot?

  1. Left column. Here is the basic query and its various variations with "tail". Opposite each request is a number indicating how much this request is in in general has been used by various users.
  2. Right column. Requests similar to the main one and indicators of their overall frequency. Here we see that a person who wants to rent a car, in addition to the request "car rental" can use "car rental", "car rental", "car rental" and others. This is very important data that you need to pay attention to so as not to miss a single request.
  3. Regionality and history. By selecting one of the possible options, you can check the distribution of requests by region, the number of requests in a particular region or city, as well as the trend of changes over time or with the change of season.
  4. Devices, from which the request was made. By switching tabs, you can find out which devices are most often searched from.

Check different versions of key phrases and record the received data in Excel tables or Google spreadsheets. For convenience, install the plugin Yandex Wordstat Helper. After installing it, plus signs will appear next to the search phrases; when you click on them, the words will be copied; you will not need to select and paste the frequency indicator manually.

Collecting keywords from Google AdWords

Unfortunately, Google does not have an open source of search queries with their frequency indicators, so here you need to work around it. And for this we need a working account in Google AdWords.

We register an account in Google AdWords and top up the balance with the minimum possible amount - 300 rubles (on an account that is inactive in terms of budget, approximate data is displayed). After that, go to “Tools” - “Keyword Planner”.

A new page will open, where in the “Search for new keywords by phrase, site or category” tab, enter the keyword.

Scroll down, click “Get options” and see something like this.

  1. Top request and average number of requests per month. If the account is not paid, then you will see approximate data, that is, the average number of requests. When there are funds on the account, exact data will be shown, as well as the dynamics of changes in the frequency of the entered keyword.
  2. Keywords by relevance. This is the same as similar queries in Yandex Wordstat.
  3. Downloading data. This tool is convenient because the data obtained in it can be downloaded.

We looked at working with two main sources of statistics on search queries. Now let's move on to automating this process, because collecting semantics manually takes too much time.

Programs and services for collecting keywords

Key Collector

The program is installed on the computer. The program connects work accounts from which statistics will be collected. Next, a new project and a folder for keywords are created.

Select “Batch collection of words from the left column of Yandex.Wordstat”, enter the queries for which we collect data.

An example is included in the screenshot, in fact, for a more complete syntax, here you additionally need to collect all query options with car brands and classes. For example, “bmw for rent”, “buy a toyota with option to buy”, “rent an SUV” and so on.

WordEb

Free analogue previous program. This can be considered both a plus - you don’t need to pay, and a minus - the program’s functionality is significantly reduced.

To collect keywords, the steps are the same.

Rush-analytics.ru

Online service. Its main advantage is that you don’t need to download or install anything. Register and use it. The service is paid, but when you register, you have 200 coins in your account, which is enough to collect small semantics (up to 5000 requests) and parse frequency.

The downside is that semantics are collected only from Wordstat.

Checking the frequency of keywords and queries

And again we notice a decrease in the number of requests. Let's go ahead and try another word form of the same query.

We note that in the singular, this request is searched by a much smaller number of users, which means the initial request is a higher priority for us.

Such manipulations must be carried out with every word and phrase. Those requests for which the final frequency is equal to zero (using quotation marks and an exclamation mark) are eliminated, because “0” means that no one enters such queries and these queries are only part of others. The point of compiling a semantic core is to select the queries that people use to search. All queries are then placed in an Excel table, grouped by meaning and distributed across the pages of the site.

It’s simply not possible to do this manually, so there are many services on the Internet, paid and free, that allow you to do this automatically. Let's give a few:

  • megaindex.com;
  • rush-analytics.ru;
  • tools.pixelplus.ru;
  • key-collector.ru.

Removing non-target requests

After sifting through the keywords, you should remove unnecessary ones. What search queries can be removed from the list?

  • requests with the names of competitors' companies (can be left in);
  • requests for goods or services that you do not sell;
  • requests that indicate a district or region in which you do not work.

Clustering (grouping) of requests for site pages

The essence of this stage is to combine queries with similar meaning into clusters, and then determine which pages they will be promoted to. How can you understand which requests to promote to one page and which to another?

1. By request type.

We already know that everything is divided into several types, depending on the purpose of the search:

  • commercial (buy, sell, order) - promoted to landing pages, pages of product categories, product cards, pages with services, price lists;
  • informational (where, how, why, why) - articles, forum topics, answer to question section;
  • navigation (telephone, address, brand name) - page with contacts.

If you are in doubt what type of request it is, enter its search string and analyze the results. For commercial requests there will be more pages offering services, for informational requests there will be more articles.

There is also . Most commercial requests are geo-dependent, as people are more likely to trust companies located in their city.

2. Request logic.

  • “buy iphone x” and “iphone x price” - need to be promoted to one page, since in both the first and second cases, a search is carried out for the same product and more detailed information about it;
  • “buy iphone” and “buy iphone x” - need to be promoted to different pages, since in the first request we are dealing with a general request (suitable for the product category where iPhones are located), and in the second the user is looking for a specific product and this request should promote to the product card;
  • “how to choose a good smartphone” - it is more logical to promote this request to a blog article with the appropriate title.

View search results for them. If you check which pages on different sites lead to the queries “construction of houses made of timber” and “construction of houses made of bricks”, then in 99% of cases these are different pages.

4. Automatic grouping using software and manual refinement.

The 1st and 2nd methods are excellent for compiling the semantic core of small sites where a maximum of 2-3 thousand keywords are collected. For a large system (from 10,000 to infinity of requests), the help of machines is needed. Here are several programs and services that allow you to perform clustering:

  • KeyAssistant - assistant.contentmonster.ru;
  • semparser.ru;
  • just-magic.org;
  • rush-analytics.ru;
  • tools.pixelplus.ru;
  • key-collector.ru.

After automatic clustering is completed, it is necessary to check the results of the program manually and, if errors are made, correct them.

Example: the program can send the following requests to one cluster: “vacation in Sochi 2018 hotel” and “vacation in Sochi 2018 hotel breeze” - in the first case, the user is looking for various hotel options for accommodation, and in the second, a specific hotel.

To eliminate the occurrence of such inaccuracies, you need to manually check everything and, if errors are found, edit.

What to do next after compiling the semantic core?

Based on the collected semantic core, we then:

  1. We create the ideal structure (hierarchy) of the site from the point of view of search engines;
    or in agreement with the customer, we change the structure of the old website;
  2. we write technical assignments for copywriters to write text, taking into account the cluster of requests that will be promoted to this page;
    or We are updating old articles and texts on the site.

It looks something like this.

For each generated request cluster, we create a page on the site and determine its place in the site structure. The most popular queries are promoted to the top pages in the resource hierarchy, less popular ones are located below them.

And for each of these pages, we have already collected requests that we will promote on them. Next, we write technical specifications to copywriters to create text for these pages.

Technical specifications for a copywriter

As in the case of the site structure, we will describe this stage in general terms. So, technical specifications for the text:

  • number of characters without spaces;
  • page title;
  • subheadings (if any);
  • a list of words (based on our core) that should be in the text;
  • uniqueness requirement (always require 100% uniqueness);
  • desired text style;
  • other requirements and wishes in the text.

Remember, don’t try to promote +100500 requests on one page, limit yourself to 5-10 + tail, otherwise you will get banned for over-optimization and will be out of the game for a long time for places in the TOP.

Conclusion

Compiling the semantic core of a site is painstaking and hard work, which needs to be given especially close attention, because it is on this that the further promotion of the site is based. Follow the simple instructions given in this article and take action.

  1. Choose the direction of promotion.
  2. Collect all possible queries from Yandex and Google (use special programs and services).
  3. Check the frequency of queries and get rid of dummies (those with a frequency of 0).
  4. Remove non-target requests - services and goods that you do not sell, requests mentioning competitors.
  5. Form query clusters and distribute them across pages.
  6. Create an ideal site structure and draw up technical specifications for the content of the site.

The semantic core is a scary name that SEOs came up with to denote a rather simple thing. We just need to select the key queries for which we will promote our site.

And in this article I will show you how to correctly compose a semantic core so that your site quickly reaches the TOP, and does not stagnate for months. There are also “secrets” here.

And before we move on to compiling the SY, let's figure out what it is and what we should ultimately come to.

What is the semantic core in simple words

Oddly enough, but the semantic core is a regular Excel file, which contains a list of key queries for which you (or your copywriter) will write articles for the site.

For example, this is what my semantic core looks like:

I have marked in green those key queries for which I have already written articles. Yellow - those for which I plan to write articles in the near future. And colorless cells mean that these requests will come a little later.

For each key query, I have determined the frequency, competitiveness, and come up with a “catchy” title. You should get approximately the same file. Now my CN consists of 150 keywords. This means that I am provided with “material” for at least 5 months in advance (even if I write one article a day).

Below we will talk about what you should prepare for if you suddenly decide to order the collection of the semantic core from specialists. Here I will say briefly - they will give you the same list, but only for thousands of “keys”. However, in SY it is not quantity that is important, but quality. And we will focus on this.

Why do we need a semantic core at all?

But really, why do we need this torment? You can, after all, just write high-quality articles and attract an audience, right? Yes, you can write, but you won’t be able to attract people.

The main mistake of 90% of bloggers is simply writing high-quality articles. I'm not kidding, they have really interesting and useful materials. But search engines don’t know about it. They are not psychics, but just robots. Accordingly, they do not rank your article in the TOP.

There is another subtle point with the title. For example, you have a very high-quality article on the topic “How to properly conduct business in a face book.” There you describe everything about Facebook in great detail and professionally. Including how to promote communities there. Your article is the highest quality, useful and interesting on the Internet on this topic. No one was lying next to you. But it still won't help you.

Why high-quality articles fall from the TOP

Imagine that your website was visited not by a robot, but by a live inspector (assessor) from Yandex. He realized that you have the coolest article. And hands put you in first place in the search results for the request “Promoting a community on Facebook.”

Do you know what will happen next? You will fly out of there very soon anyway. Because no one will click on your article, even in first place. People enter the query “Promoting a community on Facebook,” and your headline is “How to properly run a business in a face book.” Original, fresh, funny, but... not on request. People want to see exactly what they were looking for, not your creativity.

Accordingly, your article will empty its place in the TOP search results. And a living assessor, an ardent admirer of your work, can beg the authorities as much as he wants to leave you at least in the TOP 10. But it won't help. All the first places will be taken by empty articles, like the husks of seeds, that yesterday’s schoolchildren copied from each other.

But these articles will have the correct “relevant” title - “Promoting a community on Facebook from scratch” ( step by step, in 5 steps, from A to Z, free etc.) Is it offensive? Of course. Well, fight against injustice. Let's create a competent semantic core so that your articles take the well-deserved first places.

Another reason to start writing SYNOPSIS right now

There is one more thing that for some reason people don’t think much about. You need to write articles often - at least every week, but preferably 2-3 times a week - to get more traffic, faster.

Everyone knows this, but almost no one does it. And all because they have “creative stagnation”, “they just can’t force themselves”, “they’re just lazy”. But in fact, the whole problem lies in the absence of a specific semantic core.

I entered one of my basic keys into the search field - “smm”, and Yandex immediately gave me a dozen hints about what else might be interesting to people who are interested in “smm”. All I have to do is copy these keys into a notebook. Then I will check each of them in the same way, and collect hints on them as well.

After the first stage of collecting key words, you should end up with a text document containing 10-30 broad basic keys, which we will work with further.

Step #2 — Parsing basic keys in SlovoEB

Of course, if you write an article for the request “webinar” or “smm”, then a miracle will not happen. You will never be able to reach the TOP for such a broad request. We need to break the basic key into many small queries on this topic. And we will do this using a special program.

I use KeyCollector, but it's paid. You can use a free analogue - the SlovoEB program. You can download it from the official website.

The most difficult thing about working with this program is setting it up correctly. I show you how to properly set up and use Sloboeb. But in that article I focus on selecting keys for Yandex Direct.

And here let’s look step by step at the features of using this program for creating a semantic core for SEO.

First, we create a new project and name it by the broad key that you want to parse.

I usually give the project the same name as my base key to avoid confusion later. And yes, I will warn you against one more mistake. Don't try to parse all base keys at once. Then it will be very difficult for you to filter out “empty” key queries from golden grains. Let's parse one key at a time.

After creating the project, we carry out the basic operation. That is, we actually parse the key through Yandex Wordstat. To do this, click on the “Worstat” button in the program interface, enter your base key, and click “Start collection”.

For example, let's parse the base key for my blog “contextual advertising”.

After this, the process will start, and after some time the program will give us the result - up to 2000 key queries that contain “contextual advertising”.

Also, next to each request there will be a “dirty” frequency - how many times this key (+ its word forms and tails) was searched per month through Yandex. But I do not advise drawing any conclusions from these figures.

Step #3 - Collecting the exact frequency for the keys

Dirty frequency will not show us anything. If you focus on it, then don’t be surprised when your key for 1000 requests does not bring a single click per month.

We need to identify pure frequency. And to do this, we first select all the found keys with checkmarks, and then click on the “Yandex Direct” button and start the process again. Now Slovoeb will look for the exact request frequency per month for each key.

Now we have an objective picture - how many times what query was entered by Internet users over the past month. I now propose to group all key queries by frequency to make it easier to work with them.

To do this, click on the “filter” icon in the “Frequency” column. ", and specify - filter out keys with the value "less than or equal to 10".

Now the program will show you only those requests whose frequency is less than or equal to the value “10”. You can delete these queries or copy them to another group of key queries for future use. Less than 10 is very little. Writing articles for these requests is a waste of time.

Now we need to select those key queries that will bring us more or less good traffic. And to do this, we need to find out one more parameter - the level of competitiveness of the request.

Step #4 — Checking the competitiveness of requests

All “keys” in this world are divided into 3 types: high-frequency (HF), mid-frequency (MF), low-frequency (LF). They can also be highly competitive (HC), moderately competitive (SC) and low competitive (LC).

As a rule, HF requests are also VC. That is, if a query is often searched on the Internet, then there are a lot of sites that want to promote it. But this is not always the case; there are happy exceptions.

The art of compiling a semantic core lies precisely in finding queries that have a high frequency and a low level of competition. It is very difficult to manually determine the level of competition.

You can focus on indicators such as the number of main pages in the TOP 10, length and quality of texts. level of trust and tits of sites in the TOP search results upon request. All of this will give you some idea of ​​how tough the competition is for rankings for this particular query.

But I recommend you use Mutagen service. It takes into account all the parameters that I mentioned above, plus a dozen more that neither you nor I have probably even heard of. After analysis, the service gives an exact value - what level of competition this request has.

Here I checked the query “setting up contextual advertising in google adwords”. Mutagen showed us that this key has a competitiveness of "more than 25" - this is the maximum value it shows. And this query has only 11 views per month. So it definitely doesn’t suit us.

We can copy all the keys that we found in Slovoeb and do a mass check in Mutagen. After that, all we have to do is look through the list and take those requests that have a lot of requests and a low level of competition.

Mutagen is a paid service. But you can do 10 checks per day for free. In addition, the cost of testing is very low. In all the time I have been working with him, I have not yet spent even 300 rubles.

By the way, about the level of competition. If you have a young site, then it is better to choose queries with a competition level of 3-5. And if you have been promoting for more than a year, then you can take 10-15.

By the way, regarding the frequency of requests. We now need to take the final step, which will allow you to attract a lot of traffic even for low-frequency queries.

Step #5 — Collecting “tails” for the selected keys

As has been proven and tested many times, your site will receive the bulk of traffic not from the main keywords, but from the so-called “tails”. This is when a person enters strange key queries into the search bar, with a frequency of 1-2 per month, but there are a lot of such queries.

To see the “tail”, simply go to Yandex and enter the key query of your choice into the search bar. Here's roughly what you'll see.

Now you just need to write down these additional words in a separate document and use them in your article. Moreover, there is no need to always place them next to the main key. Otherwise, search engines will see “over-optimization” and your articles will fall in search results.

Just use them in different places in your article, and then you will receive additional traffic from them as well. I would also recommend that you try to use as many word forms and synonyms as possible for your main key query.

For example, we have a request - “Setting up contextual advertising”. Here's how to reformulate it:

  • Setup = set up, make, create, run, launch, enable, place...
  • Contextual advertising = context, direct, teaser, YAN, adwords, kms. direct, adwords...

You never know exactly how people will search for information. Add all these additional words to your semantic core and use them when writing texts.

So, we collect a list of 100 - 150 key queries. If you are creating a semantic core for the first time, it may take you several weeks.

Or maybe break his eyes? Maybe there is an opportunity to delegate the compilation of FL to specialists who will do it better and faster? Yes, there are such specialists, but you don’t always need to use their services.

Is it worth ordering SY from specialists?

By and large, specialists in compiling a semantic core will only give you steps 1 - 3 from our scheme. Sometimes, for a large additional fee, they will do steps 4-5 - (collecting tails and checking the competitiveness of requests).

After that, they will give you several thousand key queries that you will need to work with further.

And the question here is whether you are going to write the articles yourself, or hire copywriters for this. If you want to focus on quality rather than quantity, then you need to write it yourself. But then it won't be enough for you to just get a list of keys. You will need to choose topics that you understand well enough to write a quality article.

And here the question arises - why then do we actually need specialists in FL? Agree, parsing the base key and collecting exact frequencies (steps #1-3) is not at all difficult. This will literally take you half an hour.

The most difficult thing is to choose HF requests that have low competition. And now, as it turns out, you need HF-NCs, on which you can write a good article. This is exactly what will take you 99% of your time working on the semantic core. And no specialist will do this for you. Well, is it worth spending money on ordering such services?

When are the services of FL specialists useful?

It’s another matter if you initially plan to attract copywriters. Then you don't have to understand the subject of the request. Your copywriters won’t understand it either. They will simply take several articles on this topic and compile “their” text from them.

Such articles will be empty, miserable, almost useless. But there will be many of them. On your own, you can write a maximum of 2-3 quality articles per week. And an army of copywriters will provide you with 2-3 shitty texts a day. At the same time, they will be optimized for requests, which means they will attract some traffic.

In this case, yes, calmly hire FL specialists. Let them also draw up a technical specification for copywriters at the same time. But you understand, this will also cost some money.

Resume

Let's go over the main ideas in the article again to reinforce the information.

  • The semantic core is simply a list of key queries for which you will write articles on the site for promotion.
  • It is necessary to optimize texts for precise key queries, otherwise even your highest-quality articles will never reach the TOP.
  • SY is like a content plan for social networks. It helps you avoid falling into a “creative crisis” and always know exactly what you will write about tomorrow, the day after tomorrow and in a month.
  • To compile a semantic core, it is convenient to use the free program Slovoeb, you only need it.
  • Here are the five steps of compiling the NL: 1 - Selection of basic keys; 2 - Parsing basic keys; 3 - Collection of exact frequency for queries; 4 — Checking the competitiveness of keys; 5 – Collection of “tails”.
  • If you want to write articles yourself, then it is better to create a semantic core yourself, for yourself. Specialists in the preparation of synonyms will not be able to help you here.
  • If you want to work on quantity and use copywriters to write articles, then it is quite possible to delegate and compile the semantic core. If only there was enough money for everything.

I hope this instruction was useful to you. Save it to your favorites so as not to lose it, and share it with your friends. Don't forget to download my book. There I show you the fastest way from zero to the first million on the Internet (a summary from personal experience over 10 years =)

See you soon!

Yours Dmitry Novoselov

Hello, dear readers! Often, during personal conversations with webmasters, I see a lack of understanding of the importance of keywords for search engine promotion. Therefore, I decided to write a separate post on this important topic. What's happened semantic core of the site, its pros and cons, an example of a finished kernel, tips on how to compose it correctly - the topic of this article. You will learn why correctly selected keywords are so important for a web resource in terms of SEO promotion. You will feel their importance and exclusivity, and see the power from their use.

Why did I write this post? Firstly, my article is intended for novice webmasters who have just begun to understand SEO promotion. And secondly, in the top 20 in Yandex there is not a single article that would reveal in detail to the search user the essence of the semantic core. Let's fix this bug. I hope the Russian search engine will take this into account. 🙂

The concept of the semantic core

What is it

Each of us knows how to use a telephone directory. Whether it is a pocket alphabet version in the form of a notebook or a large printed Talmud of all the telephones in the city - the principle of its use is very simple. We are looking for a specific last name for all the contact information of the person we are looking for. To do this, we need to know the person’s full name. Usually, already knowing a person’s last name, we can easily look at all the entries in the directory and select the one we need. That is, knowing a certain word (last name), we can get all the data on a person that is recorded in the book.

The semantics of the site work on this principle - we enter a search query in a search engine (analogous to the name for which we are looking for contacts in the directory) and receive a list of the most accurately answering documents from various web resources. These documents are individual pages of websites or blogs, the content of which is optimized for the search query we are requesting, which is called a keyword. Thus, the usual semantic core is a set of search queries (keywords) that accurately characterize the subject of a web resource and the type of its activity (usually for commercial sites that offer services or goods on the Internet).

Therefore, the semantic core performs a very important function - it allows a web resource to receive users from search engines on its pages, which are necessary to perform various tasks. For example, for a commercial site such tasks may be selling goods or services, for a news portal - advertising third-party sources using contextual or banner advertising, for a blog - advertising affiliate programs, etc. That is, the core is exactly the foundation that is necessary to receive search traffic using SEO promotion. Without the correct and high-quality set of keywords, it is not possible to get such traffic.

Therefore, if we want to use the resources and capabilities of search engines to promote our web resource, it needs to have a competent semantic core. If we don’t want to receive search traffic, if we don’t need a target audience from search, then the presence of semantics for our site is not justified. We don't need keywords.

Types of semantic core

In search engine promotion there is a main and a secondary semantic core. The main thing implies a set of main keywords with the help of which this site realizes its goals and objectives. For example, these could be the search queries that potential buyers of goods or services on a commercial website follow from a search. That is, thanks to the requests of the main core, the site or blog fulfills its purpose. It is precisely these keywords that transform an ordinary user from a search into a client.

The secondary semantic core allows the web resource to solve less significant problems or receive additional traffic to attract more potential customers. In this case, the range of keywords can be not only the main subject of the site, but also related topics. Typically used in a number of commercial sites that want to convert visitors with additional articles to sell services or products on sales pages. For example, this is what many sites that provide SEO services do. On their web resources, in addition to the main selling and explanatory pages, there is an additional large set of documents that together form a blog option.

For bloggers, the main core is a list of keywords that bring them search traffic, since this is the main task when monetizing a blog with the help of a target audience from search. And a blog usually doesn’t have secondary semantics. Because whatever the blog’s task, all traffic on all pages promoted in search engines goes to solving its commercial task. But, usually, keywords on the main topic give much more commercial conversions than keywords on a non-core topic.

What you need to know when creating a semantic core

Keywords for the semantic core are selected according to various parameters (frequency, competitiveness, etc.). I wrote about this in detail in. Depending on these parameters, semantic cores are selected for commercial sites, blogs, news portals, etc. The choice of certain values ​​of these characteristics depends on three components - on the promotion strategy, on the amount of budget allocated for it and the topic of the field of activity of the site being promoted.

The promotion strategy dictates the plan for selecting keywords, focusing on the relationship of the promoted pages with the site and their quality. Firstly, the page linking scheme is important here - the number and importance of documents that will be promoted in search engines depends on its correct choice. Each web resource should have its own scheme, with the help of which there will be a better distribution of weight among the promoted documents. Secondly, you need to know the volume of content of the page being promoted (for which keywords are selected) - depending on the number of characters, one or another number of keywords is used for it.

The possibility of using competitive queries in the semantic core depends on the amount of budget. The more financial resources a webmaster can allocate to his project, the better words he can include in its semantics. It’s no secret that the highest quality promoted keywords always cost a decent amount of money to get into the top 10 for them. This means that to promote pages for these key queries, internal optimization is not enough - the purchase of external links is necessary, and material resources are needed to purchase them.

First of all, the minimum keyword frequency threshold that can be used to receive search traffic depends on the topic. The more competitive the topic is and the narrower it is (in terms of its popularity), the more the webmaster casts his gaze towards low- and micro-frequency words (their frequency is less than 50-100, depending on the topic). For example, the topic of my blog (SEO and web analytics) is very narrow and I cannot use more keywords to create a semantic core. Therefore, most of the keywords on my blog are LF and MK search queries with a frequency of no more than 100. If we compare with popular topics (for example, cooking - LF queries in it reach 1000!), then you can easily understand that you can get serious traffic through small the topic is very difficult - you need to have a huge number of keywords with low frequency (but not competitive). This means you need to have a huge number of promoted pages.

Plan for creating a semantic core

To create a semantic core, you need to perform a number of sequential actions:

  1. Select the website topics for which keywords will be selected. Here you need to write down all the areas of activity that the web resource is dedicated to. If this is an online store, then we list the main categories of goods (usually keywords are not searched for separately for goods, unless the product does not have a numbered model, but only a general one). If this is a blog, then all subtopics of the general topic, as well as individual titles of important articles. If the site provides services, then future keywords will be the names of these services, etc.
  2. Based on the selected topics, create a list of masks - initial queries that will be used to parse our semantic core. You can learn more about the concept of a list from. another post on this topic, which describes.
  3. Find out the region where your website is being promoted. Depending on the location of the resource, some keywords may have varying degrees of competition. The narrower and more precise the region, the greater its importance in terms of promotion. For example, many keywords are not as competitive in the Russia region as they are in Moscow.
  4. Parse all kinds of search queries from the Yandex and Google search engines (if necessary) using various methods (manual, in the Key Collector program, competitor analysis, etc.). The described parsing options, for which I have written detailed step-by-step guides, can be found via the links in the block of additional articles after this post.
  5. The received search queries must be analyzed and the highest quality ones retained. , which will allow you to weed out dummy words, will be able to separate competitive words from non-competitive ones and show the most effective keywords.
  6. Distribute the received keywords among the documents of the web resource. The final stage in creating a semantic core, in which it is necessary to secure your keywords for each promoted page, depending on the promotion conditions. Thanks to this distribution, the webmaster can immediately see the main request and additional ones, but also guess the structure of future content. Here the core is divided into primary and secondary (if necessary).

After the full semantic core of the site has been created, in terms of promotion, it’s time to take on internal page optimization and then do effective linking.

Example of a ready-made semantic core (keyword table)

In order for you to see the finished result, I decided to post a piece of the semantic core that I ordered. In this screenshot you will see a table of selected keywords (the picture is clickable):

As you can see, all keywords are distributed among articles (distribution is carried out by the customer himself or by me for a fee). You can immediately see which search queries from the table can be used as the main query, and which in the form of additional keywords. All this allows the customer to immediately implement the received keywords into finished posts without delay or make a rough plan for future posts - everything is clearly and clearly visible.

Where can you order an excellent semantic core?

If you want to assemble a full-fledged semantic core for a commercial website or information project, I recommend contacting.

Pros and cons of the semantic core

As in any business, creating a semantic core has its positive and negative sides. Let's talk about the cons first:

  • To obtain a high-quality semantic core of a website, various cost options are needed. If you ordered your list of keywords from a specialist, then these costs will be a decent amount. The more words you order, the more rubles you give away. If you select keywords yourself, you can spend quite a lot of time learning how to select the most effective queries. But the experience gained and the quality of the keys will return the hours spent collecting the core in the form of excellent conversion and targeted search traffic.
  • To select and analyze the best keywords from those already parsed, knowledge is required. Not every webmaster will delve into this matter - after all, in addition to knowledge of optimization, you need to know the basics of working in Excel and be able to count (for formatting data, first of all). But, again, if all this happens, the quality of your semantic core will be many times higher than a simple set of parsed keys from Wordstat.

As you can see, there are few disadvantages, but they are significant. And if material resources are not so important, then studying the necessary material requires time, which is valued most of all. After all, it cannot be returned... Let's move on to the advantages:

  • The biggest and most important advantage is that you get a strong foundation for promoting your web resource. Without keywords, the visibility of a website or blog will be very low, which will not attract much search traffic to it. This means that the assigned tasks and goals will not be achieved. Simply writing high-quality content is not enough to attract search engine users. Search engines also need the presence of keywords in the article (in the text) and on the page (in its meta data). Therefore, creating a semantic core is the first stage, without which search engine promotion is simply unrealistic!
  • Thanks to a good semantic core (of course, taking into account its competent internal optimization), the promoted pages of a web resource will not only receive exactly the target users from the search, but will also allow the site to have excellent behavioral factors. When using bad keywords (or their absence), the bounce rate increases noticeably, and the number of targeted conversions decreases.
  • Having a ready-made semantic core at hand, any webmaster will have an idea of ​​the future of his web resource in terms of content creation. By looking at the keywords, he will see future topics of his articles on the site. That is, he can plan his activities much more effectively - give the copywriter a task in advance, better prepare for seasonal jumps, or come up with the next series of posts.
  • As a rule, a large number of posts are created (located) on a blog. Therefore, you need to look for a lot of keywords for them. But you shouldn’t take the first ones you come across - look for the best options, evaluate them, analyze the parameters of your search queries. You may spend more time, but you will end up with high-quality keywords that can attract better traffic than a bunch of the first search queries seen. Make yourself a goal to search for keywords for one article every day. Having filled your hand, you will collect them for your core much faster, thereby servicing 3-5 posts at a time.
  • You don’t need to read a bunch of SEO materials, don’t rush to super-duper new optimizations, and don’t look for the perfect way to create semantics for your blog - it doesn’t exist! Take the one that you understand. And use it to select keywords. But trust only verified materials about SY.
  • It often happens when we find it difficult to come up with a topic for our future post. Especially from the point of view of search engine promotion, when the priority is to obtain maximum search traffic. A ready-made semantic core helps to get rid of this problem once and for all - before your eyes at any time (!) you will be able to observe the ready-made topics of the following articles. And not only - in this way you can estimate a future series of posts, which can then turn into your own information product. 🙂

And finally, the most important advice. Don't chase big and thick keywords. There will be no benefit from this, you will only create problems for yourself by wasting a lot of time. Take into account my next three-phase principle for selecting keywords, which I successfully use on my blog (more than 80% of keywords are in the top 30, about 49% in the top 10 in Yandex and more than 20% in Google - and this is in my complex competitive topic!):

  1. Check the resulting parsed words that you collected in your own way (for example, using Wordstat, the Slovoeb program, competitor analysis, etc.) for competitiveness - estimate their promotion cost (how to do this, you can find out in my free book or you can read in separate additional article - link after this post). Remove all competitive queries, leave only those with the minimum cost (for example, in SeoPult these are keywords worth 100 rubles, in SeoPult Pro - 25).
  2. For the remaining search queries, rate their quality. To do this, you need to calculate the ratio of the exact and basic frequencies (available both in the book and in the post about analyzing search queries). Depending on the topic, exclude low-quality words and dummy words.
  3. Now, among the quality words received, distribute them among articles. If a post has more than one keyword, choose the boldest one (but this keyword should have a higher exact volume than the others). It will be the main query that brings most of the search traffic for the promoted article. If two of the keywords proposed for the article are suitable for dominance (more than two are rare), take both. Write one in the title, the other in h1.
  4. unique multifunctional platform Key Kollector;
  5. For a blogger, the best assistant for creating his own semantic core, of course, will be. It works quickly and is absolutely free! I recommend!

Before starting SEO promotion, you need to create a semantic core of the site - a list of search queries that potential clients use when searching for the goods or services we offer. All further work - internal optimization and work with external factors (purchase of links) are carried out in accordance with the list of requests defined at this stage.

The final cost of promotion and even the expected level of conversion (number of calls to the company) also depend on the correct collection of the core.

The more companies promote using the chosen word, the higher the competition and, accordingly, the cost of promotion.

Also, when choosing a list of queries, you should not only rely on your ideas about what words your potential clients use, but also trust the opinion of professionals, because not all expensive and popular queries have high conversion and promoting some of the words directly related to your business may to be simply unprofitable, even if it is possible to achieve an ideal result in the form of TOP-1.

A correctly formed semantic core, all other things being equal, ensures that the site is confidently positioned in the top positions of search results for a wide range of queries.

Principles for compiling semantics

Search queries are formed by people - potential site visitors, based on their goals. It is difficult to keep up with the mathematical methods of statistical analysis embedded in the algorithm of work of search engine robots, especially since they are continuously refined, improved, and therefore change.

The most effective way to cover the maximum number of possible queries when forming the initial core of a site is to look at it as if from the position of a person making a request in a search.

The search engine was created to help a person quickly find the most suitable source of information for a search query. The search engine is focused, first of all, on a quick way to narrow down to several dozen most suitable answer options for the key phrase (word) of the request.

When forming a list of these keywords, which will be the basis of the site’s semantics, the circle of its potential visitors is actually determined.

Stages of collecting the semantic core:

  • First, a list of the main key phrases and words found in the information field of the site and characterizing its target orientation is compiled. In this case, you can use the latest statistical information about the frequency of requests in the direction in question from the search engine. In addition to the main variants of words and phrases, it is also necessary to write down their synonyms and variants of other names: washing powder - detergent. The Yandex Wordstat service is perfect for this work.

  • You can also write down the components of the name of any product or subject of the request. Very often, queries include words with typos, misspellings, or simply spelled incorrectly due to the lack of literacy of a large part of Internet users. Taking this feature into account can also attract additional resources from site visitors, especially if any new names appear.
  • The most common queries, also called high-frequency queries, rarely lead a person to the site they are looking for. Low-frequency queries, that is, queries with clarification, work better. For example, the request Ring will return one top, and piston ring will provide more specific information. When collecting, it is better to focus on such requests. This will attract target visitors, that is, for example, potential buyers if this is a commercial site.
  • When compiling a list of keywords, it is also advisable to take into account widespread slang, the so-called folk, which have become generally accepted and stable names for some objects, concepts, services, etc., for example, cell phone - mobile phone - mobile phone - mobile phone. Taking into account such neologisms in some cases can provide a significant increase in the target audience.
  • In general, when compiling a list of keys, it is better to initially focus specifically on the target audience, that is, those website visitors for whom the product or service is intended. The core should not contain a little-known name of an item (product, service) as the main option, even if it needs to be promoted. Such words will be found extremely rarely in queries. It is better to use them with clarifications or use more popular similar names or analogues.
  • When the semantics are ready, it should be passed through a series of filters to remove clogging keywords, which means that they bring the wrong target audience to the site.

Taking into account the semantics of associated queries

  • To the initial list of SEO core, compiled from the main keys, you should add a number of auxiliary low-frequency ones, which may include important but not taken into account words that did not come to mind when compiling it. The search engine itself will help you with this. When you repeatedly type key phrases from the list on a topic, the search engine itself offers for consideration options for frequently occurring phrases in this area.
  • For example, if the phrase “computer repair” is entered, and then the second query is a matrix, then the search engine will perceive them as associated, that is, interconnected in meaning, and will provide various frequently occurring queries in this area to help. With such key phrases you can expand the original semantics.
  • Knowing a few main words from the core of the text, using a search engine it can be significantly expanded with associated phrases. In the event that a search engine does not produce an insufficient number of such additional keys, you can obtain them using the methods of a thesaurus - a set of concepts (terms) for a specific subject from the same conceptual area. Dictionaries and reference books can help here.

Logical scheme for selecting semantics for a site

Formation of a list of requests and their final editing

  • The key phrases that make up the semantics generated in the first two steps require filtering. Among such phrases there may be useless ones, which will only make the core heavier, without bringing any tangible benefit in attracting the target audience of site visitors. Phrases obtained by analyzing the target orientation of the site and expanded using associated keys are called masks. This is an important list that allows you to make the site visible, that is, when a search engine operates, in response to a request, this site will also be displayed in the list of suggested ones.
  • Now you need to create lists of search queries for each mask. To do this, you will need to use the search engine that this site is oriented to, for example, Yandex, Rambler, Google, or others. The created list for each mask is subject to further editing and cleaning. This work is carried out based on clarification of the information posted on the site, as well as the actual search engine ratings.
  • Cleaning consists of removing unnecessary, uninformative and harmful requests. For example, if the list of a building materials website includes phrases with the words “course work,” then it should be removed, since it is unlikely to expand the target audience. After cleaning and final editing, you will get a version of actually working key queries, the content for which will be in the zone of so-called visibility for search engines. In this case, the search engine will be able to show the desired page from the semantic core using internal links of the site.

Summarizing all of the above, we can briefly say that the semantics of a site is determined by the total number of search engine query formulations used and their total frequency in the statistics of hits for a specific query.

All work on the formation and editing of semantics can be reduced to the following:

  1. analysis of the information posted on the site, the goals pursued by the creation of this site;
  2. compiling a general list of possible phrases based on site analysis;
  3. generating an extended version of keywords using associated queries (masks);
  4. generating a list of query options for each mask;
  5. editing (cleaning) the list to exclude unimportant phrases.

From this article you learned what the semantic core of a website is and how it should be compiled.



Random articles

Up