Every company is getting more digital and SEO optimization goes in hand with that. Search Engine Optimization is a process to improve quality and quantity of your organic search engine results and its rankings. It is highly advantageous to your site to have a good SEO in general, because you can attract more people to your website and if you monetize your content, the amount of money you earn can easily grow by hundreds of percents. The biggest players on the internet (such as Youtube or Facebook) pay enormous amounts of money to SEO specialists, because they realize how important it is.
No matter what market you’re categorized in, I can assure you that there is at least two or three competitors who try to steal the audience from you. It might seem harsh to you, but if you’re smart, you will do the same. SEO optimization can help you move forward in the search engine results and can also help your links look better, which will draw more attention to the end users. To have you a bit motivated, here are some facts:
- more than 80% of organic clicks happen on first 10 positions in search result
- more than 60% of clicks are on first three positions in search results
Don’t be scared. Having a good SEO is not as hard as it seems to be and it does not require a specialist sitting next to you to accomplish it. To help you, we have prepared simple checklist to get you started.
Introduction to SEO optimization
Unfortunately, SEO is not something you can achieve by installing a plugin into WordPress. It takes hard work and plenty of time to move your site to the first few positions in the search result.
Take GuideArea.com as an example. This website previously existed on a domain vladmarton.com and it served as a portfolio webpage of a creator of GuideArea. Vlad decided to share some of his professional experience with the rest of the world. After about a year, there were roughly 20 high-quality articles. The site’s traffic was just about 25 unique users per day. This number was first of all not enough, and second of all it was not growing.
Vlad decided to make some rearrangements – he improved site speed and rewritten the titles of his articles. Because his articles had a how-to feeling, he decided to add a word “Tutorial” at the beginning of each of them. Believe it or not, this almost doubled the organic traffic on vladmarton.com. People who look for guides on the internet tend to write the word “tutorial” in their queries very often. Every day, 70-90% more people clicked on the search result link just because it looked more attractive and matched the search query better.
This is one of the real examples on SEO optimization. There is no limitation when it comes to it and it’s a subject to person’s creativity and endurance. The traffic described above was extremely low anyway, but keep in mind that doubling traffic of a website with just 20 articles – unless aimed to match the most searched keywords on the internet – is pretty darn good.
In this article, I will aim to tell you how you can achieve a similar result, all by yourself. Let’s start with some theory.
In order to simplify the definition of the words “process of affecting the online visibility“, imagine SEO as building blocks of pyramid which you have to climb to be perfectly optimized.
It’s harder to climb the first level, but at the same time it has the highest impact on the final result. High-quality content takes a lot of work, but decides the most about your future success.
We’ll dig deeper into each of the factors of all the levels in the pyramid.
High-quality, unique content
… is the first, and most important key of a SEO-friendly web site. Write about something you love. If you do not have passion for the main topic of your content, they will never be good enough. People, who visit your site will feel that they wasted their time reading your articles and they will never come back. Do you know how many times it happens to me every week that I see a site in search results where I previously read a crappy article, and I skip it, even though it might be good? I do it because there is a risk that this one might be crappy as well.
You can’t afford to lose your audience like this. Make your articles understandable, easily readable and full of good points, facts and valid, respectful opinions. Your daily traffic consist of returning visitors, so make them want to return!
Avoid copying content from other websites and making them your articles. Search engines are not that stupid. When they see a duplicate content, they do not know what to do. It is hard to say who is the original author and one of you will be chosen, while the second of you won’t be shown at all. I mean, why should it? It already displayed the same content once. Because it was you who copied it, your article will be younger, indexed for less time and will have lower traffic. Which one do you think the engine will display?
Your content is the most important part of the SEO. All you have to do is to put enough time to your content, so it’s worth reading.
Keyword is a term closely related to content on your website. One of the most important steps while building up your SEO strategy is doing a good job looking for your keywords. These will have a high impact on your on-site optimization.
One more note before telling you how to do the search for your keywords. Keep in mind that the more specific your keyword is, the more relevant and accurate your search results will be. This type of keyword is also known as long-tail keyword (e.g. keyword – hotel, long-tail keyword – cheap hotel in London)
Okay, now we know what keywords are but how to find them? Here are some tips:
Offline keyword research
- Brainstorming – gather up with your working team and brainstorm about different ideas. Your colleagues are people who know what words are popular in your industry and interact with your customers.
- Ask your customers – who should know the best what keywords if not your customers? Put together a survey and spread it out.
- Old marketing material – dig up old leaflets, ads etc. from your collection and use some of the keywords from those – people might have them in their minds already, right?
Online keyword research
- Google autocomplete – what can be easier than to use good old Google to maybe tryout the ideas you gathered from offline keyword research to see which one are the most looked for.
- Keyword Planner (Google) – just insert the keywords you have in mind, select the location you are targeting and click on get ideas – yes, it is that simple!
- Google Trends – you need to keep up! Checkout the rises and falls of your keywords.
- SEMrush – this tool is great, but you might need an account to properly use it. You can register for free to try 10 different searches – it is definitely worth it to check it out.
Now you are done with the research so do not forget to implement your best keywords in: URL, title, meta description, alt text, H1 titles and body text and you should be good to go!
This type of optimization includes all the factors that have an influence on your website in normal search results and is controlled by you. There are many factors that can affect on-site SEO. I will describe the most important ones in this chapter.
Together with URL, title tag is the most important parameter. It should not have more than 55 characters. Keywords must be present in the title tag and preferably they will be on first position.
URLs should be user friendly and should preferably display a title keywords. For example, WordPress let’s you replace default article URLs with permalinks. This means that an article’s URL will not be guidearea.com/article/4510, but instead it will become guidearea.com/how-to-create-permalink.
This permalink seems pleasant to human eye. It also shows a consistency between article title and it’s URL, which search engines do not see, but users looking at the search result can appreciate.
URLs should not contain any stop-words, such as “a”, “an”, “and” or “the”. Get rid of those – search engines do not like them.
Headings are elements that help crawlers to identify parts of your content. If you don’t include any and your article is just one large paragraph, they will see the article as messy and non-structured, resulting in bad SEO.
Your pages/articles should contain at least one h1 heading and it should contain the keywords you used in title tag too.
… means how often your keywords are present in body text. Density of 0.5 – 5% is a suggested amount – if you go lower, search engines might think that the content is mixed and you’re not really writing about the topic that you’re suggesting.
This description is shown on search engine results page under the website URL. It should be specific and unique. It has to tell a short story about what you’re trying to say.
Meta description does not affect SEO directly but it is the best tool to attract potential audience. The text should range between 150 – 200 characters and should include the most important keywords.
Avoid using randomly generated meta description, or no description at all.
Alts are alternative descriptions, which are shown if the image cannot be displayed. These images are used by search engines to figure out what the image is about, as well as the text surrounding the image.
Your pages and/or articles should point to another parts of your website, to increase traffic and lower the bounce rate. These two measures have impact on your SEO and search result position.
Use keywords in your anchor texts to easily describe what the article is about and how they are related to the current page.
These links point to another websites, pages and articles. They can help you to build relevance and value of your content by providing another alternatives to discover more information about your topic. It’s also a way to “contact” other websites – when you link them, they receive a pingback which tells them about the existence of your page.
Be careful! You don’t want your users to go to external link and never return back! External linking should be done with awareness…
As the heading suggests, this is a part of SEO optimization that does not happen on your website. Instead, it takes places on all other websites throughout the internet.
Search engine crawlers have to go through millions of pages every day. Let’s say that they crawled your sitemap and all links in it. Engine indexed them all and you now appear in search results. Your content might be relevant and high-quality, but still, you appear on 36th page of the search result. (You can read about the reason for this by skipping to chapter Domain age and traffic)
Your website is not the only one that is indexed by search engines. And when they arrive at a very popular website with excellent SEO rating and search appearance and see a link to your website or specific article, it gives you credibility – big deal of it. The more references like this you have, the better.
You can achieve this by:
- Writing to corresponding threads on social media, such as Reddit
- Leaving comments that point to your content on other blogger’s/writer’s articles
- Sharing on social media like Facebook or Linkedin
- Have people write an article/shout-out about your website
Process of link building is a long-term matter but it definitely is worth looking into. It can move your website up by tens of percents in search results if done properly.
Search engines do not know to what extend your websites are or aren’t responsive. And I’d also say that they don’t care. What they do care about is time on page and bounce rate. They measure these two elements and it’s one of the factors that decide on how good your content is.
Time on page is an amount of time that your users spend on your websites. If your content is good, your average time on page will be several minutes. If it is not good, the average will be under one minute.
Bounce rate is a percentage of visitors who jump out of your website after viewing just one page. It is highly suggested that you insert links to your other pages in each of your articles and also implement a suggestion box at the end of the article, which will show articles on your site that are related to the one currently viewed. If you do this right, your bounce rate will be low and that is a good sign for search engines. It basically says that your content is good and therefore should be ranked at higher places in search result.
If your website isn’t responsive, visitors of your website will not like what they see and there is a very high chance they will bail – without thinking and very fast. Which means that each of them will generate an extremely low time on page, and 100% bounce rate.
Site structure (Error pages)
There is not that much to do with pages that do not exist on your website. They will always exist, because even if you keep your sitemap updated and there are no pages in it that point to a 404 site, visitors will still be able to enter meaningless URLs to their browsers.
As an example, try to enter URL guidearea.com/df to the browser. This is a URL that does not point to any of our content here on GuideArea and we cannot affect what URLs you put in your browser. But we made sure that when you land on a “Not found” page, you will see a page with the same theme as our website. It will point you to a search bar where you can try to search again and to some of our articles in which you might be interested in.
Avoid letting your users see the ugly default 404 page. Take a look at our 404 page, what is your first impression? (Open the image to see full resolution)
By creating such error page, you improve your bounce rate. People don’t just leave because they can’t see nothing. And even though this might happen even with customized error page, there is a chance they will instead stay on the page and try to search for content, or click on one of the suggested articles. The lower the bounce rate, the better rank your site gets from search engines. And by referring your users to articles, you do not lose traffic on your site. Small price to pay, huh?
Domain age and traffic
On GuideArea we have some articles about programming in Java. One of our many competitors, and probably the biggest one, is a website tutorialspoint.com. When you search for “Java tutorial” in Google, GuideArea does not even show up. I mean, it probably does, but chances are you’d need half an hour to paginate through results to find a result which points to this website. On the other hand, tutorialspoint holds number 1. Why is that?
It’s because tutorialspoint has been on stage for many years and their monthly traffic exceeds 20 million unique visitors. Search engines prefer to show pages with high maturity and high amount of daily traffic. If you have a domain which has been around for 6 months and your main website topic is Food, you cannot expect to appear on first page of search result, just like that.
Keep your domain up. Don’t transfer to other domain. Find one that you are sure you will be comfortable with for the next 50 years, and stick to it. If you keep your content on high level and do your best to grow your audience, in a few years, you might be the one who shows up on the first page. And heck, even on the first place in search result.
Site speed and loading times
Nobody likes slow websites. If your article is loading for 10 seconds, your audience might get fed up with waiting and leave. Even if your website can be fully loaded in 4 seconds in your browser, it does not mean the result will be the same on locations with slow internet connection.
Losing your audience (and traffic) over something that is optimizable is a shame. There are many ways to make your site faster and I’m not going to explain each of them. These are the least intrusive steps that you should consider if you’r website is too slow:
- Choose a lightweight theme for your website
- Make your static files (images, etc.) smaller.
- Minify and merge your CSS and JS files
- Load your JS files asynchronously
- Use a CDN network to move your static files closer to visitors
There are many pages where you can test your websites. Following are the three most reliable:
PageSpeed Insight – https://developers.google.com/speed/pagespeed/insights/
GTmetrix – https://gtmetrix.com/
Pingdom tools – https://tools.pingdom.com/
If your website score is 80% or higher on both desktop and mobile platform (or grade A or B on GTmetrix), you can sleep peacefully. It’s very hard to reach 100% nowadays and unless you go deep under 80%, you will be just fine. At the time of writing of this article, GuideArea ranked on GTmetrix in the following manner:
The fully loaded time depends on many factors – one of them being your advertisement platform. Because ad loading is deferred, it adds a few seconds to the site loading time. Back in days when vladmarton.com was still active, the loading time was under 2 seconds and if you’re not loading many JS scripts, you should try to reach similar number.
SSL certificates for HTTPS protocol
Having SSL certificate not only enhances website security. It also provides a certain level of credibility. Search engines which look at your website will take it more seriously if you have an HTTPS protocol implemented and won’t penalize you for not having it.
It might not have been mandatory in the past, but if you want your page to get a better score, you should think about getting a certificate. I remember times when I could not afford one because it costed over 100$ a year. Nowadays, the story is completely different.
Above are the prices of one of the biggest SSL certificate provider NameCheap SSL Certificates. Even though I do not recommend going for the cheapest certificate, you can still do it and it will cost you close to nothing.
Sitemap – communication with engine
Search engine crawlers usually land on your page and then look for any internal links. If you have your categories in the menu, articles on category pages, and so forth, it’s very possible that your web site will be crawled successfully. But it also means that:
- Crawlers will have to do more work to index all your pages
- You won’t be able to filter which pages you want to index and which not to
Having a website without a sitemap will affect your overall score. Also, future indexing of your new articles and updates to the existing ones will not be possible. It will happen from time to time that search engine crawler will return to your page again. But this could easily take half a year and during this period, your new content wouldn’t be available in search results.
Having a properly configured sitemap is like having a direct communication channel with search engine. It’s like saying: “Hey, this is a list of my pages that you should index. New content usually arrives every week, so come back in a week again.”
Search engines appreciate sitemaps because it makes their job easier and faster. Not having a sitemap nowadays is a huge mistake.
Webmaster Tools & Search Consoles
As soon as you have your sitemap generated, you should submit it to search engines. Like I said before, sitemap is a way to directly communicate with a search engine. This does not change – no matter if you submit it or not. But if you do, you will make it easier for engines to know about your sitemap and they will find it easier and faster.
It’s a good idea to do this part for all of the big players out there. Go ahead and register your website on both Google Search Console and Bing Webmaster Tools. Of course, there is many other search engines, such as Baidu. Although, submitting to Google and Bing should be enough for your needs. In Q4’2018, Google’s market share was 90%, while Bing’s was close to 4%.
There is no guarantee that you will be indexed right away. I will quote my own article Why doesnt my WordPress website appear in Google search results where I talked about why isn’t your site getting indexed after submitting sitemap.
Just upload your sitemap… and be patient.
Breadcrumbs in content
So you have a sitemap which lists all of your URLs. That’s good, but not good enough. You probably have nicely formatted permalinks active and search engine crawlers will never know where the article belongs on your site. Is it under homepage? Is it under specific page? Or category?
This specific advice is not so mandatory, but it definitely is advantageous. When crawler arrives at your post or page, the only thing that can help him understand where the post or page belongs is through URL or referrer URL. If the post or page URL is a permalink and the referring URL was your homepage, crawler will assume that this post or page is a sub-page of your homepage.
And this is not true, because your post belongs to a specific category. But the crawler will never deduct that.
That is why you should implement breadcrumbs. Breadcrumbs are like a stack-trace which consists of links that we’re clicked before getting to a specific post/page. Below is an example of breadcrumbs – taken from ebay.com:
This helps crawlers to understand the hierarchy of your website and it might give your site a better score. On the other hand, at the time of writing of this article, Amazon did not have any breadcrumbs implemented. Like I said, it’s advantageous, but not mandatory.
Robots.txt directive file
This file should be placed in the root of your website and should contain rules for crawl-ability. You can allow and disallow paths to different directories to let the search engine crawlers know how the directory should be dealt with. Let’s take a standard WordPress robots.txt as an example:
User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/ Disallow: /xmlrpc.php
This file allows all of your images and attachments to be indexed. At the same time, it forbids crawlers to discover URLs that point to protected directories. You generally do not want these to be indexed, otherwise google-ing “site:guidearea.com” would show public audience all kinds of weird results.
Helping crawlers to properly crawl your page is a good SEO practice and everyone who owns a website should follow it. If your website does not have robots.txt file defined and has 100.000 files in it’s folder, robots.txt will try to crawl every single one of them. After it’s been crawling for a long time, it will give up, as search engine crawlers have sort of a time-to-try cap (after which they give up crawling and jump to next website).
This hurts index-ability of your page. It also means that pages which you really wanted to be indexed might not be indexed at all. It’s very important to have robots.txt defined and properly configured, so you don’t lose your audience over a few lines of commands.
Don’t be afraid of SEO. If you find time to deal with it, it can help you achieve great things. It takes a lot of work and patience to land the first page in search engine results, but it definitely is doable.
Nowadays, there is plenty of content management systems (such as WordPress) that offer a possibility to install an SEO-optimization plugin. These systems and their plugins usually take care of most of the steps that I described – they create your sitemaps, robots.txt file, optimize page speed, etc.
If you follow all the steps from this guide in combination with an SEO plugin, your high-quality SEO should be up and running in no-time. After that, it’s all on quality of your content.
PS: Here are some online tools that can help you with SEO optimization: