Mittwoch, 28. Oktober 2015

The question rises over and over again: whether and how could HTML markup negatively impact SEO. Googlebot is indeed a smart HTML interpreter:

it has a high tolerance to HTML syntax errors,

it doesn't force websites to comply with W3C validation rules.

But nevertheless there are some HTML misuses, which could painfully hurt SEO. To begin with this topic i relate to a pair of posts by two respectable Googlers and by commenting the posts i list HTML issues causing negative SEO effects:

Dienstag, 27. Oktober 2015

Many webmasters are affected from the weird issue: Google is indexing (at least crawling) non-existing URLs. The issue isn't depending of whether one uses Wordpress or other CMS. This question about why Google is crawling and / or indexing non-existing URLs appears in all webmaster forums, Google Groups and so on, but without a clear solution.

The fact, that Googlebot creates and crawls a bunch of non-existing URLs, lets arise some questions:

Where non existing URLs are coming from?

Why is it not optimal, if non-existing URLs are crawled respectively indexed?

Dienstag, 13. Januar 2015

URLs with query strings can be a real poison for a SEO. The main and mostly harmful damage untreated URLs with string query do, is a not calculable rise of amount of existing URLs with same content, HTTP answer code 200 and untreated indexing management, also called duplicated content. Another issue caused by query strings in URLs is overspending of crawl budget to URLs with query strings, which must be better excluded from crawling and indexing.

On this way a site with untreated URLs with query strings gets on the one side such URLs into index, which don't belong here, on the other side the crawling budget for good URLs could be missed, cause overspend.

There are some passive techniques to deal with query strings in URLs. Actually i planned to publish existing techniques for dealing with query strings in URLs and my solution for SEO problems caused by query strings in URL into my ultimate htaccess SEO tutorial, but then this topic got some more details, so i decided to create an extra article about query strings in URL and SEO.

Preceding kinds of SEO dealing with query strings in URLs

while Google means, it could deal with query strings in URLs, it recommends to adjust the bot's settings in Webmaster Tools for each of existing query strings.

URLs with query strings could be disallowed in the robots.txt with a rule like

Disallow: /?*
Disallow: /*?

If header information of HTML or PHP files, available with URLs with query strings, can be edited, so it is possible to add rules for indexing management and URL canonicalisation, like

These methods are mainly manual, require unpredictable workload and solute problems partly. But the good news is: i have an universal solution, working for all URLs
with query strings and getting rid of all SEO troubles caused by query
strings in URLs.

Dienstag, 23. September 2014

One of the common techniques of the website performance optimization is the reducing of the HTTP requests amount. Each website asset, like image, needs a HTTP request to be loaded. On this issue is based an idea, to embed website images as base64 encoded data URI. After an image is embedded directly into HTML or CSS of the website, no additional HTTP request is needed to load this image - it is no longer an external resource, but becomes a part of the source code. This is the good part.

Mittwoch, 17. September 2014

Certain ecommerce sites with only few product categories and some thousands of products are able to generate thousands upon thousands useless URLs, through product search, product filter and product option URLs. Sad, but true. We can't do as if this problem wouldn't exist. To leave such URLs unhandled would bring tons of negative SEO impact. There are only few kinds of dealing with such URLs:

to get rid of them completely,

to turn a part of useless URLs into useful, and

to reduce the negative SEO impact of the remaining useless URLs.

Note!There isn't the magic method - no one of existing SEO techniques does the trick alone. What works is the combination of SEO techniques, which i collect in this article.

Donnerstag, 11. September 2014

.htaccess (hypertext access) is a text file, placed mostly in the root folder of the given site and invisible cause of the point at the begin. .htaccess contains directives for server, server software, robots and browser about handling of files, folders and paths / URLs.

Generally there are 2 topics, where .htaccess can be used for SEO purposes:

Mod_alias and mod_rewrite directives (URL redirects and rewrites)

load time optimization

Site security has in my opinion only indirectly to do with SEO, so i decided not to make it to a topic of this article.

The last, fifth part of my HASCH OnPage SEO framework is about the SEO mission of .htaccess. I aim to create a kind of multipurpose explained and examples-illustrated checklist about .htaccess usage for mod_rewrite and robots manipulation and load time optimization as advanced SEO objectives. This ".htaccess for SEO" tutorial will be helpful (for me and you) on performing site audits and building new strictly SEO-minded sites. Read the tutorial →

Donnerstag, 7. August 2014

Today we all got to know from Google officially, that HTTPS secured connection till now is a ranking signal for Google. Some of us run to buy one. Some other will try to get one for free, e.g. by StartSSL. While we don't know at the moment, how much weighs the secured connection as the ranking signal for Google, another issue exists for all website owners: what if somebody tries to reach your site with HTTPS, but you haven't any certificate installed? Any browser will rise an error, something like ERR_SSL_PROTOCOL_ERROR. and the visitor will not see your site. What to do? If you will search for an answer, you will mostly get an answer, any redirection isn't possible, cause the SSL handshake happens before the whole connection, so any redirection will not work. But there is a little mighty thing named htaccess, which will allow us to make our sites visible for any visitor independently of with which URL the visitor tries to reach your sites. The trick is

Donnerstag, 31. Juli 2014

The short answer is: no. There is no such metric or signal or anything, what would Google measure.

The long answer is: ...yes! How? Optimizing text to code ratio to the benefit of text, we reduce the code rate. Less code generally effects less page loading time. And this is very well a measurable signal, evaluated by Google and influencing a rank.

So what now? Text to code ration as measurable number isn't relevant for SEO in no way. But it does matter as a first symptom of possible loading time issues, related to dispensable code inside of web document. How to reduce the website code amount?

Samstag, 26. Juli 2014

How to relaunch website keeping ranking and position in SERP

In the last time the most interest to the topic "how to relaunch website", as i shortly realized, rises in Germany, whereby i don't think, that this topic is indeed interesting only there.

Fact is: any website relaunch means losses. Losses of traffic, of ranking, of position in SERP, which are finally loss of money. I've seen losses of 4% and 40%, and there aren't the final numbers. The main objective of SEO is to minimize such negative impacts.

In short each website relaunch is a change of design and / or content structure. If content structure changes, with it comes definitely the change of URL structure. So we have generally 2 possible fields, where the new website version could meet a bad acceptance and following loss of traffic:

users could dislike the new design

search engines could become bitchy on indexing the new URL structure and some pages will be out of searcher's scope.

So how to act on relaunch to come out as a winner from the relaunch battle against our visitors and search engines? Look:

Mittwoch, 2. Juli 2014

If your task is to optimize a pdf file for search engines, you must ensure, that your pdf file will be text sourced and not image-sourced. To do so, create your pdf file with a text editor like Libre Office / Open Office or the like, and never with an image editor like Photoshop.

The SEO for pdf procedure isn't tricky, but the optimization quality depends vitally from your general HTML and SEO knowledge:

Freitag, 27. Juni 2014

Each website begins with the header. There are no minimal or required values inside of it: the header content fully depends from our goals and laziness:) Another cause for poor designed headers is a meaning, that Google doesn't understand meta tags or doesn't take meta tags as a ranking factor. Well, Google says, it understands just a little amount of available meta tags. Here is Google's statement about understanding of meta tags. But i say:

use not only meta tags, which Google definitely understands,

use meta tags extensively,

be redundant in meta tags using.

Why? Just because the web contains much more as only Google and you. There are very much bots and spiders in the internet, their goal is to parse web contents and to create different indexes. The parsing criteria could be a bunch of existing meta tags and the parsing results on their part could be parsed from Googlebot. So Google gets into its index the content from your meta tags, which Google means not to understand.

Good, now we are agreed about benefits of meta tags using. The main SEO point is utilizing header rules to give the searchbot so much information about our site as possible. Now i list meta tags one by one and give the correct syntax, some possible values, and my point of view of the practical SEO effects.

Donnerstag, 26. Juni 2014

Some C++ programmers dream of or even try to create an own operating system. The creation is scheduled to be the best of the existing OSs and demonstrate the state of the art. Some PHP programmers dream of or even try to create an own content management system, targeting the same. Why want they do it? I guess, such creations would include all the best practice examples into it, then they would get rid of all existing bugs and misbehaviors, then they would systematize and list all current knowledge in the creation.

OnPage SEO is a sophisticated knowledge area, with very much of unsystematized and unvalidated knowledge from many various knowledge segments, like web design, web development, server administration, linguistic, marketing, psychology. With the HASCH OnPage SEO framework i target to systematize the OnPage SEO knowledge and to get rid of unvalidated parts of it.

Then lets get down to the nitty-gritty:

HASCH: the OnPage SEO framework

PS:
I'm sure, this framework will be a good help, cheat sheet and rulebook for
all, who performs SEO audits or creates SEO-minded sites. The nature of
SEO is, that this SEO framework will be never ended up, so it will be
always in the public beta and ongoing updated. I'm highly happy about
any additional advice you would share with me!

Dienstag, 1. April 2014

What a SEO needs to know?

The web is flooded with infografics and cheat sheets. Somebody meant once, such giveaways are good for SEO as linkbuilding assets and now everybody makes some. At least as copy and share. I will not speculate about whether or how many of them bring real value, imo most of them are redundant, but my personal biggest problem with them was - THE cheat sheet was NEVER present, if it was really needed (at least for me). Indeed, the sense and the convenience of cheat sheets is if they are there just in time, at the moment, whem one needs them. So i decided to create a collection of all cheat sheets i ever used on my SEO activities and share it. This cheat sheets suite is an evergreen knowledge, hints and tricks, which will be always helpful. Surely this knowledge isn't enough to call oneself an expert, but for somebody who does SEO, specially technical SEO and Onpage SEO, these cheat sheets will render a great service. And for somebody who learns SEO at the moment, they will give a great summary of things which must be learned. These cheat sheets cover already all essential knowledge segments a SEO brings daily into action. Befor publishing i reviewed all cheat sheets to find eventuallya fresher version - for some of them i finded one indeed.

Montag, 17. März 2014

Insert scripts properly into Blogger template's head

There are many how-to's for implementing of third-part scripts into Blogger templates, mostly custom css and javascripts. I personally use Syntaxhighlighter and Google Analytics. But the most how-to's advice users to insert scripts into the template's head. This approach is against all best practices for optimizing of site's load times. If scripts are inserted into the head, the whole site's content will not load and wait till all scripts are fully loaded. This influences negatively the whole site's load time, which, as you know is an important ranking signal. The there is a strong dependency: more scripts in the head - longer load time - poorer ranking.

My advice for you (i tested it myself without any issue): insert scripts into the body's bottom, just befor closing tag. If something doesn't work, you could still move scripts one by one into the head.

Samstag, 1. Februar 2014

We are agreed, that assets like html, css and javascript must be better minified. If not, YSlow, Google Page Speed test and similar tools will give your site less scoring points and advice you to minify. It is enough said about importance of load optimizing for SEO, so lets look, what we can do and what we use to achive best possible result.
I tested 17 free tools to minify JavaScript online: after minifying with the best tool there are ca 43% of code remained, the "worst" tool (not really worst) remained ca 51% of code. It worth also to minify. To minify CSS gives ca 60% of less code - It worth also to minify too. Read further - i list all tested online minifying tools with stats and a few server-side traffic-saving hints :

Donnerstag, 3. Oktober 2013

It's not a kind of doing, what you can use on each corner. But such hint could help you to diversify and enrich your assets and to gain the spreading of your source. But don't misuse - you could be just quickly abused;) Google doesn't like if you become oversmart. I'm talking about imagemaps, the links inside of imagemaps and making use of imagemaps for SEO purposes.

Donnerstag, 12. September 2013

You want surely , that your site comes into relevant SERP with its images. There are some technics, which will enrich your images and let them be better indexed. Beside of this we talk about image size optimization - size matters, you know.