SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategieshttp://cognitiveseo.com/blog
SEO Blog | cognitiveSEO Blog on SEO Tactics & StrategiesTue, 03 Mar 2015 14:03:46 +0000en-UShourly1http://wordpress.org/?v=3.8.1Build More Links by Being More Humanhttp://cognitiveseo.com/blog/7542/build-links-human/
http://cognitiveseo.com/blog/7542/build-links-human/#commentsTue, 03 Mar 2015 13:43:20 +0000http://cognitiveseo.com/blog/?p=7542Building links to a website is one of the most crucial aspects of online marketing, because building links means you rank higher in Googles’ search engine, and Google still uses links as its main factor for ranking a website. The problem with link building is that it is more a problem of human persuasion than […]

]]>Building links to a website is one of the most crucialaspects of online marketing, because building links means you rank higher in Googles’ search engine, and Google still uses links as its main factor for ranking a website.

The problem with link building is that it is more a problem of human persuasion than anything related to a mechanical algorithm.

The process of link building causes us to naturally forget that websites don’t actually give the link. It’s people who link to websites, and it’s people who can be emotional, bear grudges, act irrationally. It is not some objective piece of computer code that gives the link. It is a human being who has to enter the URL into the HTML code and cause the Google bot to sit up and notice.

It is because of this that the soft art of influence and persuasion are profoundly important when employing a link building strategy.

The recent changes in the way Google ranks websites have caused people to change the way they create content on their websites and the way they get links. You can no longer rely simply on scaled, farmed content to create your links, and so the alternative is to create content that is good enough to be read and linked to by people.

There is a way to increase your link building effectiveness without having to focus purely on content, although you still need to create content for it to work and the content does need to be useful, interesting, and already attractive to people.

The solution is to be more popular within the niche you operate in and attract influential people who give links, rather than on the links themselves.

In other words, become more human in your link building.

You could of course follow the mantra of “Build it and they will come”. But I do not recommend this.

Much better to persuade, influence, inspire, excite… etc. people to link to your content.

Link Building with Content is a Relatively Simple Process:

Create content

Tell people about the content

People link to your content

However, anyone who has attempted this knows that, in practice it is not so easy.

Because by definition most people are average and do not get noticed, while others have huge buckets of popularity to whitewash their content with.

Simply creating content and telling everyone you know, posting it on every social media account may not be enough.

In fact, if you look at the status of the average person’s ability to influence, you will notice that it definitely will not be enough.

By increasing your popularity you can help solve this problem.

Whom Do We Want Links from?

We want links from sites that will make a difference to our own rankings and allow real people to follow the link to our own websites.

We can use all manners of tools with which to assess authority initially. I use the Site Explorer tool to get a quick snapshot of the authority of the site, along with a visual check of the websites social media.

I try to find the people behind the site; if they are not hiding and can easily be found, that’s a good thing.

Social media is still a good way to determine who is a player and who is just faking it, by cross referencing a number of social media platforms you can get a good assessment of a person’s standing.

I check out their main social media accounts, if they have them, if they do not I downgrade their importance.

Twitter

I note the ratio of how many people follow them compared to how many they follow. If I see someone who only has 5k followers and they are only following 50, I will be taking more interest in them.

However, if their Twitter account has 120k followers and they follow 100k, I usually wonder what tool they used to inflate their account and if their following is actually real.

LinkedIn

LinkedIn is a bit more useful in this regard as it is difficult for someone who is not popular to game it. Although it still has its issues, LinkedIn is very useful to determine whether someone is a player or not.

Facebook

Facebook is a great way to find out who the real person is, but you have to factor in the constant stream of cats, sunsets, births, deaths, aches and pains, which adds to the noise rather than giving you something specific. But because it’s mostly real people, it is a data mine for those who spend time there.

Google+

G+, I don’t really take it seriously at the moment in this regard, but it could change.

I am deliberately glossing over each social media platform, as each one could have its own book, and a quick look at Amazon tells you that they do.

Be a Human

Once you found out who the person behind the website is and digged their social media accounts for information about them, you will be able to see if they give links and if those links are worth getting.

You will now have a good profile of the linker and know what they like and what they don’t like and what they will link to and what they will not.

You will not have to go through this process if you spend a lot of time in your niche once you build up a nice list of contacts who you are able to persuade to help.

Beware the Easy Route

The problem for most is that when they get to this point the temptation is high to simply send an email blast to all on their list with the same, generic content.

I see this every day in my email inbox and a lot in LinkedIn. I replied to one of these spam blasts to tell the sender they are ruining their reputation after he visibly CC’d in about 200 names. He replied that is was all good as he got a bit of business out of it.

He may well have got a bit of business, but had he taken the time to craft individual messages to people whom he had a familiar connection with, his response may have been a lot greater?

Increasing Response Rate by Being Human

Human beings exist and operate within communities, in fact our brains are hard-wired to function best in communities of 150. Which is about the size of a small village.

There is a level of intimacy that is missing from online communication and what you make up for in immediacy and numbers, you lack in personal, face to face, and one to one communication.

The goal is to aim for a village of close and trusted friends, no more than 150 and a wider group of fans and followers of a 1,000.

Don’t bother trying to swallow gallons of super hero guru juice. These people are outliers, their success cannot be replicated and you must travel your own path.

1 Rabid Fan at a Time

Your goal is to make one person like you more today than he/she did yesterday.

There are a number of ways to do this

Do good things without expecting something back

Have good manners

Don’t be a bore

Be nice

Be useful

Be helpful

Don’t be creepy

Avoid politics

Avoid religion

Avoid spoilers to The Game of Thrones

Conclusion

Be human, treat online interaction with the same rules as you would the offline one. You wouldn’t go up to a stranger in the pub and ask for a fiver.

You have the tools and ability to be popular, and there is no big secret to it, once you realize there is no shortcut to genuine popularity.

]]>http://cognitiveseo.com/blog/7542/build-links-human/feed/0Google Penguin Recovery Success Using 301 Redirects and 404shttp://cognitiveseo.com/blog/7402/google-penguin-recovery-301-redirects-404/
http://cognitiveseo.com/blog/7402/google-penguin-recovery-301-redirects-404/#commentsThu, 26 Feb 2015 11:00:38 +0000http://cognitiveseo.com/blog/?p=7402This is a TRUE & SUCCESSFUL story from Dale Gillespie, one of our customers. “Below you will see presented an interesting recovery story from one of our great customers, Mr. Dale Gillespie. Happily, he chose to share with us his experience regarding an Google Penguin Recovery and a successful implementation of a 301 redirect strategy mixed […]

]]>This is a TRUE & SUCCESSFUL story from Dale Gillespie, one of our customers.“Below you will see presented an interesting recovery story from one of our great customers, Mr. Dale Gillespie. Happily, he chose to share with us his experience regarding an Google Penguin Recovery and a successful implementation of a 301 redirect strategy mixed with 404 on pages affected by unnatural links. It’s a great story about perseverance, patience and thinking ahead!”

I’d like to share with you my experience regarding a Google penalty that we’ve received and tell you how we managed to get rid of it and be back on the track again. Was a pain of a job to be honest! But it was made drastically easier, yet still laborious, as a result of perseverance and the use of cognitiveSEO.

1. Multiple Google Penguin Penalties

We were aware of the fact that we had some SEO problems with our existing domains but in the spring of 2013 things became more clear as all of our domains were hit with algorithmic penalties.

As a multimillion turnover and long established car dealer group, we were simply badly equipped for the internet revolution and had taken bad advice. We ended up with all of our sites having directory links and potentially bad links in their profile.
The sites that we’re talking about are:

Jennings-ford.co.uk

Jenningskia.co.uk

Jennings-mazda.co.uk

Jennings-Seat.co.uk

Jenningsseat.co.uk

Below are the screenshots taken from analytics, where the drops can be easily spotted.
Smaller sites were worst affected by the Google Penguin Update from May 2013 and other Penguin updates had caused minor drops in traffic. The stronger more established sites weathered the storm but the drop’s were definitely affecting business.

Google Analytics Screenshot – Jennings-mazda.co.uk Ranking Drop

Google Analytics Screenshot – Jennings-Seat.co.uk Ranking Drop

2.The Rebranding Decision

With all our domains being penalized by Google Penguin, something needed to be done in order to recover our ranking, so we started working on some plans of action. A marketing strategy was agreed to rebrand as “Jennings Motor Group” and move all our penalized sites under one new domain, named www.jenningsmotorgroup.co.uk

The main concern was doing 301 redirect for all sites to a new domain in the context where we were lightly highlighted by the fact that the penalties would then affect all traffic and search rankings.

Also, our domains had real world value with over 14.000 cars on the road for all the brands that we commercialize. They had solid natural search that could not be abandoned, backed up by the domain names, press radio and other re-enforcement. Therefore, among our worries was how do we make sure that any branding change won’t affect our hard-earned SEO value.

We wanted to tell both users and search engines that our original pages were no longer relevant and that the most relevant and up to date information can be found on a brand new page but also transfer our old domains’ search engine rankings to the new web address.

Car buying requires about 10 website visits before a decision and even after a decision is made, people keep coming back to look at the car before they get it physically. So, maintaining our brand’s authority and awareness was crucial for us.

3. The Implementation of the 301 Redirects

After analyzing the situation, the agreed process was to implement a 301 redirect.

The process was implemented around the 8th of May 2014 and the 301 was done on a page to page basis, for each website individually. We also did 404 to some pages. We had to do further tidy up after this impacted traffic. So, long story short, we combined multiple domains into one single motor group domain so we had to tidy up the backlinks from these sites individually then 301 into the main site.

Therefore, we started to try and take out all the other activities that creates solid organic traffic from real world radio, TV, press etc. It seldom drops off a cliff as we would ramp up ads which in turn generate increased organic traffic. We now needed to establish Jenningsmotorgroup.co.uk as a standalone site to do some natural link building and some content marketing for it. We retained the legacy of the other redirected domains so we also needed to do some cleaning in the link yard. We’ve started cleaning the new profile and started building on the value of JMG with high quality content and unique PR activity.

4.The Immediate Penguin Recovery Results

The first 301’s occurred in August 2014 producing a burst in visibility but, as expected, a rapid decline followed, resulting from the combined effect of all the bad links. The Google Penguin recovery was not done yet!

6.The Google Disavow “Dilemma” and the 404 Recovery Technique

Although sustained link building altogether with quality content and correct 301 redirect was done, there were still bad links in our profile, therefore we considered the possibility of submitting a disavow.

On the other side, we thought that every site has some spammy link ratio and maybe it was more important to get our profile as clean as we possibly could. Using cognitiveSEO to identify the bad links, contacting the linking sites and get them taken down and creating good content strategy should do just fine for our profile. We had created the mess so we had to clean it. Disavow was not a route we wished to take but more of a last resort.

Until now, we have not done a disavow but we have clear in mind that it is an option, although we avoid it by killing off some of the internal pages with 404 rather than 301.

7. A Bright Future Ahead of Us!

If you look at the SEO Visibility over a longer period from, let’s say August last year until today, you will see the huge improvements we’ve made. The big backlinks tidy up that was done using cognitiveSEO with constant monitoring got us here. This year has kicked off with some good sold traffic improvements and we are enjoying solid traffic levels.

Monitoring is a never ending job. Yet, after all this efforts, we can now proudly look at our link history chart and see a promising evolution of our OK links. We do not have unnatural links in our profile no more and the suspect links evolution trend is descending. As I said at the beginning of this post, it was a pain of a job but we managed to get things right after all with a lot of hard work and using the right tools.

Disclosure This is not a paid post and cognitiveSEO didn’t make any kind of agreement with the author. This is the successful Penguin recovery story of Dale Gillespie, written and documented by himself.

20 years experience in the motor trade in systems, sales, marketing and analysis roles. Responsible for heading up the Internet car sales team along with online automotive marketing and website development for JenningsMotorGroup.co.uk. Recently refreshed his skills with a Post Grad in Digital Marketing in 2012/13.

]]>http://cognitiveseo.com/blog/7402/google-penguin-recovery-301-redirects-404/feed/21Defend against SERP Hijacking before You Lose Your Rankingshttp://cognitiveseo.com/blog/7255/defend-against-serp-hijacking-before-you-lose-your-rankings/
http://cognitiveseo.com/blog/7255/defend-against-serp-hijacking-before-you-lose-your-rankings/#commentsTue, 24 Feb 2015 10:37:23 +0000http://cognitiveseo.com/blog/?p=7255Site hacking poses a constant threat for webmasters and online business owners. Whether you’re just interested about how to deal with hacking, or unfortunately you’ve received a “Notice of suspected hacking” message in Google webmaster tool, in this article I`ll try to give you some tips on how to deal with it and how to even […]

]]>Site hacking poses a constant threat for webmasters and online business owners. Whether you’re just interested about how to deal with hacking, or unfortunately you’ve received a “Notice of suspected hacking” message in Google webmaster tool, in this article I`ll try to give you some tips on how to deal with it and how to even prevent it in the future. Even though Google does what it can to stop spam and SERP Hijacking, black hatters still use this shady tactic to rank up various sites. The hacking can basically go two ways: one is inserting links and keywords pointing to a business the hacker wants to rank higher in and the other is inserting malware with the purpose of infecting end users to gain financial information. This article will tackle the first kind of hacking, the one we call, in the online world, SERP Hijacking.

In this dreaded circumstance, a hacker gains access to your website and inserts keywords and links pointing to his business. Worst of all, these links and keywords are usually hidden form the webmasters and the website looks the same way they’re used to see it day after day. Only after accessing the pages from the SERPs a totally different website will be displayed, the one inserted by the hacker. This leads to huge loses in traffic and can even bring your own website a Google penalty and will give you a lot of work to do in order to recover from the damages. Luckily there are ways to catch the hacking before it does enough damage and even before Google takes note of it and starts throwing penalties left and right.

Why Is “Hacking Sites for Rankings” Such a Profitable Business?

Even though Google wages a constant war on hacking with constant algorithm updates, hacking for rankings is still used today. And why is it so often used? Unfortunately, because it works. Simply take a look at the Google search results for highly competitive words like: buy Viagra, buy Cialis or payday loans.

As you can see for yourself, from the first 5 results, 3 of them are hosted illegally on servers that have nothing to do with Cialis. They latch onto high authority and innocent domains to climb the search engine ladder.

Due to the illegal nature of this technique, most people began to call these hackers, crap hatters.

This kind of hacking is very appealing to black hatters crap hatters. Clicking the results will redirect users to completely different pages through the help of a cloaking script with clever HTTP_REFERER identification.As stated before, the problem here is that the webmaster will access the hacked site only through the search results page. This could lead to a late detection of the hacking which, in turn, will lead to loss of traffic and potential Google penalties.

Only after a closer inspection of the URL can we see that the domains don’t have anything remotely related to drugs:

This technique is called Parasite Hostingand it’s defined as hosting a webpage on someone’s server without their consent. The hackers usually target high authority domains that rank really well in Google. The websites point to a certain site that they wish to promote and boost their position in the SERPs. As the name suggests, these pages are parasites, feeding on the high authority earned by other websites in Google.

Luckily, Google is working hard to fix this issue. Their latest payday loans update took things down a notch, but there is still a big delay between the indexing of the hacked site and the moment Google uncovers the scheme. Hackers who use this technique still reap the benefits of raking in the top 10 of the SERPs as Google just can’t detect them right away. In the end they will be uncovered, and penalties will be applied.

You will always find this technique used together with other types of black hat SEO strategies for a better performance. They do have the power to influence rankings on their own, but if one goes through the trouble of hacking a children cancer foundation website, you can be sure other types of black hats techniques are used to promote their business.

Another black hat tactic that goes hand in hand with parasite hosting is Google Bombing. This technique is not at least new as it was first encountered in 1999, when a search for “more evil than Satan himself” provided in the SERPs the Microsoft homepage as the no. 1 result. A simple definition for Google bombing is heavily linking with the same anchor text, from many domains to the same URL. This will result in increased rankings for the link on the desired keyword. Google bombing quickly evolved from a simple prank to a often used black hat technique to boost rankings for some websites.

Google tries to stop Google Bombing from working, but black hatters still use it to their advantage these days.

Let’s take an example and see how it performed together with parasite hosting:

As you can see from the screenshot, this URL has zero history, which shows that it’s a new page, and yet in the last days it has generated huge spikes of new links. This seems abnormal, a more in-depth look at the domain’s past link gain is required.

The website has a strong, constant link building tactics, starting from 2008. The domain build a solid authority making it a perfect target for some black hat action. The last 90 days graph on the right raises suspicion since in only a couple of months it gained thousands of links. Let’s take a look from where the links come from:

The newly generated links are actually parasite doorways. Doorways to the landing pages of drug stores for Viagra actually. What this kind of action does is two things:

redirect all the organic traffic from your website to the hacker’s page;

once Google discovers this, it will penalize your website, resulting in more lost traffic, decline in trustworthiness and headaches for your business.

Hacked Sites in the SERPs. A Closer Look:

I gave you some examples for each of the black hats techniques above, but we should take a closer look at what a hacker does to your site. A hacker can gain access to your website either by finding out your login information or by taking advantage of some vulnerability in the widgets or plugins you are using for your website.

Let’s take the example of Community Partners with Youth:

We found out above that it appears in the SERPs for the “buy cialis” query. Directly access (not through the search engine results page) displays a normal website, nothing too shady at first glance. To see what pages are indexed by Google, we can do a “site:cpymn.org” search. Let’s take a look at the results:

Using the query is a great way to uncover all indexed pages in Google and uncover for yourself, even before the search engines, any potential hacking. As you can clearly see only on the first page of results, there are 5 hacked pages. If they are removed before the warning from Google, you might not even get penalized. Let’s take a closer look at the code to see if we can spot the hacked content:

Here we can clearly see the not so hidden “HiddenDiv” filled with keywords and links pointing to websites that have nothing to do with the cpymn.org domain.

Hacking is a huge problem but you can deal with it if you catch it early on. Let’s move on to the prevention part.

How to Identify If Your Site was Hacked for Ranking Purposes

The first way to identify the problem is closely related to what I did in the example above. Simply set up a Google Alert for the words “site:domain.com”. This way you will be instantly notified by any new content indexed by Google, and if that content looks shady to you, you can take the necessary action to remove it as soon as possible, even before Google detects that it is actually a hack.

To set up an alert in Google you can access this page. In the query insert the “site:yourdomain.com” and I recommend to set it for As-it-happens to get instantly notified. You will receive the alerts on your email account and hopefully you will have enough time to react to them should worst come to worst.

The second method to identify the hacking is with the help of Google Webmasters Tools. Inside the tool you will be able to set up e-mail alerts in case Google detected that you’ve been hacked.

Unlike the first method in which you notice the hacking before Google, in this one it’s Google who discovers the hacking for you. It might be a bit later and action against your site could have already been taken, but it’s still a good way to discover it.

How to Secure Your Site

But as usual, the best way to protect yourself from hacking is making sure your website is unhackable (or really, really hard to hack). It’s worth to take the extra steps in order not to be an easy target for hackers.

1. Block WordPress Login

Start by limiting access to the “wp-admin” folder in WordPress. Allow only a selection of IP addresses to the login section of the WordPress login and redirect others to a page of your choosing.

2. Use Only Trusted WordPress Plugins

I can’t stress enough this point, WordPress plugins have been a target for hackers since they were first introduced. In the summer of last year we witnessed a hacking of more than 50000 of websites, thanks to a WordPress plugin which had major security gaps in its code.

Unfortunately there is no “trusted plugins only” tab in the plugin section for wordpress, but before installing any plugin to your website take some time to research it, its developers and see what other people think of it. This 30 minute research may save you a lot more hours in the future if you have to recover from a hacking.

3. Have a Backup Plan

There is great stress in having an online business and knowing that it could be hacked at any moment. A great treatment for this stress comes in the form of having daily backups for your website. You surely don’t want to wake up in the morning and by the time you take your first sip of coffee notice that your life’s work is off the grid. Such a waste of a good coffee.

Luckily there are many ways of backing it up. And the backups can be done automatically.

These few lines of code can give you the good night sleep you deserve. It only takes a couple of minutes to insert them into your server and you’ve covered the worse case scenario.

4. Use a Website Firewall

A great tool to add additional protection to your website is provided by the Sucuri team. This is one of the extra steps earlier. Not only that it gives extra protection against hacking attempts, it also provides defense against DDOS attacks. And as a plus it offers Load Balancing and High Availability for your websites.

Setting up the Sucuri solution for WordPress is easy. All you have to do is install the plugin from the CMS Dashboard, activate it and configure it to your choosing. They even provide a quick guide for it here.

How does Google Tackle the Hacking Problem

Above you have the extra steps that you can take to prevent hacking. Google provides its own security tools to help out webmasters:

The Google service diagnoses your website for any malicious software. The Safe Browsing tool enables apps to verify URLs against Google’s constantly updated lists of suspected phishing and malware pages. Using this service, users will be notified before clicking on links that they will lead to malicious pages. It even prevents hackers to post the links to known phishing pages from your website.

How to Recover a Hacked Site

No matter how quickly a webmaster fixes the security issues, recovering from the penalties and hacking takes a while. Google is trying to make the recovery process easier for webmasters with the help of features like Security Issues, and a special section on the forums designated for hacked sites. They shared two interesting recovery stories in their latest blog post.

The first case was a WordPress user who received a notification in Google Webmasters Tools regarding their website being labeled as hacked. The webmaster found his code filled with “buy viagra” and “buy cialis” keywords”. Following the cleanup the webmaster had made, Google still warned him that there were still hacked scrips hidden inside the .htaccess file. Only after fixing that issue did Google remove the hacked site label.

The following are the steps the webmaster takes in order to keep his website clean:

Keep the CMS updated to its latest version. The same goes for the plugins;

The second case presented in the Google’s blog post is one in which a business owner had a hacker plant some pages on his domain. Hidden from GoogleBot the pages were untraceable using the “fetch as googlebot” feature in WordPress. The webmaster finally fixed his issues and after a couple of reconsideration requests, Google removed the hacked label. Here are the following steps the webmaster takes in order to avoid being hacked:

Use SFTP when transferring files;

Secure access to the .htaccess file;

Be vigilant and look for new and unfamiliar users in the admin panel or any place where users can modify the website.

Consequences of a Hacked Site

The aftermath of a hacked site is always hard to deal with. But it’s not impossible to recover from it. Let’s take this example from a forum member of WebmasterWorld.

In his case, just like many others, the hacking came from a pharma network and started inserting viagra/cialis pages on the user’s server. Hackers hid the pages from users, making them visible only for the search engine bots. They could have used a form of hacking like I have explained above in the article. The website was originally a WordPress site and the hackers managed to move it to Drupal and stopped inserting additional sites to the server after the migration.

The webmaster did not notice the hacking before Google did and it was hit by two Google penalties. One was received for the hacked pages, and one for unnatural links. Only after the Google penalties were sent the webmaster took action to remove the links from the webpage, disavow the unnatural hacked links using the disavow tool and send the first reconsideration request to begin the recovery process. As you can see from his report, Google did remove his unnatural link penalty. It was only after the second reconsideration request Google removed his hacked pages penalty.

We can draw some conclusions from the mishaps of this webmaster. Google understands that the unnatural backlinks pointing to your website are actually for the hacked pages. Even though it penalizes for both unnatural link profile and the hacked content, Google is willing to remove the penalties after you have taken action to correctly remove the problems on your website.

Conclusion

Going through a hacking problem is one of the most stressful and complex issue a webmaster will ever go through. Even if your business doesn’t rely too much on organic traffic, the penalties received from Google, and having a note on your website that says “This site may contain malicious software” will ultimately bring heavy losses. I don’t want to sound too pessimistic about it, but hacking is a constant threat for online businesses. It’s best if webmasters spend a couple of hours to make sure the website is as secure as possible, than spend weeks or even moths to recover from hacking.

It’s encouraging to see that Google helps a lot in this matter, with the constant updates made to the algorithm, they will catch the hacks faster. Also, using the security tools available will make your website more secure and most of all you will have a backup plan to fall onto if worst comes to worst.

So what other ways or tools do you use for extra defense against hackers?

]]>http://cognitiveseo.com/blog/7255/defend-against-serp-hijacking-before-you-lose-your-rankings/feed/4Your Digital Marketing Dashboard Now Supports 3rd Party Metricshttp://cognitiveseo.com/blog/7260/digital-marketing-dashboard-now-supports-3rd-party-metrics/
http://cognitiveseo.com/blog/7260/digital-marketing-dashboard-now-supports-3rd-party-metrics/#commentsFri, 20 Feb 2015 09:59:00 +0000http://cognitiveseo.com/blog/?p=7260As we think you already got used to, we at cognitiveSEO are ceaselessly working on improving our existing features or adding new ones in order to offer the best business solutions for our Cognitives, meaning our great customers. You might be already familiar with our fully customizable SEO Dashboard that we’ve launched around half a year ago and […]

]]>As we think you already got used to, we at cognitiveSEO are ceaselessly working on improving our existing features or adding new ones in order to offer the best business solutions for our Cognitives, meaning our great customers. You might be already familiar with our fully customizable SEO Dashboard that we’ve launched around half a year ago and as we believe that innovation is the key to success, we’ve included some new features to it. We added more widgets to our already famous SEO Dashboard in order to give you more customization power. This update is about how quickly you can get your data and how you can adjust them to fit your needs.

And since the online world is all about collaboration, don’t forget that these dashboards can be publicly shared with your customers or team members.

To get a better grasp of the dashboard, you need to see it. Here is a live dashboard. Give it 15-30seconds to load. It’s stuffed with great widgets!

The dashboard is great for monitoring, pitching and reporting!

Let’s browse a bit through all the widgets that we’ve added and let’s see each one’s superpower.

1.HTML Widget - Add Your Own Third-Party Metrics

Wouldn’t be great if on your digital marketing dashboard you could add whatever widget you want, related or not with the tool, real-time updated ? Well, we have a very good news for you: now you can! We promised a fully customizable dashboard and this is what we are going to deliver. Regardless if you want to add data from the weather channel or you want to monitor exchange rates, conversions, KPI or sales, you now have the possibility to this with our newly fresh HTML widget. You have no restrictions in terms of how many HTML widgets you can add, therefore, if this answers your needs, you can create a dashboard exclusively with this kind of widgets.

Another great thing about this, along with the fact that you can add external and internal data that you are most interested in, is that you can set up a reload time starting from 1 minute and you can add your own URL. This means that you will get accurate data from what interests you the most in real time. Whether you want to analyze the volatility of the exchange rates or you want to monitor your sales in real time, the HTML widget is the perfect solution for you.

After valuable feedback from our customers and a serious research of the market, we concluded that seeing the alerts directly on the dashboard is highly productive and it really comes in handy. So, we decided to add the following widgets to the marketing dashboard:

Alerts

Alerts – New Links

Alerts – Lost Links

Alerts – Rank Tracking

All Alerts List

Alerts List – New Links

Alerts List – Lost Links

Alerts List – Rank Tracking

If TV is now in the era of the 24-hour news networks, the web is a network that has always been a 24-hour news affair. We realize getting the news as it happens can be crucial to businesses, so now we give you the opportunity to live in the moment. You can choose from a multiple variations of alerts and what is even greater is that once added, the widget is clickable and it works as a shortcut for the Alerts module. So, you can just take a look at the numbers of your alerts or you can go further with your analysis and with just one click you can find out what your actual alerts are.

You can view alerts about new links, lost links or rank tracking directly on your dashboard and keep track of what has been influencing your ranking. Seeing new keywords you rank for is a piece of information that can become valuable instantly and we want you to be able to use that. If you see that a new keyword has become relevant to your page, you can adapt accordingly before others pick up on the trend. With rankings being as volatile as they can sometimes be (and seemingly insignificant changes actually having a great impact on your place in the search result pages), this is one item we wanted to make sure you stay on top of. Live links and broken links come in hand because they can greatly influence traffic but might also give you clues on how much you’re prone to penalties.

3. Anchors Widgets – Customize Your Link Power Board Even More

We’ve realized that keeping track of your anchor text general “health” and keeping up to speed on a need basis can be essential. This is why we’ve came up with the following list of widgets for the anchors:

Anchor Text Cloud

Anchor Text Distribution

Anchor Text History – Live Links

Anchor Text History – Lost Links

Anchor text distribution can now be used as a fine-tuning instrument, since you can look at it by time interval and you are also getting updates on live and broken links anchor text evolution. This gives you a very valuable tool in terms of timely updates and adjustments to your overall strategy. We also added a full anchor text cloud, which can also be viewed for a specific time interval. See how last week’s anchor text cloud compares to the one from this week, or to the one from 2 months ago. This way you can keep a close eye on what has only recently become relevant, but also on what used to be relevant but now does not attract as many views (it might be time for a slight tweaking, or even for a complete re-organization).

4. Text Widget – Add Your Own Labels & Descriptions

In addition to all these alerts, we figured to give you your very own custom widget where you can add your own text and set what you want to be made aware of.

We know that everyone’s needs are specific and despite our best efforts we cannot offer a one-size-fits-all solution. Which is why we’re giving you the possibility of trying out various combinations of alerts for new links, lost links and ranking changes. You know your business and what kind of information you need best and you should be able to take advantage of that. With the text widget you have the possibility to segment your dashboard just the way you want, defining well-established categories that interests you the most. The example below shows you how you can define your dashboard segments and choose to create specific widgets for Google Penalty monitoring, for important metrics that matters most to you at a certain point or you can just make things clear for your clients or the team you are sharing your SEO dashboard with.

Just because you’re watching your business doesn’t mean you cannot look sideways too. In fact, we recommend it. Which is why one of the new alerts that we’ve created allows you to look at the top 10/20/30 rankings entries and exits for all tracked or specific keywords. You can choose the keyword and the search engine to watch for (just because one of them is more famous doesn’t mean the others aren’t worth chasing) and what type to change to pick up on (entry, exit, improve or decline). Add as many triggers as you want and give each of them a unique name so you can keep better track. Now you’re not just looking at your business, you’re looking at your business in the context of the business domain.

What is Coming Next?

We’re not stopping here. More widgets will be added to the dashboard in order to provide a very flexible and customizable marketing dashboard that both technical and management people will use. In the meantime, take the time to get acquainted to our newest offering and get full advantage of your dashboard whichever way fits you best.

What’s the next widget you want to see on your Digital Marketing Dashboard?

]]>http://cognitiveseo.com/blog/7260/digital-marketing-dashboard-now-supports-3rd-party-metrics/feed/68 Renowned Experts Bust Common Google Disavow Tool Mythshttp://cognitiveseo.com/blog/7282/8-renowned-experts-bust-common-google-disavow-tool-myths/
http://cognitiveseo.com/blog/7282/8-renowned-experts-bust-common-google-disavow-tool-myths/#commentsThu, 19 Feb 2015 10:43:20 +0000http://cognitiveseo.com/blog/?p=7282Google wants us to use the disavow tool when we get a notice from them telling us we have websites pointing dodgy links at us. Although this would seem like a straight forward task, there are some myths and elements of confusion that have built up around using the disavow tool. As there are numerous […]

]]>Google wants us to use the disavow tool when we get a notice from them telling us we have websites pointing dodgy links at us. Although this would seem like a straight forward task, there are some myths and elements of confusion that have built up around using the disavow tool.

As there are numerous people who are far more experienced and expert on this subject, I decided to ask a few. Here is what they said.

I asked:

What do you think the myths of using the Disavow tool are and what is the thing that most people seem to get wrong?

Mark Porter

@markcporterScreamingFrog.co.uk
There does seem to be a few misconceptions around the disavow tool recently that, based on our experience, simply aren’t the case. The main ones for me are:

Myth – You always experience a recovery after using the disavow tool

Unfortunately this simply isn’t the case. While it is dependent on the current situation (if it’s a manual action or algorithmic, such as penguin), we’ve found that it largely depends on what genuine, reputable links the site has, aside from the unnatural links that are being disavowed. If the domain does not have the backlink profile to support itself once these unnatural links have been disavowed then you are unlikely to experience a full recovery. It may be that these unnatural links were the only reason the site was ranking in the first place, in which case you’ll probably tumble away into the dark depths of Google.

Myth – You should ALWAYS disavow at domain level

While this may be true 95% of the time, there are some exceptions where it doesn’t make sense to disavow a link at domain level. For example, perhaps you have a client who used to send out press releases with multiple anchor text links in them, which got picked up or syndicated on a decent high value site. It’s usually best to disavow these at URL level in case they receive natural pickup from these sites further down the line.

Myth – You can pick and choose your links

“I believe this was one of the arguments on why the disavow tool shouldn’t be used in a recent moz post. The disavow tool is not there for you to select which links are helping or hurting the sites, regardless of how natural or unnatural they are. Links should be audited based upon whether they are meeting Google’s guidelines or not. Unnatural links which are undoubtedly helping a site may have to be disavowed to avoid problems in the future – you have to take it on the chin and move on!”

Myth – You have to make an effort to remove links

Again this is somewhat dependent on whether the site has a manual action or algorithmic, but we’ve found that just using disavow tool is sufficient and that no efforts need to be made repeatedly trying to get links removed from sites.

Krystian Szastok

Some people (including me at the very early stages) used to believe that people read disavow files, so I used to put comments explaining why certain domains were disavowed.

What’s the thing that most get wrong:Disavowing at link level.

Most websites – especially the spammy ones – will give you many links from the same domain. Directories are a good example, you get a link from your listing and from any type of a category page.
Lesson: always disavow at domain level.

One more thing many get wrong: trying not to disavow all keyword rich anchor text links.

If you got penalized – you need to disavow any suspicious link out there, otherwise your chances of getting out are slim.

Getting out of a penalty at the first try is a skill. One doesn’t really do it without any past experience. Any number of software can dictate to you which links are bad and which are good.
I use Cognitives Link Navigator which allows me to manually go through each link – one per domain – and review them by my own eye. I judge each link on the basis of ‘does this link have purpose other than giving me some tiny little bit of PageRank’. If the answer is ‘yes’ then the link usually stays. If the answer is ‘no’ – I remove it.

Marie Haynes

I think that the biggest myth out there when it comes to using the disavow tool is that it doesn’t work.

I have seen so many people say, “I used the disavow tool, and my rankings did not improve, so it does not work!” But, there are many other factors that can be in play. For example, perhaps the site had very few good links and was only ranking well previously because of the power of links that are now being considered unnatural. Or maybe they disavowed the wrong links. Or perhaps the site is also dealing with other issues such as suppression by the Panda algorithm.

I have seen many sites who have escaped Penguin, or have had a manual link penalty removed, who would not have been able to succeed without using the disavow tool. In my opinion it is a vitally important tool that needs to be used by any site that is suffering because of the presence of unnatural links.

Another myth surrounding the disavow tool is that you can do damage to a site by disavowing it.

If I disavow my competitor I am not going to do any harm to my competitor’s site. I’m just going to ask Google not to count any links from their site to mine when it comes to calculations for algorithms like Penguin. I do not believe that Google crowd sources data and says, “Ah! This site has been disavowed by many people. Let’s penalize it!”

And a third misconception about the tool is that it is read by the webspam team.

I see a lot of cases where the disavow tool was used with the intentions of it being a reconsideration request. In these cases, the first line of the disavow file contains a comment that says something like this:

#We contacted all of the sites below and tried to get links removed but could not. Please reconsider our site as we are not ranking well because of past SEO link building efforts.

However, disavow files are completely machine read and will not be seen by a Google employee. The comments in the file are really for your own use to make it easier to understand changes you make as you continually update the file.

What do people most often get wrong when using the disavow file? I think that the most common mistake that I see is that the wrong links are disavowed. The most common reason for this to happen is that too much reliance has been placed on an automated link auditing tool. These tools can be useful when used in conjunction with a manual link by link audit. But, if you are running an automated report and going straight to disavow then you are definitely missing unnatural links and there is also a decent chance that you will disavow good links.

I think that the disavow tool can be extremely helpful if used correctly.

Chuck Price

@ChuckPrice518reconsideration.org
This is a great question, as there are very few tools that have generated as much FUD as this one. The most common misconception is the disavow tool doesn’t work. It does.

For a manual penalty, the disavow file works, when used as a last resort,

That means that a full fledged and well documented link removal campaign must precede it. The disavow file, combined with a detailed reconsideration request, is a core component in successfully getting a manual penalty revoked.

For an algorithmic hit (Penguin), the disavow file also works.

What “most” people get wrong is that a manual link removal effort is NOT required.

In fact, manual link removal is a waste of resources. Since most link removal campaigns result in 5 – 10% of the links being removed, it is generally not enough to move the dial.

The Penguin algorithm is driven primarily by a good link to bad link ratio. Your time is better spent replacing the disavowed links with good ones.

Another point most people get wrong is their expectation for what a “recovery” looks like.

Many people expect that a recovery means regaining their former SERPs after they have escaped a manual penalty or algorithmic hit. Since many of the links that once propped up SERPs are now in the disavow file, that’s not going to happen. At least not until those links are replaced with good ones.

Gabriella Sannino

@SEOcopylevel343.comOo dear, dare I say I never heard of the “myth” rather opposition to the action? We’ve used the disavow process with several clients who had either been penalized or we took proactive measures when we noticed a large amount of toxic links. Granted you can’t say definitively that any recovery came directly due to the disavow process, however, they (clients’ sites) show signs of recovery when we’ve done them.

In regards to what people seem to get wrong is their disavow specific links rather than a domain.

In our case, when we have proactively used the disavow tool (without having been manually penalized), I would say results are inconclusive, but show STRONG correlation. But I would still advise and recommend doing it as part of any link profile clean-up.

Rusell Jarvis

Myth 1: Submitting links for disavowing guarantees that your site will be out of the red for link malpractice immediately.

Reality:
If your site has been negatively impacted by penguin you generally have to wait until Google crawls all the domains requested and wait for the next penguin update to roll out before seeing full recovery in SERP visibility.
If you have been hit by a manual penalty it usually takes a lot more work than just submitting domains to be disavowed.
Manually removing spammy links to your site trumps disavowing links.

Myth 2: Using the disavow tool flags your site as using spammy link building practices to Google.

Reality: This is untrue. Google does not flag your site as spam for using the disavow tool. They are usually very good with picking up unnatural link activity and will probably hit you with a manual penalty on detection. Same goes for having your domain submitted in another sites disavow list, remember links that may seem unnatural for some sites can be more natural for others (it depends on industry, content, context and a bunch of other factors).

Myth 3: Once you disavow links there is no going back.

Reality: This is untrue as Google has confirmed that it is possible to reavow links. You simply have to remove links from the disavow list and resubmit it. However, reavow requests take much longer to be recognized as Google takes even stricter precautionary measures in this process to aid combating spammers who are trying to find loopholes. They usually crawl a domain a couple of times before revoking the disavow request.

On the topic of resubmitting a disavow list, remember that when you submit a new list in the tool it overrides the old list. Thus, always work from the original list and update it. You can also add comments between using hashes (#–add comment— #) to the file to make notes for yourself and other webmasters that may need the info in future.

Don’t rely only on tools to classify your links as spam or quality. Nothing is more accurate than the eye and instincts of an experienced search engine marketer. It may be a tedious task working through hundreds or thousands of links and categorizing their quality, but I guarantee it’s worth the time and effort. Disavowing too many links (including good quality links) is just as dangerous as disavowing too little.

In short, marketers shouldn’t test the disavow tool … it should be used only when necessary.

It is not there to clean up after the mess of knowingly dodgy link building practices.

Emory Rowland

The worst and most deceptive myth is that uploading a list of unsavory domains with the disavow tool gets you out of jail.

Why would Google want you to do just this? Consider the absurdity of this cycle. You build links. Get a short term boost. Blame it on spammers. Disavow. Build more links. Rinse and repeat. Soon someone creates automated disavow software. So, you no longer even have to think about the consequences of bad links, just click the auto disavow button.

Somehow, people forget that the point is to lose the links. And Google is taking account of the ones that fall off. The disavow tool doesn’t prove your motivations. Actions speak louder than disavows. Get it right the first time by disavowing and losing links.

David Cohen

@explorionary
In my experience, I believe the worst Disavow Tool myth of all is that you must remove some or all links in order to restore the organic search visibility and traffic that’s been lost because of a Penguin-related issue. Using the Disavow Tool and blindly removing links on a wholesale scale isn’t a smart and strategic approach to solving the problem, it’s sloppy.

It’s not a myth that you can successfully use the Disavow Tool, not remove links, and experience a restoration of what’s been lost.

Pure opinion – but I feel the last step you’d ever want to take is actually removing links to your site as part of the entire disavow and restoration process. The best way to both avoid the myth and avoid making your site perform worse is to hire a professional who has a proven track record of solving your Penguin-related problem. Using amateurs will always end up costing you more than using professionals in this scenario.

Conclusion

I would like to thank all those who took part in the expert round up for taking the time to participate as I know we all get very busy.

As we can see there are some very interesting issues raised here by our panel of experts.

While some people disagree with certain techniques regarding the disavow tool, one thing is certain, you have to be careful and you have to know what you are doing.

If I can make a shameless plug here, cognitiveSEO has a very useful unnatural link detection tool that will help a lot in the disavow process.

]]>http://cognitiveseo.com/blog/7282/8-renowned-experts-bust-common-google-disavow-tool-myths/feed/16Content Pruning – the Technique That Will Protect Your Rankings from Google Pandahttp://cognitiveseo.com/blog/5587/content-pruning-the-technique-that-will-protect-your-rankings-from-google-panda/
http://cognitiveseo.com/blog/5587/content-pruning-the-technique-that-will-protect-your-rankings-from-google-panda/#commentsTue, 17 Feb 2015 10:18:33 +0000http://cognitiveseo.com/blog/?p=5587Time has passed since the Panda 4.0 update and there’s still a lot of talk around this subject. At that time, the headlines were filled with words of panic, which foretold an impending doom. Yet, time proved that the latest Google Panda shouldn’t be brought in discussions in terms of penalties. Instead, targeting and deranking low […]

]]>Time has passed since the Panda 4.0 update and there’s still a lot of talk around this subject. At that time, the headlines were filled with words of panic, which foretold an impending doom. Yet, time proved that the latest Google Panda shouldn’t be brought in discussions in terms of penalties. Instead, targeting and deranking low quality content would be more appropriate concepts to bring into discussion. So it seems that the Panda update should affect the ones who do not comply with Google’s idea of valuable content: creating high-quality content, that is both useful and entertaining to the reader.

It’s all the rage in the SEO community and specialists are looking at ways of improving their strategies in order to work around it. Most webmasters rely on digital marketing strategies that produce content on a regular basis. The content production may be represented as a blog tab on the site or just a section of press releases. But in it’s greater form, it may be a large-scale SEO strategy that prunes old content which still generates traffic. If we were to generalize to the extreme the SEO formula, you could say that your success may come from the number of backlinks and the number of indexed pages. With this raw formula in mind, the underperforming content that gets indexed by Google might pull the whole site down in ranking. The problem is that many webmasters tend to look only forward when they are optimizing their site for search engines, leaving the old content behind.

New and up-to-date content is always going to be the focus point, but people tend to ignore the old content. And that’s how everything tends to transform into a huge pile of pages that gather little to no traffic for the site.

After Panda 4.0 was run, this forgotten bundle of underperforming content may even drag your site’s ranking down. That’s why content pruning should become an essential part of your ranking strategy. Even if in the past a lot of SEOs would argue that this was a valid point of view, I think that Google Panda 4.0 showed this is a possibility.

What Is Content Pruning?

Since Google rolled out the Google Panda 4.0 Update, some webmasters were forced to reduce the number of indexed pages and began pruning low-quality content that didn’t add any value to the site. It’s not easy to take the decision of removing indexed pages from the Google Index and, if handled improperly, it may go very wrong.

Your sites’ obsolete or low quality content may be one of the problems that generate a drop in rankings.

Google made a lot of effort to make the search algorithm and detect quality content, just as human visitors would do when they read it. This means that low quality pages may affect the performance of the whole site. Even if the website itself plays by the rules, low-quality indexed content may ruin the ranking of the whole batch.

Content pruning isn’t a technique that should be taken into consideration only by the small sites or the emerging businesses. Reuters, for instance is one of the biggest and the most known international news agencies, managing to be an important player in the field since the last century. Yet, they too should prune their content in order to give the user the best experience there is. As we look at the screenshot above, we can see a list of pages that can hardly be found by searchers, therefore, they shouldn’t be indexed. Moreover, the listed pages don’t offer valuable info, have duplicate content (highly penalized by the Panda Algorithm) and definitely pull the whole site back than pushing it forward in rankings.

Why Should I Prune My Own Content?

Every site has it’s evergreen content, which attracts organic traffic, and some pages that are deadwood. In the Google Index you should only have pages that you’re interested in ranking. Otherwise you might end up polluting your rank. Those pages filled with obsolete or low quality content aren’t useful or interesting to the visitor. Thus, they should be pruned for the sake of your website’s health.

Keep the Google Index fresh with info that is worthwhile to be ranked and which helps your users.

Those low-quality pages that make up for a very stale user experience, are considered unacceptable from Google’s point of view. Your site should be cleaned of these pages because it makes for a very poor and confusing user experience for the searcher. It might look like you’re an old site with lots of pages, but if you’re not on the point with your content it only means you’re writing for the search engine not for the user.

How Should I Prune My Own Content?

To successfully prune your own content you should follow these steps:

1. Identify Your Site’s Google Indexed Pages

For this task you have two methods that you may use in order to identify all your site’s indexed pages:

Method A. The first one, explained below, uses Google Webmaster Tools. The disadvantage of using this method is that the list of website pages displayed does not give only indexed pages. Here you’ll find all the pages found by GoogleBot during the crawling. However, you have a total number of indexed pages and a graphical display of it’s evolution in the Index Status.

You can access the whole bundle of internal links for your website by going to Search Traffic > Internal Links in the Google Webmaster Tools. Here you’ll be able to download the data into a CSV or Google Docs format. That way you’ll have a list with all the pages from your website ( indexed or not ) and also the number of links pointing to each. This can be a great incentive on discovering which pages should be subjected to content pruning.

This method is recommended for very big sites that have a lot of pages.

Method B. Site:Domain Query and Extract Only Indexed Pages.

While the first method took the whole bunch of pages GoogleBot crawled for your site, this one delivers a list that contains only the indexed pages. By using the command site:examplesite.com you will only receive the actual results that are displayed in the search engine results page.

In order to hasten the process you need to modify some settings from the search engine. You need to go to the Search settings page and set the Results per page to 100. That way, you’re going to see more results per page. And you should also check the Never show Instant results option from Google Instant predictions. Given that you’re going to see much more results per page, you want to remove any clutches you might encounter.

The next step is done with the help of a bookmarklet which scans the results displayed in SERP and generates another window with a numbered list that contains the links and their anchor text. To install this bookmarklet you’ll have to click and drag this button onto the Bookmark bar above:

Be sure that you are viewing the SERP page as you activate the bookmarklet. As you can see from the image below, it will generate a list with all the links and all the anchor texts from the SERP. Remember, if you have sites that have more than 100 links, you have to repeat the process for all the results pages. You should also be aware of the fact that for big sites, the process may take a while. You just have to be thorough with this process and try to grasp as many indexed links as possible.

2. Identify Your Site’s Low Ranking Pages

To identify the low ranking pages you can use Google’s own free tool – Webmaster Tools. Google Webmaster Tools provides accurate data regarding the number of impressions, clicks, click-through-rate and average position. The feature that helps you to view all these data is called Search Queries and you can reach it in the Search Traffic category. Here you will have a list containing the top pages from your website. Identifying low ranking pages becomes a really easy process as you can order the list by the number of clicks in order to see the least performing pages.

3. Identify Underperforming Pages

Understanding a site’s structure and identifying the obsolete paths and pages are critical in order to “clean up” your content. You can use metrics from Google Webmaster Tools such as number of clicks or number of impressions in a certain period of time. This way, you can check the interest of the user in reading a certain page or his involvement in doing something actionable on your site. For instance, you can use the “Clicks” element in order to show the pages that have zero clicks for a certain query. This way, you can have some insight about what your users are in to.

Also, you can use these metrics to make data analysis regarding the performance of your content. For instance, you can download the data provided by the GWT in CSV or Google Docs. Once you have these files, you can mark the pages that are underperforming , keep an eye on them and start filtering the content posted there in order to get the most out of your content pruning campaign.

4. No-Index the Underperforming Pages

After you have successfully identified the pages that don’t bring you any added value, you should start to no-index them. If you want that pages to disappear from Google’s radar, you need to entirely block them from crawling. This way, they won’t be indexed by Google. Here are two ways you can do this:

You can disallow those pages in the robots.txt file to tell Google not to crawl that part of the website. Yet, you should keep in mind an important consideration if you plan on using the robots.txt method: be sure you do the right stuff. You may break your rankings easily with it!

You can tag certain pages. After you identify the pages that should be de-indexed, just apply the meta no-index tag to them. If you want the page to be followed by Google Crawlers you should add the <META NAME=”ROBOTS” CONTENT=”NOINDEX, FOLLOW”> tag. That will tell Google not to index the page but at the same time to crawl other pages that are linked from it.

Don’t Jump the Gun

De-indexing some pages or part of them is quite a big decision. So you need to really think it well before you start implementing it. Even though content pruning may present itself as a workaround to the recent Google updates, you should use it with caution. As with everything in life, don’t abuse this. It’s natural to feel the need to solve your issues swiftly, especially when you’re facing major rank and traffic drops. But be sure to do it gradually and know exactly what you block so you do not block entire folders to be crawled or to block the Google crawler from accessing important sections of your site.

Conclusion

Good content takes a lot of time to be built. Pruning it might take even longer but it pays off on the long run. While focusing on providing new content to your viewers should be the main priority, you should still overhaul the old content. Neglecting jaded content could harm your website’s ranking!A content pruning campaign is not effective just from a ranking point of view but could be also a part of the content marketing strategy. Having a high-quality content will increase the overall credibility of your site, will improve the user experience and therefore will positively affect conversions.

]]>http://cognitiveseo.com/blog/5587/content-pruning-the-technique-that-will-protect-your-rankings-from-google-panda/feed/9Critical Mistakes in Your Robots.txt Will Break Your Rankings and You Won’t Even Know Ithttp://cognitiveseo.com/blog/7052/critical-mistakes-in-your-robots-txt-will-break-your-rankings-and-you-wont-even-know-it/
http://cognitiveseo.com/blog/7052/critical-mistakes-in-your-robots-txt-will-break-your-rankings-and-you-wont-even-know-it/#commentsThu, 12 Feb 2015 09:56:16 +0000http://cognitiveseo.com/blog/?p=7052The use of a robots.txt file has long been debated among webmasters as it can prove to be a strong tool when it is well written or one can shoot oneself in the foot with it. Unlike other SEO concepts that could be considered more abstract and for which we don’t have clear guidelines, the […]

]]>The use of a robots.txt file has long been debated among webmasters as it can prove to be a strong tool when it is well written or one can shoot oneself in the foot with it. Unlike other SEO concepts that could be considered more abstract and for which we don’t have clear guidelines, the robots.txt file is completely documented by Google and other search engines.

You need a robots.txt file only if you have certain portions of your website that you don’t want to be indexed and/or you need to block or manage various crawlers.

*tks to Richard for the correction on the text above. (check the comments for more info)

What’s important to understand in the case of the robots file is the fact that it doesn’t serve as a law for crawlers to obey to, it’s more of a signpost with a few indications. Compliance with those guidelines can lead to a faster and better indexation by the search engines, and mistakes, hiding important content from the crawlers, will eventually lead to a loss of traffic and indexation problems.

Robots.txt History

We’re sure most of you are familiar with robots.txt by now, but just in case you heard about it a while ago and since have forgotten about it, the Robots Exclusion Standards as it’s formally known, is the way a website communicates with the web crawlers or other web robots. It’s basically a text file, containing short instructions, directing the crawlers to or away from certain parts of the website. Robots are usually trained to search for this document when they reach a website and obey its directives. Some robots do not comply with this standard, like email harvesters, spambots or malware robots that don’t have the best intentions when they reach your website.

It all started in early 1994 when Martijn Koster created a web crawler that caused a bad case of the DDOS on his servers. In response to this, the standard was created to guide web crawlers and block them from reaching certain areas. Since then, the robots file evolved, contains additional information and have a few more uses, but we`ll get to that later on.

How Important Is Robots.txt for Your Website?

To get a better understanding of it, think of robots.txt as a tour guide for crawlers and bots. It takes the non human visitors to the amazing areas of the site where the content is and shows them what is important to be and not to be indexed. All of this is done with the help of a few lines in a txt file format. Having a well experienced robot guide can increase the speed at which the website is indexed, cutting the time robots go through lines of code to find the content the users are looking for in the SERPs.

More information has been included in the robots file through out the time that helps the webmasters to get a faster crawling and indexation of their website.

Nowadays most robots.txt files include the sitemap.xml address that increases the crawl speed of bots. We managed to find robot files containing job recruitment adds, hurt people feelings and even instructions to educate robots for when they become self-conscious.

Keep in mind that even though the robots file is strictly for robots, it’s still publicly available for anyone who does a /robots.txt to your domain. When trying to hide from the search engines private information, you just show the URL to anyone who opens the robots file.

How to Validate Your Robots.txt

First thing once you have your robots file is to make sure it is well written and to check for errors. One mistake here can and will cause you a lot of harm, so after you’ve completed the robots.txt file take extra care in checking for any mistake in it.Most search engines provide their own tools to check the robots.txt files and even allow you to see how the crawlers see your website.

Google’s Webmaster Tools offers the robots.txt Tester, a tool which scans and analyzes your file. As you can see in the image below, you can use the GWT robots tester to check each line and see each crawler and what access it has on your website. The tool displays the date and time the Googlebot fetched the robots file from your website, the html code encountered, as well as the areas and URLs it didn’t have access to. Any errors that that are found by the tester need to be fixed since they could lead to indexation problems for your website and your site could not appear in the SERPs.

The tool provided by Bing displays to you the data as seen by the BingBot. Fetching as the Bingbot even shows your HTTP Headers and page sources as they look like for the Bingbot. This is a great way to find out if your content is actually seen by the crawler and not hidden by some mistake in the robots.txt file. Moreover, you can test out each link by adding it manually and if the tester finds any problems with it, it will display the line in your robots file that blocks it.

Remember to take your time and carefully validate each line of your robots file. This is the first step in creating a well written robots file, and with the tools at your disposal you really have to try hard to make any mistakes here. Most of the search engines provide a “fetch as *bot” option so after you’ve inspected the robots.txt file by yourself, be sure to run it through the automatic testers provided.

Be Sure You Do Not Exclude Important Pages from Google’s Index

Having a validated robot.txt file is not enough to ensure that you have a great robots file. We can’t stress this enough, but having one line in your robots that blocks an important content part of your site from being crawled can harm you. So in order to make sure you do not exclude important pages from Google’s index you can use the same tools that you used for validating the robots.txt file.

Fetch the website as the bot and navigate it to make sure you haven’t excluded important content.

Before inserting pages to be excluded from the eyes of the bots, make sure they are on the following list of items that hold little to no value for search engines:

Code and script pages

Private pages

Temporary pages

Any page you believe holds no value for the user.

What we recommend is that you have a clear plan and vision when creating the website’s architecture to make it easier for you to disallow the folders that hold no value for the search crawlers.

How to Track Unauthorized Changes in Your Robots.txt

Everything is in place now, robots.txt file is completed, validated and you made sure that you have no errors or important pages excluded from Google crawling. The next step is to make sure that nobody makes any changes to the document without you knowing it. It’s not only about the changes to the file, you also need to be aware of any errors that appear while using the robots.txt document.

1. Change Detection Notifications – Free Tool

The first tool we want to recommend is changedetection.com. This useful tool tracks any changes made to a page and automatically sends an email when it discovers one. First thing you have to do is insert the robots.txt address and the email address you want to be notified on. The next step is where you are allowed to customize your notifications. You are able to change the frequency of the notifications and set alerts only if certain keywords from the file have been changed.

2. Google Webmaster Tools Notifications

Google Webmaster Tools provides an additional alert tool.The difference made by using this tool is that it works by sending you notifications of any error in your code each time a crawler reaches your website. Robots.txt errors are also tracked and you will receive an email each time an issue appears. Here is an in-depth usage guide for setting up the Google Webmaster Tools Alerts.

3. HTML Error Notifications – Free & Paid Tool

In order to not shoot yourself in the foot when making an robots.txt file, only these html error codes should be displayed.

The 200 code, basically means that the page was found and read;

The 403 and 404 codes, which mean that the page was not found and hence the bots will think you have no robots.txt file. This will cause the bots to crawl all your website and index it accordingly.

The SiteUptime tool periodically checks your robots.txt URL and is able to instantly notify you if it encounters unwanted errors. The critical error you want to keep track of is the 503 one.

A 503 error indicates that there is an error on the server side and if a robot encounters it, your website will not be crawled at all.

The Google Webmaster Tools also provides constant monitoring and shows the timeline of each time the robots file was fetched. In the chart, Google displays the errors it found while reading the file; we recommend you look at it once in a while to check if it displays any other errors other than the ones listed above. As we can see below the Google webmaster tools provides a chart detailing the frequency the Googlebot fetched the robots.txt file as well as any errors it encountered while fetching it.

Critical Yet Common Mistakes

1. Blocking CSS or Image Files from Google Crawling

Last year, in October, Google stated that disallowing CSS, Javascript and even images (we’ve written an interesting article about it) counts towards your website’s overall ranking. Google’s algorithm gets better and better and is now able to read your website’s CSS and JS code and draw conclusions about how useful is the content for the user. Blocking this content in the robots file can cause you some harm and will not let you rank as high as you probably should.

2. Wrong Use of Wildcards May De-Index Your Site

Wildcards, symbols like “*” and “$”, are a valid option to block out batches of URLs that you believe hold no value for the search engines. Most of the big search engine bots observe and obey by the use of it in the robots.txt file. Also it’s a good way to block access to some deep URLs without having to list them all in the robots file.

So in case you wish to block, lets say URLs that have the extension PDF, you could very well write out a line in your robots file with User-agent: googlebot

Disallow: /*.pdf$

The * wildcard represents all available links which end in .pdf, while the $ closes the extension. A $ wildcard at the end of the extension tells the bots that only URLs ending in pdf shouldn’t be crawled while any other URL containing “pdf” should be crawled (for example pdf.txt).

*Note: Like any other URL the robots.txt file is case-sensitive so take this into consideration when writing the file.

Other Use Cases for the Robots.txt

Since it’s first appearance, the robots.txt file has been found to have some other interesting uses by some webmasters. Let’s take a look at other useful ways someone could take advantage of the file.

1. Hire Awesome Geeks

Tripadvisor.com’s robotos.txt file has been turned into a hidden recruitment file. It’s an interesting way to filter out only the “geekiest” from the bunch, and finding exactly the right people for your company. Let’s face it, it is expected nowadays for people who are interested in your company to take extra time in learning about it, but people who even stalk for hidden messages in your robots.txt file are amazing.

2. Stop the Site from Being Hammered by Crawlers

Another use for the robots file is to stop those pesky crawlers from eating up all the bandwidth. The command line Crawl-delay can be useful if your website has lots of pages. For example if your website has about 1000 pages, a web crawler can crawl your whole site in several minutes. Placing the command line Crawl-delay: 30 will tell them to take it a bit easy, use less resources and you’ll have your website crawled in a couple of hours instead of few minutes.

We don’t really recommend this use since the Google doesn’t really take into consideration the crawl-delay command, since the Google Webmaster Tools has an in-built crawler speed tuning function. The uses for the Crawl-delay function work best for other bots like Ask, Yandex and Bing.

3. Disallow Confidential Information

To disallow confidential information is a bit of a double edged sword. It’s great not to allow Google access to confidential information and display it in snippets to people who you don’t want to have access to it. But, mainly because not all robots obey the robots.txt commands, some crawlers can still have access to it. Similarly, if a human with the wrong intents in mind, searches your robots.txt file, he will be able to quickly find the areas of the website which hold precious information. Our advice is to use it wisely and take extra care with the information you place there and remember that not only robots have access to the robots.txt file.

Conclusion

It’s a great case of “with great power, comes greater responsibility”, the power to guide the Googlebot with a well written robot file is a tempting one. As stated below the advantages of having a well written robots file are great, better crawl speed, no useless content for crawlers and even job recruitment posts. Just keep in mind that one little mistake can cause you a lot of harm. When making the robots file have a clear image of the path the robots take on your site, disallow them on certain portions of your website and don’t make the mistake of blocking important content areas. Also to remember is the fact that the robots.txt file is not a legal guardian, robots do not have to obey it, and some robots and crawlers don’t even bother to search for the file and just crawl your entire website.

]]>http://cognitiveseo.com/blog/7052/critical-mistakes-in-your-robots-txt-will-break-your-rankings-and-you-wont-even-know-it/feed/9Unique Technique Reveals Competitors Evergreen Content Secretshttp://cognitiveseo.com/blog/7078/unique-technique-reveals-competitors-evergreen-content-secrets/
http://cognitiveseo.com/blog/7078/unique-technique-reveals-competitors-evergreen-content-secrets/#commentsTue, 10 Feb 2015 10:37:44 +0000http://cognitiveseo.com/blog/?p=7078One of the most powerful types of content you will find the most rewarding is evergreen content. By identifying usable, evergreen content in your niche and publishing your own version you can boost the traffic and branding of your website. The best way to establish the types of content that are going evergreen in your […]

]]>One of the most powerful types of content you will find the most rewarding is evergreen content. By identifying usable, evergreen content in your niche and publishing your own version you can boost the traffic and branding of your website.

The best way to establish the types of content that are going evergreen in your niche is to see what the leading websites are doing. By analyzing the data from our target websites we can quickly ascertain which content is evergreen and still attracting eyeballs.

The cognitiveSEO tool has a nifty way of doing this, by using the Social Visibility tool you can accurately identify the hot evergreen topics in your niche, giving you a good idea of what works.

What is Evergreen Content?

Evergreen content is content that remains valid over time, it is timeless and always attracts new readers. Some evergreen content will never need updating, whilst other types may need a little tweaking over time but will still be of great use to the reader.

Certain things in life are constant, certain things will constantly change. It is the aim of evergreen content to always be useful and therefore always attract a level of interest that is reflected in it’s subject.

Examples of Evergreen Content are:

How to get a job in marketing

10 Personality traits of an epic programmer

History of digital marketing on the Internet

10 Biographies of giants in business

A timeline of the Internet

A list of topics that are evergreen:

Time Management

Freelancing

Creativity

Human communication

Design laws

Human emotion

Greed

Gravity

You can see from this list that there are some topics that do not change, whilst others may come in and out of fashion. Human communication for example is a subject of continual interest and an article such as “How to Publish a Website”, would be considered evergreen with regular tweaks. Content can be created around different aspects of these topics

The key is to create a piece of content that is still of interest long after the publishing date. Social media is the main vehicle of interest in identifying what is evergreen content and what is not, as it presents us with a verifiable human signal that can be recorded and analysed in a pattern.

By using the Social Visibility tool on the cognitiveSEO dashboard we can quickly analyse a tracked website to identify the evergreen content.

What Evergreen Content is Not

It is not the latest fashionable tip that is all the rage at the recent SEO conference and is being pounced on by hordes of hungry SEO types. First of all, these tips tend to go public when they have been blitzed to death on the dark SEO forums. Second, usually they are taking advantage of a flaw in the Google system. It may work great for a while but soon stops working when Google figures out how to plug the gap, making that 2,000 word blog post you wrote on the subject completely out of date and dead in the water.

Whilst there is room for web content that is highly topical and for the moment, it is good to have a mix of content on your website.

A list of blog topics that are not Evergreen:

This years Super Bowl commercials

A Google algo change

How to employ specific techniques in search engines

How to take advantage of how specific social media platforms work

How a new law has changed things

Content referring to a cultural event such as a movie

A common topic on SEO and social media blogs are how to game social, or as some blogs call it, “optimising the social media system.”

Each social media platform has its little idiosycrasies and quirks that can be taken advantage of and experts in each social media platform regularly blog about how to do this on Twitter and how to do that on Facebook.

This content is useful in the short term or for as long as the social media platform does not change it’s system which eradicates the technique. Once the technique is negated by a change in the social media platform, the content that promoted becomes out of date and useless. Content such as this should always be updated to inform your reader that you no longer promote this technique.

Although the waters get muddied, especially in SEO as techniques come and go, sometimes very quickly, those who can utilise the new technique can get ahead of the game quickly and get rewarded.

Thus, content that is only valid for a specific time period is only as valuable as the technique will allow and unless you are a hard core blackhat SEO, simply following these techniques from a Google, organic search perspective is very short sighted.

Much better to have a mix of content, both topical and evergreen. This will mean that some of your content will always be valid.

Content that has proved it’s worth over time.

There are some blog posts that never get old and yet at the same time reveal something very interesting: 1,000 True Fans: By Kevin Kelly and The Long Tail: By Chris Anderson

Evergreen content is:

Useful

Timeless

High Quality

Evergreen content is a way to scale the act of publishing content.

You may have to spend longer in creating evergreen content but the interest will be compounded over a longer period of time. A post that takes only a short time to create may an enable you to publish content more often but will be less likely to have that longevity of evergreen content

Evergreen content can be re-promoted and if it is useful to your readers why should you not repromote something you wrote a few months ago.

Some people will change the date on their WordPress system so it seems that the content is fresh. I have tested this and whilst it does get attention it did not feel right and I felt in someway the reader was being misled. This is easily solved by stating that the content is from the “archives”, or a blast from the past which is still valid.

Evergreen content can be internally linked to by fresh content. It means that you can get a further pay off from the time and effort you invested in the content originally.
Consider that a tweet has a useful lifespan of 18 minutes, unless you are a politician tweeting a photo of your bits and bobs and then it may have a longer span, whilst evergreen content will go on and on.

Evergreen content is deep and drills down into specific topics further than most posts tend to and is usually bigger than the average piece of content.

Using the cognitiveSEO tool allows us to quickly see which content on a competitors’ website is evergreen and still receiving social signals long after the publishing date.

How to Create Evergreen Content

There are certain things to be mindful of when creating evergreen content. The most important is that it’s not you who decides what is evergreen and what is not, it is the readers who will decide. This means we need to follow certain rules to allow for the maximum chance for your content to be considered evergreen.

1. Your content should be useful and remain useful.

It must serve a purpose that your reader finds interesting or solves a problem and it should continue to be interesting. If something changes in the world that causes your content to lose its appeal, then it is no longer evergreen.

2. It must attract those at the early stages of a subject. Yes, noobs = profit.

The reason for this is many more people start something than finish, not all who attempt a subject can achieve mastery, thus there are many more people at the beginner stage than there are at the expert stage. By your content focusing on the fresh to the subject crowd, you will be guaranteed a constant stream of people who are interested in information that will help them out.

3. The topic of the evergreen content should focus on a specific aspect and not be a rambling view of a subject.

It should offer a promise that after the reader has consumed your content they will have earned knowledge that can be applied in their own lives to be of benefit. For example, “How to Take the Perfect Landscape Photograph”, narrows down the exact subject matter and the benefit of reading the article.

4. It should be of high interest within the niche you are in.

There are some subjects that are hot in your niche and if you are experienced within your niche you will know them. These are the subjects that you can build your evergreen content on. Again, the Social Visibility tool on cognitiveSEO.com will help identify specific, hot subject areas.

How Your Competitors Can Show You Which Evergreen Content Works in Your Niche

Using the Social Visibility tool on cognitiveSEO reveals social signals over time, and is an ideal way to identify evergreen content on a competitors site. If we notice that a post which was created sometime ago and is still getting social signals even after the initial promotion, then there is reason to believe that people think this content is still valid after the date of publication and is therefore evergreen content. It will require further research to determine whether or not the content will be valid over many years, but as long as the content is still getting interest it can be considered evergreen.

Important questions to be asked when analysing for evergreen content

When was it published?

Is it still creating social signals after the initial period of publication and promotion?

What is the cause of continual interest. Isolate the key factors in the content?

Can the content be updated further, making it more attractive?

Here is a step by step, visual guide to using the Social Visibility tool to find evergreen content

Go to Social Analysis by clicking on the Social Visibility tab on the cognitiveSEO tool.

Check the Anti-Noise Technology box. What this does is get rid of any duplicate pages, due to the way web pages can be submitted to social media it can mean the same web page is submitted with a different url address. By excluding possible duplicate pages we get rid of this problem.

Scroll down to, “Pages with Shares Increase for…..”

Here we see the account we are using is a recent account and only crawled on 30th of January, with the latest crawl being on the 6th of February, which means we will be working with one week of data between the first crawl and the latest.
You can select multiple data points for the period you want to analyse, but in this case we are going the last crawl.

We see a table with web pages, share increase and a breakdown of the social signals and referring domains

We can see straight away that listing 2 has had a share increase of +81. This is the largest increase of social signals in the top 10 and is worth investigating. Clicking through to the web page we find it was a piece of content published, October 13th 2014 and is titled, Unmasking The Hidden Digital Marketing Strategies of 9 Successful Startups

By hovering over the +81 we get the following pop-up which gives us further data about the social signals this page has generated since the last crawl.

We find out that the +81 is actually 4% of the signals previously, which totals a very large 1,648 Tweets, 520 Facebook likes, and 78 from Google+.
Scanning the content we see that it’s made up of stories of digital marketing strategies from multiple start ups.
The data in the form of recent social signals, making 4% of the total social signals that the page has received tell me that the content is still getting not only read but shared too.

This indicates to me that this topic has a strong Evergreen value.
But what is very tasty to me is that we can very quickly see the number of referring domains which is 13, which we can see when hovering over the green bar for referring domains.

We click on the green bar of the referring domains and get a popup of the referring domains and can click through to the web page linking to the evergreen content. When looking at what a competitor is using for evergreen content I can not only identify which content is working and which is not, but I can see who is interested enough in the content to link to it.

This technique is very powerful when researching a niche and discovering which type of content is evergreen and still getting social signals. If a web pages is consistently getting regular social signals, it is a strong indicator that the topic is the type which could produce consistent results if created on our own website.

Conclusion

What is Evergreen content?
It is content that is timeless and generates interest long after the initial publishing date.Why is Evergreen content important?
It demonstrates a deep understanding of the subject and allows you to scale interest over time.How do you create Evergreen content?
Identify the subjects that are evergreen and address the issues which readers find interesting and solve their problems. By using the Social Visibility tool on cognitiveSEO you can identify what is working in your niche.

]]>http://cognitiveseo.com/blog/7078/unique-technique-reveals-competitors-evergreen-content-secrets/feed/4Traffic Boosts Organic Rankings? New Research Reveals Some Interesting Factshttp://cognitiveseo.com/blog/7013/traffic-boosts-organic-rankings-new-research-reveals-interesting-facts/
http://cognitiveseo.com/blog/7013/traffic-boosts-organic-rankings-new-research-reveals-interesting-facts/#commentsThu, 05 Feb 2015 09:56:18 +0000http://cognitiveseo.com/blog/?p=7013One of the things we do here, aside from trying to help others to figure out how search engines work, is to deeply research any pattern and data we can, in order to best figure it out for ourselves. We keep a close eye on Google and the likes, ask questions and sometimes even experiment […]

]]>One of the things we do here, aside from trying to help others to figure out how search engines work, is to deeply research any pattern and data we can, in order to best figure it out for ourselves. We keep a close eye on Google and the likes, ask questions and sometimes even experiment to make sense of what goes on behind the scenes. But as is the case with all human observation (from quantum physics to sociology), it is impossible to study something without – at least at some point – becoming part of what you study.

1. The Story – 20k Visitors from Reddit = Boosted Rankings

We became part of our own study recently, when an article we published 4 months ago got picked up by Reddit, the self-professed “front page of the Internet”, in one of their sections (called subreddits). The article was a prediction attempt at how Google might “read” and rank your images in the near future and it was adequately picked up by the Futurology Subreddit. Whether Reddit really is the front page of the Internet is open to debate but the community is definitely a force to be reckoned with: the futurology subreddit alone boasts (as of now) with over 2.2 million subscribers. Not all of them may have been actively accessing the subreddit at the time our article got onto it, but we definitely felt an immediate and significant effect of our presence over there. What is interesting, beyond the effect itself, is how to interpret it, as we will shortly discuss.

2. The Huge Referral Traffic Spike on cognitiveSEO

The immediate effect that we saw on closer analysis was the impressive (if you look at the chart below) spike in traffic around the day it was picked up by Reddit (January 5th). Within 1 day, the traffic on our blog hit 20 000 visits, an impressive feat by any standards, but also a quite singular one in recent history if you look at the rest of the chart. Of course, quantity does not necessarily equal quality, as evidenced by the no-less-dramatic drop in average session duration (almost inversely proportional to the spike in the number of visitors). On average, the users that day spent about 9 seconds on our pages. But before we sink into the pool of “youth-these-days” disappointment, remember that averages are statistical tools which can easily deceive. In this particular case, the average is a relative measurement which sometimes blurs the facts: it is likely that the absolute number of people who spent a really long amount of time on our page was still as high as (if not higher than) on any other day. Their numbers simply got cancelled out in the sea of users who clicked and bailed in just a couple of seconds.

Statistical clarifications aside, the fact remains: dramatic increases in the number of visitors does not necessarily mean a dramatic increase of the quality visitors.

3. The Traffic Statistics for the cognitiveSEO Blog

So what exactly happened to so many users that felt the need to leave the page as soon as they got there? Because the bounce rate was at an all-time high (96.19%), meaning that a vast majority just clicked, glanced and left in a hurry, just like you can see in the chart below.

Accidental clicks (followed, naturally, by hurried departures) are probably responsible for a large portion of the bounces, but surely they can’t all have been accidents? The profile of the reddit user is also something to take into account at this point. They are likely avid readers, parsing through articles in rapid succession without necessarily scanning the title in-depth before clicking on it and only deciding on whether they want to read it or not just seconds after the page has loaded. Another scenario to consider is that Reddit is first and foremost a community and its users are primarily preoccupied with interaction (even when it’s for the purpose of finding out information). Comments section are a lot of times the actual prized content and not just a redundant annex to the content. It is then not unthinkable that while most people who would comment on the topic would want to read the article, there was at least a considerable number of those who were more interested in the comments than in the article itself. There are probably a lot of other factors at work here to explain the staggering bounce rate. Despite all the bounces, however, the mere presence of so many visitors in so little time led to an even more impressive effect.

As expected and presented above, the average session duration was very low. Yet, as the high traffic went away, the average session duration increased. This means that our usual readers platform maintained their behavior and after all the buzz in our traffic, things got back to normal.

3. The Keyword Ranking Boost on Ultra Competitive Keywords

Because the article was all about Google images and talked about this a lot, it actually got noticed by someone else, namely by Google itself. Well, it was probably noticed the first time around, but this time it got really noticed. Like that time in college someone threw a party and everyone came. People knew that person before, but after the party they really knew him/her. At least for a while. Much like this fictional American themed example, our fame was intense but short. We had a truly impressive visibility spike for what is undoubtedly a very competitive phrase (“google images”), getting from the 74th position to the 8th in only one day! That is a feat by any standards. Yet, as you can see from the chart below, after just a few days our rankings had a big downtrend.

We won a lot of points in terms of SEO visibility also, going from around 500 to around 1350 (almost three times more). Alas, our visibility fell less abruptly than it rose, but by the end of the month the levels had gotten back to where we started. By then, it was probably obvious to Google that what had happened on the 5th had been an isolate case and not the rule.

Although the saying goes “impressions don’t buy, real people do, we cannot ignore the high number of impressions registered on the 5th of January: almost 60.000. For a quick reminder, impressions are the number of times a post is displayed, whether the post is clicked or not. People may see multiple impressions of the same post as, for example, someone might see a page update in news feed once, can see it a second time when a friend shares it and so on. Even so, as you can see from the chart below, the number of impressions reached for the queries containing “google images” is impressive and need to be taken into consideration as an important factor in our present research.

Our 15 minutes of fame lasted a bit more, but not by a lot. Everything happened on the 5th, as the organic traffic spike confirms. You can see this yourself, in the screenshot below, taken from Google Analytics.

As more and more people saw the thread, up-voted it and commented on it, the post got pushed higher and higher and started to attract even more views and comments. Soon enough, it was all a matter of the rich getting richer. As with most threads on reddit, the buzz eventually died out and other posts took turns at the popularity contest. Posts usually lose popularity on reddit because others start to down-vote them and balance out the up-votes. This usually happens naturally, but sometimes it can also be influenced by users with administrative rights so as to keep a steady influx of new threads. Either way you look at it, the decrease of the organic traffic was a normal occurrence. The interesting fact was the direct influence the increase (and decrease) of the number of visitors had on the increase (and decrease) of the page ranking in the search engine.

An interesting article written 4 years ago brought into discussion the same issue, still topical apparently:

Can a dozen of K new visitors influence the position in the SERPs ? It looks like it could be.

Yet, the author also stresses on the importance of concentrating more on obtaining links from webpages that can convert traffic rather than looking at the outdated metric of PageRank for example. So, concentrating on obtaining quality traffic is crucial and you can monitor that traffic using free tools Google Analytics. Nevertheless, we cannot ignore the benefit of referral traffic in terms of how it influences rankings within Google. As in our case, it seems like obtaining links that provide a high volume of referrals and click through data influences rankings within search engines. We cannot draw a 100% certified conclusion here but we can assume, having our own experience as a starting point that referral traffic may influence rankings. More research needs to be done on this matter in the future (and we are probably going to make an analysis of this kind in the near future) but I think that our findings on this situation is one that needs to be exploited and one from which there are a lot of lesson to be learnt.

5. Could the Links Ranking Signal Boost Our Rankings Actually?

Of course, this whole analysis is basically trying to establish a cause-effect relationship between two elements : the referral traffic and the rankings in SERP. The basic conditions seem to be meet: one happened before the other, there was a co-variation in one that corresponded to a co-variation in the other and the two are relevant to each other.

However, we have ignored one additional criterion of a causal relationship: that there are no alternative causes that could lead to the same effect. There is one such possible alternative in our scenario: the Reddit link that was posted was a dofollow link that could have potentially influenced the ranking of the page. Looking at our analysis below, you can see that the subreddit has both a high link authority and a high domain authority. While the actual measurable value and impact of the referral link is not very clear, it is definitely a different factor to take into account than the simple increase in number of visitors.

Could only one referring domain cause such a spike in rankings and in only a couple of hours since it was published?

We cannot be extremely sure about this but is not unlikely though.

We picked up some of the Reddit Links and analyzed them in order to see if the scenario where one domain could have boosted up so high was possible. As we browsed around the links we observed that indeed, most of them were links with good domain authority, yet, nofollow.

6. Conclusion

So could one link from a high authority domain boost the organic traffic to the top almost instantly for such a competitive keyword or did the actual traffic signal boost it? This is open to testing but my gut feeling tells me that there is a combination of the two signals and that both links and traffic contribute to such considerable spikes like the ones we have seen in our case. There are lot of things to discuss here and as listening to others’ opinions and analyze them is one of a researcher’s best skill, we would be more than happy if you shared your opinion on this matter.

So, what do you think? Does referral traffic influence the SERPs? Or can only one referring domain boost a site’s rankings from a low position to a top position in only a couple of hours?

]]>http://cognitiveseo.com/blog/7013/traffic-boosts-organic-rankings-new-research-reveals-interesting-facts/feed/13Powerful SEO & Digital Marketing API Launched by cognitiveSEOhttp://cognitiveseo.com/blog/6988/powerful-seo-digital-marketing-api-launched-cognitiveseo/
http://cognitiveseo.com/blog/6988/powerful-seo-digital-marketing-api-launched-cognitiveseo/#commentsTue, 27 Jan 2015 11:08:18 +0000http://cognitiveseo.com/blog/?p=6988We are extremely pleased and excited to announce you that the cognitive SEO API has been launched! That’s right! After a long waiting time, we’ve finally released the first version of our API. It was good spent time, we say, as we wanted to deliver a cutting-edge, high quality product. With our freshly released API you […]

We are not going to say too much here but you can check out the dedicated SEO API page that explains it all and also the documentation that provides full information on all the commands and how to use them. And if you still have questions, suggestions or you just want to find out more info, just write us down at support@cognitiveseo.com and we will be more than happy to reply.