Thursday, 31 January 2013

URL shorteners are rapidly being used in social media like Twitter and Facebook. Since a microblogging site like Twitter only allows 140 characters in a post, URL shorteners are gaining in popularity. But how does shortening of an URL affect its Search Engine Ranking and Search Engine Optimization (SEO)? Do the link credits go to the original article or is lost somewhere in between?

Matt Cutts,a well-known Google staffer recently answered this question in a video. According to him, URL shortening does not harm an article’s Search Engine Optimization till the time the shortener uses a 301 redirect to the original link. 301 redirects are permanent redirects and help give credits to the actual link. Most URL shorteners like bit.ly use 301 redirects. Such redirects help in creating inbound links to the article/site and consequently help in Search Engine Optimization.

However some URL shorteners use 302 redirects. They give only temporary link credits to the original article and lay more emphasis on the shortened link. Such URL shorteners are better avoided.

So in case you had some worries about using URL shorteners for SEO… rest assured, they only add to your link credits. You just have to ensure that they use 301 redirects. 99BKG2CZ6KFP

Tuesday, 29 January 2013

That’s just stupid, honestly. I’ve seen so many “Is SEO Dead” threads
and articles popping up everywhere telling people that all hope is
lost, and that’s just ridiculous. SEO will NEVER die, period!

You simply need to adjust your strategy a bit, write high-quality content and stick to white-hat SEO techniques if you’re doing any form
of link building. That’s it, that’s all you need to do to “conquer” any
update Google throws at you.

If you write something on your website, ask yourself the following
question: “Is this what I want to see on Google’s first page if this is
what I searched for?” If you can answer honestly and it’s “yes”, then
you should be fine. It’s all about relevancy and user experience. If
Google feels you’re REALLY contributing to the WWW then they won’t touch
your rankings. But if you’re doing anything sneaky to try and fool
Google or just try to make a quick buck without providing sufficient
value to your readers, whatever good rankings you have now certainly
won’t last long. People who says SEO is dead have no idea what they’re
talking about, they’re quitters, wannabes or most likely, -both.

Here’s a breakdown of the 2 algorithm changes made by Google
that you can use to inspect, fix and master to avoid ever getting
slapped again.

Farmer/Panda Updates: As stated before, the Panda update
was implemented to remove article directories and thin-content sites
(like minisites) from the SERPS. The de-indexing of many article
directories lead to a lot of sites losing backlinks on a major scale,
which is what caused some sites’ rankings to drop dramatically. More
recent updates (Dec 2011 – Apr 2012) targeted private blog networks like
SEO Link Vine (this one hit in Dec 2011 judging from personal
experience), Build My Rank and High PR Society just to name a few. Panda
was also said to target web 2.0 sites, but this varies depending on the
amount and quality of content posted on these sites.

How to Fix your Site that was Hit by Google Panda?

Easy, focus your link building efforts on social networks and don’t
spam the living hell out of the internet with every flashy SEO tool you
can get your hands on. Stick to white hat SEO and you’re good to go.

The Dreaded Penguin Updates: The Penguin update
focused more on the on-page SEO of sites, penalizing sites that had too
many stuffed keywords in every post or page. Sure, it also targeted
anchor text backlinks and obvious footprints from backlinking, but is
more focused towards your on-page content itself and not so much the
off-page factors.

How to Fix your Site after being Penguin Slapped?

This is what I did, and it worked. Since I myself has been found
guilty of a bit of keyword stuffing here and there (especially the older
posts), I’ve had some keywords drop from page 1 into oblivion
overnight. Here’s what I did to fix my site after being hit by the
Penguin updates.
I
edited each and every one of my posts that made me money and removed
many duplicate keywords to get a keyword density of between 0.8% and
1.2%. My rankings are coming back in a big way, here’s a screenshot of
my Google Webmaster Tools account (on the left) to show you the proof
that this is the way to “beat Google Penguin”. These rankings have
disappeared but are making a big comeback after I’ve edited my posts.

Another thing that triggers a visit from the wrath of the Penguin, is
linking to unrelated sites. So delete any links pointing to outbound
sites that are unrelated, because that will get you slapped faster than
you can say “Penguin”.

That’s it, that’s all you need to do! I also suggest following Matt
Cutts on Google Plus and keep a close eye in what he has to say. Know
that it will take some time to get your rankings back, but if you follow
my advice it’s sure to come back sooner rather than later. Know that
SEO is NOT dead and it will never die, so ignore anyone telling you
otherwise.

You now have everything you need to know to recover from both Panda and Penguin updates, so get to work and focus on producing great content mixed with relevant, white hat link building techniques.

Thursday, 22 November 2012

Last week in an interview with Eric Enge, Matt Cutts mentioned
that Google might discount Infographic links in future. To quote Matt,
“if at some point in the future we did not start to discount these
infographic-type links to a degree. The link is often embedded in the
infographic in a way that people don’t realize, vs. a true endorsement
of your site.”

As a justification for this probable move he cited a few reasons :

“What concerns me is the types of things that people are doing with
them. They get far off topic, or the fact checking is really poor. The
infographic may be neat, but if the information it’s based on is simply
wrong, then it’s misleading people.”

He also mentioned, “people don’t always realize what they are linking to
when they reprint these infographics. Often the link goes to a
completely unrelated site, and one that they don’t mean to endorse.”

So to summarize, three reasons why Google might be discounting infographics links in future are :
Infographics could be far off topic in relation to what the business is dealing with
The fact represented in the infographics is really poor - resulting in misleading info
People don’t realize what they are linking to when they republish an inforgraphics

And for these Google might discount all infographic links. Really ?? Are
you kidding me? It is completely ridiculous and it seems Google is
increasingly getting the God complex.
Google has always mentioned about creating extraordinary content that
people would love to link to and now when people have identified a
definitve form of such content they want to discount those links.

Let’s take a more detailed look at the points mentioned above..

Off topic Infographics : Yes, this could definitely be a
valid reason to discount the links. If we are dealing with SEO and
publish an infographics on the most influential political leaders of the
world, there is every reason and justification for Google to devalue
any link that the site gets through it and they also have the capability
to judge this contextual relevancy of the graphics to the overall theme
of the website.

Poor Research Data : How is Google going to determine the
quality of the research data ? In an infographics all research data are
graphically represented and while Google might have really advanced
their capability to read and understand image, I don’t believe it is
anywhere close to interpreting graphically represented research data.
The only option is manual verification - that is not a scalable and
feasible process given the volume of infographics published and also,
two different reputable sources could have two different value for same
data point, what if Google looks at a source other than the one you used
for infographics ? Does that make your depreciate the data quality of
your infographics ?

People don’t realize what they are linking to why republishing infographics :
Really ? Webmaster’s and content editors are that foolish ? Someone who
maintains a good quality website ( because that is already a
prerequisite for the link to be valuable) would definitely be wise
enough to know and check what they are linking to. For a second, let’s
accept that webmasters are foolish enough to link to a website without
checking it. In such case whose responsibility is that ? When I am
linking to a website from my site in whatever form, it is my
responsibility to check what I am linking to, if I am linking to
something wrong / irrelevant / unethical that should go against me and
not the site I am linking to. So in this case, if at all Google has to
take any action they should take it against the re-publishing website
and not the site that created the infographics.

I have worked on several infographics for different projects and website
and know for sure an infographics with this data or poor graphics would
never succeed ( yes, we tried that too and learnt from the mistake).

How infographics get links ?

Let’s look at how infographics get their links. Once you create an
infographic, the first thing that you do it is publish on the social
media channels and as it starts getting shared, it catches the attention
of bloggers who start republishing. Now the prerequisite here is the
infographic getting “shared” and that only happens when it is of certain
quality and actually provides some interesting/ useful information for
the readers. So if the content isn’t of good quality it wont get shared,
neither would it get substantial number of links. And when people have
endorsed the infographics through social sharing ( and consequentially
by linking) - why does Google have a problem with it ?

Of course there are other ways to get links for infographics, like
mailing to bloggers directly, doing press release etc but even there
anyone who republishes an infographics would definitely spend a couple
of moments to evaluate the quality of it and when Google’s discounting
these links seems like a sheer disrespect towards people’s judgement.
This is an unbelievable arrogance resulting from Google’s monopoly in
the search space.

Is Google Socially Blind ?

Search engines today are increasingly relying on social data and in this
case social data could be one of the key indicators of the quality of
the infographics. Should we / Do we have to believe that Google doesn’t
have access or capability to judge the social response to a page ? and
when they see a major positive reaction, isn’t that enough to tell them
about the quality of the content ?

The Embed Code Issue

Google can definitely have some problem with the embed codes that are
provided with infographics, as that proactively suggests the link and
poses an opportunity for the publishing site to get the same anchor text
link. However, with Penguin in place it should not be a tough job for
Google to work out the anchor text bit. But if there is no embed code
provided there will be a ton of people copying and republishing
infographics without crediting the original source - what happens then ?
We have seen Google crediting authority websites when they republish
some great content that was originally created by some lesser known
sites and while most reputed bloggers do provide necessary citation to
source, I have encountered two cases where two extremely reputed
authority sites have published our infographics without any credits (
they did add a link to us, only after we requested them to mention us as
the source). For one of those infograpics Google still ranks that
authority site above our site even though the original site has received
enough links and social mentions. In this situation, can a business
investing in creating a good infographics really afford not to use an
embed code ?

I look at providing embed code as an initiative to make the content more
linkable. If you are creating a good content that you know people are
going to love and link to, what is wrong with making it a little easier
for them ?

I can understand if they decide to discount links coming from
infographics directories as any one can get a link from those but saying
that they might discount links that an infographic receives sounds
ridiculous. This is as good as saying that we may devalue the organic
links that you have earned by creating some awesome content that loads
of people loved, linked to and shared.

This is one of those frustrating moments when I really wish that we had a
strong competitor from Google that would make them think twice before
contemplating such ridiculous steps.

Ranking no. 1 in Google for targeted keywords is definitely the dream
and objective of any and every SEO consultant, however, it has never
been easy and particularly so in this troubled time when Google has
successfully created an unnerving experience for SEOs, using the cutest
of animals - Panda and Penguin.

But well, all's not gone. Matt Cutts here tells you a simple formula on
how to rank #1. Watch the video and get your SEO guns going ablaze.

So are you ready to follow his instructions ?

* This post is just for fun and is actually a mash up of multiple
genuinely helpful videos from Matt. It was put together by Sam
Applegate. Please do not follow the instructions given in the video for
SEO as it is sure to backfire.

Friday, 9 November 2012

Parked Domains are discouraged by Google, but their official pages display a different story

Matt
Cutts answers to questions asked by people in Google’s ‘Webmaster Help
Forum’. Now that’s not news, but what happened yesterday is that,
instead of Matt selecting a question from the forum list he opted to
answer his own question. Probably he intentionally did this to convey a
valuable message to webmasters worldwide.

The question is as below:

“I have a parked domain and want to launch a
new website on it. Are there any pitfalls I should avoid? Should I keep
my domain parked or put some sort of stub page there?”

Matt
mentions that they have got a parked domain filter OR detector that
prevents these parked pages appearing in Google’s search results. He
mentions that if you have your domain parked for a while before
launching the actual site (rich with content and links), it takes time
for the above mentioned filter to advise Google’s algorithm that the
site is no longer ‘parked’. Matt advises users to add a paragraph or
more mentioning that this domain will be the future home of XYZ site and
mentioning more about the business that is getting ready to kick-start
or which already exists. He warns that leaving the domain bare without
content, due to lack of business ideas the time you purchase the domain,
will end up in the filters detecting your launched website a little
later than usual.

What is a Parked Domain?

Parked
Domains are additional domains placed without any content by a primary
domain, mainly serving advertising purposes. These domains are
single-page websites often developed by webmasters for future uses. Or
you can also launch a parked domain right before the actual launch of a
website.

Data Refresh

Just a refresh into an old blog post by Google.
This was posted back in 2011. As mentioned in this post, one of the
search refreshes done by Google was on ‘Parked Domains’ – not to show
them anymore in search results.

“New “parked domain” classifier:
This is a new algorithm for automatically detecting parked domains.
Parked domains are placeholder sites with little unique content for our
users and are often filled only with ads. In most cases, we prefer not
to show them.”

It doesn’t make sense when they lead us to a ‘setup instructions page’
and a ‘Help Center’ page (both landing in a single page), which
deliberately encourages and fuels creating parked pages. The rule-maker
is the rule-breaker here! Will it not apply to Google’s AdSense for
Domains, which cheered for parked pages?

About Me

Amit here, looking for strong social networking groups with professional sector.Loves to like watching cricket, travelling, listening music.crazy for internet savvy..I have an 3 years approximately exp. on Internet Research.