ICMR-4Ps B&M brings to you inet, a comprehensive ranking of India's Most popular websites after a thorough and comprehensive analysis with research inputs from primary as well as secondary sources

4Ps Business & Marketing magazine, in association with Indian Council for Market Research (ICMR), conducted a perception survey among Internet users to identify India’s most popular websites. Here’s how we did it:

FIRST PHASE:

The first phase of the research was initiated with a single question to the respondents, “Which are your top five most preferred websites?”This survey was conducted among 1,000 respondents across India namely Delhi, Mumbai, Kolkata, Chennai and Bangalore. The respondents included students, housewives, working professionals and others to generate a list of preferred websites across various categories. Based on the frequency of responses, the websites were first sorted and placed into their respective categories. The following is the list of categories covered under the survey.

SECOND PHASE:As per the table given above, 23 broad categories were taken into consideration for the next phase of the survey. In order to conduct one-on-one interviews, structured questionnaires were designed. The research was conducted in five major cities of India – New Delhi, Mumbai, Kolkata, Bangalore and Chennai; with an aggregate sample size of 1,089 respondents.

The respondent profile varied across different categories with their respective age groups, gender and education levels. Each of the respondents was asked to rate the websites on a scale of 1-10 (where 1 is the lowest and 10 is the highest) on the following parameters:

User Friendliness: The main goal of a website is to provide its visitors with a pleasant and fulfilling experience. User friendliness includes factors such as page loading speed, accessibility, navigation and information.

Look/Design: The parameter of look/design is also considered important as it creates the first impression on the visitor. The look/design of the website is primarily based on colours, graphics, white spaces and other technical factors.

Content Quality: A website is known for its content quality apart from the overall look and design. The content quality/credibility of a website is what drives a visitor to make frequent visits to the website.

Trustworthiness: At present, there are a number of websites available for online shopping, money transfer, photo/video sharing and other transactions. In this context, overall privacy and security are all the more important for a visitor to a particular site.

Overall Satisfaction: This parameter was included to ascertain the respondents’ perception on the website in a holistic sense.

THIRD PHASE:The final overall ranking is a weighted average of Alexa rankings and the ICMR survey rankings. As per expert opinion and peer review, Alexa rankings have been given 30% and ICMR rankings have been given 70% weightage.

[Editors Note: Researcher should note that Alexa rankings, similar to any other Internet ranking tool, can be manipulated to a larger extent than can primary responses from a sample. Researchers should also note that a few websites that could have scored extremely high on Alexa rankings might have very low Alexa score because of their internal server security that may prohibit Alexa’s Internet crawlers from accessing their websites (typical examples include Gmail, Hotmail, Yahoo! Mail et al)] Subsequently, certain websites have featured in the ICMR survey list but do not have an Alexa ranking. Hence these websites have not been considered in the overall ranking but their individual scores have been revealed in the following pages. Also, while many websites rank high in their respective categories, they may have slipped down in overall rankings because other websites have more mass functions (search engines, social networking, web portals, et al).

Note: The Alexa rankings we have considered for the respective websites are ranks as on August 13, 2012. Also, certain categories have been omitted in the second phase of the survey e.g. Torrent sites, websites with adult content, et al.

For the past several years, Google has been continually updating its PageRank algorithm to clear the web spam and reward websites that play by its rules. The search engine giant has struck again, with " Penguin" this time. What does it mean to search engine optimizers (SEOs), and how they can survive the attack...

The world turned upside down for the whole new breed of SEO (Search Engine Optimization) experts across the globe when Ken Krogue, a contributor at Forbes Magazine, penned an article title “The Death Of SEO: The Rise of Social, PR, And Real Content” on July 20, 2012. It was followed up with another article title “The Death Of SEO: Generating Real Content” on August 4, 2012. And what followed these two articles was not just a flurry of responses or feedbacks (both the articles have been viewed by some 1,10,426 people so far), but also thousands questioning “If it was for true?”

In one of his pieces Krogue had spoken about an exhaustive conversation which he had with a Utah (United States) based SEO consultant named Adam Torkildson over lunch, sometime back in March 2012. The revelation was “Google is in the process of making the SEO industry obsolete, SEO will be dead in two years.” Krogue doubted the words, but a month later, on April 24, 2012, when Google proved Torkildson right by announcing the launch of an all new search engine algorithm code named “Penguin”, he shared the conversation with the world, creating a sensation.

It’s a fact that whenever Google has come out with a new algorithm for PageRank (a link analysis algorithm, named after Founder Larry Page and used by the search engine, that assigns a numerical weighting to each element of a hyperlinked set of documents, such as the world wide web, with the purpose of measuring its relative importance within the set), it has created ripples across the digital world. After all, every single individual or company wants to be in the top league when it comes to Google search results page – a miss means losing out (in terms of customers, brand recall, et al) to a competitor that has managed its way to the top.

If industry experts are to be believed, Google alters its PageRank algorithm over 500 times a year. Most of these changes are minor, but major updates keep on happening at regular intervals which keep search engine optimizers on their toes. A case in point is Panda, a prominent change to Google’s search ranking algorithm that was first released in February 2011 (interestingly, there have been about 13 data refreshes of it since then). The algorithm made news for cracking down on sites with poor quality, and plagiarized content, as well as sites with a high ad-to-content ratio. Its impact on PageRank too was evident. CNET, a tech media website, reported a sudden rise in the rankings of news websites and social networking sites, and a drop in rankings for sites plagued with advertisements.

Interestingly, despite this and several other changes to the PageRank algorithm the SEO industry has been not just been able to survive but flourish. In fact, a new eMarketer study says that search ad spend is expected to grow 27% from 2011 to 2012, up from $15.36 billion to $19.51 billion. And by 2016, it is expected to reach almost $30 billion annually. As expected Google sites lead the US explicit core search market with 66.8% market share, followed by Microsoft and Yahoo sites with 15.7% and 13% market share respectively (as per the latest data released by comScore on August 15, 2012). If this is the situation, then why there is such an outcry on the launch of yet another update – Penguin?

When 4Ps B&M got in touch with Torkildson, who made the shocking revelation, he gave us the same response which he had been giving to a host of bloggers and writers since the statement was published in Forbes (although this time he sounded more diplomatic than the way he was quoted by Krogue). “Google is definitely pretty far away from being able to personalise everything, and make their search results 100% accurate. But they are working on it very hard, and by invoking Moore’s Law on their behalf, if their algorithm today doubles in effectiveness two years from now, most search engine optimizers will be out of a job just because of that fact,” says Torkildson. Pessimism is evident from the way he states things, but is it really the situation?

When the launch of Penguin was announced by Google on April 24, 2012, Matt Cutts, Head of the webspam team at Google, had stated on the Official blog that “in the next few days, we’re launching an important algorithm change targeted at web-spam. The change will decrease rankings for sites that we believe are violating Google’s existing quality guidelines. We’ve always targeted web-spam in our rankings, and this algorithm represents another improvement in our efforts to reduce webspam and promote high quality content.” Therefore, one can say that it’s nothing but an ongoing battle between “Black Hat” search engine optimizers and Google, one which will probably never be resolved. The basic idea behind Penguin is to put a stop to “Black Hat” techniques and reward ethical SEOs.

For starters, while “White hat” search engine optimizers are the ones who focus on improving the usability of a website by creating quality content which is relevant to that particular website, “Black hat” search engine optimizers use unethical techniques and tactics (tactics such as keyword stuffing, cloaking, and content spinning), primarily by looking for shortcuts or loopholes, to rank a website higher than it deserves to be ranked.

The Penguin targets all such tactics that search engine optimizers have been using for the past several years, all to improve the ranking of a website, unethically. Agrees Todd Bailey, Boston based SEO and Vice President – Marketing at WebiMax as he tells 4Ps B&M, “If you had done a lot of these things, you will see dramatic shuffling of search indexes and drops in ranking.” For numbers, while Panda affected about 12% of queries to a significant degree; Penguin affects about 3.1% of queries in English (about 3% of queries in languages like German, Chinese, Arabic, & an even bigger percentage of them in “highly-spammed” languages) to a degree that a regular user might notice. In fact, the percentage would increase as Google launches Penguin updates.