Mobile Search and thus Mobile SEO are hot topic today. Apparently, much focus has been put on mobile search at CTIA recently.Â If you follow news in the mobile search industry, you probably heard of a new mobile SEO company that was officially launched a few days ago, visibility mobile, based in Ireland and whose co-founder is Bena Roberts, a mobile search blogger.

visibility mobile introduced something that they claim will revolutionize mobile search and mobile SEO…It’s called MetaTXT, and it is basically a .txt file that you put on the root folder of your site (like robots.txt) and its goal is to tell search engines, browsers, or other applications, where are all versions of your sites: mobile, desktop, RSS, Podcast, etc.

visibility mobile hopes that search engines will support this file, in order to improve the way mobile search works. I’ve read the paper several times, and scratched my head dozens of times while reading it, I still don’t see how metaTxt can improve anything in the mobile search industry. Plus it would surprise me if Google or any of the leading search engines accepts to respect a standard suggested by a SEO company.

It does helps finding the mobile version of a site when a bot crawls the desktop version: but how is that different from having a mobile sitemaps for example, or simply linking to it? It’s true that linking is not as present on the mobile web than on the desktop web, but that’s because mobile site owners still don’t really realize that SEO can help them having more traffic.

Most mobile search engines will display web results first in their search results, and use proxies to transform web pages so that they display properly on a mobile phone.Â So if you do own a mobile site, you can use a link tag in your desktop site, as recommended by the W3C in their Content Transformation Guidelines, and currently supported by Google (not Yahoo! OnSearch or Live Mobile Search though), so that users are forwarded to that site instead.

Some things in MetaTXT are great though, such as the geolocation of your site, because I don’t think that people pay attention to their Top Level Domains or Hosting location for Mobile SEO, so that kind of things can help search engines return the most relevant results to users.

But for the rest, it seems to me like MetaTXT tries to solve a problem that does not exist…Or I’m just dumb.

Today, Bena Roberts released a “Mobile SEO whitepaper”, which you can find here (pdf). I was quite surprised that it was called “whitepaper”, as it doesn’t really explain how to improve one’s mobile SEO strategy, it does lack depth and the approximative vocabulary and jargon reduce the firm’s credibility.

- In the first part of the document: “Online techniques vs mobile ” (note that I don’t know why they choose to call the desktop Web “online”, the mobile web is also “online”…) , it says:

The mobile web is going through a learning curve and more often than not OneWeb standards are implemented to direct users to the right mobile site or page. This means that HTML codes are inserted into sites that tell mobile search engines to redirect the page.

<link> rel=â€mobileâ€

This is insufficient. Moreover, it is a lengthy process where search engines are often redirected to the online site only to be forwarded to the mobile site.

Currently, it does what it is supposed to do, redirecting users to a mobile page. OK: there is a redirection, but it’s merely noticeable by the end-user, and whether you like it or not, the most popular mobile search engines (Google has a 63% market share, and Yahoo! 34.6%, according to this new study from comScore) will display Web pages first in search results, so this alternate link is “sufficient” if you want to ensure a good mobile user experience.

Another problematic point (according to them) is raised: the lack of linking between mobile sites. That’s true, there are many mobile sites out there that are highly popular but they do not have as many links as they would if they were desktop sites.Â But links do count in mobile SEO, and if you get links you can outrank your competitors. So more links should be created, and this will help discovery.

- Instead, visibility mobile suggests the use of its metaTXT file to help search engines find mobile sites, because of that: the lack of links. OK, so let’s say I add this tag on my PC site domain.com, which indicates that my mobile site is domain.com/m.Â How is that different from linking to it? Would search engines not find my mobile site if I only link to it? Nope, it will.

And OK, even if you tell search engines where your mobile site is, would that be sufficient? Nope, you’ll still need links.

The following claim is then made regarding keywords:

Indeed, nowadays the use of meta tags has decreased with companies such as Google saying that Meta Tags are often over populated with the wrong keywords to trick or misguide users. But this does not mean that Google does not use Meta Tags in its search analysis. If used correctly and not-overpopulated with illicit or incorrect words â€“ it remains an effective way of gathering relevant site information.

In mobile where sites are usually scaled down versions of online sites due to the limitations of the small screen and for effective usability â€“ manipulating the text with keywords can be tricky. So once again Meta Data is an ideal way in which to ensure that the main focus of the site is highlighted.

We won’t hold that debate again, do a test: use a unique keyword located inside a keyword meta tag and try to see if the page comes up when you search for it on Google Mobile (Mobile Web).

It’s true that space is limited on mobile pages, but you can still use title tags and page content to include your keywords. No need for meta keyword tags which, unlike other non-meta tags, have no hierarchy.

- The whitepaper then gots even funnier. It’s a case study of a mobile SEO campaign. What strikes me is that they say they used Mobile SEO techniques to optimise some test sites, and part of that was PPC advertising… I agree that there’s no standards in SEO, but come on, SEO and PPC are 2 different things.

The campaigns are explained as below:

Using building vs buying techniques we optimised two mobile sites with different specifications. I created these sites myself with mobile site creation tools. Then they were submitted to mobile search engines â€“ after a 4 week wait we started this analysis. The analysis below is a synopsis of the results only. Our techniques remain proprietary to our visibility mobile brand.

The first was bkimedia.zinadoo.mobi and the second was gomonews.mobi.

Our mission was:

Bkimedia.zinadoo.mobi
Our aim was to make BKI Media a success in traditional search engines (Yahoo! Google) and also to get into the five of all mobi related search engines and directories.

Gomonews.mobi
Our aims was to be found at the top of multimedia search engines such as Taptu; onsearch.mobi and on viral online searches as well.

The results: they ranked well for their brand names (“bki media” and “gomo news mobi”) on Google Web, Find.mobi , Taptu, Mobiseer, and Yahoo OneSearch.

Ahem… I don’t think you need more than 1 hour of experience with SEO to rank well for your brand name. Sorry, but I’m not convinced. Plus if you want to get traffic from mobile SEO, target the main mobile search engines (Google and Yahoo), not barely known directories or “.mobi” search engines.

The author concludes by saying that they achieved these rankings by using “proprietary techniques”. Really? Wow. I’d love to meet the engineers behind your techniques.

It’s true that Mobile SEO can seem hard to do, but the same techniques apply than with traditional SEO, there’s no secret: observe, test, measure, adjust. If you don’t have time to do it yourself, my advice would be to go with a traditional SEO company, which has more knowledge of search engine algorithms and ranking techniques.

Posted by Nadir
on Tuesday, September 16th, 2008 at 10:01 am.

7 Responses to “My Take on MetaTXT and visibility mobile’s Mobile SEO Whitepaper”

Ha Ha Nadir – it was going to be a white paper then it changed to a paper and it was written not for SEO’s but for brands. Sorry you didn’t like it – every PR, female, brand agency and non techie has thanked me for being so simple. I would expect that you would find Nialls White Paper a lot better than this “paper”.

I blogged a couple of times I was going to dumb it down. But I do think that its easy for you to say – I just checked out the position of gameloft on Taptu and something that you are saying is easy – hasn’t been achieved?

Thanks for the attention and as a previous subscriber of BKI – I know you know my writing isn’t that bad.
bena

Regarding Taptu, I’ve never tried to optimize for this search engine, plus it’s a search engine that has “human input”, so clearly it’s not a factor that a SEO can directly have an impact on
Taptu is best suited for content search, not web pages.

Let’s run through the various components of the mobile search ecosystem to determine why this is a false conclusion:

1- Web crawlers. It’s true that Google and Yahoo do in fact crawl the entire Web, and thus it’s possible for them to parse this line during that crawl. I’ve seen no evidence of that happening yet, as the majority of mobile-optimized sites continue to show up in Google’s index as the PC-optimized URL, and similarly on Yahoo, many mobile-optimized sites can not be found in their “Mobile Web” results. But more importantly, the vast majority of mobile search engines do not crawl the entire Web, but stick with the mobile-optimized sites. Thus these tags are never seen, and thus not part of any solution.

2- Transcoders. It does seem that transcoders notice the tag. However, every transcoder I’ve seen will render the PC-optimized page, providing only a far from prominent link within the footer showing that a mobile-optimized page exists. For me, this is opposite the ideal behavior. However, it makes sense, for to see the tag, the transcoder must have already loaded the PC-optimized page, and by the time it’s parsed that tag, it’s likely loaded the style sheet and most of the images.

3- Browsers. Similar to transcoders, a browser has to load the PC-optimized page just to discover a mobile-optimized version exists. A key reason mobile-optimized pages exist is that PC-optimized pages are too large and heavy for mobile browsers, even full HTML browsers. Meta.txt allows a browser to quickly determine if a mobile-optimized site exists, thus avoiding downloading tens or hundreds of kilobytes. As most subscribers pay per kilobyte, this can make the difference between adoption of Mobile Internet and not.

4- RSS readers and other tools. Meta.txt is designed for more than search and more than browsers. It allows for discovery of a site’s RSS feeds, podcasts, and other non-page content, again without having to load the PC-optimized web page.

Taking a step back, using the PC-optimized home page as the repository for a site’s meta-data is simply not a good design. This is the same reasoning which spawned CSS. Markup is for describing page content, CSS for style, and now meta.txt for site meta-data.

Take one more look at the whitepaper, this time thinking about the above design principal. Then try answering these questions without loading the PC-optimized home page:

Q1- What is the name of the site? There is no answer to this today. Google, Yahoo, etc. uses the name of the web page. Meta.txt allows a web master to name the site.

Q2- What is a short description of the site? This can be specified today as a tag within the home page. However, Google, Yahoo, etc. all seem to ignore this value, and instead generate a synopsis of the page, based on keywords. Meta.txt allows a web master to describe the site.

Q3- Does this web site have an RSS feed? Meta.txt allows for discovery of RSS feeds, podcasts, and other content which are not web pages. This can be expanded in the future as novel forms of interactions are invented and adopted, e.g. IPTV, OpenSearch, OpenID.

Q4- What is the mobile-optimized entry point for the site? Unfortunately, few sites use a single URL for both PC and mobile access. Mobile entry points today are only discoverable via the tag within the PC-optimized entry point, or by web masters specifically registering sites with search engines and portals. Meta.txt allows these entry points to be discovered without mixing markup and meta-data, without loading the scripts and other data found in the headers of PC-optimized web pages. The Medio meta.txt file is all of 298 bytes. That’s only a few bytes more than just the , , and tags in the http://medio.com home page and 12K smaller than the tag in that same page. Wireless networks are far too precious to load and then immediately throw away 22K simply to redirect to http://m.medio.com, which itself is less than 3K in size.

All this said, there is no reason to abandon the use of the tag to aid in discover of mobile sites. My point is that the solution is far from ideal and far from complete. A better, simple solution is meta.txt.

I talked to many publishers and a few browser vendors at CTIA last week, and found universal support. As co-founder of Medio, I’m well aware of the issues around mobile web crawlers and mobile search. Medio views meta.txt as a very useful solution to aid in discover of mobile web sites.

I think on your first pass, you likely missed some of uses cases I’ve outlined above, and overlooked the potential network wastage from the tag. Hopefully my long response clears all that up.

Hi Luni, thanks for your lengthy reply, it does shed some light on the subject.

Regarding the questions you raised:

1. Why not. But I don’t see in what context it would be used, nor how search engines will treat that. People search for keywords/phrases, some keywords are brand names, some are products, some are services, questions, etc. How would the name of a site help and what URLs should a search engine return when I type a keyword located in the name? Because it’s not associated to a specific page, but rather a whole website.

3. RSS: you can specify that inside the tag, but I’m not familiar enough with the technology behind RSS crawlers or readers to know if that poses a problem.

4. I agree that for site owners who do not use user agent detection, that can be handy. But regarding search engines, it’s just that the technique of using Meta Data to help them find a site goes against the techniques used to find sites.

Even if all PC sites owners added meta.txt, and let’s say that they have a mobile site but don’t link to it, that’d mean that a crawler would need to crawl all PC websites to find mobile websites? Would Medio or other mobile only search engines do that?
I guess no, so registering or using links to help mobile search engines index sites will always be the most reliable way for that.

Plus some sites are only mobile, they don’t have a PC version. If these mobile sites are indexed nowhere, meta.txt won’t help, you’ll still need links to help bots discover it.

Note that I’m not criticizing meta.txt and your ambitions behind that, it’s just that it has been introduced as a “Mobile SEO tool” that can immediately increase the visibility of a site and can solve everything, which IMO has been exaggerated.

The biggest problem in mobile search is the discovery of mobile web sites. Meta.txt helps solve that problem.

If meta.txt were adopted widely, then the mobile search engines could simply grab the list of all .com, .net, .org, .co.uk, .de, etc. domains, and one by one, find the meta.txt file and determine if a mobile site exists. Plus in the process, gather than name and description of the site, plus any RSS feeds, podcasts, and other content consumable on mobile phones.

This then doesn’t eliminate the need to crawl the whole site and index the contents, but it at least provides a means to determine the collection of sites which are mobile-optimized, in order to create a far more complete index of mobile sites than any search engine has today.

Discovery of a site is step #1 in any SEO process, and thus meta.txt is potentially the most powerful SEO tool in mobile. Sitemaps, keyword optimization, link-in’s and every other SEO “trick” is useless until a site is known to many search engines.

Registration of sites directly with search engines is the only true alterative. Registration is nice for Google, Yahoo, and even Medio, but registration hasn’t worked for the smaller search engines, and fails completely to help new search engines in new markets. Meta.txt is instead an open standard which solves this issue in a simple, open manner.

This is a great reading. Thanks for sharing this information. We have few readers who would like to read this stuff. We will pass it on to our readers for more feedback. We are dealing with seo firms and would like to get feedback from you too. This is a nice postings indeed. Thanking You. london seo agency seo ireland