Hottest topics at SearchMarketingDay 2012 – conference coverage

03/07/2012

The second edition of the international SEO conference Search Marketing Day came to an end a month ago. For two days Stary Browar (the Old Brewery shopping and culture centre), the conference venue, was taken over by 18 experts from around the world and Poland who came to share their knowledge with over 130 conference participants. It’s time for some conclusions: what are the important developments in SEM and affiliate, what new information did the participants benefit from most, what went well and what didn’t?
The conference started at 10.00 on Tuesday in the screening rooms of the Multikino cinema in Stary Browar. The organisers planned twice as many presentations as the year before and that is why they were divided into two parallel thematic sessions (each with 8 presentations + a panel discussion). One session was devoted to SEO and website positioning, while the other one focused on sponsored links and affiliate marketing. Though frequently underestimated by others, the cinema once again proved to be a great conference venue. Comfortable armchairs, great visibility from all seats, impressive sound quality, high resolution and size of the screen.

At the registration desk

Dixon Jones and Kaspar Szymański

A full room

David Harling presenting a case study of a campaign for Audi

Magnus Brath talking about scalabiity in SEO

Dixon Jones running with the mike during the Q&A part as part of his slimming plan :-)

Ralph Tegtmeier showing automatically generated texts

Dominik Wojcik talking about Fast Flux Networks

Ralph Tegtmeier’s T-shirt caused a sensation

The presentations were a good barometer of the most important problems in the field of search engine optimisation that have been lately troubling SEOs. Some topics kept cropping up in most of the presentations, in some of them even as the main theme. Those included the following topics:

Penguin/ Panda,

good-quality content,

SEO’s work organisation.

Penguin

Even though Google gives the names of cuddly animals to successive algorithm updates, the SEO world receives them with apprehension, if not anger. It should thus come as no surprise that Penguin was mentioned in almost all presentations in the SEO session, including at least two talks entirely devoted to the issue. Google does not explain what the changes consist in so speculations excite everyone. During consecutive presentations the speakers argued with each other: “Penguin weeds out over-optimised sitewide anchors from poor-quality websites”, Dixon Jones argued. Ander Alpar in turn believes that “when Google flags a website as spam it takes their users’ reactions into consideration.”

the factor that makes Penguin alert is the presence of sitewide links (links coming from all pages of a source website, for example from the footer) with poorly varied anchor texts coming from poor-quality websites (Dixon Jones);

a 301 redirect from a website flagged as spam could be harmful for the target website (Dixon Jones);

you can use available tools (from Majestic SEO) to check if Penguin could be a threat to you; you can for example see the sitewide links directing to your website (Dixon Jones);

simplifying things a bit, Panda and Penguin are about reducing the importance of ranking factors that could be collectively labelled as “relevancy” and increasing the importance of “authority” type factors; fabricating authority is much more difficult for spammers than fiddling relevancy (Ander Alpar);

Google is suspicious not only of over-optimised anchors and sitewide links but also of the users’ behaviour. When filtering out poor-quality websites Google considers their bounce rate and time on site. The search engine makes use of many tools that collect this kind of data (Chrome, Android, Analytics, Google Toolbar) (Andre Alpar);

suggestions: shifting weight from relevancy to authority might lead to the situation when exact match domains do you more harm than good, a few strong links give you more than many poor ones, and some resources are worth shifting from offsite to onsite activities (Andre Alpar);

affiliate programmes have come under scrutiny, so why not give dropshopping a try instead (Andre Alpar);

the procedure after Penguin and any other future algorithm update should be the same: to methodically analyse the most important metrics of your own website and the websites that rank higher in the search results. And draw conclusions. Using Excel will be indispensable and preferably some programming skills as well (Joe Sinkwitz).

Good-quality content

„Produce interactive infographics, they attract thousands of links and likes!”, Russell Smith of BBC argued. “Develop creative systems of global content distribution”, David Harling added. They were countered by Magnus Brath (“but SEO has to be scalable!”) and Ralph Tagtmeier (“good enough content can be generated by a machine”).

This opposition between Smith and Harling on the one hand and Brath and Tagtmeier on the other follows from the forever continuing more or less open discussion on the Internet about the future of Google’s antispam algorithms. Opinions on the topic can be divided into two major groups. The idealist variant assumes that Google will gradually approach human-like understanding of websites and links, which would drastically curb spamming practices. The cynical variant, on the other hand, assumes that Google will grope its way in the dark trying to weed out spam, introducing new imperfect factors to the algorithm that SEOs will quickly learn to fiddle. No matter which variant wins, content will be further gaining in importance at the expense of links. Under the first variant, SEO would eventually equal PR. Under the second variant, it would drift towards the techniques of scalable content generation which at a low cost generate content that seems good-quality. The content trend was strongly present at SMD; at least two entire presentations were devoted to it (Russell Smith at the PPC & Affiliate session and Ralph Tagtmeier at the SEO session) and it kept cropping up in other talks as well.

“Database journalism” is journalism based not only on the text but also on interactive apps which make personalised content more attractive and more accessible (for example: The world at seven billion) (Russell Smith),

“responsive technology” is a system of providing content to users based on the “write once, publish often” rule (write a text once, publish it simultaneously in dedicated versions for various terminals: PC/ mobile/ tablet/ smartTV), use “responsive design” (content that adjusts itself to the terminal and connection speed), remember to enrich your content with metadata (Russell Smith);

there exist technologies that allow automated generation of texts whose quality is acceptable for the algorithm and Google’s employees; such mechanisms ensure varied form and length, sometimes even make pseudorandom slips (typos, spelling errors). They can generate texts on one topic that are more diversified than those produced by human copywriters! Unfortunately those technologies are so far available only for English and German (Ralph Tagtmeier);

creating content requires creating a system of distributing it also outside the website that is being positioned , e.g. in social media, but also on third-party websites. It’s worth doing it for the sake of the brand recognisability and obviously links (David Harling).

SEO’s work organisation

“The work of a SEO should be based on procedures, and those procedures should be scalable. Otherwise you stand no chance of succeeding in the most difficult industries such as casino/ poker, and soon maybe others” – Magnus Brath said. “SEO is constantly changing, so learn how to react to the changes in the algorithm, whatever they may be”, Joe Sinkwitz added.

Good SEOs should know how to break down their work into procedures which could be passed to the employees as part of their tasks that they could successfully carry out to some extent independently of their talents. Good SEOs should know how to build a machine that would function with or without them at the wheel: “Invest in a company an idiot can run because one day an idiot will” (Magnus Brath shared the thought based on the quote from Warren Buffet);

There is no scalability without good procedures. Without scalability you stand no chance of succeeding in the more competitive industries such as casino/ gambling – perhaps soon in other industries as well (Magnus Brath);

The most difficult task is to strike a balance between automatic performance of tasks that make up a procedure and their creativity and adjustment to the current situation. Using software? – yes, but only to a limited extent and under strict human supervision (Magnus Brath);

After Penguin as well as all other updates that will follow in the future the procedure to act should be identical: to methodically analyse the most important metrics of your own website and the websites that rank higher in the search results. And draw conclusions. Using Excel will be indispensable and preferably some programming skills as well (Joe Sinkwitz);

Creating content requires creating a system of distributing it also outside the website that is being positioned, e.g. in social media, but also on third-party websites. It’s worth doing it for the sake of the brand recognisability and obviously links (David Harling).