Data-based marketing is the wave on which marketing and communication professionals, technology providers and agencies are riding and talking to each other. Due to people’s constantly changing, digitising media consumption, target groups have to be profiled more precisely.

They have to be addressed in the right time window and environment in a needs-oriented manner and converted to customers as efficiently as possible.
The plan is good and right. And at best, it ends the unnecessary media channel discussions between “classic” vs.”digital” supporters.

However, companies and their communication service providers must first of all face the challenge of being able to adequately meet these channel-neutral requirements.

My trend for 2018 is therefore to look at data-based marketing from a new perspective: Many marketing decision-makers should aim for and initiate a strategic change of direction towards new data-based marketing compared to the tried and tested.

In this way, we can generate data for you, process it in a meaningful way, and then control needs-oriented content at the right time and in the right place.

https://serviceplan.blog/sp-content/uploads/2016/03/logo_serviceplan.png00Diana Degraahttps://serviceplan.blog/sp-content/uploads/2016/03/logo_serviceplan.pngDiana Degraa2017-12-20 08:14:072017-12-20 10:16:13Data is the oil of the 21st century

In the search for the trends for 2018, everyone likes to look at his desk in a reflexive manner and describe the topics that can be found there.

I’m afraid that won’t last much longer. The long preached collaboration of all disciplines and perspectives will become a trend in 2018, and the practice will demand it. Open thinkers lead and combine offline, online, artificial intelligence and common sense into a strong commitment to brands and customers.

Collaboration in our understanding transcends all boundaries. Between departments, companies, customers, partners. It is a matter of close cooperation between people with different perspectives and skills, but with a common goal: to develop new, future-oriented solutions.

Undoubtedly, this will be a wonderful, inevitable feat of strength! And we will all have to stretch out to overcome the horizon of our own desks.

In 2018 we will creatively prepare ourselves to combine technology and storytelling in new ways. With the rapid pace of technical development, it will not only be possible to create different content for different people, but also to make the content reactive. A film that notices that I’m in a bad mood and tries to cheer me up, but surprises another user with another ending; a visual that immerses my mood in colours; music that adapts to my situation. In terms of narrative, this results in infinite, progressive possibilities.

New wine into old wineskins and an endless relationship drama are what kept us busy last month. Meanwhile, in the Far East, a new player is gaining ground on the search engine stage. Find out who it is – and much more – in the SEO News for December.

1) YouTube data now available in Google Trends

Happily, belittled by SEO experts as a child’s plaything, Google Trends has been an extremely popular tool for many years, used to easily analyse the search market. In addition, the company from Mountain View likes to use its trend feature as a PR vehicle for clickable headlines (“The most important search terms of 2017”). Since the end of November, however, even experienced SEO experts have found it useful to look at the web tool, which quickly and simply provides a comparative overview of search demand and its development over time for up to 125 keywords simultaneously. While the data used to be based solely on Google’s web search, the results can now be filtered by Google’s news, shopping and images categories. What’s more, the search volumes from Google’s YouTube video portal can also be displayed separately. Particularly in times when moving image content is becoming increasingly important, Google provides a reliable source for preparatory market analysis and monitoring.

2) The pivotal question: Is social media important for SEO?

You might think this question is as old as humanity itself. This cannot be true, of course, as humanity is much older than Facebook, StudiVZ and Myspace together. Nevertheless, since the rise (and fall) of social media portals and apps, the search scene has been wondering: Do I really need this to do my SEO job right? To put it bluntly, social media content is not a direct ranking factor in the same way as backlinks, for example. This already inhibits the limited visibility of many posts and likes for search engines behind the login barriers of social media applications. But when viewed from a distance, it becomes clear that Social and Search pay for the same goals: both want to attract the attention of users, satisfy their need for information or entertainment and anchor a product or service as a brand in the collective consciousness of Internet users on the intricate paths of the user journeys. The paths can cross at different points, for example in the search hits of social media content on search engines. Even though it is hardly possible to verify a measurable connection, the realisation is obvious that Social and Search are brothers in spirit who can strengthen each other.

3) Top ranking factors of 2018 according to SEMrush

Now is the time for SEO experts to reflect on the achievements of the fading year and ask themselves what they might be up against in 2018. We started to look forward to the coming year in the last SEO News. A new study of the popular analysis tool SEMrush has now examined more than 600,000 keywords with the help of a self-learning algorithm and has compiled the 17 most important ranking factors. Not surprisingly, direct user signals are at the top of the ranking, such as the amount of direct traffic to a page, the time spent on the page and the bounce rate. Interestingly, the often disregarded off-page factors were considered relatively important by SEMrush. The classic factors such as referring domains, backlinks or IP circles are still ahead of content factors such as text length, metadata or rich media integration. This means that the findings of the study at least partly contradict the publicly announced position of the major search engines such as Google and Bing. Every search engine practitioner should definitely take a look at the study – if the holiday season permits.

Whether the future will be built in China may only be answered with certainty in a few years’ time. However, the fact is that China is rapidly on its way to becoming a new centre for technological development. The Chinese technology giant Tencent is the company behind the successful chat apps WeChat and QQ. Its search engine Sogou (literally: Search Dog) has been around since 2004, but was not able to escape the field of defeated competitors behind the industry giant Baidu. This is now set to change with the help of fresh money from an IPO and massive investments in artificial intelligence. According to the wishes of the parent company, Sogou users will also be able to search English-language websites within China’s legal boundaries. Tencent also wants to use its immense data pool from WeChat to raise the recognition of natural language and user intentions to a new level. Whether a new Google of Asia will emerge here remains to be seen.

It’s all about speed when it comes to online marketing. Therefore, in November, we are already looking at the new year and thinking about everything that will change in 2018. Will SEO be dead and gone and robots take over the world? It won’t be that bad, but there is a hint of truth behind this. You can find out more in the current SEO news.

1) Google launches its Mobile-First-Index (a little)

The launch of the Mobile-First-Index will be the dominating topic for SEOs in 2018. A year ago, the search engine, based in Mountain View, had already announced that it will realise mobile versions of websites in the future instead of using the desktop version as a reference for contents and rankings. However, it is not all going to happen on one specific day, the change will be quite gradual and accompanied by extensive tests, according to Google. Google spokesperson, John Mueller, has now announced that work has begun on converting the first websites to the Mobile Index in trial operation. Although it is still too early to talk about the official launch of regular operation, it is more of an initial testing phase. However, the changes in rankings that were observed by web masters in the middle of October are not related to these tests, according to Mueller.

2) 2018 SEO expert oracle

A glimpse into the SEO crystal ball fascinates the search industry again every year. Renowned experts have made predictions for 2018, on what the dominating trends will be in the coming 12 months. They all agree that Google’s transition to the Mobile-First-Index, the rapidly increasing use of language assistants and the triumph of artificial intelligence will bring about serious changes to the technological side of search marketing. Companies and web masters should watch these changes closely. The fight for organic traffic will quickly intensify. Since Google increasingly appears as a publisher and already provides a lot of information on its own search results using the so-called Featured Snippets, the use of structured data, in-depth analysis of contents and user behaviour as well as the focus on a good user experience all remain the most important areas of activity. Aaron Wall from SEO Book even speculated that Google’s dominance in the search sector will decline and that users will increasingly resort to specialised search systems. In summary, SEO expert John Lincoln easily adapts an old classic: “The old SEO is dead and gone – welcome to a new era. It’s 100 times better and much more exciting.”

3) Microsoft and Google rely on human support

Barely a day goes by when there isn’t something written about the unstoppable spread of artificial intelligence and its effects on online marketing. Search provider giants, Google and Microsoft, rely on the use of learning machines. However, if you look closely, there is also an opposite trend: Microsoft’s search engine, Bing, first announced in August that it wants to rely more on its collaboration with users in the “Bing Distill” community in order to improve the quality of its direct answers in the future (we reported). At the start of October, Google invited its “Local Guide” community to the second conference in San Francisco. According to the company, the organised user community already has around fifty million participants worldwide, who primarily check and correct entries in Google Maps. In addition, almost 700,000 new entries are composed by local guides on a daily basis. Google said that this is a great help, especially in developing countries, because information from local businesses and services in these countries is difficult to automatically record and check. It remains to be seen whether this trend is taking hold or whether humans are just a bridge technology until artificial intelligence has acquired the same skill set.

4) How artificial intelligence will change search engine optimisation

Search Marketing faces great changes and, at the core, it’s all about the effects of integrating artificial intelligence and machine learning into the technology of major platforms. In terms of the organic search, according to SEO veteran and expert, Kristopher Jones, this means that keyword rankings will no longer be subject to dramatic changes in the future and that there will be no superior, universal algorithm. In fact, specialised and dynamic algorithms in a variety of versions will be used for various search requests. Ultimately, the search provider’s aim is to accurately grasp the exact intention of the user using technological aids and to be able to deliver better results, according to Jones. The search expert believes that the classic keyword analysis and technical SEO would therefore be obsolete. In response to the challenges of artificial intelligence, Jones suggests a combination of user experience optimisation, strictly tailoring the contents to user intentions and using more natural speech patterns for voice search. He went on to say that search engine optimisers will not be able to develop their own analysis tools based on artificial intelligence and that agencies and advertisers will have to develop strong responses to the technological challenges in order to not be overwhelmed by the progress.

When Google announced the dawn of the Age of Assistance [1] last spring, it certainly didn’t underestimate all the changes it would trigger.

11 mn to readInteraction with various technologies and interfaces means that the way we search for information on the Internet is being turned completely on its head.
And along with it, the way we need to think about how we use SEO.

First, you no longer have (full) ownership of your content…

It’s been rumoured for a few years, we’re going to have to come to terms with the fact that the web is now a series of platforms. This global phenomenon means that access to the digital audience, is centralized, managed by a handful of players in Silicon Valley.

Google AMP, with examples of how the Washington Post and the New York Times are displayed

If we only consider access to information, it is primarily Google and Facebook that lay down their own law. For commercial reasons – mainly audience retention and control of advertising space – each of them has deployed its own platform for hosting content: Instant Articles by Facebook [2], AMP (Accelerated Mobile Pages) by Google [3]. If you want to reach a large audience, especially in the media sector, it has quickly become essential to consider sharing information on these platforms.

The benefit for the reader is obvious. With these technologies, users can access and consume information more quickly. With Facebook for example, you don’t need to wait for a web browser to open up, the articles are readily available in the Facebook app on your smartphone. It’s the same with Google, AMP is an “accelerated” platform, a no-frills setup that delivers an optimal mobile experience.

However, when you post an article on AMP or Instant Articles, you abandon your website and depend solely on the environment that the GAFAs agree to provide for you. While it is still possible for your brand to emerge, a great many web user habits have begun to disappear: auxiliary browsing, page separation, and of course advertising design. Content is brought right down to its bare bones.

This can be a good thing, in that the news, content or function displayed is exactly what the user was looking for. But it has a huge impact on how the information is presented, and especially how the internet has learned to capitalize on its readership over the last 20 years.

With AMP, it’s no longer the page, but its content… that generates satisfaction, viewing habits, even repeat behavior.

And if you’re not running a media site?

Rest assured, Google hasn’t forgotten you either. The search engine is developing and distributing a Progressive Web App format [4] that will eventually become the AMP for eCommerce and transactional platforms. In the same way that press articles are fast-tracked by smartphones, transaction forms – flight check-in, quote requests, etc. – will be fast-tracked for an improved mobile experience. The outlook looks promising in terms of the user experience [5], but the impacts for the industries that will use this format remain to be seen.

…by the way, you no longer (really) need to try to rank web pages…

AMP and Instant Articles have already changed how content is perceived on the Internet, and have begun to separate it from its traditional base: the web page. Featured snippets (SEO experts refer to these results as being in position zero) in Google results [6] are also having a massive impact on the way SEO is approached.

Basically, a featured snippet is a ready-made answer, generated by Google, to a question from the user.

It takes the form of a paragraph of boxed text, sometimes even with illustrations, that is presented above the usual organic search results. And that’s why it’s called position zero.

A Google featured snippet, generated for a search into… featured snippets…

The format appears mainly when the user asks questions; full sentences in interrogative form… but also when requests are understood to be searches for information on processes or concepts. Anything that requires more explanation than transactional search results.

How does Google identify the information that will appear in a featured snippet? It first evaluates the relevance of a page on the subject requested and then extracts the paragraph or paragraphs it considers to be the most explicit.

What matters then to the engines is no longer just the relevance of a page on a specific keyword, but how part of this page can answer a concrete question.

The evolution of searches towards featured snippets means that content creators have to stop thinking only in terms of web pages, and start thinking in text units – paragraphs, lists, processes – and how they can work on their content so that it is presented as a response rather than raw information. This is probably going to change a lot of editorial style guides.

The impact of featured snippets goes hand-in-hand with the deployment of AMP technologies. If Google is able to find, and therefore provide the user with, a suitable response in one paragraph, there is no longer any benefit in it driving traffic to a website. The featured snippet can potentially be enough for the user. Finding a web page isn’t the user’s priority any more!

…so you no longer (really) need to target keywords…

Where is all this going? The two revolutions we’ve covered so far only involve the display and processing of web data. They do not really affect how the user interacts with search engines. And yet, the biggest revolution in progress is coming straight from the users themselves.

By relying more and more on their smartphones (people are now pulling out their devices more than 150 times a day [7]), users prefer their own micro-questions and are moving away from keyboards. The emergence of featured snippets in Google is a direct consequence of this change in behavior [8].

Last spring, nearly 20% of queries made using the Google app in the US were voice queries [9].

And these figures are set to rise. Many queries are now only related to smartphones: to look for directions, call a contact or play a track.

But other uses are emerging, such as requests for homework assistance from teenagers (31%) or queries about movie showtimes from adults (9%) [10]. These are searches for basic information.

How are Internet users’ voice queries formulated? It’s quite simple, they are spoken. When we make a voice query, we no longer depend on a keyword, we ask a real question. Advertisements for voice assistants – like Apple’s Siri – have inspired this behavior.

When questions are fully formulated, they have the advantage of being able to concentrate on a point of detail about a person (age, place of birth, role for an actor, etc.) or a retail outlet (location, opening hours, etc.). And that’s exactly what featured snippets are designed to do: they provide a precise answer to a specific question [11].

So what does that change in how websites are to be designed?This changes what kind of information is displayed: we’re no longer trying to position results based on a request, but to answer a question.

It’s no longer about trying to display as much information about Angelina Jolie as possible, but to convince Google that you are the best source of information to give her age . This involves a little technical skill – microformatting, information management – but above all it means thinking about content in a different way, separating it into basic information blocks, based on the user’s questions, rather than in long encyclopaedic articles.

Questions, unlike queries, require quick and simple answers. And so content needs to be quick and simple too.

…anyway, soon you’ll no longer be displaying any text at all…

The next revolution will be simplicity. Can you see where we’re going with this?The natural partner for voice queries is of course voice responses. Voice assistants – Amazon Echo,Google Home and Apple HomePod mark the latest development in how we search for information on the web.

Google Home, Google’s voice assistant is of course the future of Search

Amazon Echo, the most popular voice assistant, was installed in nearly 9 million American homes last spring [12].

These terminals have no screens – even though Amazon has been testing new versions of its terminal[13] – answers to users’ questions are purely vocal and should leave no doubt or room for interpretation.

Because the main problem with searches, and especially answers, is in how they are interpreted, and the doubt that can be generated by the absence of visual support or backup solutions. On a computer screen, if the first result of a query isn’t what you want, you can always click on the next one. And if you do not fully understand a featured snippet, it often comes with a visual to illustrate the subject, or a link to go into more details.

This reassurance, or redirection, is crucial in the user experience; it encourages them to go deeper into a search, to check back, to explore further.

In voice queries, answers don’t offer any further options. You can’t ask for another result or confirm the first outcome with an illustration.

In voice queries, “I did not understand” doesn’t exist.In fact, anything that could be confusing about the answer is eliminated.

First, the source of the generated information[14] can have a tremendous impact on its meaning. Users must make do with what they have, and assume that if they have chosen an assistant produced by the Amazon brand, they agree to see the world, or at least part of it, through the eyes and sources that are available from Amazon.

But there is also the way content providers formulate their answers. The spoken media is not the written media – print and radio reporters are well aware of that – and being convincing on a Google Home device is not the same as being reassuring from the search engine’s homepage.

The answers provided by brands will have to evolve towards facts rather than elements of communication or “projection”…

Are you still doing SEO?
Yes, but not really in the same way as before…

We have cast SEO aside so many times that it will probably still survive many a revolution in the future. Appearing on a search result page or being quoted by a voice assistant will always require a minimum amount of information structuring and technical expertise, a minimum amount of thinking and the ability to use algorithms.

Having said that, voice searches, and especially the emergence of new consultation tools on the web, are sure to drastically change the way we think about how we optimize access to information. First, because the page as a unit of measurement for the web will soon disappear. Social networks have already eliminated the need for a website[15].

The web page is still the basic unit of SEO optimization. We use web pages to consider how we segment content, tree structures, semantic silos, etc. Separating content providers from web pages will require most information specialists to take a look back at their content: texts and semantic notions. And to reflect on this moving matter without necessarily sticking to permanent and structured support tools.

The death of the web page will force site managers to think in terms of information flow, and no longer only in terms of support tools.

A great many content managers and community managers have been getting into that habit over the last few years. They flick between broadcast media – Facebook, Twitter, YouTube, etc. – depending on the targets and objectives of their content, and have learned to handle information flows and divert physical media. They have also learned how to adjust content and change its structure, depending on its objectives and how it is published.

In some ways, community managers have also learned to speak to algorithms before addressing human beings.

And that goes to show that businesses and expertise are getting closer, coming together, and merging.

After all, given that conversational interfaces are the search engines of tomorrow, community managers might indeed become the SEO of tomorrow?

Inspirations

– One album: US (Peter Gabriel – 1990), for the introduction track Come Talk to Me 🙂
– One book : The Library of Babel (Jorge Louis Borgès – 1941)

Why the approach for search-engine advertising in future will focus more on context

The days of search-engine advertising (SEA) insisting chiefly on category accounts and generic campaigns with thousands of ad groups and keywords are coming to an end. Nowadays, excessive keywording is actually proving to be somewhat counter-productive. You could be extolling the virtues of a small Kölsch beer to a fan of Munich’s famous stein – a complete waste of time. The main reason for turning our backs on absolute keyword dominance is that, in the past, Google has continually expanded its keyword options. As a result, the context in which the keyword appears is becoming increasingly more relevant.

Originally, when planning SEA campaigns, care was taken to choose keywords that were as precise as possible, not least because of the strict policy implemented by Google. Nevertheless, over time, Google has become increasingly flexible with the range of its “exact match” and “phrase match” keyword options. For some time now, to ensure that potential customers do not fall between the cracks, misspelt keywords, regional language variations (e.g. tram instead of trolley) or abbreviations have also resulted in a hit list being returned. The aim of this measure was not only to generate more clicks and thus also more money for Google, it was also intended to simplify the process for advertisers to focus on the intentions of the people conducting the searches and thus ensure more relevant results.

Ultimately, all users are different and therefore have a different way of expressing themselves and a different idea of which adverts are personally relevant to them. There has also been a change in user behaviour with the increasing use of voice technology. According to Google, more than a fifth of enquiries are already made on Android smartphones using voice input. Consequently, for the SEA business, dynamic search ads (DSAs) are becoming more and more appealing.

With dynamic search ads, keywords are no longer entered – apart from the keywords which should not be used at all (the so-called negatives). Instead, the text displayed is created (semi) automatically. In connection with dynamic search ads, keywords are only used in the context of a negative exclusion test, i.e. to ensure campaign granularity. With DSAs, rather than using keywords, Google compares search queries with the contents of a website or data feed. If there is relevant information with regard to the search enquiry, Google returns an advertisement, after approval of the campaign – automatically and with no keywords entered. In so doing, both the combination of words in the ad title and the URL of the target page are generated individually based on the way the search query is worded.

An example: a user is about to travel to Switzerland and wants to buy a new pair of hiking shoes. In the Google search window, he enters the words “men’s hiking shoes”, whereupon, based on the contents of the web shop, an ad is generated with the title “men’s hiking shoes”. Google takes the text for the ad from the product data feed or the text on the website. Because data feeds are often based on identical manufacturer’s information for all web shops, without content optimisation, no distinction in the headline of the text ad can be made. Companies therefore need to specify which type of criterion (price, selection, delivery, promotion etc.) should be selected. User and enquiry-related optimisation of contents in the web shop and data feed is the key to higher conversion rates here among search-ad competitors.

Dynamic search ads are already obtaining very good results. The reason for this is that the increase in targeting options on data feeds and new opportunities in text design means that ads can be placed with great precision, without any reduction in output by Google and if necessary without getting highly relevant traffic or paying more for this traffic. The more precise and relevant the ad, the more likely it is that the potential purchaser will click on it. It also ensures that considerably more users are reached than with campaigns purely based on keywords.
However, this partial automation in Google’s system certainly does not grant Google a free licence or allow the search-engine giants alone to decide what hits should be delivered. In fact, the opposite is true: agencies need to manage the complex craft of dynamic search ads. This means that they need to specify negative exclusion tests which ensure no advertisement is returned. In the earlier hiking-shoes example, this could be a combination of the terms “fall”, “mountaineering accident”, or something similar. They need to put together campaigns with clearly structured themes and optimise data feeds or the website structure and URLs.

It is also necessary to continuously monitor the quality of campaigns, constantly adjust what is returned and continuously review the rules and regulations that govern this. The tasks for agencies are therefore changing.

The e-commerce sector in particular stands to benefit from dynamic search ads. Online shops often have a large and ever-changing product range and extensive content, such as product data feeds, which can be accessed by generating dynamic ads. Despite the freedom offered by DSAs, campaigns can be closely controlled, as there is an option to advertise on the basis of the website as a whole, as well as specific categories. There is also a logistical advantage with dynamic search ads: because an online shop’s range often changes, this used to require a great deal of work with classic search-engine advertising. This work has been reduced to a minimum by the semi-automatic creation of DSAs.

Therefore, the work of search-engine advertisers will also change in future: we shall no longer be putting most of our effort into keyword sets and variant texts. Instead, for our SEA campaigns, we shall use dynamic search ads that extract their information automatically from data feeds and websites. In future, agencies will have to deal much more with pure campaign optimisation to ensure quality and thus long-term success. Furthermore, website optimisation and data-feed optimisation will increasingly become a central focus.

The issue of security doesn’t just motivate people when they’re casting their vote; it also motivates them on the internet. This is where the search engine giant Google is now exerting its market power. Also in SEO news – trends and new developments in the B2B sector and the Chinese market, the end of a long relationship, as well as a fresh look at a fundamental question: do I really need backlinks to be successful in SEO?

1) Google withdraws confidence in Symantec

Google has announced that it plans to withdraw confidence in Symantec’s security certificates and is going to remove them from its Chrome browser step-by-step. With a market share of around 40 per cent, Symantec is the largest issuer of security certificates, e.g. for verifying secure SSL connections. Google is criticising Symantec for alleged quality deficiencies and will gradually withdraw confidence in Symantec’s products from March 2018. Users of the popular Chrome browser will no longer be able to directly access websites with the affected certificates. Webmasters are now being asked to implement an alternative for Symantec certificates as fast as possible.

2) Great SEO potential in B2B

The US search engine has, for the first time, called on around 4,500 German politicians to fill in the contents of the information box to the right of the search results with their own ideas on the election program. The politicians are given a maximum of 500 characters to present their manifestos and appeal to their voters. Additionally, each politician can make three main points, each with a maximum of 140 characters. According to a statement from the company, the offer is optional and is primarily aimed at candidates who are not yet well known at an election level.

3) 2017: What’s new at the Chinese search engine Baidu

Baidu, the Chinese search market leader with around 77 per cent share, is copying the strategies of its American competition to a large extent. With their version of mobile-optimised websites, MIP (Mobile Instant Pages), the Chinese are backing the same horse as Google with AMP (Accelerated Mobile Pages). The preferred presentation of secure HTTPS websites and the relatively new Progressive Web Apps (PWA) technology also underline the fact that Baidu wants to see itself as a technology leader. In addition to an algorithm update called “Hurricane”, which cracks down on illegally-used, protected content, Baidu has also introduced a new crawler that is better able to understand the layout and UX of the page examined. Here’s some practical SEO tips for the Chinese market – a website should not be larger than 128KB and, if possible, URLs should be shorter than 76 characters. You should therefore avoid using Chinese characters in the URL. In contrast to Google or Bing, you can de-index 404 pages using an XML file, and new domain endings, such as .TOP or .WIN are categorised as spam by Baidu.

4) Apple’s Siri now loves Google (not Bing any more)

People that search on their iPhone using Siri in the future will no longer be shown search results from Microsoft Bing when Apple’s voice assistant can’t provide a spoken answer. The group based in Cupertino in California explained its decision to change the iPhone search provider due to wanting to standardise search technologies between its iPhone (Siri) and Mac (Spotlight) platforms and in internal searches on iOS. Apple states that the displayed results will include website links and video. Since the market launch of the iPhone 4s in 2011, the Siri voice assistant has used Microsoft’s Bing search engine as standard.

5) Bing: Links are still an important ranking factor

In the beginning there was the link. The era of PageRanks started in 1997 with Google. PageRanks weights websites according to the number of links and the quality of its linking structure. For many years this meant that a good backlink was the gold standard in the SEO sector. Unfortunately, the optimisation industry is exploiting the potential for manipulation in this technology, meaning that search engines started to downgrade links as a ranking factor in 2012. They were replaced with metrics that are less open to manipulation, such as social signals, clickstream data or engagement data. This doesn’t mean that the age of backlinks is anywhere near over though. Microsoft has now confirmed that its Bing search engine is still not at the point where backlinks can be forgone as a ranking factor. Microsoft states that outbound links that add value from authoritative pages are essential.

While half of Germany are still enjoying their holidays, we are presenting the SEO News for September – with search results for DIY, the renaissance of an old SEO theory, an update to e-commerce with Amazon’s Alexa and an important technology update from Russia.

1) Bing allows direct answers to be optimised directly by users

So far, the quality community “Bing Distill”, which was launched by Microsoft around two years ago, has remained relatively quiet. Now, observations from the USA are attracting attention: Users of the search engine can directly revise the responses to commonly asked questions that are shown in the results (direct answers, e.g. “How do I change a car tyre?”) and send optimisation suggestions to Microsoft. In contrast to the fully automated Google system, which is also seeing a sharp increase in the use of instant answers, Microsoft is taking a completely different, human-based approach to ensure the quality and relevance of this popular feature. According to company information, the membership in the Bing Distill group should theoretically be open to every user.

2) Google stirs up the electoral campaign

The US search engine has, for the first time, called on around 4,500 German politicians to fill in the contents of the information box to the right of the search results with their own ideas on the election program. The politicians are given a maximum of 500 characters to present their manifestos and appeal to their voters. Additionally, each politician can make three main points, each with a maximum of 140 characters. According to a statement from the company, the offer is optional and is primarily aimed at candidates who are not yet well known at an election level.

3) SEO quality factor with Google – so yes after all?

Search guru Rand Fishkin has recently been a breath of fresh air in the discussion about an old SEO theory. Does Google have a quality factor for evaluating websites that works similarly to the organic search of the official “Quality Score” in the paid AdWords program? Here, the advertiser is informed about the quality of his advertisements, keywords and target pages – this is certainly an important optimisation aid. With regard to the organic area, Google has never commented on the speculation. The observations from previous years strongly suggest that a high level of user engagement on selected pages (click rates, average stay, clicks per visit, etc.) can have a positive effect on the visibility of entire domains. According to Fishkin, the aim should be to improve the performance of entire domains with the targeted quality optimisation of some pages. This not only includes the development of new subpages, but also the exclusion of content with low user interest.

4) With Amazon’s Alexa, SEO Score has a name

The topic of voice search is increasingly becoming a part of classic e-commerce. According to traffic reports and weather forecasts, the sale of goods and services that use voice-activated assistants is steadily coming to the fore. In the middle of August, it was announced that Google Home would be collaborating with Wal-Mart, the market leader in US retail, which clearly shows the hopes of new technology. The fact that the home assistants Alexa and Echo are direct extensions of Amazon, makes things clearer but not necessarily easier. Therefore, it is important that when receiving a spoken command, Alexa and Echo firstly browse the user’s purchase history for previously ordered items of the same type. If Alexa does not find any identical items, she automatically suggests a product from the “Amazon Choice” selection. It is therefore necessary for retailers to qualify their products for the “Amazon Choice” program. Similarly to a quality factor, the product must be available in Prime, have a high conversion rate, a competitive price and positive reviews.

5) Yandex takes off with artificial intelligence

After Google had declared the “Al first” motto at its developer conference in May and thus proclaimed artificial intelligence as the basis for its search technology, the Russian search engine Yandex is now trying to emulate it. The company announced that an update based on neuronal, self-learning networks called “Korolyov” (named after a Soviet space research centre) was being taken live. Similarly to Google’s RankBrain algorithm, using this new technology, Yanex can gain a better understanding of the intention behind rare and complex search queries and can now apply the detected search intention to entire websites in large numbers. Up to now, Yandex had only used headings of web content to match search intentions to the suitable page relevance. This is an important step towards better voice search expertise for the company.