As the year draws to a close we are looking back at the year’s greatest achievements. This case study dives into the metrics achieved in one of our most successful link-building campaigns and how they resulted in significant measurable benefits for Client X’s site. Client X has seen a dramatic uplift in links, traffic and domain authority thanks to an SEO strategy that included these campaigns.

Client X’s website improvements:

Organic Traffic (not including traffic to customer pages or the creative pieces themselves): +69% year on year

Linking Domains: 1K -> 3K

Domain Authority: 45 -> 51

Why did Client X want to focus on link generation?

Client X exists in a very competitive space.

Client X was recovering from having a low-quality backlink profile

A link-building campaign is something that can run in tandem with other main site improvements without taking up developer resource.

Creative campaigns could gain brand exposure in publications that have not covered the brand previously.

Primary campaign goal:

Generate quality backlinks to Client X’s site to support SEO ranking improvements

What Distilled created:

Distilled has been consulting for Client X for two years. As well as working on technical and SEO strategy, we produced two Creative campaigns in 2017 and started to see results. This led to an even bigger investment over the course of 2018, where we created 7 link-building campaigns for Client X, achieving the following:

245 links (79% followed)

40 links per campaign average

3,500+ social shares

100,620 sessions to Creative pages

This helped achieve improve Domain Authority from 45 to 51 over the course of the year.

Using results from multiple tools

Because no tracking tool has a perfect understanding of the internet, we don’t rely on data from just one source when we are reviewing campaign impact. While different tools might report different actual numbers (for instance the difference between Moz known Linking Domains and SEMRush Referring Domains) we look for signs that all the tools are saying the same thing, whether that’s progress, stagnation, or decline.

Links (MOZ):

‘Competitor C’ acquired 2 other relatively large sites during 2018, meaning that they benefited from a big influx of redirected links during this same time period.

Known links: 31.1K -> 186K

Known linking domains: 977 -> 3000

Domain Authority (SEMRush):

Site Visibility (Searchmetrics, Sistrix)

Traffic value (SEMRush)

SEMRush uses rank tracking and PPC data to estimate how much organic traffic a site is getting, and how much it would cost to get the same amount of traffic by paying for it in Google Ads. SEMRush doesn’t account for all of Client X’s traffic (we know the numbers are much higher) but this still shows a marked increase throughout the year and estimates a traffic worth of $97,700.

Actual Traffic

To recap Client X’s website improvements:

Organic Traffic (not including customer pages or creative pages): +69% year on year

Linking Domains: 1K -> 3K

Domain Authority: 45->51

Conclusion:

While links are only one part of a balanced SEO approach, this case study demonstrates that the coverage achieved by these pages is helping our strategy for Client X progress in the right direction. We are looking forward to creating more campaigns in 2019, and building on what we have learnt.

Without considering a website’s homepage, category pages on e-commerce sites generate most of your organic traffic - are any of you surprised by this statement? If this comes as a shock, I have bad news: you might need to reconsider your information architecture. If you have done your job right then you have nothing to worry about.

Curious about how much organic traffic category pages actually account for, I decided to dig into the Google Search Console of a client of Distilled which has been a very successful e-commerce site for several years. These were my findings over the past 6 months (at the time of writing, November 2018).

Bear in mind this is just an example that shows a fictitious URL structure for an average e-commerce site - the level of category and subcategory pages often differs between sites.

Type of page

Proportion of Clicks

Example URL

Category pages

5.0%

example.co.uk/mens-jeans

1st level subcategory pages

25.0%

example.co.uk/mens-jeans/skinny-jeans

2nd level subcategory pages

16.5%

example.co.uk/mens-jeans/skinny-jeans/ripped

Homepage

40.0%

example.co.uk

Non-transactional pages

5.0%

example.co.uk/about-us

Product pages

8.5%

example.co.uk/black-ripped-skinny-jeans-1

This simple exercise very much confirms my thesis: category & subcategory pages on e-commerce sites do account for the biggest chunk of organic traffic outside the homepage - in our case, about 50% of the total.

So, we have now shown an example of how important these pages are from an organic standpoint without answering the question of why. Let’s take a step back and look the bigger picture.

Why are category pages so important for SEO?

Put simply, users are more likely to search for generic, category-like keywords with strong intent rather than very specific product names. If I want to buy a new jumper, chances are I will start searching for broad search queries, such as “men’s jumpers” or “men’s jumpers 2018” with the potential addition of “brand/retailer” to my query, instead of a very precise and long tail search. Who really searches for “Tommy Jeans icon colorblock cable knit jumper in navy multi” anyway unless you are reading from the label, right? For such specific searches, it is your product pages’ job to capture that opportunity and get the user as close to a conversion as possible.

Having optimised category and subcategory pages that match users’ searches makes their life much easier and helps search engines better crawl and understand your site’s structure.

Sitting close to the top of the hierarchy, category pages also benefit from more internal link equity than deeper or isolated pages (more on this later). And let’s not forget about backlink equity: in most instances, category and subcategory pages receive the highest amount of external links pointing to them. As far as we know in 2018, links remain one of the most important off-site ranking factors in the SEO industry, even according to reputable sources like SEMrush.

By now three main elements should be clear: for e-commerce sites, category pages are key from a traffic, information architecture and linking point of view. Pretty important! At this stage, the next question is simple: how do we go about creating new pages then?

Creating new category pages: when and why

Before starting with the process, ask yourself the following questions:

What is my main objective when opening a new page? What am I trying to achieve?

If your intent is to capitalise on new keywords which show an opportunity from a search volume standpoint and/or to improve the hierarchical structure of your site for users to find products in an easier manner, then well-done, move to the next question.

Do I have enough products to support new categories? Are my current category pages thin already or do they contain enough items?

In order for category pages to be relevant and carry enough SEO weight, they should be able to contain ‘some’ products. Having thin or empty category pages makes no sense and Google will see it the same way: both the SEO and UX value associated with them would drag the page’s rankings down. There is no magical minimum number I would recommend, just use your logic, check the competition (if your top competitors are selling 100 products and you are selling 10, chances are your category page will not be as strong) and most importantly, think about the users first here. Do you want users to come to a page where you display only 2 products? It is likely they will bounce back to the SERP immediately and not remember your site for granting a good experience.

When should I think of opening new category pages? Which instances are recommended?

Normally speaking, you should always keep an eye on your categorisation, so it is not a one-time task. It is vital that you regularly monitor your SEO performance and spot new opportunities that can help you progress.

As for specific instances, some of the following situations might be a good time for you to evaluate your category pages. Marketing or branding are pushing for new products? Definitely, a good time to think about new category pages. A new trend/term has gone viral? Think about it. 2019 is approaching and you are launching new collection? Surely a good idea. A site migration is another great chance to re-evaluate your category (and subcategory) pages. Whatever form of migration you are going through (URL restructuring, domain change, platform change etc..) it is vital to have a plan on what to do with your category pages and re-assessing your full list is a good idea.

Always bear in mind to have a purpose when you create a new page, don’t do it for the sake of it or because of some internal pressure that might encourage you to do so: refer to point 1 & 2 and prove the value of SEO when making this decision. You might soon end up with more risks than benefits if you don’t have a clear idea in mind.

How to identify the opportunity to open new categories

After having touched on some key considerations before opening new category pages, let’s now go through the process of how to go about it.

Keyword research, what else..

Everything starts with keyword research, the backbone of any content and SEO strategy.

When you approach this task, try and keep an open mind. Use different tools and methodologies, don’t be afraid to experiment.

Here at Distilled, we love Ahrefs so check out a post on how to use it for keyword research. Their content gap feature is also brilliant, so I recommend taking advantage of that.

Here is my personal list of things I use when I want to expand my keyword research a bit further:

Keyword Planner (if you have an AdWords account with decent spending, otherwise data ranges are a downer)

SEMrush: particularly interesting for competitive keyword research (more on that later)

Keyword Tool: particularly useful to provide additional suggestions, and also provides data for many platforms other than Google

Answer The Public: great tool to find long tail keywords, especially among questions (useful for featured snippets), prepositions and comparisons

(Data being obscured unless a pro version is paid for)

If you find valuable terms with significant search volume, then bingo! That is enough to support the logic of opening new category pages. If people are searching for a term, why not have a dedicated page about it?

Look at your current rankings

Whatever method or tool you are using to track your existing organic visibility, dig a bit deeper and try to find the following instances:

Are my category pages ranking for any unintentional terms? For example, if my current /mens-jumpers/ page is somehow ranking (maybe not well) for the keyword “cardigans”, this is clearly an opportunity, don’t you think? Monitor the number of clicks those unintentional keywords are bringing and check their search volume before making a decision.

Is my category page ranking for different variations of the same product? Say your /mens-jumpers/ page is also ranking (maybe not well) for “roll neck jumpers”, this might be an opportunity to create a subcategory page and capitalise on the specific product type is offering.

Are my product pages ranking for category-level terms? This is clearly an indication I might need a category page! Not only will I be able to capitalise on the search volume of that particular category-level keyword, but I would be able to provide a better experience for the user who will surely expect to see a proper category page with multiple products.

Last but not least: are my category pages being outranked by my competitors’ subcategory pages for certain keywords? For instance, you dig into your keyword tracking platform of choice and see that, for a group of terms, your /mens-jeans/ page is outranked by more refined subcategory pages such as /slim-fit-jeans/ or /black-jeans/. Chances are your competitors have done their research and targeted clear sets of keywords by opening dedicated subcategory pages while you have not - keep reading to learn how to quickly capitalise on competitors’ opportunities.

Check your competition

Most of the times your competitors have already spotted these opportunities, so what are you waiting for? Auditing your competition is a necessary step to find categories you are not covering.

Here is my advice when it comes to analysing competitors’ category and subcategory pages:

Start by checking their sites manually. Their top navigation combined with the faceted navigation menu will give you a good idea of their site structure and keyword coverage.

Use tools like Ahrefs or SEMrush to audit your competitors. If interested in how to use SEMrush for this purpose, check out this post from another Distiller.

Do a crawl of competitor sites to gather information in bulk about their category URLs and meta data: page titles, meta descriptions and headings. Their most important keywords will be stored there, so it is a great place to investigate for new opportunities. My favourite tool in this regard is Screaming Frog.

Content on category pages

Different SEOs have different views on this as it is quite a controversial topic.

Just take a look at an internal Slack conversation on the topic - we surely like to debate SEO at Distilled!

Some say that descriptions on category pages are purely done for SEO purposes and have very little value for the user. As with many other things, it depends on how it is done, in my opinion. Many ecommerce sites out there tend to have a poorly written 150 to 250 character description of the category, often stuffed with keywords, and either placed at the top or the bottom of the page.

Look at the example below from a fashion retailer: the copy is placed at the bottom of a handbags and purses page, so the user would need to scroll all the way down just to see it, but most importantly it does not add any value as it is extremely generic and too keyword-rich.

My preference is the following:

short but unique description which can be expanded/collapsed by the user (especially convenient on mobile where space is precious)

keyword-informed description in a way that is useful to the user and provides additional information compared to the meta data (yes, makes that extra effort)

placement at the top of the page and not at the bottom so it gets noticed by the user and the search engine’s bot

By using the description as a useful addition to our page’s meta data, we are helping Google understand what the page is about - especially for new pages that the search engine is crawling for the first time.

Also, let’s not forget about internal linking opportunities such content may be able to offer, especially for weaker/newer pages we may have on the website (more on this later).

Looking closely at the previous example on Next’s Men’s Knitwear page, we can see how they used the copy on the Men’s Knitwear page for internal linking purposes.

Have you considered your Quality Score?

Also, very important note, a description hugely helps an underrated element of our digital marketing: our PPC’s Quality Score, which is an aggregated estimate of the quality of your ads. Since category pages then to be the main destination for PPC ads, we should do whatever is in our power to improve the quality and efficiency of our work.

Landing page experience is “Google Ads’ measure of how well your website gives people what they’re looking for when they click your ad accounts” and is one of the most important elements of a keyword’s quality score. By using the category page’s content to cover some of the keywords we are targeting in our ad copies, we are heavily impacting our overall quality score, which directly impacts our CPC, hence the whole efficiency of our account!

What about the risks of creating new pages?

Creating new category page is a delicate decision that should be thought through carefully as it does have its risks.

Be aware of keyword cannibalisation

You are at the stage where you have decided to create a new category page and are about to focus on writing great meta data and description for your new page, off the back of the keyword research and other tips provided above - great! Before rushing into copywriting, take a minute to evaluate the potential risk of keyword cannibalisation. This is an often forgotten task that will save you a lot of time further down the line in case you do happen to come across this issue once the new category pages have been created. When doing so, it is important to make sure your new page’s meta data does not cannibalise your existing pages.

The risk of cannibalisation is very real: having pages which are too closely related from an SEO standpoint, especially when it comes to on-page elements (title tags, headings in particular) and content on the page, can cause some serious setbacks. As a result, the pages suffering from this problem will not live up to their full organic potential and will compete for the same keywords. Search engines will struggle to identify which page to favour for a certain keyword / group of keywords and you will end up being worse off.

Not only it is overly optimised and too long, but it clashes with its subcategory pages which are optimised for most of the terms already included in the parent page’s title.

Their Men’s V Neck Jumpers page title is the following:

Cheap Mens V-Neck Jumpers | MandM Direct

When opening their subcategory page, Men’s V-Neck Jumpers for instance, I personally would have tried to re-optimise the parent page’s title in order to allow the subcategory page to live up to its full potential for its key terms:

Re-optimised Men’s Knitwear page title:

Mens Jumpers, Cardigans & Knitwear - UK Sale | MandM Direct

How do you prevent this from happening? Do your research, monitor your keywords and landing pages and make sure to write unique meta data & on-page content. Also, don’t be afraid to re-optimise and experiment with your existing meta data when opening new categories. Sometimes it will take you more than one attempt to get things right.

Crawl budget concerns

Google defines crawl budget as “the number of web pages or URLs Googlebot can or wants to crawl from your website”.

One of the arguments against opening new category pages might be crawl budget consumption. For large e-commerce sites with millions of pages, opening many new category pages might come as a risk in a way that could prevent some parts of your site not to be crawled anymore or not as often.

In my opinion, this is a concern only for (very) large e-commerce sites which are not necessarily well-maintained from an SEO point of view. Gary Illyes from Google seems to be on my side:

Internal linking equity

This is more of a real problem than crawl budget, in my modest opinion, and here is why: creating additional pages means that the internal linking equity across your site gets re-distributed. If not closely monitored, you might end up diluting it without a clear process in mind or, worse, wasting in across the wrong pages.

When creating new pages, make sure to consider how your internal link equity gets impacted: needless to say that opening 10 pages is very different than opening 1000! Focus on creating more inlinks for important pages by exploring options such as navigation tabs (main and side navigations) and on-page content (remember the paragraph above?).

The rule of thumb here is simple: when approaching new category pages, don’t forget to think about your internal linking strategy.

Conclusion

Category pages are the backbone of e-commerce sites, hence they should be closely monitored by SEOs and webmasters. They are vital from an information architecture and internal (and external) linking point of view, and attract the most amount of traffic (beyond the homepage). By following the above tips, it becomes easier to identify opportunities where new category pages can be ‘opened’ in order to capitalise on additional organic traffic.

I am curious to hear other people’s opinions on the topic, so please get in touch with me or Distilled by using the comments below or my email: samuel.mangialavori@distilled.net.

]]>Interviewing Google’s John Mueller at SearchLove: domain authority metrics, sub-domains vs. sub-folders and morehttps://www.distilled.net/resources/interviewing-googles-john-mueller-at-searchlove/<p dir="ltr">I was fortunate enough to be able to interview Google&rsquo;s John Mueller at SearchLove and quiz him about domain authority metrics, sub-domains vs. sub-folders and how bad is ranking tracking really.</p><a href="https://www.distilled.net/resources/interviewing-googles-john-mueller-at-searchlove/">Continue reading &gt;&gt;</a>Will CritchlowMon, 10 Dec 2018 13:54:41 +0000https://www.distilled.net/resources/interviewing-googles-john-mueller-at-searchlove/I was fortunate enough to be able to interview Google’s John Mueller at SearchLove and quiz him about domain authority metrics, sub-domains vs. sub-folders and how bad is ranking tracking really.

I have previously written and spoken about how to interpret Google’s official statements, technical documentation, engineers’ writing, patent applications, acquisitions, and more (see: From the Horse’s Mouth and the associated video as well as posts like “what do dolphins eat?”). When I got the chance to interview John Mueller from Google at our SearchLove London 2018 conference, I knew that there would be many things that he couldn’t divulge, but there were a couple of key areas in which I thought we had seen unnecessary confusion, and where I thought that I might be able to get John to shed some light. [DistilledU subscribers can check out the videos of the rest of the talks here - we’re still working on permission to share the interview with John].

Mueller is Webmaster Trends Analyst at Google, and these days he is one of the most visible spokespeople for Google. He is a primary source of technical search information in particular, and is one of the few figures at Google who will answer questions about (some!) ranking factors, algorithm updates and crawling / indexing processes.

New information and official confirmations

In the post below, I have illustrated a number of the exchanges John and I had that I think revealed either new and interesting information, or confirmed things we had previously suspected, but had never seen confirmed before on the record.

I thought I’d start, though, by outlining what I think were the most substantial details:

Confirmed: Google has domain-level authority metrics

We had previously seen numerous occasions where Google spokespeople had talked about how metrics like Moz's Domain Authority (DA) were proprietary external metrics that Google did not use as ranking factors (this, in response to many blog posts and other articles that conflated Moz's DA metric with the general concept of measuring some kind of authority for a domain). I felt that there was an opportunity to gain some clarity.

"We've seen a few times when people have asked Google: "Do you use domain authority?" And this is an easy question. You can simply say: "No, that's a proprietary Moz metric. We don't use Domain Authority." But, do you have a concept that's LIKE domain authority?"

Confirmed: Google does sometimes treat sub-domains differently

I expect that practically everyone around the industry has seen at least some of the long-running back-and-forth between webmasters and Googlers on the question of sub-domains vs sub-folders (see for example this YouTube video from Google and this discussion of it). I really wanted to get to the bottom of this, because to me it represented a relatively clear-cut example of Google saying something that was different to what real-world experiments were showing.

I decided to set it up by coming from this angle: by acknowledging that we can totally believe that there isn’t an algorithmic “switch” at Google that classifies things as sub-domains and ranks them deliberately lower, but that we do regularly see real-world case studies showing uplifts from moving, and so asking John to think about why we might see that happen. He said [emphasis mine]:

in general, we ... kind of where we think about a notion of a site, we try to figure out what belongs to this website, and sometimes that can include sub-domains, sometimes that doesn't include sub-domains.

Sometimes that includes sub-directories, sometimes that doesn't include specific sub-directories. So, that's probably where that is coming from where in that specific situation we say, "Well, for this site, it doesn't include that sub-domain, because it looks like that sub-domain is actually something separate. So, if you fold those together then it might be a different model in the end, whereas for lots of other sites, we might say, "Well, there are lots of sub-domains here, so therefore all of these sub-domains are part of the main website and maybe we should treat them all as the same thing."

And in that case, if you move things around within that site, essentially from a sub-domain to a sub-directory, you're not gonna see a lot of changes. So, that's probably where a lot of these differences are coming from. And in the long run, if you have a sub-domain that we see as a part of your website, then that's kind of the same thing as a sub-directory.

To paraphrase that, the official line from Google is:

Google has a concept of a “site” (see the discussion above about domain-level metrics)

Sub-domains (or even sub-folders) can be viewed as not a part of the rest of the site under certain circumstances

If we are looking at a sub-domain that Google views as not a part of the rest of the site, then webmasters may see an uplift in performance by moving the content to a sub-folder (that is viewed as part of the site)

Unfortunately, I couldn’t draw John out on the question of how one might tell in advance whether your sub-domain is being treated as part of the main site. As a result, my advice remains the same as it used to be:

In general, create new content areas in sub-folders rather than sub-domains, and consider moving sub-domains into sub-folders with appropriate redirects etc.

The thing that’s changed is that I think that I can now say this is in line with Google’s official view, whereas it used to be at odds with their official line.

Learning more about the structure of webmaster relations

Another area that I was personally curious about going into our conversation was about how John’s role fits into the broader Google teams, how he works with his colleagues, and what is happening behind the scenes when we learn new things directly from them. Although I don’t feel like we got major revelations out of this line of questioning, it was nonetheless interesting:

I assumed that after a year, it [would be] like okay, we have answered all of your questions. It's like we're done. But there are always new things that come up, and for a lot of that we go to the engineering teams to kind of discuss these issues. Sometimes we talk through with them with the press team as well if there are any sensitivities around there, how to frame it, what kind of things to talk about there.

For example, I was curious to know whether, when we ask a question to which John doesn’t already know the answer he reviews the source code himself, turns to an engineer etc. Turns out:

He does not generally attend search quality meetings (timezones!) and does not review the source code directly

He does turn to engineers from around the team to find specialists who can answer his questions, but does not have engineers dedicated to webmaster relations

For understandable reasons, there is a general reluctance among engineers to put their heads above the parapet and be publicly visible talking about how things work in their world. We did dive into one particularly confusing area that turned out to be illuminating - I made the point to John that we would love to get more direct access to engineers to answer these kinds of edge cases:

Concrete example: the case of noindex pages becoming nofollow

At the back end of last year, John surprised us with a statement that pages that are noindex will, in the long run, eventually be treated as nofollow as well.

Although it’s a minor detail in many ways, many of us felt that this exposed gaps in our mental model. I certainly felt that the existence of the “noindex, follow” directive meant that there was a way for pages to be excluded from the index, but have their links included in the link graph.

What I found more interesting than the revelation itself was what it exposed about the thought process within Google. What it boiled down to was that the folk who knew how this worked - the engineers who’d built it - had a curse of knowledge. They knew that there was no way a page that was dropped permanently from the index could continue to have its links in the link graph, but they’d never thought to tell John (or the outside world) because it had never occurred to them that those on the outside hadn’t realised it worked this way [emphasis mine]:

it's been like this for a really long time, and it's something where, I don't know, in the last year or two we basically went to the team and was like, "This doesn't really make sense. When people say noindex, we drop it out of our index eventually, and then if it's dropped out of our index, there's canonical, so the links are kind of gone. Have we been recommending something that doesn't make any sense for a while?" And they're like, "Yeah, of course."

More interesting quotes from the discussion

Our conversation covered quite a wide range of topics, and so I’ve included some of the other interesting snippets here:

Algorithm changes don’t map easily to actions you can take

Googlers don’t necessarily know what you need to do differently in order to perform better, and especially in the case of algorithm updates, their thinking about “search results are better now than they were before” doesn’t translate easily into “sites that have lost visibility in this update can do XYZ to improve from here”. My reading of this situation is that there is ongoing value to the work SEOs to do interpret algorithm changes and longer-running directional themes to Google’s changes to guide webmasters’ roadmaps:

[We don’t necessarily think about it as] “the webmaster is doing this wrong and they should be doing it this way”, but more in the sense “well, overall things don't look that awesome for this search result, so we have to make some changes." And then kind of taking that, well, we improved these search results, and saying, "This is what you as a webmaster should be focusing on", that's really hard.

Do Googlers understand the Machine Learning ranking factors?

I’ve speculated that there is a long-run trend towards less explainability of search rankings, and that this will impact search engineers as well as those of us on the outside. We did at least get clarity from John that at the moment, they primarily use ML to create ranking factors that feed into more traditional ranking algorithms, and that they can debug and separate the parts (rather than a ML-generated SERP which would be much less inspectable):

[It’s] not the case that we have just one machine learning model where it's like oh, there's this Google bot that crawls everything and out comes a bunch of search results and nobody knows what happens in between. It's like all of these small steps are taking place, and some of them use machine learning.

And yes, they do have secret internal debugging tools, obviously, which John described as:

Kind of like search console but better

Why are result counts soooo wrong?

We had a bit of back-and-forth on result counts. I get that Google has told us that they aren’t meant to be exact, and are just approximations. So yeah, sure, but when you sometimes get a site: query that claims 13m results, you click to page 2 and find that there are only actually 11 - not 11m, actually just 11, you say to yourself that this isn’t a particularly helpful approximation. We didn’t really get any further on this than the official line we’ve heard before, but if you need that confirmed again:

We have various counters within our systems to try to figure out how many results we approximately have for some of these things, and when things like duplicate content show up, or we crawl a site and it has a ton of duplicate content, those counters might go up really high.

But actually, in indexing and later stage, I'm gonna say, "Well, actually most of these URLs are kinda the same as we already know, so we can drop them anyway."

…

So, there's a lot of filtering happening in the search results as well for [site: queries], where you'll see you can see more. That helps a little bit, but it's something where you don't really have an exact count there. You can still, I think, use it as a rough kind of gauge. It's like is there a lot, is it a big site? Does it end up running into lots of URLs that are essentially all eliminated in the end? And you can kinda see little bit there. But you don't really have a way of getting the exact count of number of URLs.

More detail on the domain authority question

On domain authority question that I mentioned above (not the Moz Domain Authority proprietary metric, but the general concept of a domain-level authority metric), here’s the rest of what John said:

I don't know if I'd call it authority like that, but we do have some metrics that are more on a site level, some metrics that are more on a page level, and some of those site wide level metrics might kind of map into similar things.

…

the main one that I see regularly is you put a completely new page on a website. If it's an unknown website or a website that we know tends to be lower quality, then we probably won't pick it up as quickly, whereas if it's a really well-known website where we'll kind of be able to trust the content there, we might pick that up fairly quickly, and also rank that a little bit better.

…

it's not so much that it's based on links, per se, but kind of just this general idea that we know this website is generally pretty good, therefore if we find something unknown on this website, then we can kind of give that a little bit of value as well.

…

At least until we know a little bit better that this new piece of thing actually has these specific attributes that we can focus on more specifically.

Maybe put your nsfw and sfw content on different sub-domains

I talked above about the new clarity we got on the sub-domain vs. sub-folder question and John explained some of the “is this all one site or not” thinking with reference to safe search. If you run a site with not safe for work / adult content that might be filtered out of safe search and have other content you want to have appear in regular search results, you could consider splitting that apart - presumably onto a different sub-domain - and Google can think about treating them as separate sites:

the clearer we can separate the different parts of a website and treat them in different ways, I think that really helps us. So, a really common situation is also anything around safe search, adult content type situation where you have maybe you start off with a website that has a mix of different kinds of content, and for us, from a safe search point of view, we might say, "Well, this whole website should be filtered by safe search."

Whereas if you split that off, and you make a clearer section that this is actually the adult content, and this is kind of the general content, then that's a lot easier for our algorithms to say, "Okay, we'll focus on this part for safe search, and the rest is just a general web search."

John can “kinda see where [rank tracking] makes sense”

I wanted to see if I could draw John into acknowledging why marketers and webmasters might want or need rank tracking - my argument being that it’s the only way of getting certain kinds of competitive insight (since you only get Search Console for your own domains) and also that it’s the only way of understanding the impact of algorithm updates on your own site and on your competitive landscape.

I struggled to get past the kind of line that says that Google doesn’t want you to do it, it’s against their terms, and some people do bad things to hide their activity from Google. I have a little section on this below, but John did say:

from a competitive analysis point of view, I kinda see where it makes sense

But the ToS thing causes him problems when it comes to recommending tools:

how can we make sure that the tools that we recommend don't suddenly start breaking our terms of service? It's like how can we promote any tool out there when we don't know what they're gonna do next.

We’ve come a long way

It was nice to end with a nice shout out to everyone working hard around the industry, as well as a nice little plug for our conference [emphasis mine, obviously]:

I think in general, I feel the SEO industry has come a really long way over the last, I don't know, five, ten years, in that there's more and more focus on actual technical issues, there's a lot of understanding out there of how websites work, how search works, and I think that's an awesome direction to go. So, kind of the voodoo magic that I mentioned before, that's something that I think has dropped significantly over time.

And I think that's partially to all of these conferences that are running, like here. Partially also just because there are lots of really awesome SEOs doing awesome stuff out there.

Personal lessons from conducting an interview on stage

Everything above is about things we learned or confirmed about search, or about how Google works. I also learned some things about what it’s like to conduct an interview, and in particular what it’s like to do so on stage in front of lots of people.

I mean, firstly, I learned that I enjoy it, so I do hope to do more of this kind of thing in the future. In particular, I found it a lot more fun than chairing a panel. In my personal experience, chairing a panel (which I’ve done more of in the past) requires a ton of mental energy on making sure that people are speaking for the right amount of time, that you’re moving them onto the next topic at the right moment, that everyone is getting to say their piece, that you’re getting actually interesting content etc. In a 1:1 interview, it’s simple: you want the subject talking as much as possible, and you can focus on one person’s words and whether they are interesting enough to your audience.

In my preparation, I thought hard about how to make sure my questions were short but open, and that they were self-contained enough to be comprehensible to John and the audience, and allow John to answer them well. I think I did a reasonable job but can definitely continue practicing to get my questions shorter. Looking at the transcript, I did too much of the talking. Having said that, my preparation was valuable. It was worth it to have understood John’s background and history first, to have gathered my thoughts, and to have given him enough information about my main lines of questioning to enable him to have gone looking for information he might not have had at his fingertips. I think I got that balance roughly right; enabling him to prep a reasonable amount while keeping a couple of specific questions for on the day.

I also need to get more agile and ask more follow-ups and continuation questions - this is hard because you are having to think on your feet - I think I did it reasonably well in areas where I’d deliberately prepped to do it. This was mainly in the more controversial areas where I knew what John’s initial line might be but I also knew what I ultimately wanted to get out of it or dive deeper into. I found it harder where I found it less expected that I hadn’t quite got 100% what I was looking for. It’s surprisingly hard to parse everything that’s just been said and figure out on the fly whether it’s interesting, new, and complete.

And that’s all from the comfort of the interrogator’s chair. It’s harder to be the questioned than the questioner, so thank you to John for agreeing to come, for his work in the prep, and for being a good sport as I poked and prodded at what he’s allowed to talk about.

I also got to see one of his 3D-printed Googlebot-in-a-skirt characters - a nice counterbalance to the gender assumptions that are too common in technical areas:

Things John didn’t say

There are a handful of areas where I wish I’d thought quicker on my feet or where I couldn’t get deeper than the PR line:

"Kind of like Search Console"

I don’t know if I’d have been able to get more out of him even if I’d pushed, but looking back at the conversation, I think I gave up too quickly, and gave John too much of an “out” when I was asking about their internal toolset. He said it was “kind of like Search Console” and I put words in his mouth by saying “but better”. I should have dug deeper and asked for some specific information they can see about our sites that we can’t see in Search Console.

John can “kinda see where [rank tracking] makes sense”

I promised above to get a bit deeper into our rank tracking discussion. I made the point that “there are situations where this is valuable to us, we feel. So, yes we get Search Console data for our own websites, but we don't get it for competitors, and it's different. It doesn't give us the full breadth of what's going on in a SERP, that you might get from some other tools.”

We get questions from clients like, "We feel like we've been impacted by update X, and if we weren't rank tracking, it's very hard for us to go back and debug that." And so I asked John “What would your recommendation be to consultants or webmasters in those situations?”

I think that's kinda tricky. I think if it's your website, then obviously I would focus on Search Console data, because that's really the data that's actually used when we showed it to people who are searching. So, I think that's one aspect where using external ranking tracking for your own website can lead to misleading answers. Where you're seeing well, I'm seeing a big drop in my visibility across all of these keywords, and then you look in Search Console in it's like, well nobody's searching for these keywords, who cares if I'm ranking for them or not?

…

From our point of view, the really tricky part with all of these external tools is they scrape our search results, so it's against our terms of service, and one thing that I notice kind of digging into that a little bit more is a lot of these tools do that in really sneaky ways.

(Yes, I did point out at this point that we’d happily consume an API).

They do things like they use proxy's on mobile phones. It's like you download an app, it's a free app for your phone, and in the background it's running Google queries, and sending the results back to them. So, all of these kind of sneaky things where in my point of view, it's almost like borderline malware, where they're trying to take user's computers and run queries on them.

It feels like something that's like, I really have trouble supporting that. So, that's something, those two aspects, is something where we're like, okay, from a competitive analysis point of view, I kinda see where it makes sense, but it's like where this data is coming from is really questionable.

Ultimately, John acknowledged that “maybe there are ways that [Google] can give you more information on what we think is happening” but I felt like I could have done a better job on pushing for the need for this kind of data on competitive activity, and on the market as a whole (especially when there is a Google update). It’s perhaps unsurprising that I couldn’t dig deeper than the official line here, nor could I have expected to get a new product update about a whole new kind of competitive insight data, but I remain a bit unsatisfied with Google’s perspective. I feel like tools that aggregate the shifts in the SERPs when Google changes their algorithm and tools that let us understand the SERPs where our sites are appearing are both valuable and Google is fixated on the ToS without acknowledging the ways this data is needed.

Are there really strong advocates for publishers inside Google?

John acknowledged being the voice of the webmaster in many conversations about search quality inside Google, but he also claimed that the engineering teams understand and care about publishers too:

the engineering teams, [are] not blindly focused on just Google users who are doing searches. They understand that there's always this interaction with the community. People are making content, putting it online with the hope that Google sees it as relevant and sends people there. This kind of cycle needs to be in place and you can't just say “we're improving search results here and we don't really care about the people who are creating the content”. That doesn't work. That's something that the engineering teams really care about.

I would have liked to have pushed a little harder on the changing “deal” for webmasters as I do think that some of the innovations that result in fewer clicks through to websites are fundamentally changing that. In the early days, there was an implicit deal that Google could copy and cache webmasters’ copyrighted content in return for driving traffic to them, and that this was a socially good deal. It even got tested in court [Wikipedia is the best link I’ve found for that].

When the copying extends so far as to remove the need for the searcher to click through, that deal is changed. John managed to answer this cleverly by talking about buying direct from the SERPs:

We try to think through from the searcher side what the ultimate goal is. If you're an ecommerce site and someone could, for example, buy something directly from the search results, they're buying from your site. You don't need that click actually on your pages for them to actually convert. It's something where when we think that products are relevant to show in the search results and maybe we have a way of making it more such that people can make an informed choice on which one they would click on, then I think that's an overall win also for the whole ecosystem.

I should have pushed harder on the publisher examples - I’m reminded of this fantastic tweet from 2014. At least I know I still have plenty more to do.

So. Thank you John for coming to SearchLove, and for being as open with us as you were, and thank you to everyone behind the scenes who made all this possible.

Finally: to you, the reader - what do you still want to hear from Google? What should I dig deeper into and try to get answers for you about next time? Drop a comment below or drop me a line on Twitter.

]]>An introduction to HTTP/2 for SEOshttps://www.distilled.net/resources/an-introduction-to-http2-for-seos/<p dir="ltr"><img src="https://www.distilled.net/uploads/an_introduction_to_http2_(2)_(1).png" width="780" height="300" /></p>
<p dir="ltr"><span style="font-size: 1.05em;">In the mid 90s there was a famous incident where an email administrator at a US University fielded a phone call from a professor who was complaining his department could only send emails 500 miles. </span></p><a href="https://www.distilled.net/resources/an-introduction-to-http2-for-seos/">Continue reading &gt;&gt;</a>Tom AnthonyWed, 05 Dec 2018 14:33:51 +0000https://www.distilled.net/resources/an-introduction-to-http2-for-seos/SEO

In the mid 90s there was a famous incident where an email administrator at a US University fielded a phone call from a professor who was complaining his department could only send emails 500 miles. The professor explained that whenever they tried to email anyone farther away their emails failed -- it sounded like nonsense, but it turned out to actually be happening. To understand why, you need to realise that the speed of light actually has more impact on how the internet works than you may think. In the email case, the timeout for connections was set to about 6 milliseconds - if you do the maths that is about the time it takes for light to travel 500 miles.

We’ll be talking about trucks a lot in this blog post!

The time that it takes for a network connection to open across a distance is called latency, and it turns out that latency has a lot to answer for. Latency is one of the main issues that affects the speed of the web, and was one of the primary drivers for why Google started inventing HTTP/2 (it was originally called SPDY when they were working on it, before it became a web standard).

HTTP/2 is now an established standard and is seeing a lot of use across the web, but is still not as widespread as it could be across most site. It is an easy opportunity to improve the speed of your website, but it can be fairly intimidating to try to understand it.

In this post I hope to provide an accessible top-level introduction to HTTP/2, specifically targeted towards SEOs. I do brush over some parts of the technical details and don’t cover all the features of HTTP/2, but my aim here isn’t to give you an exhaustive understanding, but instead to help you understand the important parts in the most accessible way possible.

HTTP 1.1 - The Current Norm

Currently, when request a web page or other resource (such as images, scripts, CSS files etc.), your browser speaks HTTP to a server in order to communicate. The current version is HTTP/1.1, which has been the standard for the last 20 years, with no changes.

Anatomy of a Request

We are not going to drown in the deep technical details of HTTP too much in this post, but we are going to quickly touch on what a request looks like. There are a few bits to a request:

The top line here is saying what sort of request this is (GET is the normal sort of request, POST is the other main one people know of), and what URL the request is for (in this case /anchorman/) and finally which version of HTTP we are using.

The second line is the mandatory ‘host’ header which is a part of all HTTP 1.1 requests, and covers the situation that often a single webserver may be hosting multiple websites and it needs to know which are you are looking for.

Finally there will a variety of other headers, which we are not going to get into. In this case I’ve shown the User Agent header which indicates which sort of device and software (browser) you are using to connect to the website.

HTTP = Trucks!

In order to help explain and understand HTTP and some of the issues, I’m going to draw an analogy between HTTP and … trucks! We are going to imagine that an HTTP request being sent from your browser is a truck that has to drive from your browser over to the server:

A truck represents an HTTP request/response to a server

In this analogy, we can imagine that the road itself is the network connection (TCP/IP, if you want) from your computer to the server:

The road is a network connection - the transport layer for our HTTP Trucks

Then a request is represented by a truck, that is carrying a request in it:

HTTP Trucks carry a request from the browser to the server

The response is the truck coming back with a response, which in this case is our HTML:

HTTP Trucks carry a response back from the server to the browser

“So what is the problem?! This all sounds great, Tom!” - I can hear you all saying. The problem is that in this model, anyone can stare down into the truck trailers and see what they are hauling. Should an HTTP request contain credit card details, personal emails, or anything else sensitive anybody can see your information.

HTTP Trucks aren’t secure - people can peek at them and see what they are carrying

HTTPS

HTTPS was designed to combat the issue of people being able to peek into our trucks and see what they are carrying.

Importantly, HTTPS is essentially identical to HTTP - the trucks and the requests/responses they transport at the same as they were. The response codes and headers are all the same.

The difference all happens at the transport (network) layer, we can imagine it as a over our road:

In HTTPS, requests & responses are the same as HTTP. The road is secured.

In the rest of the article, I’ll imagine we have a tunnel over our road, but won’t show it - it would be boring if we couldn’t see our trucks!

Impact of Latency

So the main problem with this model is related to the top speed of our trucks. In the 500-mile email introductory story we saw that the speed of light can have a very real impact on the workings of the internet.

HTTP Trucks cannot go fast than the speed of light.

HTTP requests and many HTTP responses tend to be quite small. However, our trucks can only travel at the speed of light, and so even these small requests can take time to go back and forth from the user to the website. It is tempting to think this won’t have a noticeable impact on website performance, but it is actually a real problem…

The farther the distance of the network connection between a user’s browser and the web server (the length of our ‘road’) the farther the request and response have to travel, which means they take longer.

Now consider that a typical website is not a single request and response, but is instead a sequence of many requests and responses. Often a response will mean more requests are required - for example, an HTML file probably references images, CSS files and JavaScript files:

Some of these files then may have further dependencies, and so on. Typically websites may be 50-100 separate requests:

Web pages nowadays often require 50-100 separate HTTP requests.

Let’s look at how that may look for our trucks…

Send a request for a web page:

We send a request to the web server for a page.

Request travels to server:

The truck (request) may take 50ms to drive to the server.

Response travels back to browser:

And then 50ms to drive back with the response (ignoring time to compile the response!).

The browser parses the HTML response and realises there are a number of other files that are needed from the server:

After parsing the HTML, the browser identifies more assets to fetch. More requests to send!

Limit of HTTP/1.1

The problem we now encounter is that there are several more files we need to fetch, but with an HTTP/1.1 connection each road can only handle a single truck at a time. Every HTTP request needs its own TCP (networking) connection, and each truck can only carry one request at a time.

Each truck (request) needs its own road (network connection).

Furthermore, building a new road, or opening a new networking connection also requires a round trip. In our world of trucks we can liken this to needing a stream roller to first lay the road and then add our road markings. This is another whole round trip which adds more latency:

New roads (network connections) require work to open them.

This means another whole round trip to open new connections.

Typically browsers open around 6 simultaneous connections at once:

Browsers usually open 6 roads (network connections).

However, if we are looking at 50-100 files needed for a webpage we still end up in the situation where trucks (requests) have to wait their turn. This is called ‘head of line blocking’:

Often trucks (requests) have to wait for a free road (network connection).

If we look at the waterfall diagram for a page (this example this HTTP/2 site) of a simple page that has a CSS file and lot of images you can see this in action:

Waterfall diagrams highlight the impact of round trips and latency.

In the diagram above, the orange and purple segments can be thought of as our stream rollers, where new connections are made. You can see initially there is just one connection open (line 1), and another connection being opened. Line 2 then re-uses the first connection and line 3 is the first request over the second connection. When those complete lines 4 & 5 are the next two images.

At this point the browser realises it will need more connections so four more are opened and then we can see requests are going in batches of 6 at a time corresponding with the 6 roads or network connections that are open.

Latency vs Bandwidth

In the waterfall diagram above, each of these images may be small but each requires a truck to come and fetch it. This means lots of round trips, and given we can only run 6 simultaneously at a time there is a lot of time spent with requests waiting.

It is sometimes difficult to understand the difference between bandwidth and latency. Bandwidth could be thought of as the load capacity of our trucks, where each truck could carry more. This often doesn’t help with webpage times though, given each request and response cannot share a truck with another request. This is why it has been shown that increasing bandwidth has a limited impact on the load time of pages. This was shown in research conducted by Mike Belshe at Google which is discussed in this article from Googler Ilya Grigorik:

The reality was clear that in order to improve the performance of the web, the issue of latency would need to be addressed. The research above was what led to Google developing the SPDY protocol which later turned into HTTP/2.

Improving the impact of latency

In order to improve the impact that latency has on website load times, there are various strategies that have been employed. One of these is ‘sprite maps’ which take lots of small images and jam them together into single files:

Sprite maps are a trick used to reduce the number of trucks (requests) needed.

The advantage of sprite maps is that they can all be put into one truck (request/response) as they are just a single file. Then clever use of CSS can display just the portion of the image that corresponds to the desired image. One file means only a single request and response are required to fetch them, which reduces the number of round trips required.

Another thing that helps to reduce latency is using a CDN platform, such as CloudFlare or Fastly, to host your static assets (images, CSS files etc. - things that are not dynamic and the same for every visitor) on servers all around the world. This means that the round trips for users can be along a much shorter road (network connection) because there will be a nearby server that can provide them with what they need.

CDNs have servers all around the world, can make the required roads (network connections) shorter.

CDNs also provide a variety of other benefits, but latency reduction is a headline feature.

HTTP/2 - The New World

So hopefully, you have now realised that HTTP/2 can help reduce latency and dramatically improve the performance of pages. How does it go about it!

Introducing Multiplexing - More trucks to the rescue!

With HTTP/2 we are allowed multiplexing, which essentially means we are allowed to have more than one truck on each road:

With HTTP/2 a road (network connection) can handle many trucks (requests/responses).

We can immediately see the change in behaviour on a waterfall diagram - compare this with the one above (not the change in the scale too - this is a lot faster):

We now only need one road (connection) then all our trucks (requests) can share it!

The exact speed benefits you may get depend on a lot of other factors, but by removing the problem of head of line blocking (trucks having to wait) we can immediately get a lot of benefits, for almost no cost to us.

Same old trucks

With HTTP/2 our trucks and their contents stay essentially the same as they they always were, we can just imagine we have a new traffic management system.

Requests look as they did before:

The same response codes exist and mean the same things:

Because the content of the trucks doesn’t change, this is great news for implementing HTTP/2 - your web platform or CMS does not need to be changed and your developers don’t need to write any code! We’ll discuss this below.

Server Push

A much anticipated feature of HTTP/2 is ‘Server Push’ which allows a server to respond to a single request with multiple responses. Imagine a browser requests an HTML file but the server knows that that means the server will need a specific CSS file and a specific JS file as well. Then the server can just send those straight back, without needing them to be requested:

Server Push: A single truck (request) is sent...

Server Push: ... but multiple trucks (responses) are sent back.

The benefit is obvious- it removes another whole round trip for each resource that the server can ‘anticipate’ that the client will need.

The downside is that at the moment this is often implemented badly, and it can mean the server sends trucks that the client doesn’t need (as it has cached the response from earlier) which means you can make things worse.

For now, unless you are very sure you know what you are doing you should avoid server push.

Implementing HTTP/2

Ok - this sounds great, right? Now you should be wondering how you can turn it on!

The most important thing is to understand that because the requests and responses are the same as they always were, you do not need to update the code on your site at all. You need to update your server to speak HTTP/2 - and then it will do the new ‘traffic management’ for you.

If that seems hard (or if you already have one) you can instead use a CDN to help you deploy HTTP/2 to your users. Something like CloudFlare, or Fastly (my favourite CDN - it requires more advanced knowledge to setup but is super flexible) would sit in front of your webserver and speaking HTTP/2 to your users:

A CDN can speak HTTP/2 for you whilst your server speaks HTTP/1.1.

Because the CDN will cache your static assets, like images, CSS files, Javascript files and fonts, you still get the benefits of HTTP/2 even though your server is still in a single truck world.

HTTP/2 is not another migration!

It is important to realise that to get HTTP/2 you will need to already have HTTPS, as all the major browsers will only speak HTTP/2 when using a secure connection:

HTTP/2 requires HTTPS

However, setting up HTTP/2 does not require a migration in the same way as HTTPS did. With HTTPS your URLs were changing from http://example.com to https://example.com and you required 301 redirects, and a new Google Search Console account and a week long meditation retreat to recover from the stress.

With HTTP/2 your URLs will not change, and you will not require redirects or anything like that. For browsers and devices that can speak HTTP/2 they will do that (it is actually the guy in the steamroller who communicates that part - but that is a-whole-nother story..!), and other devices will fall back to speaking HTTP/1.1 which is just fine.

We also know that Googlebot does not speak HTTP/2 and will still use HTTP/1.1:

This means that Google will notice the benefit you have provided to users with HTTP/2, and that information will make it back into Google’s evaluation of your site.

Detecting HTTP/2

If you are interested in whether a specific site is using HTTP/2 there are a few ways you can go about it.

My preferred approach is to turn on the ‘Protocol’ column in the Chrome developer tools. Open up the dev tools, go to the ‘Network’ tab and if you don’t see the column then right click to add it from the dropdown:

Alternatively, you can install this little Chrome Extension which will indicate if a site is using it (but won’t give you the breakdown for every connection you’ll get from doing the above):

Wrap Up

Hopefully, you found this useful. I’ve found the truck analogy makes something, that can seem hard to understand, somewhat more accessible. I haven’t covered a lot of the intricate details of HTTP/2 or some of the other functionality, but this should help you understand things a little bit better.

I have, in discussions, extended the analogy in various ways, and would love to hear if you do too! Please jump into the comments below for that, or to ask a question, or just hit me up on Twitter.

]]>Help us find the next search industry rising star for SearchLove San Diego 2019https://www.distilled.net/resources/help-us-find-the-next-search-industry-rising-star/<p dir="ltr"><img src="https://www.distilled.net/uploads/speaker_cards.jpg" width="800" height="262" /></p>
<p dir="ltr">After the success of running community speaker sessions at SearchLove London we are delighted to be bringing them to our American conferences, starting with <a href="https://www.distilled.net/events/searchlove-sandiego/">SearchLove San Diego&nbsp;in March 2019</a>. </p><a href="https://www.distilled.net/resources/help-us-find-the-next-search-industry-rising-star/">Continue reading &gt;&gt;</a>Will CritchlowTue, 04 Dec 2018 08:25:43 +0000https://www.distilled.net/resources/help-us-find-the-next-search-industry-rising-star/Events

After the success of running community speaker sessions at SearchLove London we are delighted to be bringing them to our American conferences, starting with SearchLove San Diego in March 2019. Our community speaker sessions are 20 minutes long, presented by relatively new speakers, who we support and coach and then give the chance to stand on stage in front of 200+ digital marketers from around the world.

If this sounds like something you would love to experience, then we are on the lookout for speakers who:

Are San Diego locals (no further than 2-2.5 hours drive from the venue). We want to support the community where our conference runs and help our speakers raise their local profile.

Have some speaking experience, preferably a couple of speaking gigs, and are looking to break onto an even bigger stage.

What’s in it for us and our audience?

Every time we put on a SearchLove show, we (led by our head of events, Lynsey) scour the industry for the best speakers we can find. We often invite back people who blow us away and wow our audience, but we also want to find speakers no-one has seen before. Sometimes we find great speakers who have deep experience in related fields who are underexposed to the search industry, but sometimes we want to be the ones who help people break out for the first time.

In addition to the long game of building partnerships with the great speakers of tomorrow, we believe that these shorter sessions with a little less pressure could end up bringing perspectives and viewpoints we can’t get from our more experienced speakers. Speaking experience often comes with general experience, and that often accompanies moves to management or the growth of the speakers’ own companies. One of the things we also want to see is hands-on advanced and actionable advice from practitioners who are doing the work every day.

We’re also hopeful that we can access a more diverse pool of speakers with different backgrounds and experiences. There are unfortunate barriers in place to getting some of the experience that we typically require and while we put a lot of effort into broadening our intake, we hope that this initiative can play a key role in building the pipeline. (You can tell we’re serious about building a safe, inclusive and welcoming environment for our speakers and delegates from the way we bake our code of conduct into our events, and our recent progress: in an industry with too many manels, SearchLove Boston 2018 was our first conference with >50% women speakers and had women appearing as the top-rated speaker and 3 of the top 5 speakers by overall rating).

We know that we will get a bunch of overconfident white men applying (yes, I see the irony in my writing that) but if that doesn’t describe you, I’d especially encourage you to throw your hat in the ring.

What’s in it for you?

This should be the shortest path from knowing what you’re talking about to getting full speaking opportunities on the industry’s biggest stages. We’ve already seen our SearchLove London community speakers getting accepted to speak at bigger events. We have also seen first hand the speed that you can move from presenting at a meet-up to local conferences to SearchLove, MozCon or Inbound. But up until now, most of the non-Distilled speakers who made it to the SearchLove stage did so after proving themselves at another major conference.

Here’s the full package you’ll receive if you are successful (along with your 20 minutes on stage!):

Introduction call with the Distilled team

Multiple video calls to run through your presentation with the Distilled team and get feedback

Deck review and content call to bounce around your session ideas

1 to 1 ongoing support from a Distilled team member

Final in-person review with myself and the Distilled team in San Diego before the conference

VIP ticket to attend SearchLove San Diego including attending the VIP dinner with all the other speakers the night before the conference

A nice bunch of Distilled and SearchLove swag

We are extremely excited to be rolling this scheme out to our US events after the success of London (all 3 community speakers ranked in the top 8 speakers and one broke into the top 3!), we want to give this opportunity to local folks, and so we'll only be accepting pitches from applicants living within the San Diego area (2-2.5 hours drive from the venue) for this particular conference.

A note on the video requirement

You will note that the application form asks for a video. We debated whether to include this as a requirement and ultimately came down on the side of including it because by far the biggest limitation we typically have with less experienced speakers’ pitches is an inability to judge how they’ll perform on stage when they don’t have a ton of speaking experience and professional on-stage video. We hope that this is the most inclusive way of achieving this. We’re trying to avoid the too-high hurdle of requiring professional footage and this is something everyone can put together.

We are not expecting you to put together a professionally-lit and shot promo video. We want to see your enthusiasm, public speaking capability, and maybe a bit of your depth of knowledge. A selfie video shot on a mobile phone can totally do the job, but think about how you are going to stand out from the crowd and show us what we need to see. Once you have recorded the video upload it to a hosting platform such as Google Drive, Wistia, YouTube or Vimeo and share the URL in your application form.

In order to avoid asking you to do something I wouldn’t be prepared to do myself, I’ve recorded a short pitch myself. You can see that it’s shot on a phone, and didn’t use any editing:

A personal note

I’ve seen in my own career how powerful it has been to get better at public speaking and also the benefits of appearing on bigger stages. Having run a successful Community Speaker program at our London conference, I know that we can help more people on this journey.

I’ve been lucky enough to get enough of a start at our own events to bootstrap my way to bigger opportunities but I remember the 20 or so people who paid less than 20 bucks each to come to our first meet-up. We have also now built up enough of a support and coaching capability within Distilled that we have helped members of our team go from their first speaking opportunity to highly-rated SearchLove sessions in a matter of months. I want to bring those opportunities to more people. That means YOU.

I would strongly encourage you to think about the actual requirements. Don’t fall prey to imposter syndrome: are there things you are passionate about, where you have deep hands-on knowledge, and where you can teach even an experienced audience new things? If so, don’t sweat your speaking experience - let us be the judge of potential and get your application in.

How to apply

You’ll need to tell us:

Why you’d like to speak at SearchLove San Diego

Where you are based

What your speaking experience looks like so far

What topic you’d like to talk about - the more specific and actionable a topic you can describe, the better

Autumn’s roundup focuses on making the mundane memorable. Whether it’s weather-watching, logging into a website, or scrolling endlessly through a long form, we are used to norms. Login forms are dull, grey boxes, a cursor is an arrow, and weather predictions are told with satellite images – it’s what we are used to. This roundup looks at how the ordinary has been reimagined with little touches that make you look twice.

UX reimagined

The cursor reimagined. This scrolling piece uses some very fitting visual imagery. Discussing the carbon footprint of a cocaine habit, a white line links one piece of content to the next. As you scroll you find out the carbon equivalent of various amounts of cocaine, which are astounding! 50g is the same as buying 56 new iPhones and throwing them in a river. My favourite touch here though is the cursor, which is a nose!

Login fields reimagined. Darin Senneff has created an animated SVG login avatar. Login forms are so dull. The bane of my life, in fact! Anything that can make this less laborious should be considered if it does not throw up UX issues. Fewer fields, conversational intro text, colour, password managers, and in this case a little guy who seems to react when your details are entered. Smiling when you enter your email address, and comically covering his eyes so as not to peek when you enter your password.

Logo reimagined. We have all seen logos animate, or become smaller and perhaps more simplified as you scroll down a page, but Creative Boom, have been more inventive. The ‘OO’ in the word ‘boom’ animate – turning into a set of eyes which move into the centre of the page, the eyes follow the cursor position as you scroll.

Offline mode reimagined. When Chrome fails it used to launch a little one button game (of which we are big fans at Distilled: Brexit bus game, Emergency stop game). In the game, you use the spacebar to jump, and the dinosaur dies when you come into contact with a cactus. The way the dinosaurs eyes bulge out here is my favourite part. I got to 560 points, and no, I haven’t spent the last half hour playing it…

Onsite content reimagined

Identity reimagined. About pages can be hard, there are certainly some questionable ones out there. Not everyone likes having their photo taken, and unless you have a studio, it can be hard to maintain consistent backgrounds and lighting. But what is an about us page trying to show? That the people behind the company are human, smart, approachable, and someone you would want to collaborate with. We build relationships based on connections. Cornett’s about page is a square product montage of things that are important in those people’s lives – showing their passions, fashions, and identities, not through a photo of their face but with objects that mean something to them.

Weather predictions reimagined. It can be hard to imagine what changes in the weather will feel like, especially dramatic ones. We are used to seeing aerial satellite shots of swirling storms approaching – showing the direction, speed and perhaps to some extent the ferocity of a storm – but it doesn’t explain what it will feel like on the ground. The Weather Channel, used a virtual reality studio to do just that. Gigantic walls of water circling the presenters, showing just how high the water could come. This realism, I would hope, enticed people in risk areas to evacuate without hesitation.

Media reimagined

Podcasts reimagined. This piece could easily have worked as a podcast; it is after all about sound. But there is something about sound paired with amazing photography that shapes a different experience that sound alone, or video (sound with moving image) could not create. The scrolling mechanism also allows time for pause and reflection, and like reading a book over watching a film, the lack of movement leaves some of the story to the imagination.

My favourite snippet from this piece is the noise that city rats make, they’ve even captured rat laugher! To do this, the frequency of sound that rats make had to be lowered to the human vocal range so we can hear it.

Money Reimagined

Bank notes reimagined. Distilled created a piece on this theme, called Dead Men on Dollar Bills, which highlighted that 85% of the world’s banknotes have men on them. We imagined how notes would look with inspirational female figures on, including Emmeline Pankhurst and J.K. Rowling. Google has taken this a step further making, 100 notes featuring women, along with the story of why they deserve to be commended.

Products reimagined

Bath bombs reimagined. The holidays are the perfect time to launch a product range with a twist. Lush cosmetics turned their bath bombs into googly eyes! There is undoubtedly something chilling about these unblinking balls all gazing mindlessly.

Data reimagined

Graphs reimagined. We often have internal battles here at Distilled about the clearest and most creative way of showing data. Clear and creative often feeling like they are at opposite ends of the spectrum, with the clearest usually being the humble bar graph, and arguably not the most creative This piece, however, succeeds in telling a simple story in an interesting way. Animating small multiples to show percentages communicates the diversity in the recent midterm elections. The cut-out heads enable us to see the 26 lesbian, gay, bisexual or transgender candidates. As the piece zooms out, it shows us how the share of candidates who are white men, at 58%, is the lowest in the past four elections!

What content have you enjoyed lately? Let us know in the comments.

]]>How to rank for head termshttps://www.distilled.net/resources/how-to-rank-for-head-terms/<p dir="ltr"><img src="https://www.distilled.net/uploads/blog/how_to_rank_for_head_terms_(1).png" width="780" height="300" /></p>
<p dir="ltr">Over the last few years, my mental model for what does and doesn&rsquo;t rank has changed significantly, and this is especially true for head terms - competitive, high volume, &ldquo;big money&rdquo; keywords like &ldquo;car insurance&rdquo;, &ldquo;laptops&rdquo;, &ldquo;flights&rdquo;, and so on. </p><a href="https://www.distilled.net/resources/how-to-rank-for-head-terms/">Continue reading &gt;&gt;</a>Tom CapperTue, 20 Nov 2018 13:40:49 +0000https://www.distilled.net/resources/how-to-rank-for-head-terms/SEO

Over the last few years, my mental model for what does and doesn’t rank has changed significantly, and this is especially true for head terms - competitive, high volume, “big money” keywords like “car insurance”, “laptops”, “flights”, and so on. This post is based on a bunch of real-world experience that confounded my old mental model, as well as some statistical research that I did for my presentation at SearchLove London (create a free Distilled account to access the video) in early October . I’ll explain my hypothesis in this post, but I’ll also explain how I think you should react to it as SEOs - in other words, how to rank for head terms.

My hypothesis in both cases is that head terms are no longer about ranking factors, and by ranking factors I mean static metrics you can source by crawling the web and weight to decide who ranks. Many before me have made the claim that user signals are increasingly influential for competitive keywords, but this is still an extension of the ranking factors model, whereby data goes in, and rankings come out. My research and experience are leading me increasingly towards a more dynamic and responsive model, in which Google systematically tests, reshuffles and refines rankings over short periods, even when site themselves do not change.

Before we go any further, this isn’t an “SEO is dead”, “links are dead”, or “ranking factors are dead” post - rather, I think those “traditional” measures are the table stakes that qualify you for a different game.

Evidence 1: Links are less relevant in the top 5 positions

Back in early 2017, I was looking into the relationship between links and rankings, and I ran a mini ranking factor study which I published over on Moz. It wasn’t the question I was asking at the time, but one of the interesting outcomes of that study was that I found a far weaker correlation between DA and rankings than Moz had done in mid-2015.

The main difference between our studies, besides the time that had elapsed, was that Moz used the top 50 ranking positions to establish correlations, whereas I used the top 10, figuring that I wasn’t too interested in any magical ways of getting a site to jump from position 45 to position 40 - the click-through rate drop-off is quite steep enough just on the first page.

Statistically speaking, I’d maybe expect a weaker correlation when using fewer positions, but I wondered if perhaps there was more to it than that - maybe Moz had found a stronger relationship because ranking factors in general mattered more for lower rankings, where Google has less user data. Obviously, this wasn’t a fair comparison, though, so I decided to re-run my own study and compare correlations in positions 1-5 with correlations in positions 6-10. (You can read more about my methodology in the aforementioned post documenting that previous study.) I found even stronger versions of my results from 2 years ago, but this time I was looking for something else:

The first thing to note here is that these are some extremely low correlation numbers - that’s to be expected when we’re dealing with only 5 points of data per keyword, and a system with so many other variables. In a regression analysis, the relationship between DA and rankings in positions 6-10 is still 98.5% statistically significant. However, for positions 1-5, it’s only around 41% statistically significant. In other words, links are fairly irrelevant for positions 1-5 in my data.

Now, this is only one ranking factor, and ~5,000 keywords, and ranking factor studies have their limitations.

However, it’s still a compelling bit of evidence for my hypothesis. Links are the archetypal ranking factor, and Moz’s Domain Authority* is explicitly designed and optimised to use link-based data to predict rankings. This drop off in the top 5 fits with a mental model of Google continuously iterating and shuffling these results based on implied user feedback.

*I could have used Page Authority for this study, but didn’t, partly because I was concerned about URLs that Moz might not have discovered, and partly because I originally needed something that was a fair comparison with branded search volume, which is a site-level metric.

Evidence 2: SERPs change when they become high volume

This is actually the example that first got me thinking about this issue - seasonal keywords. Seasonal keywords provide, in some ways, the control that we lack in typical ranking factor studies, because they’re keywords that become head terms for certain times of the year, while little else changes. Take this example:

This keyword gets the overwhelming majority of its volume in a single week every year. It goes from being a backwater search term where Google has little to go on besides “ranking factors” to a hotly contested and highly trafficked head term. So it’d be pretty interesting if the rankings changed in the same period, right? Here’s the picture 2 weeks before Mother’s Day this year:

I’ve included a bunch of factors we might consider when assessing these rankings - I’ve chosen Domain Authority as it’s the site-level link-based metric that best correlates with rankings, and branded search volume (“BSV”) as it’s a metric I’ve found to be a strong predictor of SEO “ranking power”, both in the study I mentioned previously and in my experience working with client sites. The “specialist” column is particularly interesting, as the specialised sites are obviously more focused, but typically also better optimised. - M&S (marksandspencer.com, a big high-street department store in the UK) was very late to the HTTPS bandwagon, for example. However, it’s not my aim here to persuade you that these are good or correct rankings, but for what it’s worth, the landing pages are fairly similar (with some exceptions I’ll get to), and I think these are the kinds of question I’d be asking, as a search engine, if I lacked any user-signal-based data.

Here’s the picture that then unfolds:

Notice how everything goes to shit about seven days out? I don’t think it is at all a coincidence that that’s when the volume arrives. There are some pretty interesting stories if we dig into this, though. Check out the high-street brands:

Not bad eh? M&S, in particular, manages to get in above those two specialists that were jostling for 1st and 2nd previously.

These two specialist sites have a similarly interesting story:

These are probably two of the most “SEO'd” sites in this space. They might well have won a “ranking factors” competition. They have all the targeting sorted, decent technical and site speed, they use structured data for rich snippets, and so on. But, you’ve never heard of them, right?

But there are also two sites you’ve probably never heard of that did quite well:

Obviously, this is a complex picture, but I think it’s interesting that (at the time) the latter two sites had a far cleaner design than the former two. Check out Appleyard vs Serenata:

Just look at everything pulling your attention on Serenata, on the right.

Flying Flowers had another string to their bow, too - along with M&S, they were one of only two sites mentioning free delivery in their title.

But again, I’m not trying to convince you that the right websites won, or work out what Google is looking for here. The point is more simple than that: Evidently, when this keyword became high volume and big money, the game changed completely. Again, this fits nicely with my hypothesis of Google using user signals to continuously shuffle its own results.

My last piece of evidence is very recent - it relates to the so-called “Medic” update on August 1st. Distilled works with a site that was heavily affected by this update - they sell cosmetic treatments and products in the UK. That makes them a highly commercial site, and yet, here’s who won for their core keywords when Medic hit:

Site

Visibility

Type

WebMB

+6.5%

Medical encyclopedia

Bupa

+4.9%

Healthcare

NHS

+4.6%

Healthcare / Medical encyclopedia

Cosmopolitan

+4.6%

Magazine

Elle

+3.6%

Magazine

Healthline

+3.5%

Medical encyclopedia

Data courtesy of SEOmonitor.

So that’s two magazines, two medical encyclopedia-style sites, and two household name general medical info/treatment sites (as opposed to cosmetics). Zero direct competitors - and it’s not like there’s a lack of direct competitors, for what it’s worth.

And this isn’t an isolated trend - it wasn’t for this site, and it’s not for many others I’ve worked with in recent years. Transactional terms are, in large numbers, going informational.

The interesting thing about this update for this client, is that although they’ve now regained their rankings, even at its worst, this never really hit their revenue figures. It’s almost like Google knew exactly what it was doing, and was testing whether people would prefer an informational result.

And again, this reinforces the picture I’ve been building over the last couple of years - this change is nothing to do with “ranking factors”. Ranking factors being re-weighted, which is what we normally think of with algorithm updates, would have only reshuffled the competitors, not boosted a load of sites with a completely different intent. Sure enough, most of the advice I see around Medic involves making your pages resemble informational pages.

Explanation: Why is this happening?

This is a great article in many ways - its intentions are nothing to do with SEO, but rather politically motivated, after Trump called Google biased in September of this year. Nonetheless, it affords us a level of insight form the proverbial horse’s mouth that we’d never normally receive. My main takeaways are these:

When the interview was conducted, the team that CNBC talked to was working on an experiment involving increased use of images in search results. The metrics they were optimising for were:

The speed with which users interacted with the SERP

The rate at which they quickly bounced back to the search results (note: if you think about it, this is not equivalent to and probably not even correlated with bounce rate in Google Analytics).

It’s important to remember that Google search engineers are people doing jobs with targets and KPIs just like the rest of us. And their KPI is not to get the sites with the best-ranking factors to the top - ranking factors, whether they be links, page speed, title tags or whatever else are just a means to an end.

Under this model, with those explicit KPIs, as an SEO we equally ought to be thinking about “ranking factors” like price, aesthetics, and the presence or lack of pop-ups, banners, and interstitials.

Now, admittedly, this article does not explicitly confirm or even mention a dynamic model like the one I’ve discussed earlier in this article. But it does discuss a mindset at Google that very much leads in that direction - if Google knows it’s optimising for certain user signals, and it can also collect those signals in real-time, why not be responsive?

Implications: How to rank for head terms

As I said at the start of this article, I am not suggesting for a moment that the fundamentals of SEO we’ve been practising for the last however many years are suddenly obsolete. At Distilled, we’re still seeing clients earn results and growth from cleaning up their technical SEO, improving their information architecture, or link-focused creative campaigns - all of which are reliant on an “old school” understanding of how Google works. Frankly, the continued existence of SEO as an industry is in itself reasonable proof that these methods, on average, pay for themselves.

But the picture is certainly more nuanced at the top, and I think those Google KPIs are an invaluable sneak peek into what that picture might look like. As a reminder, I’m talking about:

The speed with which users interact with a SERP (quicker is better)

The rate at which they quickly bounce back to results (lower is better)

There are some obvious ways we can optimise for these as SEOs, some of which are well within our wheelhouse, and some of which we might typically ignore. For example:

Metadata - we’ve been using this to stand out in search results for years

E.g. “free delivery” in title

E.g. professionally written meta description copy

Brand awareness/perception - think about whether you’d be likely to click on the Guardian or Forbes with similar articles for the same query

Optimising for rate of return to SERPs:

Sitespeed - have you ever bailed on a slow site, especially on mobile?

First impression - the “this isn’t what I expected” or “I can’t be bothered” factor

Price

Pop-ups etc.

Aesthetics(!)

As I said, some of these can be daunting to approach as digital marketers, because they’re a little outside of our usual playbook. But actually, lots of stuff we do for other reasons ends up being very efficient for these metrics - for example, if you want to improve your site’s brand awareness, how about top of funnel SEO content, top of funnel social content, native advertising, display, or carefully tailored post-conversion email marketing? If you want to improve first impressions, how about starting with a Panda survey of you and your competitors?

Similarly, these KPIs can seem harder to measure than our traditional metrics, but this is another area where we’re better equipped than we sometimes think. We can track click-through rates in Google Search Console (although you’ll need to control for rankings & keyword make-up), we can track something resembling intent satisfaction via scroll tracking, and I’ve talked before about how to get started measuring brand awareness.

Some of this (perhaps frustratingly!) comes down to being “ready” to rank - if your product and customer experience is not up to scratch, no amount of SEO can save you from that in this new world, because Google is explicitly trying to give customers results that win on product and customer experience, not on SEO.

There’s also the intent piece - I think a lot of brands need to be readier than they are for some of their biggest head terms “going informational on them”. This means having great informational content in place and ready to go - and by that, I do not mean a quick blog post or a thinly veiled product page. Relatedly, I’d recommend this in-depth article about predicting and building for “latent intents” as a starting point.

Summary

I’ve tried in this article to summarise how I see the SEO game changing, and how I think we need to adapt. If you have two main takeaways, I’d like it to be those two KPIs - the speed with which users interact with a SERP, and the rate at which they quickly bounce back to results (lower is better) - and what they really mean for your marketing strategy.

What I don’t want you to take away is that I’m in any way undermining SEO fundamentals - links, on-page, or whatever else. That’s still how you qualify, how you get to a position where Google has any user signals from your site to start with. All that said, I know this is a controversial topic, and this post is heavily driven by my own experience, so I’d love to hear your thoughts below!

]]>Win before you begin - creative checklist for successhttps://www.distilled.net/resources/creative-checklist-for-success/<p dir="ltr"><img src="https://www.distilled.net/uploads/blog/creative_checklist_for_success_(2).png" width="780" height="300" /></p>
<p dir="ltr">Part of my role as Manager of the Creative team at Distilled is to ensure that both we, and the clients we work with, are set up for mutual success. </p><a href="https://www.distilled.net/resources/creative-checklist-for-success/">Continue reading &gt;&gt;</a>James RoachWed, 14 Nov 2018 16:58:47 +0000https://www.distilled.net/resources/creative-checklist-for-success/Marketing

Part of my role as Manager of the Creative team at Distilled is to ensure that both we, and the clients we work with, are set up for mutual success. That’s why I decided to make the ‘Creative one pager’, this is something I send to all new clients to inform them of what to expect from Distilled, but also what they should be doing internally to ensure a smooth project too.

I decided to start using this after seeing a project get sidelined at the end of production due to internal differences client side. I didn’t want that to happen again, so a more structured way of highlighting some of the potential pitfalls seemed obvious.

I’ve shared the contents of this document below in the hope you may find it useful too. Afterall, life’s so much easier when we’re on the same page.

Creative one pager

What next?

Creative productions can be as simple or complex as we make them, but nobody likes missing deadlines.

To ensure a smooth production ride we’ve highlighted a few things, which if adhered to, should make the process run like clockwork.

What to expect from Distilled

Scheduling - We’ll send you a schedule so you can see key milestones and feedback dates to make sure we can deliver the final piece on time.

Regular project updates - Your designated project lead will be in contact to either set up a weekly call to discuss progress, or set a time for a weekly email update that suits you.

A typical weekly agenda looks like:

Project status - Where are we with the project, are we on schedule?

Next steps - What do we need to do next in order to get closer to completion?

Milestones - What can you expect in the week ahead (this includes things such as feedback you may be required to give on design etc)

General discussion

An open door policy - Although we’ll be in constant contact, feel free to contact us at any time with questions or concerns, we’re always happy to help.

What we expect from you

Rally the troops - The impact of this cannot be underestimated. Make sure to include all the decision makers and affected departments. Those who have an interest in the project should be involved throughout the entire process, that includes:

SEO - This is more than likely the team you’re in right now, so hopefully we have that covered.

Design - Yourin-house design team may have certain design guidelines we need to adhere to, let’s find out what those are sooner rather than later.

PR - The PR we do can be different to that of in-house PR teams, let's communicate from the get-go to make sure we’re all on the same page.

Tech - We’ll need to speak to your tech team to know what technical limitations there are with creating something to sit on your site.

Your Boss! - Or whoever may have the final sign off if that is not you.

Swift feedback - Another blocker in a project sticking to its intended schedule is the amount of time in between feedback at certain key milestones. When we send a design or other assets for you to feedback on we’d like that back within 24 hours to ensure the schedule can be kept to. If that’s not going to be possible then it’s crucial you let us know from the offset so we can schedule more time in.

If you’ve ever pitched anything to a journalist, you’re probably familiar with the sinking feeling that comes with a failed or unanswered email pitch. In my time working in PR, I’ve written countless emails to journalists. From the good (instant, enthusiastic responses), the bad (no replies after hammering away at my keyboard for hours on end), to the downright ugly (one response from a particularly disgruntled journalist simply read “Why would you do this?”), I’ve seen it all.

While this journey hasn’t always been plain sailing, I’ve learnt a lot about what works — and even more about what doesn’t. Here’s what this process has taught me.

Before you pitch to anyone, make sure you…

Do your research

Not knowing a journalist's beat is tantamount to accidentally calling someone you’ve just met the wrong name. Journalists get bombarded with hundreds of emails every single day, so the least you can do is make sure you’re pitching a story that’s relevant to them. Do your due diligence to avoid starting off on the wrong foot. There are a handful of key questions you should always investigate first: Does this journalist still write for the same publication? Are they still covering this topic? Has their job title changed, and if so, will your content still be relevant to them? And do they tend to cover similar content, or was an article they wrote about something similar a one-off?

it's been seven months since I moved out of the fashion & beauty dept, and yet a solid half of the PR pitches I get in my inbox are F&B-related. c'mon y'all.

Don’t pitch self-serving stories

No matter how well disguised you think your motives might be, any journalist worth their salt will see straight through your spiel, almost guaranteeing your message will be deleted before they’ve had a chance to consider it.

Similarly…

Don’t pitch a story that’s already been written

I know how tempting this can be. If they’ve already covered that, they must be interested in this, right? Not necessarily. If you go to a journalist with a story that’s too similar to something they’ve already covered, you’ll only end up looking like you haven’t done your research. That said, some topics do run on and on — but revisiting more niche stories can be overkill, so use your common sense to work out which category yours falls into.

When you publish a story on a specific topic, and then are flooded with PR pitches about that exact topic... I guess I get it, on the off chance of doing a follow-up or as an FYI. But for the most part, these pitches are irrelevant b/c I'm not going to do the same piece twice.

When writing your pitch, you should always remember…

Introductions matter

You need to make your subject line snappy so that dreaded ‘delete’ button doesn’t get hit before the email has even been opened. This doesn’t mean you should go to town with puns and wordplay — you need to summarise your story as concisely as possible. When introducing yourself, avoid spending too long explaining the intricacies of what you do. Journalists are only interested in the story, so get to it as quickly as possible.

Keep everything succinct

Journalists are some of the busiest people you’ll meet and therefore won’t want to sacrifice any more of their time than they have to. Make their lives easier by cutting the waffle. Get straight to the point, try to avoid dressing things up in cliches or complicated terminology, and do your best to keep your email just a few sentences long.

Here's a tip for PR folks: If I have to read your pitch email three times to find out what your client does, I'm unlikely to write about them. @muckrack#cringeworthyprpitches

Make it personal

You want people to remember you, but there’s a fine line between being fun and being annoying. Overly quirky ice breakers often have the adverse effect, so think twice before wasting your time penning a seemingly hilarious email like this only for it to be misunderstood. Instead, you should personalise your messages by explaining how your story relates to the recipients specific interests or topics they regularly write about.

PR pitch was already on shaky ground, what with the egregious use of the emoji and all, but come on already with this shit. pic.twitter.com/OYcLaoS4d0

Proofread and spell check everything

When you’ve finished writing, go through your email with a fine-toothed comb, looking for any mistakes. Then reread it. And then a third time. Keep going until what you’ve written is a flawless, perfectly flowing pitch that is easy to digest and skim read. Typos are rarely forgivable, especially so when speaking to someone who writes for a living, so avoid falling at the final hurdle by spending time looking for errors before you hit ‘send’. If spelling and grammar aren’t your forte, find someone else who is more comfortable with words to proof read what you’ve written. At the very least, you can rely on an online tool such as Grammarly or HemingWay to improve your emails — and your chances of getting a reply from an interest journalist.

You ever get a PR pitch so badly written you consider responding with edits out of pure pity for the sender?

After you’ve sent your pitch, you should…

Find reasons to follow up

Don’t lay all your cards on the table in your first email. Perhaps you have a photograph of that new piece of tech your company is working on that you mentioned in your first email, or a video that adds a bit of colour to the story. Send across any additional material that’s relevant to your pitch, as this gives you a good excuse to get in touch again without sending a pestering email simply asking whether they are interested in your story.

I usually don't complain about PR pitches. I know that people have a job to do. But this email, just received, is just.... pic.twitter.com/OFkumL5rna

Answer any questions in a timely fashion

The longer you take to answer a journalist’s questions, the less likely the opportunity becomes, so don’t let this slide to the bottom of your to-do list. If it’s something that you can’t answer immediately, let them know you’re working on it and that you’ll be back in touch ASAP.

Try not to give up hope

PR is an art, not a science, so try not to beat yourself up if you don’t nail it first time round. It might not be the right fit for that particular journalist, but that doesn’t mean it isn’t a better fit for someone else. Rewrite your pitch, try tackling the topic from a slightly different angle, or go back to the drawing board entirely armed with your learnings.

This post was written by Arpun Bhuhi and Tammy Yu. The two spent their summers interning in the London and Seattle office, respectively. Arpun and Tammy have chosen to continue their digital marketing journey as analysts at Distilled.

To aspiring digital marketers and SEOs, the Distilled digital marketing internship is a wonderful opportunity to learn, grow, and start your career. As interns, we had the chance to work for a company with a great work culture and learn from some of the best professionals in the industry! Aside from the greatest experience ever, here are our top 11 perks (we couldn't agree on just 10!) of being a Distilled intern:

11. Throwback to Nintendo ‘64

Each of the offices includes a Nintendo ‘64. After work, you can often catch Distillers racing each other for first place on Mario Kart.

10. Casual attire

At Distilled, we pride ourselves in working smarter, not harder. That applies to office attire. Come in wearing what you feel most comfortable in and do great work!

9. Explore a new city

Distilled have smashed it when it comes to the location of the offices. You will find yourself either in the middle of London, New York or Seattle and you will never be short of things to do. The summer internship is three months long which is also the perfect amount of time to experience a new city and to see if the city is somewhere you want to pursue your career. Wherever you are placed remember to enjoy the experience - work hard, play hard.

8. Summer hours

During the summer, Distilled adopts “summer hours”. Essentially you work 9am - 6pm Monday through Thursday and Friday become a joyous, happy half day! You will be in either London, New York or Seattle so think about how much you could fit into the free afternoon. Everyone loves a sunny Friday afternoon to themselves!!

7. Party time!

Each office hosts their own summer party and bi-monthly parties. It might be an all-day event filled with breakfast, snacks, boating and/or an evening happy hour. The best thing is getting to know your colleagues outside of a work environment - and it’s paid for by Distilled!

6. Attend conferences and collect all the SWAG

With Distilled you will have the opportunity to attend digital marketing conferences, whether that is SearchLove, MozCon for the Seattle interns, or BrightonSEO for the London interns. This does depend upon when the dates fall. If you are lucky, this is a great time to widen your knowledge base, network with industry experts, understand Distilled’s role within the industry, and most importantly, collect the free swag.

5. Flexible hours

Need to come in late or leave early? Need to catch an early commuter bus that will force you to be in the office early? That’s totally fine! As long as you attend all your meetings, get your work done, and get your weekly hours in, work hours are pretty flexible.

4. Friday Snacks 'n' Drinks

Friday snacks 'n' drinks happen at 5 pm on the dot, every Friday without fail (who would have guessed that?). All drinks are work sponsored including a range of alcoholic and non-alcoholic beverages - fun for all! This is not only great for the obvious reason of beer on a Friday but also it is a perfect time to get to know your team, learn about the office and play some card games!

3. Build your network

Networking is often regarded as one of the most important job advancing strategies. With connections across the globe, you are sure to make connections that will last beyond the duration of your internship. That connection might be your future mentor, your next “in” to your new adventure, or one that will give you the stellar reference to make your application stronger.

2. Nobody puts Baby in the corner

No one is ever physically placed in a corner, fear not! As an intern you are free to explore all areas of Distilled; you can delve into the deep sea of SEO, explore your creative side and help with Distilled’s own marketing.

1. Knowledge, knowledge, knowledge

Distilled loves knowledge, each and every Distiller is keen to learn more! It seems, when you first join, that everyone around you has an unlimited capacity of knowledge. You will hear words like “hreflang” and “canonical” flying around then panicky Google them. Everyone at Distilled is always happy to help you, happy to spend time explaining the broadest concept to the tiniest detail. Distilled also have their own online university platform called DistilledU so you will have the learning resources at your fingertips. You will learn so much in just your first week.

This year, Hootsuite announced that 3.196 billion people are now active social media users. That is 42% of all the people on earth. In the UK, that percentage climbs to 66% and it’s 71% in the US. Even with recent data protection scandals, platforms like Facebook, Twitter, Instagram, LinkedIn, Wechat, and Pinterest are a huge part of daily life.

This kind of impressive cut through makes it more likely that we can use social media to find our audience, but that doesn’t mean that everyone on the platform is desperate to hear from us. In reality, when we use social media as businesses we’re competing for what might be a very small, very niche, but very valuable cross-section of a network. This means that whenever we do social media marketing, we need a strategy, and to have a successful social media marketing strategy it’s vital to know how we compare to our competitors, what we’re doing well and what threats we should be worrying about.

Without effective social media competitor analysis we’re working in the dark. Unfortunately, a lot of the time when we compare social media communities we keep coming back to the same metrics which aren’t always as informative as we might like. Fear not! Here’s a guide to find the social media stats which really tell us which competitors to watch out for and why.

What are we trying to achieve with social media?

One of the biggest problems with creating a social media strategy is the subjectivity of social can make it incredibly hard to get solid, reliable performance data that can tell us what to do next. If we want to get actionable information about how we compare to competitors, it’s important for us to start with why we’re on the platforms to begin with (we’ll use these agreed facts in later sections). If we agree that;

The value of a social media competitor analysis is to help us perform better on social

The value of social is to help us achieve the business objectives that we set out in the first place.

Then we can agree that the numbers we look at in a social media competitor analysis must be defined by what we actually need the networks to achieve (even if it takes a while for engagement to become page views).

With that in mind, here are the most common aims I think we try to achieve through social media, ordered roughly from high commitment on the part of our audience, to low. When we are comparing social networks we need to make sure we have an idea of how the numbers we look at can contribute to at least one of the items below (and how efficiently).

Sales (this can include donations or affiliate marketing as well as traditional sales)

Support (event attendance etc. paid event attendance being included in sales)

Site visits (essentially ad sales, visits to websites that don't run on ads can be considered a step towards a sale)

Impressions/staying front of mind (this is also a prerequisite for each of the above).

Why we should stop talking about raw follower counts

We often hear social media accounts evaluated and compared based on raw follower counts. If we agree we should look at numbers that are defined by our key goals I have some reasons why I don’t think we should talk about follower counts as much as we do.

“Followers” is a static number trying to represent a dynamic situation

When we compare social communities we don’t care how effective they were in 2012. The only reason we care about how effective they were over the last six months is because it’s a better predictor of how much of the available audience attention, and conversions they'll take up over the next six months. What’s more, as social networks grow, and implement or update sharing algorithms, the goal posts are moving, so what happened a few years ago becomes even less relevant to the present.

Unfortunately, raw follower count includes none of that context, it’s just a pile of people who have expressed an interest at some point. Trying to judge how successful a community will be based on follower count is like trying to guess the weather at the top of a large hill based solely on its height - if it gets really big you can probably guess it’ll be colder or windier, but you’re having to ignore a whole bunch of far more relevant factors.

Follower buying can also really throw off these numbers. If you want to check competitors for follower buying you may be able to find some signs by checking for sudden, unusual changes in follower numbers (see “What we should look at instead”) or try exporting all their followers with a service like Export Tweet and check for a large number of accounts with short lifespans, low follower numbers or matching follower numbers.

An “impression” is required for every other social goal

I’m going to move on to what other numbers we should look at in the next section, but we have to agree that in order for anyone to do anything you want with your content, they have to have come into contact with it in some way.

Because of the nature of social networks we can also agree the number of impressions is unlikely to exactly match the follower number, even in a perfect system - some people who aren’t following will see your content, some people who are following won’t. So we’ve started to decouple “follows” from “impressions” - the most basic unit of social media interaction.

Next we can agree - if an account stops producing effective content, or stops producing content altogether, follower count will make no difference. A page that posts nothing will not have people viewing its nonexistent posts. So follower count isn't sufficient for impressions and impressions are necessary for any other kind of success.

Depending on the kind of social network, the way in which content spreads though it will change. Which means follower count can be less decisive than other systems in different ways. We’ll look at each format below in isolation, where a network relies on more than one means (for instance hashtags and shares) the effect is compounded rather than cancelled out.

Discovery driven by hashtags

Ignoring other amplification mechanisms (which we’ll discuss below), follower count can be much less relevant in comparison to the ability to cut through hashtags. The end result of either a large, active following or content effectively cutting through a hashtag (or both) will be shown in the engagement metrics on the content itself, we have those numbers, so why rely on follows?

Discovery driven by shares and interaction

The combined followings or networks of everyone who follows you (even at relatively small numbers) can easily outweigh your audience or the audience of your competitors. Engagement or shares (whatever mechanism the platform uses to spread data via users) becomes a better predictor of how far content will reach, and we have those numbers, so why rely on follows?

If you’re interested in analysing your followers or competitor followers to find out how many followers those followers have and compare those numbers, services like Export Tweet will let you export a CSV of all the followers of an account, complete with their account creation date and follower number. Also, if you have to look into raw follower numbers this can be a way of checking for fake followers.

Discovery guided by algorithms

In this case, content won’t be shown to the entire following, the platform will start by showing it to a small subsection to gather data about how successful the post is. A successful post is likely to be seen by most of the following and probably users that don’t follow that account too, a less successful post will not be shown to much more than the testing group. Key feedback the platforms will use to gauge post success is engagement and, as we’ve said, we have those numbers, why rely on follows?

This particular scenario is interesting because having a very large audience of mostly disengaged followers can actually harm reach - when the platform tests your content with your audience, it's less lightly to be seen by the engaged subset, early post success metrics are likely to fare worse so the content will look less worthy of being shared more widely by the platform. This can mean that tactics like buying followers, or running short-term competitions just to boost follower count without a strategy for how to continually engage those followers, can backfire.

I’m not saying follower count has no impact at all

A large number of follows does give an advantage, and make it more likely that content is widely seen. The fact is that in most cases, engagement metrics usually tell us if posts were widely seen, so they are a much more accurate way to get a snapshot of current effectiveness. Engagement numbers are also far closer to the business objectives we laid out above so I’ll say again, why rely on follows?

At most I’d only ever want to use follower count to prioritise the first networks to investigate - as far as I’m concerned it isn’t a source of the actionable insights we said we wanted.

What we should look at instead

Engagements

In many ways, engagement-based numbers are the best to look at if we want to put together a fair and informative comparison including accounts we don't own.

Engagement numbers are publicly visible on almost every social network (ignoring private-message platforms), meaning we aren't having to work with estimates. What’s more, engagement is content-specific and requires some level of deliberate action on behalf of the user, meaning they can be a much better gauge of how many people have actually seen and absorbed a message, rather than glancing at something flying past their screen at roughly the top speed of a Honda Civic.

What business goal does this relate to?

Impressions. As mentioned above, engagements require the content to be on-screen and for the user to have recognised it at some level. Because engagements are like opt-in impressions, we can judge comparative success at staying front of mind. We could also use it as a sign that our audience is likely to take further action, like visiting our site or attending an event, depending on how you interpret the numbers (as long as it’s consistent). It’s fuzzy, but in a lot of ways less fuzzy than follows (due to removal from actual business goals) and actual impressions (due to lack of data). What's more, the inaccuracy of this data leans towards only counting users who cared about the content, so it’s something I’m happy to live with.

That being said, when you’re comparing your own community to itself over time (and not worrying about competitors) impressions itself is still a good metric to use - most social platforms will give you that number and it can give you a fuller idea of your funnel (we’ll cover impressions more below).

What numbers should you use?

As with follower change and impressions (which I discuss below), we need to control for varying follower base and posts-per-day. I’d recommend:

Engagements per (post*follower) (where you multiply total follower count by total updates posted)

Engagements per post

Total engagements per post.

The first number should help you compare how well a follower base is being engaged, the second should give an idea of return on investment, and the third is to avoid being totally thrown off by tiny communities which might not actually be moving the needle for business objectives.

What tools should you use?

The platforms themselves are an option for gathering engagement numbers, which is one of the reasons this kind of check is ideal. This can be as simple as scrolling through competitor timelines and making notes of what engagement they’ve received. Unfortunately, sometimes this is time-consuming and many platforms take steps to block scraping of elements. However, I’ve found some success with scraping engagement numbers from Facebook and Twitter and I’ve included my selectors in case you do manage to use a tool like Agenty or Artoo.js to help automate this.

Facebook

Number

Shares

Likes

Comments

Additional comments

All visible posts

Selector

.UFIShareLink

._4arz span

.UFICommentActorAndBody

.UFIPagerLink

._q7o

Twitter

Number

Interactions

All visible posts

Selector

span.ProfileTweet-actionCountForPresentation

span._timestamp

Facebook Insights is another great source of information because it’ll give you some direct comparisons between your page and others. It’s not quite the level of granularity we’d like but it’s easy, free, and direct, so gift horses and all that.

NapoleonCat - I don’t work for this company but they have a 14-day free trial and their reports offer exactly the kind of information I’d be looking for, for both managed profiles, and ones you are watching. That includes daily raw engagement numbers, and calculated engagement rate and SII their “Social Interaction Index” which claims to account for differing audience size, allowing direct comparison between communities.

The hitch is that Twitter and Instagram only start collecting information from when you add them to the account, so if you want to collect data over time you’ll need to pay the premium fees. On the other hand, their support team has confirmed that they’re perfectly happy with you upgrading for a month, grabbing the stats you need, removing your payment card for a few months (losing access in the process) and repeating six months later for another snapshot.

Socialblade - offers some engagement rate metrics for platforms like Instagram and Twitter. It doesn’t require you to log in but the data isn’t over time so your information is only as good as your dedication to recording it.

Fanpage Karma does an impressive job of trying to give you actionable information about what is engaging. For instance, it’ll give you a scatter chart of engagement for other pages, colour coded by post type. Unfortunately, anything more than a small number of posts can make that visualisation incredibly noisy and hard to read. The engagement-by-post-type charts are easier to read but sacrifice some of that granularity (honestly I don’t think there is a visualisation that has engagement number and post type over time that isn’t noisy).

It’ll also let you compare multiple pages in the same kind of visualisation where the dots still show number of engagement but are colour coded by page instead of post type, patterns can be a bit easier to divine with that one but the same tension can arise.

If you’re tracking these stats for your own contentTwitter analytics and Instagram Insights are great, direct, sources of information. Any profile can view Twitter analytics, but you’ll need an Instagram business profile to look at the Instagram data. At the very least, each can be a quick way of gathering stats about your own contents’ impressions and engagement numbers, so you don’t have to manually collect numbers.

If you have to include a follower metric…

If you have to include a follower metric, I’d advise focusing on something far more representative of recent activity. Rather than total or raw number of follows, we can use recent change in followers.

While I still think this is a bit too close to raw followers for my liking, there’s one important difference - this can give you more of an idea of what’s happening now. A big growth in followers could mean a network is creating better content, it could also mean they’ve recently bought a bunch of followers, either way, we know they’re paying attention.

What business goal does this relate to?

Some people might use this number to correlate with impressions, but as I said we can use other numbers to more accurately track that. This number (along with raw post frequency) is one means of gauging effort put into a social network, and so can inform your idea of how efficient that network is, when you are looking at the other metrics.

These numbers are also likely closer to what senior managers are expecting so they can be a nice way to begin to refocus.

What number should you use?

We need to account for differing community histories, a way to do this is to consider both:

Raw followers gained over a recent period

Followers gained over a recent period as a proportion of total current followers.

We can use these two numbers to get an idea of how quickly networks are growing at the moment. The ideal would be to graph these numbers over time, that way we can see if follower growth has recently spiked, particularly in comparison to other accounts of similar focus or size.

Once we've identified times where an account has achieved significant change in growth, we can start to examine activity around that time.

What tools should you use?

NapoleonCat (I promise I’m not getting paid for this) can give you historic follower growth data for accounts you don’t own, although unfortunately it only reports Twitter follower growth since the point an account starts being monitored (other networks seem to backdate).

Socialblade offers historic follower stats for accounts you don’t own, the first time anyone searches for stats on an account, that account will be added to Socialblade’s watchlist and it’ll start gathering stats from that point. If you’re lucky, someone will already have checked, otherwise you can have a look now and check back later.

Impressions

It can be harder to get a comparison of impressions for content, but it’s one of our most foundational business objectives - a way to stay front of mind and ideally build towards sales. Everything we’ve covered in terms of Follower numbers is a step removed from actual impression numbers so it’s worth comparing actual impression numbers for recent content where we can.

What business goal does this relate to?

Impressions, but as impressions are the minimum bar to clear for all of our other business goals, this can also be considered top of the funnel for other things.

What numbers should you use?

Impressions per (post*follower) (where you multiply total follower count by total updates posted)

Impressions per post

Total impressions per account/all impressions for competitor accounts during that same period

Once you have collected impression numbers from a range of accounts on the same platform which are targeting the same audiences, we can sum them together and compare total impressions per account against total impressions overall to get a very rough share of voice estimate. This number will be heavily impacted by users who view content from one account again and again, but as those users are likely to be the most engaged, it’s a bias we can live with. Again, comparing this over time can give us an idea of trajectory and growth.

Some accounts may try to drive up key metrics by posting a huge number of times a day, there's definitely a law of diminishing returns so as with engagements I'd also get an average per-post impression number to gauge comparative economy.

As this is post-specific, I would also recommend breaking this numbers down by post type (whether that be “meme”, “blog post”, or “video”) to spot trends in effectiveness.

What tools should you use?

Fanpage Karma again goes out of its way to give you means of slicing this data. Just like with engagement you can show impressions by post type for one Facebook page, or compare multiple at the same time. It can result in the same information overload but I definitely can’t fault the platform for a lack of granularity. Unlike with engagement, the platform will pretty much only give you impression data for Facebook and unfortunately sometimes it’s patchy (see the SEMrush and Moz graph below).

It’ll also give YouTube view information, as well as giving you a breakdown of video views and interactions based on when the video was posted, it also offers cumulative figures which show how the performance of a video improved over time.

Tweetreach will give estimated reach for hashtags and keywords, by searching for a specific enough phrase, you can get an idea of reach for individual tweets, or a number of related tweets if you’re smart about it.

Content shares

This is specifically people sharing a page of your site on a social network. It may help us flesh out some of the impressions metrics we’ve been dancing around, particularly in terms of content from your site or competitors’ being shared by site visitors rather than an official account.

What business goal does this relate to?

Impressions, site visits generating ad revenue

What numbers should you use?

To control for volume of content created by different sites, I would look at both total number of shares and shares per blog post, for example, during the same time period. It could also be valuable information to sum total follower count of the accounts that shared the content, to weight shares by reach, but that could be a huge task and also opens us up to the problems of follower count.

What tools should you use?

Buzzsumo will let you search for shared content by domain, and will let you dig in to which accounts shared a particular item. It can give a slightly imbalanced picture because it’s just looking for shares of your website content (so don’t expect the figures to include particularly successful social-only content for example) but it’s an excellent tool to get a quick understanding of what content is doing how well, and for who.

Link clicks

This can be difficult information to gather but given its potential value to our business goals it’s worth getting this information where we can.

What business goal does this relate to?

Site visits generating ad revenue, event attendance, sales, depending on where the link is pointing.

In my experience it’s usually much harder to get users to click away from a social media platform than it is to get them to take any action within the same platform. Sharing links can also cause a drop in engagement, often because the primary purpose of the content isn’t to encourage engagement - success with a user often won’t be visible at all on the platform.

What numbers should you use?

Clicks per (link post*follower) (where you multiply total follower count by total updates posted)

Clicks per link post

Total link clicks

What tools should you use?

Understandably this is fairly locked-down, Fanpage Karma again goes out of its way to get you the data you need, and does offer to plot posts against link clicks in one of those scatter graphs we love. I’ve reached out to them for information on how they collect this data, will update when I hear back. As with impression data, click data can sometimes be patchy - the platform seems to miss data consistently across metrics.

Outside of that, the best trick I’ve found is by taking advantage of link shortener tracking. For example, anyone who uses free service Bit.ly to shorten their links can also get access to link click stats over time. The thing is, those stats aren’t password protected, anyone can access them just by copying the Bit.ly link and putting a + sign at the end before following the link.

Here are the stats for a link Donald Trump recently shared in a tweet.

Go forth and analyse

Hopefully, some of the metrics and processes I’ve included above prove helpful when you’re next directing your social media strategy. I would never argue that every single one of these numbers should be included in every competitor analysis, and there are a whole host of over factors to include in determining the efficacy of a community, for instance; does the traffic you send convert in the way you want?

That being said, I think these numbers are a great place to start working out what will make the difference, and will hopefully get us away from that frequent focus on follower numbers. If there are any numbers you think I’ve missed or any tips and tricks you know of that you particularly like, I’d love to hear about them in the comments below.

Yes, split testing for SEO is a thing, and a powerful one at that. In How Split Testing Is Changing Consulting, Will sums up why high priority SEO changes linger in developer backlogs, and how we’re addressing these issues with our ODN platform that allows us to test and roll out these recommendations without using our clients’ developer resources: we can substantiate best practices like H1 changes, alterations to internal links, and rendering content with and without Javascript.

Let’s get started with three tests you should try to see if you can increase organic traffic to your site.

1. Do H1 changes still work?

It won’t come as any surprise to SEOs that testing on page elements can produce significant changes in rankings. That said, I’ve found that folks can put too much stock in on page elements: we tend to get keyword-tunnel vision and chock up our rankings to keyword targeting alone. As a result, being able to test these assumptions on Google can help (dis)prove our hypotheses (and help us prioritize the right development work).

For iCanvas.com, prioritizing web development work is key: they’re a canvas print company with a robust team of developers, but like most companies, they have limited resources to test technical changes. As a result, dubious SEO-driven changes can’t be prioritized over user experience-driven ones.

We did, however, notice that iCanvas was not targeting product type in their H1 tags. As a result, this is what a typical category page (like this one) looked like.

Here, the H1 tag was simply “Beach Decor.” iCanvas was communicating the style and subject of their products in their title tags–that product being canvas art prints–but that context was lost on a given category page. We hypothesized that if we told the world (and, more specifically, Google) what the products are (canvas prints), that we would better meet users’ search intents resulting in more organic search traffic to our test pages. Here’s what the H1 looked like for the test::

After less than a month, we had our answer: our test pages with canvas prints appended to H1 tags gained significantly more traffic than our control pages. How’d we measure that?

It helps to know how ODN works (also check out Craig’s post, What is SEO Split Testing?). The most important thing to know in understanding the chart above is that ODN observes the organic traffic your site captures in real time to develop a forecast for the organic traffic we’d expect to receive in the future. That’s how we got to the nice “7.7% uplift if rolled out” estimate. There is of course volatility–forecasts are rarely perfect, and ours isn’t an exception. Which is why we also measure statistical significance within the normal range of variance we’d expect.

As a result, we were confident that this change would positively impact traffic to their site, so we declared this test a winner and rolled the change out to all of their category pages through ODN. This meant that we didn’t have to hijack our developers’ work queue in order to see an immediate benefit. Additionally, we had evidence we could bring to our devs instead of relying exclusively on the promise of following “best practices” in keyword targeting.

2. Will altering internal links give you a big payoff?

Testing changes to internal links is often an ill-defined endeavor. Do you measure changes to PageRank (dubbed local PageRank by Will Critchlow)? Should you look at your log files to observe changes to Google’s crawling behavior?

In our case, iCanvas had a somewhat simpler internal linking issue we wanted to address: self-referential links. As an art company, it’s essential to attribute the creator’s name to their work of art.

As a result, they had made the decision to include a link to the artist of the work on every product listing.

For instance, in the above screenshot of a category page, you can see that each product has its artist listed, and those artists’ names are linked to pages listing all of their available artworks on iCanvas. While this application made sense for category pages where various artists’ products are featured alongside each other, it resulted in redundant links on those individual artists’ pages.

Each of these artist attributions, on the artist’s category page, were linking back to themselves (thus: self-referential links). Our hypothesis was that if we removed these redundant links, we’d better consolidate our PageRank. We knew this change could have a dramatic impact on artists’ products, resulting in more organic traffic flowing to their product pages. Our test, however, would measure the impact of organic traffic acquisition to our test group of artist pages. So how did it turn out?

As it turned out, our test was a success: artist pages in our test group received more organic traffic than our control pages. We were again able to test something that would’ve been touted as “best practice” before rolling it out sitewide, or manually setting up test and control groups and measuring the results ourselves. Once we saw the positive impact (less than a month later), we rolled this change out sitewide and the validation we needed to get the necessary development work prioritized.

3. How good is Google at crawling JavaScript?

If you follow our blog, you’ve already read about how we tested Google’s ability to crawl and render JavaScript. We posited that, because Google wasn’t reliably displaying iCanvas’ products in its Fetch and Render tool, iCanvas’ category and product pages would receive more organic traffic if we used a CSS trigger to load their products instead of relying exclusively on JavaScript.

Above is a screenshot of what we saw (and, presumably, what Googlebot saw) in Fetch and Render of a category page.

After our tweak, however, we plugged one of our test URLs into Fetch and Render, and we could finally produce what users see in their browsers with JS enabled. But did it actually result in additional organic traffic to our test pages?

As you can see above, it did. Based on the performance of our test pages, iCanvas would see an extra 88 pageviews daily with their products triggered through a line of CSS instead of JS. Measuring the impact of this relatively simple change could have taken much longer than this month-long experiment. By the end though, we were ready to roll this out sitewide to ensure that all iCanvas products were crawlable and discoverable.

Split testing something as simple as on page SEO can produce meaningful traffic changes that’ll allow you to validate best practices and get necessary evidence for your stakeholders (and developers) to buy into your suggestions. Is it time for you to try SEO split testing?

As we bid farewell to the glorious 35-degree-days of summer, and brace ourselves for the inevitable autumn chill, here’s a look back at the creative content that tickled us this summer.

One of the reasons why I love writing these posts each quarter is because, if nothing else, it helps me and the creative team here at Distilled broaden the ways in which we think; and it inspires us to consider the more-varied, perhaps less-expected executions, that we could try.

After a successful run with the same topics and execution style, ideas can become formulaic, and can soon become dull and uninspiring for us and those experiencing our content. So it’s all the more important that turn to others for inspiration, and keep ourselves open to new ways of seeing and representing the stories we tell.

Imagery reimagined

Concepts that leave a lasting impression are often those that illustrate something we know well in an alternative way. Each of these campaigns reimagine everyday images differently, whether it’s in a comical or shocking way.

This ad for KFC certainly makes you look twice. Fire has been replaced with fried chicken. It’s that simple. The organic patterns of the deep fat fried batter really do take on the life of expanding balls of exploding exhaust fire. Cars backfiring, or rockets launching into space – this image ad replacement hits the smart and succinct notes perfectly.

Communicating safe sex to younger people can seem dull, awkward or just something the target audience do not want to think about. This aptly-named ‘Unprotected Text’ campaign, however, cleverly taps into how young people communicate, using the not-so-secret alternative emoji meanings. Emoji are innately light-hearted, accessible and simple, and so allowing the NHS to talk about a slightly embarrassing topic in an eye-catching and memorable way, specifically targeting the 16-24 age range where STIs have been on the rise.

This campaign left me really concerned – is that really how many gorillas are left in the wild?! Campaigns about near-extinction and the depressing statistics that face some of the world’s greatest creatures are often not the most creative. They can be quite scarring and off-putting, shocking but not necessarily compelling people to donate. Instead of using gore tactics, this campaign simply shows the number of animals left in a certain species by the number of pixels used to depict them - the more pixelated an animal is, the more endangered. The mechanism is simple, but the impact is everlasting.

It turns out most people don’t know what a bike looks like… Something we see every day, but for some reason we can’t remember how those tubes of metal connect to make the frame. Using this method of collecting what one item looks like and crowdsourcing drawings of it, means that the variety of executions far exceeds what one person could imagine. With the highly polished graphics and computer renders we are used to, seeing an amateur’s naïve drawing has a charm of its own. This artist turned these quick sketches into realistic images of bikes, immediately highlighting how flawed they were in their design.

Manipulating Maps

This visualisation shows the touring routes of famous musicians on a map shown on a gig ticket. Different music genres are more prevalent in certain locations and fan bases are not always the same, so the routes on the maps vary wildly. There’s also a noticeable difference in patterns when comparing the route that tour buses take versus cross-state flyers.

We are so used to going underground in cities and popping up in new neighbourhoods, we often give little thought to the ground covered. Tube maps are simplified for easy navigation so usually, don’t represent the actual routes. These visualisations, show us exactly where these lines go, passing over iconic monuments that help us navigate our cites.

This scrolling data visualisation manages to simplify how the US uses its land, only 3.6% being used for urban areas and a massive 55% being used to feed the country with crops and pasture land. Having watched Cowspiracy recently the piece later goes on to highlight what this documentary tries to hammer home in that cows and cow feed production takes over a disproportionate amount of the country. Perhaps it’s time to finally give up those beef burgers! The data was gathered using surveys, satellite images and categorisations from various government agencies.

For April Fools Day this year, Google hid Wally (or Waldo, depending on where you are from) in Google maps. You are taken to an area where he is hidden and you are then zoomed into one of the famous Where’s Wally illustrations to begin your search.

Social Media and Mental Health

Mental health company Sanctus have created a tongue-in-cheek page and video that aims to highlight the effect social media has on our mental health. People are often trying to one-up each other by showing the glamorous holidays they have been on, or delicious meals they are eating, this often leaves people feeling inadequate if they are not able to keep up. Life faker makes a point by offering a library of images, so you can fake the life that everyone seems to want to show. My favourite quote from the video is ‘I have never seemed happier’ which really highlights the irony of it all.

Real World Visualisation

Living in such a digital landscape I always get excited when people create physical data visualisations. We all know the length of terms and conditions can be a joke, and this piece highlights the actual length and comparison between the T’s and C’s of different apps. If you’re interested, Instagram takes the lead, closely followed by Snapchat.

What content have you enjoyed lately? Let us know in the comments.

]]>Announcing Full-Funnel Testing - testing SEO and CRO at the same timehttps://www.distilled.net/resources/announcing-full-funnel-testing-testing-seo-and-cro-at-the-same-time/<p dir="ltr">Until now it&rsquo;s not been possible to measure the impact of SEO and CRO at the same time. Today we&rsquo;re proud to announce a new feature of <a href="https://odn.distilled.net/" target="_blank">Distilled&rsquo;s Optimisation Delivery Network</a> that we&rsquo;re calling full funnel testing.</p><a href="https://www.distilled.net/resources/announcing-full-funnel-testing-testing-seo-and-cro-at-the-same-time/">Continue reading &gt;&gt;</a>Craig BradfordMon, 15 Oct 2018 14:50:59 +0000https://www.distilled.net/resources/announcing-full-funnel-testing-testing-seo-and-cro-at-the-same-time/DistilledSEOUntil now it’s not been possible to measure the impact of SEO and CRO at the same time. Today we’re proud to announce a new feature of Distilled’s Optimisation Delivery Network that we’re calling full funnel testing.

Our ODN platform launched with a focus on SEO testing. You have probably thought about this by comparing it to tools like Optimizely that allow you to do CRO testing. If you want to know more about how SEO testing works and how it’s different to CRO, you can read more in this post on what is SEO testing.

The trouble with just using one or the other is you don’t have any insight into how they impact each other.

That’s a big problem because we know from our testing that a lot of SEO changes impact conversion rate and a lot of CRO changes (even when they increase conversion rate) can negatively impact organic traffic. If you haven’t read it already, you should check out Will’s blog post on the impact of rolling out negative SEO changes but here’s an example of when it goes wrong. This chart shows the search impact of a suggested CRO change on SEO. It decreased organic traffic by 25%.

For that reason, we see the relationship between SEO and CRO like this:

We saw a need to be able to measure SEO and CRO at the same time. For the last few months, we’ve been running a beta version for some of our clients of what we are calling “full-funnel testing”. Today we’re opening that feature up to everyone and we’d like to show you how it works.

How does it work?

Let’s look at CRO first. To run a CRO experiment, we cookie users based on the landing page design that they arrive on, they’ll then always see that version when they move between pages.

The result is we know the impact on conversion rate, but we don’t know the impact on SEO.

When we do pure SEO testing, we split pages, not users and look at the different impacts on search traffic to the control and variant pages:

The result of this framework is that we know the impact on SEO but we don’t know the impact on conversion rate:

A new framework - Full-funnel testing

With full funnel testing, the site is set up initially in the same way as in the pure SEO testing scenario - and then when someone arrives on a landing page, the SEO testing part of the experiment is complete:

We can then pivot into a CRO experiment by dropping a cookie for that user to make sure they see the same template that they first landed on when moving between pages:

Note that, having landed on the Unicorns page initially, they now see the “A” template version on all subsequent pageviews even on pages like Cats and Badgers that would be set up with the “B” template for anyone landing directly on them as a new visitor:

The result is that we are able to measure the impact of changes on SEO and CRO at the same time.

Thanks for making it this far, you can expect to hear more about this as we get more examples of full-funnel tests and start to share what we learn. If you’d like to know more or see a demo, reach out to us here.

]]>How to Make a Histogram using Google Sheetshttps://www.distilled.net/resources/how-to-make-a-histogram-using-google-sheets/<p><img src="https://www.distilled.net/uploads/how_to_make_a_histrogram_1.png" width="780" height="331" /></p>
<p><span style="font-size: 1.05em;">You&rsquo;ve probably summarized data with a number &mdash; like an average. For instance, the <em>median</em> is an average. </span></p><a href="https://www.distilled.net/resources/how-to-make-a-histogram-using-google-sheets/">Continue reading &gt;&gt;</a>Benjamin EstesMon, 15 Oct 2018 06:00:00 +0000https://www.distilled.net/resources/how-to-make-a-histogram-using-google-sheets/Marketing

You’ve probably summarized data with a number — like an average. For instance, the median is an average. It tells you what number is at the middle of your data, if you were to sort all the numbers from smallest to biggest. But an average says nothing about how those numbers are spread out.

Think about the average transaction value for an online store. Is the average high because there are many sales near that price? Or is it that a few large sales make the average look bigger? A histogram will show you which is the truth.

A histogram is an image summarizing how numbers are spread out. It breaks data into buckets, and shows how many numbers are in each bucket. Usually, each bucket has the same width. The width of a bucket is which numbers go in that bucket. A bucket that would hold any of the numbers 1 2 3 4 5 has the same width as a bucket that holds the numbers 6 7 8 9 10. Both have a width of 5.

Let’s try splitting numbers into these buckets. Call the buckets [1, 5] and [6, 10]. We’ll use the numbers (1, 2, 3, 4, 4, 4, 9). Now count how many numbers are in each bucket: [1, 5] ← 6 and [6, 10] ← 1. As a table:

Bucketing (1, 2, 3, 4, 4, 4, 9)

Bucket

Count

Numbers

[1, 5]

6

1 2 3 4 4 4

[6, 10]

1

9

median = 4

The median and our buckets say different things about the same data. The median says, “the number 4 is at the middle of this data. ” Which is true! The histogram says, “most of the data is in [1, 5]... only a single value is in [6, 10].” Which is also true! Bucketing gives us facts about the spread of numbers that averages cannot.

That’s all the theory for today! Let’s walk through the steps necessary to build a histogram from scratch. Then, we’ll analyze the example histogram to understand what it reveals.

These steps are all replicable in Excel, too — right down the function names.

First, choose how many buckets you need

A histogram lumps similar numbers into buckets, and shows how many fall into each bucket. First, we need to know how many buckets to use. You can choose any number. A good choice will be small enough to summarize the data, but large enough to show interesting facts.

Here’s one approach:

Count how many numbers you have.

Take the square root of the count.

Round the square root up to the nearest whole number. You can’t have half a bucket!

Say you have 48 data points. Then, √48 ≈ 6.93, which rounds up to 7. That’s how many buckets to use.

Choose the width of the buckets

Once you know the number of buckets, you must find a width for the buckets that covers all your data. If you choose a width that is too small, then your buckets might not include every number in your data.

This is an important choice. The point of the histogram is to make the spread of numbers easier for people to see. So, the choice of bucket width should make the data easier to understand. In general, it’s best to use multiples of 2, 5, or 10 for this. You’ll see what I mean in the example.

Here’s the process:

Find the range of your data. The range is the biggest number minus the smallest.

Divide the range by the number of buckets.

Round up to the nearest “nice” number.

In the example sheet, the biggest number is $5,473. The smallest is $20. So the range is $5,453.

The number of buckets is 7. $5,453 / 7 = $779. This rounds up to $800, which is the nearest “nice” multiple of 2 and 10. We’ll use $800 as our bucket width. If we chose a width of, say, $750, then the total range of our buckets would be 7 ⨉ $750 = $5250. Since $5250 is smaller than $5,453 some of our numbers would fall outside of any bucket!

Regardless of how you wind up drawing the chart, you should choose the width of the buckets yourself. Don’t let a spreadsheet app choose for you.

Sort your data into buckets

Depending on how you chart your data, you may not need this step. But it’s a useful skill, so try it anyway!

First, you need to determine the smallest value to chart. In the example, we’ve chosen to start at zero. Often (at least in marketing) this will be a reasonable and easy choice. Transaction amounts start from free ($0) and get bigger from there. If you need the chart to start above zero, start at the nearest “nice” number below the smallest number in your data.

Now, make a list with the biggest number contained in each bucket. This is easy with a spreadsheet. Start with the first number (the first “top end” of a bucket). Get the rest of the rows by taking the number in the row before it and adding whatever the width of a bucket is:

Note that there is no “0” row in our list of buckets. Each number is the upper bound of a bucket.

Now, let’s count how many numbers are in each bucket. Fortunately, Sheets will do this for us! We’ll use the FREQUENCY function. The result looks like this:

This cell is where we’re using FREQUENCY.

FREQUENCY is counting the number of data points at or below each boundary. So $800 exactly would actually be counted in the $1,600 bucket. Don’t worry about this. But if anyone asks — now you know!

Create the histogram as a bar chart

Sheets has a built-in histogram tool. Don’t use it. It’s hard to control what charts you get.

The good thing is, you’ve already done all the work you need to draw the histogram without fancy tools. The built-in bar chart is good enough for you! It’s dead simple, and much easier to get a good chart.

Start by selecting the buckets and count:

Then click Insert > Chart:

Sheets should decide to insert a bar chart. If it doesn’t, select “Column” as your chart type in the editor:

...and you’re done! You should have something that looks like this chart:

This is a histogram. You’ve done it!

Now, inspect your data

That was a lot of work for a simple chart. Here’s the payoff — and it’s a good one. What do you see when you look at this histogram?

Here’s what I see, in plain English:

There are two kinds of transactions. A bunch of smaller transactions are in the smallest bucket, and a few bigger ones that are more spread out. (In our business, smaller amounts tend to be subscription payments. Larger amounts are for conference tickets.)

The thicker bar at $1,600 says “lots of folks are buying one ticket for Distilled’s conference”...

...but about half of the “ticket” transactions are for multiple tickets.

If we had used “average transaction value” instead of a histogram, we wouldn’t see that there are two types of transactions!

This isn't the typical type of post you might find on Distilled’s blog. I’ve always been a glass half full person but the honest answer is that my positive attitude is a result of listening, learning, testing and being open to the world. I’m going to give you my approach to being positive at work.

It didn’t happen overnight; it took time and practice. You’ve got to want to be positive!

The more you feed your mind with positive thoughts, the more you can attract great things into your life.

Roy T. Bennett

I’m going to tell you the things I do, hopefully, you’ll want to try a few!

Meditate

Meditation, gathering your thoughts, visualization - or whatever you want to call it, I’ve found giving yourself time to consider what you want out of a day, every day really helps you be productive!

Try this: every morning spend between 5 to 20 minutes visualising what you need to do, yourself doing it and completing it like a dress rehearsal in your head. Think of it as creating a game plan of what you want to achieve that day!

Meditation calms me, enabling me to focus on specific challenges. I’ve found it particularly useful when I’ve got a big challenge on or if something unexpected happens. A few minutes of considered focus breaks the tunnel vision and opens your mind to answers you may not have thought of.

Considering what you want to get out of day sounds so simple, and it is. The trick is to keep doing it!

Be positively direct!

Don’t beat about the bush, the best way to get things done is to be clear, concise and direct.

Being direct with people will help them respond to your need precisely, saving you both time.

Try this: refine how you ask for help on projects, for the information you need or, in fact, for anything at all. Be brief, to the point and make sure what you’ve requested is understood.

Be respectful of others time, understand they have things that are a priority to them, inform them what you need (even offer to send a bullet point email after you’ve had the chat) and say thank you! This isn’t easy but I’ve found that being polite and saying “I know you’re busy” shows you understand their position, opening them up to the conversation. Even if they can’t help you there and then, it starts the conversation and you’ll be surprised how often the person you asked responds without you having to chase them.

Don’t pester people either, that's just annoying!

Have Manners

This is an obvious one right. I’ve found a few good manners go a long way, and that doesn’t just apply to the workplace!

I know everyone is always in a rush and has ‘stuff’ to do but it takes 10 seconds to open a door for a stranger or compliment a barista on their nice shirt that day. Manners show you have respect for others. Being pleasant, showing you have time for others and doing that nice thing for someone all leads to one outcome, you’ll be liked!

I always go the extra mile for people I like and if people like you they’ll do the same. Getting things done becomes much easier if people see you as a nice person. Remember to be genuine and authentic, people know when you’re being nice just because you want something.

Try this: ask how people are, how their days going and listen. You’ll be remembered for the person you were long after your work is forgotten.

Side note, please do you’re best to be the best version of you, too many people don’t find the time just to say thank you!

How would you feel?

“I feel like X”, “I think Y”, “I need Z”. Notice the ‘I’.

Do you consider things from the other person's point of view?

The ability to show people you understand their viewpoint is invaluable; it helps resolve disagreements, identify quick solutions and gets things done smoothly and efficiently.

Try this: when tackling a problem together, ask the other person what they think is the best way forward and why they think that. And really listen to what they say. You’ll find you become more open to suggestions and discover solutions you may not have thought of. You’re likely to get a positive resolution more quickly.

Looking at life from the other person's point of view is a fantastic self-education tool and teaches you to build rapport. This leads to you being able to connect with people easier, in a positive way that encourages others to share with you more freely. If you’re open to others, they’ll be open to you too.!

What did I learn from that bullshit?

“I’m so glad that bullshit is done” or “I’m so happy I don’t have to work with that idiot again”. Sound familiar?

In every situation, good or bad, look for a learning. You may already do this, in which case hats off to you; the ability to look for positives helps you to reinforce a positive mindset. You can learn from your shortcomings.

Try this: ask for feedback on where you went wrong or why you didn’t win a project. If you can identify what you’re doing wrong then you can take action to correct it. You turn a negative to a positive by taking this learning on board to improve next time.

Take a break, have a coffee

Ok, you don’t have to have a coffee (although being a caffeine addict I would advise it). Have a biscuit, go for a walk, just do something to take a break.

If you can focus on a solid 8-hour working day please email me and explain to me how. Some people can focus for hours, others bursts of 30 minutes. I focus for 45 minutes to an hour before my mind wanders. By breaking my day up into periods of focus and resetting I’ve become more efficient and maintain my positive attitude.

When I meditate in the morning I break my day into sections, e.g send emails, write a proposal, call with a client and so on. In between these tasks I may walk around the office for a few minutes or make a coffee, this helps me to wind down from my last task. When I return to my computer I’m fresh and ready to go again.

Through creating clear breakpoints between tasks, you can generate a sense of achievement (task is done) and give yourself a mental refresh boost. If you have a mammoth task, break it down into smaller manageable tasks - the bonus here is that you’re less likely to procrastinate too.

Try This: The Pomodoro Technique. Break down work into intervals using a timer and take a short break between intervals. My version is through Spotify, listen to a number of tracks whilst doing work, then break time!

You need to have a long-term goal

If you’re planning a trip, you’d be a fool not to consider the route, the time, what you need and where you want to end up, so why would you start a task without know what you want to achieve?

If you’re looking at what do you want to achieve this year, you need to consider what you need to do each month or week to achieve it. Create a timeline, with the actions you need to complete and by what time to reach your goal. Setting a deadline will give you a focus and encourage you to move towards the goal.

Try this: When breaking down a goal into the task, begin with a mindmap and think about everything I need to do. Then I set tasks and deadlines. Discipline is key to success here, find the system that works for you!

Remember it’s so easy to slip into short-term priorities rather than long-term goals.

Be prepared for the unexpected

Life never runs smoothly and there is always that thing that appears out of the blue to turn your day into Ragnarok.

Breathe, consider the challenge and don’t get overwhelmed or freaked out - this happens to us all.

If you prioritise tasks (I would advise captured somewhere, even if it’s a scrap of paper) from ‘must do’ through to ‘would be nice to do’, it gives you the flexibility to adapt and add / re-priorities tasks. This means you’re able to free up time to deal with that unexpected thing you need to deal with today as you’re able to identify those task you can roll to a later date to focus on the urgent one today.

This really comes down to creating a well-thought-out system to manage your daily schedule. I would recommend the book Getting Things Done, it’s a labour of love to read but if you stick with it you’ll get a great direction on how to get a great personal management system in place.

I use Trello as my personal management system. It’s super useful for creating to-do lists and helps to organise and prioritise work (you should check it out).

Always have fun and dream

“The creative adult is the child that never grew up!”

It’s easy to fall in the professional mindset where work is work, but work forms a large chunk of your time on planet earth, so enjoy it and have fun.

I’m lucky to work with people that are great, embrace individuality and want to create a work environment which is fun to work in.

Have a way to wind down or switch off from the world. I watch cartoons or go on a funny meme browse. I’ve even got a ‘go to video’ I watch to make me smile (you’ll find it here - it’s Kanye West or Ye related, just to give you a heads up).

Remember work is important but so is your mindset, have a dream.

Everything in life comes down to how much you want it. I’ve made the above activities into habits. The more you practice, the more they will become second nature to you.

Let me know if you already do any or all of the above, if you do anything different and if you found these suggestions worked for you, all feedback is welcome!

Changed language on some issues to make it clear how they fit in the hierarchy.

Removed several redundant lines.

Fixed typos affecting meaning of a couple lines.

Added new section relevant to mobile-first indexing.

Updated September 13, 2017. Changes include:

Made each line easier to understand

Added pointers for going straight to the relevant reports in each tool#

Changed which tool to use for some rows

Added more Google references

Removed a couple dubious lines (site speed, HTTP/2)

Removed superfluous timing column

Removed whole sections that made the audit less MECE

Fixed cases where some cells would say “Incomplete” and others wouldn’t

Thanks everyone who has provided feedback over the last year!

Technical audits are one of the activities that define SEO. We’ve all done them. But audits are only as valuable as their impact. Whether you’re a practitioner or an agency partner, your job really begins when you finish the audit. You must take your recommendations and make them a reality. Distilled thrives on this “effecting change” mindset.

Yet the (long, laborious) audit has still got to be done. We sift through crawls, consider best practices, analyze sitemaps—the list goes on.

But we’re committed to the technical audit. So if we’re going to audit a site, why not do the audit in a way that makes the fun part—making change happen—much easier?

The challenge

With that in mind, we asked “Can we design an audit that helps make real change happen?” The result is an aware technical audit checklist. It considers the underlying problems we’re tackling (or trying to prevent). It makes technical audits faster, more effective, and more impactful.

Read on for more about how to put the checklist to use. Many on our team find it self-explanatory, though, so if you want to get cracking have at it! And then let us know what you think.

Every great audit starts with a checklist!

There are lots of technical checklists out there. A good technical audit inspects many things in many places. Checklists are perfect for keeping track of this complexity. They’re simple tools with lots of benefits. Checklists are:

Comprehensive. Without a checklist, you may still discover the obvious technical problems with a site. Using a checklist ensures you remember to check all the relevant boxes.

Productive. Working without a checklist takes more effort. At each stage you have to decide what to do next. The checklist answers this question for you.

This checklist is better

Technical SEO has one purpose: ensure site implementation won’t hurt search visibility. Everything we uncover leads back to that point. This defines the scope of the audit.

Beyond that, many folks break down technical to-dos by where they need to look or what tool they need to use. They might look at all on-page elements, then move on to all sitemap issues. That’s a valid way of approaching the problem. We’ve got an alternative.

We look ahead to the conversations we’ll have after we’ve done the audit. Consider this (realistic) statement: “We’re concerned that important content isn’t indexed because URLs aren’t discovered by crawlers. Submitting a sitemap to Search Console might help fix the problem.”

This is a coherent technical recommendation. It explains why to make a change. It has 3 parts:

Outcome - important content isn’t indexed.

Cause - URLs aren’t discoverable by crawlers.

Issue - we haven’t uploaded sitemaps to Search Console.

That’s the difference: you’ll see this is exactly how we’ve structured the checklist. Take a moment to jump over and inspect it with this model in mind. By now you’re probably getting the idea—this isn’t just a technical checklist. It’s a also a tool for communicating the value of your work.

The structure encourages completeness

Each row of the checklist represents a problem. By including the right problem at each level, we also make it as complete as possible, without adding redundancy. The principle of MECE (“Mutually Exclusive, Comprehensively Exhaustive”) is what makes it work. At each level of analysis, we:

include all possible problems, and

ensure problems don’t overlap.

Let’s illustrate, using the highest level of analysis. The checklist as a whole is investigating whether “we have a technical problem with our site that is reducing search visibility”. There are 3 reasons we could lose search traffic because of a technical issue:

there is a technical reason good content isn’t indexed, or

there is a technical reason indexed content doesn't rank for desired terms, or

there is a technical reason site content isn't well-presented in search.

These represent all the possible problems we could be dealing with (“comprehensively exhaustive”). They also don’t overlap (“mutually exclusive”).

By applying the same way of thinking recursively, we expose all sub-problems in these areas. Then we list all issues that could be causing these sub-problems. This makes the checklist as thorough as possible, without redundant checks that could slow us down.

A few pointers

Getting started

This checklist template is available to the public. When you open it, you’ll discover that you only have “view” permissions for the master document. To use it, you’ll first want to create a copy:

Marking status

Mark each issue with Pass, OK, or Fail:

Passmeans you have no concerns.

OK means the issues doesn’t seem relevant currently.

Fail means something appears to be wrong.

When you update an Issue, the grade for the Cause and Outcome will also be updated. If any Issue’s score is Fail, the Cause and Outcome will also Fail.

Find what you’re looking for quickly

People new to search engine optimization can still start using this sheet. We’ve now added a “Start Here” column to make it faster than ever to get started.

For new users of some of these tools, it might not be clear where to find relevant information. The “Start Here” column points you to the exact place you can find the details you need.

Understand what’s at stake

If you’re the person analyzing the audit after it’s done, you want to get a high-level picture quickly. Use the structure of the sheet to simplify that view by filtering the Issues rows.

Filtering for Outcomes and Causes gives you a quick-and-dirty summary of a site’s strengths and weaknesses. This is the first thing I look at when I see a completed audit!

Filtering related tasks

If you’re the one doing the audit, you want to get it done as quickly as possible. Take advantage of the structure of the sheet to group things

Take advantage of the structure of the sheet by showing only the issues you’re inspecting right now. Try filtering by the “Where” column—for “Google Search Console”, for instance. This will let you grade all Issues for that tool at once.

We want to learn from you, too

This checklist is a living document. We appreciate any feedback you have. Feel free to jump in the comments section here or find me on Twitter: @BenjaminEstes.

Interested in working with us?

This audit is an example of the way Distilled approaches consulting. We aren’t limited to SEO—we also help our clients with marketing strategy, content design and production, paid search, and more. If our approach sounds interesting, please reach out!

Without a structured testing program, our experience shows that it’s very likely that most SEO efforts are at best taking two steps forward and one step back by routinely deploying changes that make things worse.

This is true even when the thinking behind a change is solid, is based on correct data, and is part of a well-thought-out strategy. The problem is not that all the changes are bad in theory - it’s that many changes come with inevitable trade-offs, and without testing, it’s impossible to tell whether multiple small downsides outweigh a single large upside or vice versa.

For example: who among us has carried out keyword research into the different ways people search for key content across a site section, determined that there is a form of words that has a better combination of volume vs competitiveness and made a recommendation to update keyword targeting across that site section?

Everyone. Every single SEO has done this. And there’s a good chance you’ve made things worse at least some of the time.

You see, we know that we are modelling the real world when we do this kind of research, and we know we have leaky abstractions in there. When we know that 20-25% of all the queries that Google sees are brand new and never-before-seen, we know that keyword research is never going to capture the whole picture. When we know that the long tail of rarely-searched-for variants adds up to more than the highly-competitive head keywords, we know that no data source is going to represent the whole truth.

So even if we execute the change perfectly we know that we are trading off performance across a certain set of keywords for better performance on a different set - but we don’t know which tail is longer, nor can we model competitiveness perfectly, and nor can we capture all the ways people might search tomorrow.

Without testing, we put it out there and hope. We imagine that we will see if it was a bad idea - because we’ll see the drop and roll it back. While that may be true if we manage a -27% variant (yes, we’ve seen this in the wild with a seemingly-sensible change), there is a lot going on with large sites and even a large drop in performance in a sub-section can be missed until months after the fact, at which point it’s hard to reverse engineer what the change was. The drop has already cost real money, the downside might be obscured by seasonality, and just figuring it all out can take large amounts of valuable analysis time. When the drop is 5%, are you still sure you’re going to catch it?

And what if the change isn’t perfect?

The more black-box-like the Google algorithm becomes, the more we have no choice but to see how our ideas perform in the real world when tested against the actual competition. It’s quite possible that our “updated keyword targeting” version loses existing rankings but fails to gain the desired new ones.

Not only that, but rankings are only a part of the question (see: why you can’t judge SEO tests using only ranking data). A large part of PPC management involves testing advert variations to find versions with better clickthrough rates (CTR). What makes you think you can just rattle off a set of updated meta information that correctly weights ranking against CTR?

Our testing bets that you can’t. My colleague, Dominic Woodman discussed our ODN successes and failures at Inbound 2018, and highlighted just how easy it can be to dodge a bullet, if you're testing SEO changes.

We’re talking about small drops here though, right?

Well firstly, no. We have seen updated meta information that looked sensible and was based on real-world keyword data result in a -30% organic traffic drop.

But anyway, small drops can be even more dangerous. As I argued above, big drops are quite likely to be spotted and rolled back. But what about the little ones? If you miss those, are they really that damaging?

Our experience is that a lot of technical and on-page SEO work is all about marginal gains. Of course on large sites with major issues, you can see positive step-changes, but the reality of much of the work is that we are stringing together many small improvements to get significant year-over-year growth via the wonders of compounding.

If you’re rolling out a combination of small wins and small losses and not testing to understand which are which to roll back the losers, you are going to take a big hit on the compounded benefit, and may even find your traffic flatlining or even declining year over year.

You can’t eyeball this stuff - we are finding that it’s hard enough to tell apart small uplifts and small drops in the mix of noisy, seasonal data surrounded by competitors who are also changing things measured against a moving target of Google algorithm changes. So you need to be testing.

No but it won’t happen to me

Well firstly, I think it will. In classroom experiments, we have found that even experienced SEOs can be no better than a coin flip in telling which of two variants will rank better for a specific keyword. Add in the unknown query space, the hard-to-predict human factor of CTR, and I’m going to bet you are getting this wrong.

Still don’t believe me? Here are some sensible-sounding changes we have rolled out and discovered resulted in significant organic traffic drops:

This is a post written by our Seattle intern, Tammy Yu. Tammy is a recent graduate from the University of Washington who majored in Informatics and is soon to be a full time Distiller.

According to Pinterest, 72% of survey respondents said Pinterest helps them find ideas for their everyday lives, and 1 out of 2 users have made a purchase after seeing a Promoted Pin. With 200 million users monthly, Pinterest is becoming a powerful tool for ecommerce retailers to connect with their current and potential consumers. In this post, I highlight the buyer’s journey and suggest ways to optimize each step in the funnel on Pinterest: awareness, consideration, and decision.

Awareness Stage

At this stage, potential buyers become aware of a problem or opportunity. It’s the light bulb that lights up inside their heads and they realize there’s a problem. This can happen on or off the platform.

On the platform: A user came across a repin of a nice houseplant. This sparked interest in a need for a houseplant.

Off the platform: A user sees a houseplant while visiting a friend’s house. This creates a need for a houseplant.

Although we can’t account for awareness that takes place off the platform, we can account for awareness that takes place on the platform.

Think of this stage as a potential domino effect: one user who shares your pin might bring about awareness and interest in another user. To make your pins engagement-worthy, refer to the next stage, consideration, for more tips.

The awareness stage focuses on growing your reach and gaining more exposure for your pins so that the domino effect can occur.

Collaborate on group boards

Group boards are great for brands looking to gain more exposure on their pins. They’re “owned” by one Pinner, though the board shows up in the profiles of all the collaborators. As a collaborator, your pin to a group board is shown to the users that follow that board. You have the potential to gain more exposure and increase your odds of getting repinned.

Install the Pinterest “Save” button on your site

If a visitor on your site finds an item interesting, the pre-installed save button makes it easy for them to save your product and image to their Pinterest board(s). When one person saves a pin to their profile, their followers can also see the pin, potentially creating awareness in another user.

Rather than uploading a pin manually, you can pin any image on your site by using the save button. Tip: pin from your site, rather that repinning from others. According to Pinterest, it makes your pin more credible.

Consideration

At this stage, potential buyers are researching, discovering, and evaluating their options.

Furthering the houseplant example: the potential buyer searches on Pinterest for more ideas of what kind of houseplant to buy.

Pinterest users need to be able to find your pins before they can consider your products as a possible solution. Make your products relevant to the potential buyer, give them the information they need about your product, and show them how your product can solve their problem to help drive a decision -- “yes, I want that product!”.

Text Overlay

Text overlay can be a faster method to communicate a CTA (call to action) or short descriptive text with the user because it requires the user to look at only your image (which they’re most likely doing already), rather than your pin description. This would also be a good place to use your keyword research, to make the text overlay more relevant to the user. I will talk more about keyword research below!

If you decide to use text overlay, make sure the text does not interfere with the entire image or the corners, especially on top of important elements or products in the image. Doing so will interfere with visual search and results. For anyone without fancy shmancy photo editing programs, try Canva.

Conduct keyword research

Conduct keyword research to improve your targeting and increase your pins’ chances of being seen.

Visual Search

Using visual search, take a picture of whatever it is you’re pinning and see what other results come up. You can learn from other competitors or the suggested keyword terms that appear in search.

Pinterest Guided Search

Type in your keyword in the search bar and press enter. See what other search terms show up below the search bar. You can also learn from competitors and other Pinners to see what they’re using in their descriptions.

SEO best practices for Pinterest

Pinterest is a platform that relies heavily on its search abilities. Optimize your Pinterest components to ensure your profile, boards, and pins can be found in user searches. Use your keyword research here!

Profile

Business Account & Verification

If you haven’t already done so, make sure you set up a business account and “claim” your website. Business accounts give you access to Pinterest Analytics. “Claiming” your website will ensure your profile picture will show up on any pins made from your site. According to Pinterest, verified/claimed accounts will have their pins appear higher in the search results.

Profile Description & Website Link

Create a descriptive profile description to let readers know what your company is all about. Make sure to link your website on your profile as well!

Boards & Sections Names

Consider naming your boards and sections something specific and unique, yet also descriptive of the pins it entails. In a sea of boards all titled nearly the same thing, you want to stand out.

Board Descriptions

Use CTAs in descriptions to help the potential buyer understand what they can gain from looking at your board.

Pins

Include keywords in your pin descriptions & img alt tags on website

Pin descriptions are also used as the alt tag for images on Pinterest and vice versa. As a result, a pre-populated pin description will appear when a user saves a pin from your site. When it comes to pin descriptions (and img alt tags), implement the following best practices:

Describe the image accurately.

Include keywords towards the beginning of your descriptions so that it’s more likely to appear in search results.

Use CTAs in descriptions to help the potential buyer understand what they can gain from looking at your pin.

Links should go directly to the product page

Make sure to link your pins back to your site! If your pin is focused on a product, link to that specific product. Consider using Google Analytics’ URL Builder to keep track of your referring traffic source.

Show how versatile your product is

Have a product that can be used in different ways (e.g. furniture, apparel)? Post several different pins (with different images) to better capture the versatility of your product and how it can be used in different ways and for different people. This will help the potential buyer understand how the product is relevant to them, and ultimately increase your chances of conversion.

Decision

At this stage, your potential buyer has done all the research and has now made the final decision. Here, your focus is to help your buyer purchase your product, should they decide your product best fits their needs.

To continue with the houseplant example: the potential buyer has decided they want to purchase one of your houseplants.

Make purchasing your products easy as possible.

Shorten the buying process to make purchasing easy and painless.

Make use of buyable pins

Buyable Pins allow you to sell products on Pinterest. The great news for ecommerce businesses: it’s free! Buyable Pins allow users to shop and checkout directly on the platform, without having to leave.

Directly link your pin to the product page, rather than the home page.

Linking directly to your product will ensure your potential buyer can find exactly the item they were viewing on Pinterest, eliminating the need to search your website to find the product.

Make checkout as easy as possible on your site.

Although I won’t touch too much on this subject since it takes place off Pinterest, do consider the user flow of purchasing a product on your site. Think about what can be improved about this process and follow UX best practices.

Use Rich Pins

Rich Pins provide more information on the pin, in comparison to traditional non-Rich Pins. At the moment, there are four types of Rich Pins: app, product, recipe, and article. Product Rich Pins give your potential buyer all the information they need to make a purchase: pricing, availability, and a link to buy the product.

Final Thoughts

Personally, one of my biggest pain points about getting ideas from Pinterest, is how incredibly hard it is to find the original source for products. When I come across something I want to buy, I can almost never find where to actually buy the item. Pinterest has made a lot of improvements to help solve this problem through verified businesses, visual search, product/Rich Pins, and Buyable Pins. If you haven’t already, now is the time for ecommerces to market on Pinterest.

Each year we write a round-up of some of the most exciting projects we’ve worked on over the past 12 months. We’ve continued to create a wide variety of content; never restricted by format. This year’s roundup includes photo stories, maps, mind maps, calendars, a long format article, picture quizzes and a one-button game.

Our successes have been through both press mentions and social engagement. One of our favourites was Advisa’s Brexit Bus which had over 1600 Twitter mentions - one being from JK Rowling herself.

It’s been said that imitation is the sincerest form of flattery. The Sun was clearly inspired by our 25 Years of Top Flight Footy Moments (8,400 retweets). And of course, when we find a format that works, we’ve got no qualms with copying ourselves.

Image Stories

Balsam Hill is one of the biggest suppliers of artificial Christmas trees in the US, and they’re certainly at the more luxury end of the niche. Christmas content makes perfect sense for them.

With this piece, we also wanted to open people’s eyes to cultural differences worldwide. We contacted over 80 freelance photographers around the world, all with different tastes, styles, religions and traditions, and asked them to photograph their Christmas dinner. It was a year in the making: we had to gather the content one Christmas to be able to launch it the next.

The photographers took a photo of their Christmas dinner and place setting in a flat-lay style to ensure consistency. We also asked for a photo of the family sitting at the table, with the spread of food in front of them.

In the finished piece, an interactive map allows you to choose a specific country to view, or you can scroll down the article and browse. As well as photography, we included details about recipes, the symbolism of the dish and the traditions that the family enjoy together.

We received covereage from some top tier publications for this, including, The Telegraph, Business Insider and The Mail Online.

Taking one topic and gathering a snapshot of what that means across the world had worked for us before with The View From Here which we created for a window coverings client.

Image Stories - Key takeaways

Produce timely content - Does your series hook into an event or certain time of year?

Commission more people than you need - We hired 80 photographers, which enabled us to cherry pick the most interesting stories and photography for the final 25 (days of Christmas - see what we did there) used within the piece.

Ask freelancers for more than you need - We could have easily just asked for a photo of the dinner alone, but the family portrait added a human element to the story.

Be clear on your tie-in - Journalists will be reluctant to cover stories that are too tangential in their link to a brand.

Create highly visual assets - These give the journalist elements that are easy to write an article around - you help their article look good.

Maps

Crimson Hexagon create tools to analyse social metrics. We used their customer insights platform to identify the most popular foods and drinks in NYC on Instagram according to hashtags.

This innovative design was created by our other designer Vicke. We did user testing on seemingly small details, such as the clock: to see whether a 24-hour or 12-hour clock was more easily understood. We ended up opting for the latter, choosing colour hues to reflect sunset, sunrise and other phases of the day.

Fortunately, the data was able to be clearly visualised thanks to the plethora of emojis out there. The final effect, with animated emojis, a rainbow of hues, and the turning clock, far surpassed the minimal build time that was required.

Maps - Key takeaways

Use open source data - Use free data sources and visualise the findings in a way that is not currently available.

Be innovative with your design decisions - The clock face movement made this piece feel original in its excecution.

User test to inform decisions - If ever a debate starts about different ways of doing things, if possible I’ll throw a design out to user testing to settle arguments.

Picture quizzes

Bad Rhino is a clothing brand for plus sized men. For this brand it makes sense to comment on sport. For the 25th anniversary of the Premier League we created a picture quiz to see if people could recognise the players in the top 25 footy moments. This ended up trending on Twitter (8,400 retweets), and getting ripped off by the Sun. We used the Bad Rhino logo on the background advertising boards within the image, which was viewed 1000’s of times, improving brand sentiment.

Having to guess 25 moments makes it just hard enough and annoying enough to complete that it drums up conversation around the harder moments depicted.

Cost effective - Picture quizzes don’t include much data, perhaps just relatively light research. Build is simple, the majority of the production costs go towards hiring an illustrator.

Work with a well known illustrator - We worked with Bill McConkey whose style is recognisable and works well for fine detail.

Make the interface simple - This quiz was specifically aimed at mass market sporting publications. Making the design as simple and bold as possible limits the barrier to playing.

Mind Maps

Just Eat are a take away app. To tie in with National Curry Week we fully nerded out on curry. We worked with a curry publication Curry Life to create a gigantic mind map which linked 30 curries together. The interactive helped the user to explore dishes by how hot, sweet, rich, dry, creamy, nutty or tangy they are.

We worked with a food stylist and the Just Eat team to style and photograph the curries. With food photography you need to put much less food on a smaller plate, so the individual items, e.g. a slice of lemon or a chunk of tomato are more easily visible, who knew!

Mind Maps - Takeaways

Nerd out on things - If you can create something that is a comprehensive study on something it has the ability to become a reference material or evergreen piece of content.

Tie in with a national event - Curry week gave us a hook in terms of timeliness for this piece.

Collaborate with your client - Working with the Just Eat photography studio enabled us to create content that was on brand and of a high production value.

Tool

Back in 2013, we created a UK map showing the cheapest Michelin Star lunches in the UK. It was very successful at the time, so we decided to give the concept an international spin for our travel client, Traveloka.

To do this, we used the Michelin website to find the cheapest one and two star restaurants in each country covered by the Michelin guide. We kept the execution simple. A sortable table lists where you can find an à la carte or set menu lunch/dinner for as low as $2.20 in places like Singapore.

As restaurant prices and Michelin listings will inevitably change, this is the sort of piece we can update with new data - potentially gathering more coverage - each year. We only just started outreaching this piece and we already have coverage from, USA Today, Lonely Planet, Esquire, The Telegraph, MSN, Yahoo, The Standard, The Daily Mail and Harpers Bazaar, to name a few.

Bloomingdales owns 54 department stores in the US and wanted to drive links to their wedding registry department. There are a lot of nerves surrounding weddings, ‘What will I wear?, ‘What venue should I hire?’, and most importantly the question on everyone's lips, ‘Will it rain?’

Using open source data we were able to predict by looking at historical weather data which day was the optimal temperature to have a wedding. We focused this on the 1,000 most populous cities in the US. The tool starts with a simple text field, prompting you to input your city. Once you submit you are given a singular date that according to weather records is the optimum date to have your wedding. Hover states give more info detailing humidity, precipitation, temperature and clear skies.

If people were specifically looking for a wedding in an alternative season, e.g. Winter or Autumn these dates were also given lower down the page.

Tool - Takeaways

Make it evergreen - Once a tool like this has been made there is no reason why it can’t be updated as new data comes out, making it ever current.

Play to people's paranoia - This sounds cold but, if you can pique someone's interest by putting concern in their mind and then reassure them, the overriding brand sentiment will be that you were helpful.

Give one key take away - Overwhelming people with information usually means they remember nothing. We boiled this tool down to just one date, with the option to explore more.

Long Format

Simple are the perfect bank for millenials, no physical stores, and tools to help you budget and save. During our ideation we stumbled across the concept of side hustles. Everyone seems to have one, especially millennials. But we wanted to drill a little deeper into the motivation behind extra work outside of the usual 9 - 5.

Google search results for ‘side hustle’ proved that interest in the topic was increasing, so we conducted a survey around the topic. The survey had both closed and open ended questions to gather both quantitative and qualitative data.

Both the type of side hustle (selling shark teeth, donating sperm and breeding fish) were strange, as well as a more unexpected reason for having one. Yes, it was for money, but also overwhelmingly to improve people's mental wellbeing.

The page is broken up with photography, quotes, graphs, icons and animations. The key here was pulling out the most interesting findings, and turning what could have been a boring report into something easily digestible, and visually engaging.

When you make your first foray into designer fashion, you often remember the item you bought for years afterwards. It’s that personal feeling we wanted to explore for YOOX, an online fashion store.

To do this, we combined video, photography and editorial to create a long format style article. The personal stories and photography were key, so we picked a selection of dedicated fashionistas to interview about their first designer purchase.

Focusing on how the sustainable jackets, belts, and garish tracksuit bottoms made people feel gave an insight into the owners’ characters, and their lives. The feeling is certainly about the luxury of wanting not needing.

Long Format - Takeaways

Delight with animation - Visual quirks, movement, something that surprises or delights, makes content lift from the page, bringing it to life.

Grab attention from the start - In a similar vein, a looping intro video can really help to hold attention on the page and entice people to start scrolling.

Cherry pick results - When collating information, we never put everything that we find into the piece, you must be selective with what you choose to publish, the hierarchy and how it will shape your story.

High production value matters - Invest in a good camera, producer and editor to make sure your content is polished to a high enough standard

One Button Games

And last but not least this one button game for Advisa. Matt Round is our resident one button game expert. Advisa are a financial company so as a client it made sense for them to comment on the current state of the pound post Brexit. Some people on twitter even commented that this game was the best thing to come out of Brexit!

We received top teir coverage from, The Poke, IB Times and a person life goal of mine was achieved, when it featured on It's nice that! Social was where this one really did us proud, with 13,400 Facebook shares, and 1600 mentions on Twitter.

Big news says around for awhile - Even though Brexit had been a hot topic for a while, news this big has legs for a while. I think that in the doom and gloom of the UK’s bleak outlook this game managed to add a little joviality and light relief to a serious situation.

One button games - Takeaways

Tie into people's passions - Finance is not a light hearted topic, often we are trying to create something that can entertain something quickly on one's lunch break. Small games can do just that.

Use big news as a hook - Brexit will affect everyone the strength of the pound plummeting is shocking to see.

The details make the difference - One of my favourite visual aspects of this piece is the sound design and the bus which crumples as it crashes. These elements delight.

What have you learnt from your content recently?

As well as successes we have also had failures, we are always reassessing what it takes to make something sticky, and how to spot a good idea amongst other weaker ones. We would love to hear what is working for you, and how you go about validating your ideas.