Tag Archives: services|203

Post navigation

At dotmailer we try our best to keep the bad guys out, but if they already have your password, there is very little we can do to detect, and stop them logging in as you…unless, of course, you have already turned on two-factor authentication (2FA). Two-factor in most cases is something you know (your username/password), and something you have (a single use access code or authentication link).

But how do can they get my password in the first place?

There are various ways an attacker may have access to your login details, but some of the possible methods include:

Compromised computer

If the computer you use to log in to your online accounts is infected with malware, it is possible that your keystrokes and even screen captures are being logged and sent back to the bad guys…..yep, including your passwords, and other authentication details.

Snooping on the network

If an attacker has access to the network from which you are logging on to an online service (e.g. public Wi-Fi hotspot), in some cases it may be possible to capture the data as it passes to the server…..yep, including your password, and other authentication details. This is where looking for HTTPS in your browser address bar becomes very important. At dotmailer, all authentication data passes over a secure channel, thus protecting you from this sort of attack.

Credential reuse

It’s really important not to use the same password across different services. We’ve seen an awful lot of very big data breaches in the news recently, and the attackers have been using the stolen authentication details from these breaches to try and log on to other online services…with what seems to be a great deal of success! This sadly means that many people are still using the same password everywhere they go online. This is one of the reasons why your dotmailer password is set to expire, and you are asked for a new one every 90 days; and why you should be choosing something completely different every time. Simply incrementing that number at the end of your password is not cool!

Social Engineering

As we get better at using good passwords, and preventing malware infections; sometime, the bad guys just find it easier to ask us for our passwords. At dotmailer, our support team will never contact you asking for your password.

If one of the above unfortunate events were to happen, 2FA adds another layer of defense, as the attacker would also need access to the authentication link or SMS code. In reality that would mean having access to your mailbox, or mobile phone. We’ve already seen that it’s possible that an attacker has obtained your password due to a compromised computer, or network; which is why we would always recommend using an “out-of-band” communication such as SMS as the means to deliver the 2FA authentication token where possible. dotmailer offers SMS 2FA to all customers. It’s simple to setup, and its free!

Without access to the authentication token, the attacker could of course try and brute force the code, but that is where our other controls such as failed login account lockouts kick in.

How to turn on 2FA in dotmailer

Log in to your account, and click the user icon in the top right, and select Account:

In the resulting window click on the “Account Settings” tab, and scroll down to the “Security” section. Simply tick the Two-factor authentication box, and enter your mobile phone number, and hit save settings at the bottom of the page.

Done! Congratulations, you have just gone one step further in protecting your valuable data.

Now you have protected your dotmailer account, check out TurnOn 2FA and see which of your other online services offer a similar feature, and SWITCH IT ON!

Note: If you are a managed user, you will need to ask your account administrator to do this for you. For obvious security reasons, you will not be able to disable this feature without the help from our support team.

While the program is recently revamped, dotmailer has enjoyed a 17-year history of working side-by-side with partners like Magento, Salesforce and Microsoft Dynamics. We value these relationships as an opportunity to help deliver the best marketing strategies that lead to more business for you and your clients. Our new partnership program extends these relationships with the right tools, resources and benefits to help you build, run and grow a profitable agency, marketing or technology reseller business.

Here are the top five questions about the partner program answered:

Who is the partner program for?

Our partner program delivers two types of certification for two distinct types of audiences:

Partners: For example, a marketing agency that serves both B2B and B2C clients that integrates, develops, and executes email and marketing campaigns on behalf of their clients. Partners will work closely with dotmailer to sell and grow their client base with a tool that directly impacts client retention.

Referrers: For example, a shop that wants to refer leads directly to the dotmailer team. A referrer doesn’t handle the sale or management of the account, but still collects commission when a referral signs up for dotmailer.

What are the benefits of becoming a dotmailer partner?

dotmailer is a fast, powerful, and easy-to-use marketing automation platform with email at its core. Our world-class integrations make dotmailer extensible, and suitable for both B2C and B2B marketers alike. Here’s what some of our current partners have to say:

“We have found that dotmailer offers a strong solution. Not only do they cater to retail brands, but they also have a distinct B2B focus, which aligns with the more than 60% of our clients that have a B2B component as part of their ecommerce channels. Leveraging the dotmailer solution makes these conversations more relevant when discussing their marketing needs. As Magento’s Premier email marketing automation provider, they have invested heavily in both the technology and the sales enablement tools we need to win over customers.” – Caleb Bryant, Strategic Alliances Manager at Gorilla Group

“dotmailer enhances and extends our opportunity to bring customers a solution that provides highly personalized, automated and measureable email interactions to their customers to further nurture leads and customer engagement.” “An additional benefit of dotmailer is the pricing flexibility and geographical reach.” – Motti Danino, VP of Operations, Oro Inc.

How much does it cost?

The dotmailer partner program is free to join and benefits are offered in three tiers: Bronze, Silver and Gold. The benefits include commission, guest blogging, partner case studies, co-hosted webinars, event sponsorship and more. Our main aim is help partners become more successful and rise through the ranks as they become more affluent in offering the dotmailer platform and services.

What’s coming next?

dotmailer is committed to ensuring our agency partners have the tools at their disposal to continue them to grow service retainers and effectively sell a best-of-breed email marketing automation platform. Our philosophy has always been to innovate and we still run in bi-weekly development cycles with quarterly releases. We are constantly innovating on both the platform and our integrations, meaning the partner program will continue to evolve as does the dotmailer feature set.

This post was originally in YouMoz, and was promoted to the main blog because it provides great value and interest to our community. The author’s views are entirely his or her own and may not reflect the views of Moz, Inc.

The spam in Google Analytics (GA) is becoming a serious issue. Due to a deluge of referral spam from social buttons, adult sites, and many, many other sources, people are starting to become overwhelmed by all the filters they are setting up to manage the useless data they are receiving.

The good news is, there is no need to panic. In this post, I’m going to focus on the most common mistakes people make when fighting spam in GA, and explain an efficient way to prevent it.

But first, let’s make sure we understand how spam works. A couple of months ago, Jared Gardner wrote an excellent article explaining what referral spam is, including its intended purpose. He also pointed out some great examples of referral spam.

Types of spam

The spam in Google Analytics can be categorized by two types: ghosts and crawlers.

Ghosts

The vast majority of spam is this type. They are called ghosts because they never access your site. It is important to keep this in mind, as it’s key to creating a more efficient solution for managing spam.

As unusual as it sounds, this type of spam doesn’t have any interaction with your site at all. You may wonder how that is possible since one of the main purposes of GA is to track visits to our sites.

They do it by using the Measurement Protocol, which allows people to send data directly to Google Analytics’ servers. Using this method, and probably randomly generated tracking codes (UA-XXXXX-1) as well, the spammers leave a “visit” with fake data, without even knowing who they are hitting.

Crawlers

This type of spam, the opposite to ghost spam, does access your site. As the name implies, these spam bots crawl your pages, ignoring rules like those found in robots.txt that are supposed to stop them from reading your site. When they exit your site, they leave a record on your reports that appears similar to a legitimate visit.

Crawlers are harder to identify because they know their targets and use real data. But it is also true that new ones seldom appear. So if you detect a referral in your analytics that looks suspicious, researching it on Google or checking it against this list might help you answer the question of whether or not it is spammy.

Most common mistakes made when dealing with spam in GA

I’ve been following this issue closely for the last few months. According to the comments people have made on my articles and conversations I’ve found in discussion forums, there are primarily three mistakes people make when dealing with spam in Google Analytics.

Mistake #1. Blocking ghost spam from the .htaccess file

One of the biggest mistakes people make is trying to block Ghost Spam from the .htaccess file.

For those who are not familiar with this file, one of its main functions is to allow/block access to your site. Now we know that ghosts never reach your site, so adding them here won’t have any effect and will only add useless lines to your .htaccess file.

Ghost spam usually shows up for a few days and then disappears. As a result, sometimes people think that they successfully blocked it from here when really it’s just a coincidence of timing.

Then when the spammers later return, they get worried because the solution is not working anymore, and they think the spammer somehow bypassed the barriers they set up.

The truth is, the .htaccess file can only effectively block crawlers such as buttons-for-website.com and a few others since these access your site. Most of the spam can’t be blocked using this method, so there is no other option than using filters to exclude them.

Mistake #2. Using the referral exclusion list to stop spam

Another error is trying to use the referral exclusion list to stop the spam. The name may confuse you, but this list is not intended to exclude referrals in the way we want to for the spam. It has other purposes.

For example, when a customer buys something, sometimes they get redirected to a third-party page for payment. After making a payment, they’re redirected back to you website, and GA records that as a new referral. It is appropriate to use referral exclusion list to prevent this from happening.

If you try to use the referral exclusion list to manage spam, however, the referral part will be stripped since there is no preexisting record. As a result, a direct visit will be recorded, and you will have a bigger problem than the one you started with since. You will still have spam, and direct visits are harder to track.

Mistake #3. Worrying that bounce rate changes will affect rankings

When people see that the bounce rate changes drastically because of the spam, they start worrying about the impact that it will have on their rankings in the SERPs.

This is another mistake commonly made. With or without spam, Google doesn’t take into consideration Google Analytics metrics as a ranking factor. Here is an explanation about this from Matt Cutts, the former head of Google’s web spam team.

And if you think about it, Cutts’ explanation makes sense; because although many people have GA, not everyone uses it.

Assuming your site has been hacked

Another common concern when people see strange landing pages coming from spam on their reports is that they have been hacked.

The page that the spam shows on the reports doesn’t exist, and if you try to open it, you will get a 404 page. Your site hasn’t been compromised.

But you have to make sure the page doesn’t exist. Because there are cases (not spam) where some sites have a security breach and get injected with pages full of bad keywords to defame the website.

What should you worry about?

Now that we’ve discarded security issues and their effects on rankings, the only thing left to worry about is your data. The fake trail that the spam leaves behind pollutes your reports.

Small and midsize sites are the most easily impacted – not only because a big part of their traffic can be spam, but also because usually these sites are self-managed and sometimes don’t have the support of an analyst or a webmaster.

Big sites with a lot of traffic can also be impacted by spam, and although the impact can be insignificant, invalid traffic means inaccurate reports no matter the size of the website. As an analyst, you should be able to explain what’s going on in even in the most granular reports.

You only need one filter to deal with ghost spam

Usually it is recommended to add the referral to an exclusion filter after it is spotted. Although this is useful for a quick action against the spam, it has three big disadvantages.

Making filters every week for every new spam detected is tedious and time-consuming, especially if you manage many sites. Plus, by the time you apply the filter, and it starts working, you already have some affected data.

Some of the spammers use direct visits along with the referrals.

These direct hits won’t be stopped by the filter so even if you are excluding the referral you will sill be receiving invalid traffic, which explains why some people have seen an unusual spike in direct traffic.

Luckily, there is a good way to prevent all these problems. Most of the spam (ghost) works by hitting GA’s random tracking-IDs, meaning the offender doesn’t really know who is the target, and for that reason either the hostname is not set or it uses a fake one. (See report below)

You can see that they use some weird names or don’t even bother to set one. Although there are some known names in the list, these can be easily added by the spammer.

On the other hand, valid traffic will always use a real hostname. In most of the cases, this will be the domain. But it also can also result from paid services, translation services, or any other place where you’ve inserted GA tracking code.

Based on this, we can make a filter that will include only hits that use real hostnames. This will automatically exclude all hits from ghost spam, whether it shows up as a referral, keyword, or pageview; or even as a direct visit.

To create this filter, you will need to find the report of hostnames. Here’s how:

Go to the Reporting tab in GA

Click on Audience in the lefthand panel

Expand Technology and select Network

At the top of the report, click on Hostname

You will see a list of all hostnames, including the ones that the spam uses. Make a list of all the valid hostnames you find, as follows:

yourmaindomain.com

blog.yourmaindomain.com

es.yourmaindomain.com

payingservice.com

translatetool.com

anotheruseddomain.com

For small to medium sites, this list of hostnames will likely consist of the main domain and a couple of subdomains. After you are sure you got all of them, create a regular expression similar to this one:

You don’t need to put all of your subdomains in the regular expression. The main domain will match all of them. If you don’t have a view set up without filters, create one now.

Then create a Custom Filter.

Make sure you select INCLUDE, then select “Hostname” on the filter field, and copy your expression into the Filter Pattern box.

You might want to verify the filter before saving to check that everything is okay. Once you’re ready, set it to save, and apply the filter to all the views you want (except the view without filters).

This single filter will get rid of future occurrences of ghost spam that use invalid hostnames, and it doesn’t require much maintenance. But it’s important that every time you add your tracking code to any service, you add it to the end of the filter.

Now you should only need to take care of the crawler spam. Since crawlers access your site, you can block them by adding these lines to the .htaccess file:

It is important to note that this file is very sensitive, and misplacing a single character it it can bring down your entire site. Therefore, make sure you create a backup copy of your .htaccess file prior to editing it.

If you don’t feel comfortable messing around with your .htaccess file, you can alternatively make an expression with all the crawlers, then and add it to an exclude filter by Campaign Source.

Implement these combined solutions, and you will worry much less about spam contaminating your analytics data. This will have the added benefit of freeing up more time for you to spend actually analyze your valid data.

In closing, I am eager to hear your ideas on this serious issue. Please share them in the comments below.

(Editor’s Note: All images featured in this post were created by the author.)

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

There’s no doubt that quite a bit has changed about SEO, and that the field is far more integrated with other aspects of online marketing than it once was. In today’s Whiteboard Friday, Rand pushes back against the idea that effective modern SEO doesn’t require any technical expertise, outlining a fantastic list of technical elements that today’s SEOs need to know about in order to be truly effective.

For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!

Video transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week I’m going to do something unusual. I don’t usually point out these inconsistencies or sort of take issue with other folks’ content on the web, because I generally find that that’s not all that valuable and useful. But I’m going to make an exception here.

There is an article by Jayson DeMers, who I think might actually be here in Seattle — maybe he and I can hang out at some point — called “Why Modern SEO Requires Almost No Technical Expertise.” It was an article that got a shocking amount of traction and attention. On Facebook, it has thousands of shares. On LinkedIn, it did really well. On Twitter, it got a bunch of attention.

Some folks in the SEO world have already pointed out some issues around this. But because of the increasing popularity of this article, and because I think there’s, like, this hopefulness from worlds outside of kind of the hardcore SEO world that are looking to this piece and going, “Look, this is great. We don’t have to be technical. We don’t have to worry about technical things in order to do SEO.”

Look, I completely get the appeal of that. I did want to point out some of the reasons why this is not so accurate. At the same time, I don’t want to rain on Jayson, because I think that it’s very possible he’s writing an article for Entrepreneur, maybe he has sort of a commitment to them. Maybe he had no idea that this article was going to spark so much attention and investment. He does make some good points. I think it’s just really the title and then some of the messages inside there that I take strong issue with, and so I wanted to bring those up.

First off, some of the good points he did bring up.

One, he wisely says, “You don’t need to know how to code or to write and read algorithms in order to do SEO.” I totally agree with that. If today you’re looking at SEO and you’re thinking, “Well, am I going to get more into this subject? Am I going to try investing in SEO? But I don’t even know HTML and CSS yet.”

Those are good skills to have, and they will help you in SEO, but you don’t need them. Jayson’s totally right. You don’t have to have them, and you can learn and pick up some of these things, and do searches, watch some Whiteboard Fridays, check out some guides, and pick up a lot of that stuff later on as you need it in your career. SEO doesn’t have that hard requirement.

And secondly, he makes an intelligent point that we’ve made many times here at Moz, which is that, broadly speaking, a better user experience is well correlated with better rankings.

You make a great website that delivers great user experience, that provides the answers to searchers’ questions and gives them extraordinarily good content, way better than what’s out there already in the search results, generally speaking you’re going to see happy searchers, and that’s going to lead to higher rankings.

But not entirely. There are a lot of other elements that go in here. So I’ll bring up some frustrating points around the piece as well.

First off, there’s no acknowledgment — and I find this a little disturbing — that the ability to read and write code, or even HTML and CSS, which I think are the basic place to start, is helpful or can take your SEO efforts to the next level. I think both of those things are true.

So being able to look at a web page, view source on it, or pull up Firebug in Firefox or something and diagnose what’s going on and then go, “Oh, that’s why Google is not able to see this content. That’s why we’re not ranking for this keyword or term, or why even when I enter this exact sentence in quotes into Google, which is on our page, this is why it’s not bringing it up. It’s because it’s loading it after the page from a remote file that Google can’t access.” These are technical things, and being able to see how that code is built, how it’s structured, and what’s going on there, very, very helpful.

Some coding knowledge also can take your SEO efforts even further. I mean, so many times, SEOs are stymied by the conversations that we have with our programmers and our developers and the technical staff on our teams. When we can have those conversations intelligently, because at least we understand the principles of how an if-then statement works, or what software engineering best practices are being used, or they can upload something into a GitHub repository, and we can take a look at it there, that kind of stuff is really helpful.

Secondly, I don’t like that the article overly reduces all of this information that we have about what we’ve learned about Google. So he mentions two sources. One is things that Google tells us, and others are SEO experiments. I think both of those are true. Although I’d add that there’s sort of a sixth sense of knowledge that we gain over time from looking at many, many search results and kind of having this feel for why things rank, and what might be wrong with a site, and getting really good at that using tools and data as well. There are people who can look at Open Site Explorer and then go, “Aha, I bet this is going to happen.” They can look, and 90% of the time they’re right.

So he boils this down to, one, write quality content, and two, reduce your bounce rate. Neither of those things are wrong. You should write quality content, although I’d argue there are lots of other forms of quality content that aren’t necessarily written — video, images and graphics, podcasts, lots of other stuff.

And secondly, that just doing those two things is not always enough. So you can see, like many, many folks look and go, “I have quality content. It has a low bounce rate. How come I don’t rank better?” Well, your competitors, they’re also going to have quality content with a low bounce rate. That’s not a very high bar.

Also, frustratingly, this really gets in my craw. I don’t think “write quality content” means anything. You tell me. When you hear that, to me that is a totally non-actionable, non-useful phrase that’s a piece of advice that is so generic as to be discardable. So I really wish that there was more substance behind that.

The article also makes, in my opinion, the totally inaccurate claim that modern SEO really is reduced to “the happier your users are when they visit your site, the higher you’re going to rank.”

Wow. Okay. Again, I think broadly these things are correlated. User happiness and rank is broadly correlated, but it’s not a one to one. This is not like a, “Oh, well, that’s a 1.0 correlation.”

I would guess that the correlation is probably closer to like the page authority range. I bet it’s like 0.35 or something correlation. If you were to actually measure this broadly across the web and say like, “Hey, were you happier with result one, two, three, four, or five,” the ordering would not be perfect at all. It probably wouldn’t even be close.

There’s a ton of reasons why sometimes someone who ranks on Page 2 or Page 3 or doesn’t rank at all for a query is doing a better piece of content than the person who does rank well or ranks on Page 1, Position 1.

Then the article suggests five and sort of a half steps to successful modern SEO, which I think is a really incomplete list. So Jayson gives us;

Good on-site experience

Writing good content

Getting others to acknowledge you as an authority

Rising in social popularity

Earning local relevance

Dealing with modern CMS systems (which he notes most modern CMS systems are SEO-friendly)

The thing is there’s nothing actually wrong with any of these. They’re all, generally speaking, correct, either directly or indirectly related to SEO. The one about local relevance, I have some issue with, because he doesn’t note that there’s a separate algorithm for sort of how local SEO is done and how Google ranks local sites in maps and in their local search results. Also not noted is that rising in social popularity won’t necessarily directly help your SEO, although it can have indirect and positive benefits.

I feel like this list is super incomplete. Okay, I brainstormed just off the top of my head in the 10 minutes before we filmed this video a list. The list was so long that, as you can see, I filled up the whole whiteboard and then didn’t have any more room. I’m not going to bother to erase and go try and be absolutely complete.

But there’s a huge, huge number of things that are important, critically important for technical SEO. If you don’t know how to do these things, you are sunk in many cases. You can’t be an effective SEO analyst, or consultant, or in-house team member, because you simply can’t diagnose the potential problems, rectify those potential problems, identify strategies that your competitors are using, be able to diagnose a traffic gain or loss. You have to have these skills in order to do that.

I’ll run through these quickly, but really the idea is just that this list is so huge and so long that I think it’s very, very, very wrong to say technical SEO is behind us. I almost feel like the opposite is true.

We have to be able to understand things like;

Content rendering and indexability

Crawl structure, internal links, JavaScript, Ajax. If something’s post-loading after the page and Google’s not able to index it, or there are links that are accessible via JavaScript or Ajax, maybe Google can’t necessarily see those or isn’t crawling them as effectively, or is crawling them, but isn’t assigning them as much link weight as they might be assigning other stuff, and you’ve made it tough to link to them externally, and so they can’t crawl it.

Disabling crawling and/or indexing of thin or incomplete or non-search-targeted content. We have a bunch of search results pages. Should we use rel=prev/next? Should we robots.txt those out? Should we disallow from crawling with meta robots? Should we rel=canonical them to other pages? Should we exclude them via the protocols inside Google Webmaster Tools, which is now Google Search Console?

Managing redirects, domain migrations, content updates. A new piece of content comes out, replacing an old piece of content, what do we do with that old piece of content? What’s the best practice? It varies by different things. We have a whole Whiteboard Friday about the different things that you could do with that. What about a big redirect or a domain migration? You buy another company and you’re redirecting their site to your site. You have to understand things about subdomain structures versus subfolders, which, again, we’ve done another Whiteboard Friday about that.

Proper error codes, downtime procedures, and not found pages. If your 404 pages turn out to all be 200 pages, well, now you’ve made a big error there, and Google could be crawling tons of 404 pages that they think are real pages, because you’ve made it a status code 200, or you’ve used a 404 code when you should have used a 410, which is a permanently removed, to be able to get it completely out of the indexes, as opposed to having Google revisit it and keep it in the index.

Downtime procedures. So there’s specifically a… I can’t even remember. It’s a 5xx code that you can use. Maybe it was a 503 or something that you can use that’s like, “Revisit later. We’re having some downtime right now.” Google urges you to use that specific code rather than using a 404, which tells them, “This page is now an error.”

Disney had that problem a while ago, if you guys remember, where they 404ed all their pages during an hour of downtime, and then their homepage, when you searched for Disney World, was, like, “Not found.” Oh, jeez, Disney World, not so good.

International and multi-language targeting issues. I won’t go into that. But you have to know the protocols there. Duplicate content, syndication, scrapers. How do we handle all that? Somebody else wants to take our content, put it on their site, what should we do? Someone’s scraping our content. What can we do? We have duplicate content on our own site. What should we do?

Diagnosing traffic drops via analytics and metrics. Being able to look at a rankings report, being able to look at analytics connecting those up and trying to see: Why did we go up or down? Did we have less pages being indexed, more pages being indexed, more pages getting traffic less, more keywords less?

Understanding advanced search parameters. Today, just today, I was checking out the related parameter in Google, which is fascinating for most sites. Well, for Moz, weirdly, related:oursite.com shows nothing. But for virtually every other sit, well, most other sites on the web, it does show some really interesting data, and you can see how Google is connecting up, essentially, intentions and topics from different sites and pages, which can be fascinating, could expose opportunities for links, could expose understanding of how they view your site versus your competition or who they think your competition is.

Then there are tons of parameters, like in URL and in anchor, and da, da, da, da. In anchor doesn’t work anymore, never mind about that one.

I have to go faster, because we’re just going to run out of these. Like, come on. Interpreting and leveraging data in Google Search Console. If you don’t know how to use that, Google could be telling you, you have all sorts of errors, and you don’t know what they are.

Leveraging topic modeling and extraction. Using all these cool tools that are coming out for better keyword research and better on-page targeting. I talked about a couple of those at MozCon, like MonkeyLearn. There’s the new Moz Context API, which will be coming out soon, around that. There’s the Alchemy API, which a lot of folks really like and use.

Identifying and extracting opportunities based on site crawls. You run a Screaming Frog crawl on your site and you’re going, “Oh, here’s all these problems and issues.” If you don’t have these technical skills, you can’t diagnose that. You can’t figure out what’s wrong. You can’t figure out what needs fixing, what needs addressing.

Using rich snippet format to stand out in the SERPs. This is just getting a better click-through rate, which can seriously help your site and obviously your traffic.

Applying Google-supported protocols like rel=canonical, meta description, rel=prev/next, hreflang, robots.txt, meta robots, x robots, NOODP, XML sitemaps, rel=nofollow. The list goes on and on and on. If you’re not technical, you don’t know what those are, you think you just need to write good content and lower your bounce rate, it’s not going to work.

Using APIs from services like AdWords or MozScape, or hrefs from Majestic, or SEM refs from SearchScape or Alchemy API. Those APIs can have powerful things that they can do for your site. There are some powerful problems they could help you solve if you know how to use them. It’s actually not that hard to write something, even inside a Google Doc or Excel, to pull from an API and get some data in there. There’s a bunch of good tutorials out there. Richard Baxter has one, Annie Cushing has one, I think Distilled has some. So really cool stuff there.

Diagnosing page load speed issues, which goes right to what Jayson was talking about. You need that fast-loading page. Well, if you don’t have any technical skills, you can’t figure out why your page might not be loading quickly.

Diagnosing mobile friendliness issues

Advising app developers on the new protocols around App deep linking, so that you can get the content from your mobile apps into the web search results on mobile devices. Awesome. Super powerful. Potentially crazy powerful, as mobile search is becoming bigger than desktop.

Okay, I’m going to take a deep breath and relax. I don’t know Jayson’s intention, and in fact, if he were in this room, he’d be like, “No, I totally agree with all those things. I wrote the article in a rush. I had no idea it was going to be big. I was just trying to make the broader points around you don’t have to be a coder in order to do SEO.” That’s completely fine.

So I’m not going to try and rain criticism down on him. But I think if you’re reading that article, or you’re seeing it in your feed, or your clients are, or your boss is, or other folks are in your world, maybe you can point them to this Whiteboard Friday and let them know, no, that’s not quite right. There’s a ton of technical SEO that is required in 2015 and will be for years to come, I think, that SEOs have to have in order to be effective at their jobs.

All right, everyone. Look forward to some great comments, and we’ll see you again next time for another edition of Whiteboard Friday. Take care.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

It’s been another wild year in search marketing. Mobilegeddon crushed our Twitter streams, but not our dreams, and Matt Cutts stepped out of the spotlight to make way for an uncertain Google future. Pandas and Penguins continue to torment us, but most days, like anyone else, we were just trying to get the job done and earn a living.

This year, over 3,600 brave souls, each one more intelligent and good-looking than the last, completed our survey. While the last survey was technically “2014”, we collected data for it in late 2013, so the 2015 survey reflects about 18 months of industry changes.

A few highlights

Let’s dig in. Almost half (49%) of our 2015 respondents involved in search marketing were in-house marketers. In-house teams still tend to be small – 71% of our in-house marketers reported only 1-3 people in their company being involved in search marketing at least quarter-time. These teams do have substantial influence, though, with 86% reporting that they were involved in purchasing decisions.

Agency search marketers reported larger teams and more diverse responsibilities. More than one-third (36%) of agency marketers in our survey reported working with more than 20 clients in the previous year. Agencies covered a wide range of services, with the top 5 being:

More than four-fifths (81%) of agency respondents reported providing both SEO and SEM services for clients. Please note that respondents could select more than one service/tool/etc., so the charts in this post will not add up to 100%.

The vast majority of respondents (85%) reported being directly involved with content marketing, which was on par with 2014. Nearly two-thirds (66%) of agency content marketers reported “Content for SEO purposes” as their top activity, although “Building Content Strategy” came in a solid second at 44% of respondents.

Top tools

Where do we get such wonderful toys? We marketers love our tools, so let’s take a look at the Top 10 tools across a range of categories. Please note that this survey was conducted here on Moz, and our audience certainly has a pro-Moz slant.

Up first, here are the Top 10 SEO tools in our survey:

Just like last time, Google Webmaster Tools (now “Search Console”) leads the way. Moz Pro and Majestic slipped a little bit, and Firebug fell out of the Top 10. The core players remained fairly stable.

Here are the Top 10 Content tools in our survey:

Even with its uncertain future, Google Alerts continues to be widely used. There are a lot of newcomers to the content tools world, so year-over-year comparisons are tricky. Expect even more players in this market in the coming year.

Following are our respondents’ Top 10 analytics tools:

For an industry that complains about Google so much, we sure do seem to love their stuff. Google Analytics dominates, crushing the enterprise players, at least in the mid-market. KISSmetrics gained solid ground (from the #10 spot last time), while home-brewed tools slipped a bit. CrazyEgg and WordPress Stats remain very popular since our last survey.

Finally, here are the Top 10 social tools used by our respondents:

Facebook Insights and Hootsuite retained the top spots from last year, but newcomer Twitter Analytics rocketed into the #3 position. LinkedIn Insights emerged as a strong contender, too. Overall usage of all social tools increased. Tweetdeck held the #6 spot in 2014, with 19% usage, but dropped to #10 this year, even bumping up slightly to 20%.

Of course, digging into social tools naturally begs the question of which social networks are at the top of our lists.

The Top 6 are unchanged since our last survey, and it’s clear that the barriers to entry to compete with the big social networks are only getting higher. Instagram doubled its usage (from 11% of respondents last time), but this still wasn’t enough to overtake Pinterest. Reddit and Quora saw steady growth, and StumbleUpon slipped out of the Top 10.

Top activities

So, what exactly do we do with these tools and all of our time? Across all online marketers in our survey, the Top 5 activities were:

For in-house marketers, “Site Audits” dropped to the #6 position and “Brand Strategy” jumped up to the #3 spot. Naturally, in-house marketers have more resources to focus on strategy.

For agencies and consultants, “Site Audits” bumped up to #2, and “Managing People” pushed down social media to take the #5 position. Larger agency teams require more traditional people wrangling.

Here’s a much more detailed breakdown of how we spend our time in 2015:

Demand for CRO is growing at a steady clip, but analytics still leads the way. Both “Content Creation” (#2) and “Content Curation” (#6) showed solid demand increases.

Some categories reported both gains and losses – 30% of respondents reported increased demand for “Link Building”, while 20% reported decreased demand. Similarly, 20% reported increased demand for “Link Removal”, while almost as many (17%) reported decreased demand. This may be a result of overall demand shifts, or it may represent more specialization by agencies and consultants.

What’s in store for 2016?

It’s clear that our job as online marketers is becoming more diverse, more challenging, and more strategic. We have to have a command of a wide array of tools and tactics, and that’s not going to slow down any time soon. On the bright side, companies are more aware of what we do, and they’re more willing to spend the money to have it done. Our evolution has barely begun as an industry, and you can expect more changes and growth in the coming year.

Raw data download

If you’d like to take a look through the raw results from this year’s survey (we’ve removed identifying information like email addresses from all responses), we’ve got that for you here:

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Today’s post focuses on a vision for your online presence. This vision outlines what it takes to be the best, both from an overall reputation and visibility standpoint, as well as an SEO point of view. The reason these are tied together is simple: Your overall online reputation and visibility is a huge factor in your SEO. Period. Let’s start by talking about why.

Core ranking signals

For purposes of this post, let’s define three cornerstone ranking signals that most everyone agrees on:

Links

Links remain a huge factor in overall ranking. Both Cyrus Shepard and Marcus Tober re-confirmed this on the Periodic Table of SEO Ranking Factors session at the SMX Advanced conference in Seattle this past June.

On-page content

On-page content remains a huge factor too, but with some subtleties now thrown in. I wrote about some of this in earlier posts I did on Moz about Term Frequency and Inverse Document Frequency. Suffice it to say that on-page content is about a lot more than pure words on the page, but also includes the supporting pages that you link to.

User engagement with your site

This is not one of the traditional SEO signals from the early days of SEO, but most advanced SEO pros that I know consider it a real factor these days. One of the most popular concepts people talk about is called pogo-sticking, which is illustrated here:

New, lesser-known signals

OK, so these are the more obvious signals, but now let’s look more broadly at the overall web ecosystem and talk about other types of ranking signals. Be warned that some of these signals may be indirect, but that just doesn’t matter. In fact, my first example below is an indirect factor which I will use to demonstrate why whether a signal is direct or indirect is not an issue at all.

Let me illustrate with an example. Say you spend $1 billion dollars building a huge brand around a product that is massively useful to people. Included in this is a sizable $100 million dollar campaign to support a highly popular charitable foundation, and your employees regularly donate time to help out in schools across your country. In short, the great majority of people love your brand.

Do you think this will impact the way people link to your site? Of course it does. Do you think it will impact how likely people are to be satisified with quality of the pages of your site? Consider this A/B test scenario of 2 pages from different “brands” (for the one on the left, imagine the image of Coca Cola or Pepsi Cola, whichever one you prefer):

Do you think that the huge brand will get a benefit of a doubt on their page that the no-name brand does not even though the pages are identical? Of course they will. Now let’s look at some simpler scenarios that don’t involve a $1 billion investment.

1. Cover major options related to a product or service on “money pages”

Imagine that a user arrives on your auto parts site after searching on the phrase “oil filter” at Google or Bing. Chances are pretty good that they want an oil filter, but here are some other items they may also want:

A guide to picking the right filter for their car

Oil

An oil filter wrench

A drainage pan to drain the old oil into

This is just the basics, right? But, you would be surprised with how many sites don’t include links or information on directly related products on their money pages. Providing this type of smart site and page design can have a major impact on user engagement with the money pages of your site.

2. Include other related links on money pages

In the prior item we covered the user’s most directly related needs, but they may have secondary needs as well. Someone who is changing a car’s oil is either a mechanic or a do-it-yourself-er. What else might they need? How about other parts, such as windshield wipers or air filters?

These are other fairly easy maintenance steps for someone who is working on their car to complete. Presence of these supporting products could be one way to improve user engagement with your pages.

3. Offer industry-leading non-commercial content on-site

Publishing world-class content on your site is a great way to produce links to your site. Of course, if you do this on a blog on your site, it may not provide links directly to your money pages, but it will nonetheless lift overall site authority.

In addition, if someone has consumed one or more pieces of great content on your site, the chance of their engaging in a more positive manner with your site overall go way up. Why? Because you’ve earned their trust and admiration.

4. Be everywhere your audiences are with more high-quality, relevant, non-commercial content

Are there major media sites that cover your market space? Do they consider you to be an expert? Will they quote you in articles they write? Can you provide them with guest posts or let you be a guest columnist? Will they collaborate on larger content projects with you?

All of these activities put you in front of their audiences, and if those audiences overlap with yours, this provides a great way to build your overall reputation and visibility. This content that you publish, or collaborate on, that shows up on 3rd-party sites will get you mentions and links. In addition, once again, it will provide you with a boost to your branding. People are now more likely to consume your other content more readily, including on your money pages.

5. Leverage social media

The concept here shares much in common with the prior point. Social media provides opportunities to get in front of relevant audiences. Every person that’s an avid follower of yours on a social media site is more likely to show very different behavior characteristics interacting with your site than someone that does not know you well at all.

Note that links from social media sites are nofollowed, but active social media behavior can lead to people implementing “real world” links to your site that are followed, from their blogs and media web sites.

6. Be active in the offline world as well

Think your offline activity doesn’t matter online? Think again. Relationships are still most easily built face-to-face. People you meet and spend time with can well become your most loyal fans online. This is particularly important when it comes to building relationships with influential people.

One great way to do that is to go to public events related to your industry, such as conferences. Better still, obtain speaking engagements at those conferences. This can even impact people who weren’t there to hear you speak, as they become aware that you have been asked to do that. This concept can also work for a small local business. Get out in your community and engage with people at local events.

The payoff here is similar to the payoff for other items: more engaged, highly loyal fans who engage with you across the web, sending more and more positive signals, both to other people and to search engines, that you are the real deal.

7. Provide great customer service/support

Whatever your business may be, you need to take care of your customers as best you can. No one can make everyone happy, that’s unrealistic, but striving for much better than average is a really sound idea. Having satisfied customers saying nice things about you online is a big impact item in the grand scheme of things.

8. Actively build relationships with influencers too

While this post is not about the value of influencer relationships, I include this in the list for illustration purposes, for two reasons:

Some opportunities are worth extra effort. Know of someone who could have a major impact on your business? Know that they will be at a public event in the near future? Book your plane tickets and get your butt out there. No guarantee that you will get the result you are looking for, or that it will happen quickly, but your chances go WAY up if you get some face time with them.

Influencers are worth special attention and focus, but your relationship-building approach to the web and SEO is not only about influencers. It’s about the entire ecosystem.

It’s an integrated ecosystem

The web provides a level of integrated, real-time connectivity of a kind that the world has never seen before. This is only going to increase. Do something bad to a customer in Hong Kong? Consumers in Boston will know within 5 minutes. That’s where it’s all headed.

Google and Bing (and any future search engine that may emerge) want to measure these types of signals because they tell them how to improve the quality of the experience on their platforms. There are may ways they can perform these measurements.

One simple concept is covered by Rand in this recent Whiteboard Friday video. The discussion is about a recent patent granted to Google that shows how the company can use search queries to detect who is an authority on a topic.

The example he provides is about people who search on “email finding tool”. If Google also finds that a number of people search on “voila norbert email tool”, Google may use that as an authority signal.

Think about that for a moment. How are you going to get people to search on your brand more while putting it together with a non-branded querly like that? (OK, please leave Mechanical Turk and other services like that out of the discussion).

Now you can start to see the bigger picture. Measurements like pogosticking and this recent search behavior related patent are just the tip of the iceberg. Undoubtedly, there are many other ways that search engines can measure what people like and engage with the most.

This is all part of SEO now. UX, product breadth, problem solving, UX, engaging in social media, getting face to face, creating great content that you publish in front of other people’s audiences, and more.

For the small local business, you can still win at this game, as your focus just needs to be on doing it better than your competitors. The big brands will never be hyper-local like you are, so don’t think you can’t play the game, because you can.

Whoever you are, get ready, because this new integrated ecosystem is already upon us, and you need to be a part of it.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

“The one who follows the crowd will usually go no further than the crowd. Those who walk alone are likely to find themselves in places no one has ever been before.”

While this quote has been credited to everyone from Francis Phillip Wernig, under the pseudonym Alan Ashley-Pitt, to Einstein himself, the powerful message does not lose its substance no matter whom you choose to credit. There is a very important yet often overlooked effect of not heeding this warning. One which can be applied to all aspects of life. From love and happiness, to business and marketing, copying what your competitors are doing and failing to forge your own path can be a detrimental mistake.

While as marketers we are all acutely aware of the importance of differentiation, we’ve been trained for the majority of our lives to seek out the norm.

We spend the majority of our adolescent lives trying desperately not to be different. No one has ever been picked on for being too normal or not being different enough. We would beg our parents to buy us the same clothes little Jimmy or little Jamie wore. We’d want the same backpack and the same bike everyone else had. With the rise of the cell phone and later the smartphone, on hands and knees, we begged and pleaded for our parents to buy us the Razr, the StarTAC (bonus points if you didn’t have to Google that one), and later the iPhone. Did we truly want these things? Yes, but not just because they were cutting edge and nifty. We desired them because the people around us had them. We didn’t want to be the last to get these devices. We didn’t want to be different.

Thankfully, as we mature we begin to realize the fallacy that is trying to be normal. We start to become individuals and learn to appreciate that being different is often seen as beautiful. However, while we begin to celebrate being different on a personal level, it does not always translate into our business or professional lives.

We unconsciously and naturally seek out the normal, and if we want to be different—truly different in a way that creates an advantage—we have to work for it.

The truth of the matter is, anyone can be different. In fact, we all are very different. Even identical twins with the same DNA will often have starkly different personalities. As a business, the real challenge lies in being different in a way that is relevant, valuable to your audience, and creates an advantage.

“Strong products and services are highly differentiated from all other products and services. It’s that simple. It’s that difficult.” – Austin McGhie, Brand Is a Four Letter Word

Let’s explore the example of Revel Hotel & Casino. Revel is a 70-story luxury casino in Atlantic City that was built in 2012. There is simply not another casino of the same class in Atlantic City, but there might be a reason for this. Even if you’re not familiar with the city, a quick jump onto Atlantic City’s tourism website reveals that of the five hero banners that rotate, not one specifically mentions gambling, but three reference the boardwalk. This is further illustrated when exploring their internal linking structure. The beaches, boardwalk, and shopping all appear before a single mention of casinos. There simply isn’t as much of a market for high-end gamblers in the Atlantic City area; in the states Las Vegas serves that role. So while Revel has a unique advantage, their ability to attract customers to their resort has not resulted in profitable earnings reports. In Q2 2012, Revel had a gross operating loss of $35.177M, and in Q3 2012 that increased to $36.838M.

So you need to create a unique selling proposition (also known as unique selling point and commonly referred to as a USP), and your USP needs to be valuable to your audience and create a competitive advantage. Sounds easy enough, right? Now for the kicker. That advantage needs to be as sustainable as physically possible over the long term.

“How long will it take our competitors to duplicate our advantage?”

You really need to explore this question and the possible solutions your competitors could utilize to play catch-up or duplicate what you’ve done. Look no further than Google vs Bing to see this in action. No company out there is going to just give up because your USP is so much better; most will pivot or adapt in some way.

Let’s look at a Seattle-area coffee company of which you may or may not be familiar. Starbucks has tried quite a few times over the years to level-up their tea game with limited success, but the markets that Starbucks has really struggled to break into are the pastry, breads, dessert, and food markets.

Other stores had more success in these markets, and they thought that high-quality teas and bakery items were the USPs that differentiated them from the Big Bad Wolf that is Starbucks. And while they were right to think that their brick house would save them from the Big Bad Wolf for some time, this fable doesn’t end with the Big Bad Wolf in a boiling pot.

Never underestimate your competitor’s ability to be agile, specifically when overcoming a competitive disadvantage.

If your competitor can’t beat you by making a better product or service internally, they can always choose to buy someone who can.

After months of courting, on June 4th, 2012 Starbucks announced that they had come to an agreement to purchase La Boulange in order to “elevate core food offerings and build a premium, artisanal bakery brand.” If you’re a small-to-medium sized coffee shop and/or bakery that even indirectly competed with Starbucks, a new challenger approaches. And while those tea shops momentarily felt safe within the brick walls that guarded their USP, on the final day of that same year, the Big Bad Wolf huffed and puffed and blew a stack of cash all over Teavana. Making Teavana a wholly-owned subsidiary of Starbucks for the low, low price of $620M.

Sarcasm aside, this does a great job of illustrating the ability of companies—especially those with deep pockets—to be agile, and demonstrates that they often have an uncanny ability to overcome your company’s competitive advantage. In seven months, Starbucks went from a minor player in these markets to having all the tools they need to dominate tea and pastries. Have you tried their raspberry pound cake? It’s phenomenal.

Why does this matter to me?

Ok, we get it. We need to be different, and in a way that is relevant, valuable, defensible, and sustainable. But I’m not the CEO, or even the CMO. I cannot effect change on a company level; why does this matter to me?

I’m a firm believer that you effect change no matter what the name plate on your desk may say. Sure, you may not be able to call an all-staff meeting today and completely change the direction of your company tomorrow, but you can effect change on the parts of the business you do touch. No matter your title or area of responsibility, you need to know your company’s, client’s, or even a specific piece of content’s USP, and you need to ensure it is applied liberally to all areas of your work.

Look at this example SERP for “Mechanics”:

While yes, this search is very likely to be local-sensitive, that doesn’t mean you can’t stand out. Every single AdWords result, save one, has only the word “Mechanics” in the headline. (While the top of page ad is pulling description line 1 into the heading, the actual headline is still only “Mechanic.”) But even the one headline that is different doesn’t do a great job of illustrating the company’s USP. Mechanics at home? Whose home? Mine or theirs? I’m a huge fan of Steve Krug’s “Don’t Make Me Think,” and in this scenario there are too many questions I need answered before I’m willing to click through. “Mechanics; We Come To You” or even “Traveling Mechanics” illustrates this point much more clearly, and still fits within the 25-character limit for the headline.

If you’re an AdWords user, no matter how big or small your monthly spend may be, take a look at your top 10-15 keywords by volume and evaluate how well you’re differentiating yourself from the other brands in your industry. Test ad copy that draws attention to your USP and reap the rewards.

Now while this is simply an AdWords text ad example, the same concept can be applied universally across all of marketing.

Title tags & meta descriptions

As we alluded to above, not only do companies have USPs, but individual pieces of content can, and should, have their own USP. Use your title tag and meta description to illustrate what differentiates your piece of content from the competition and do so in a way that attracts the searcher’s click. Use your USP to your advantage. If you have already established a strong brand within a specific niche, great! Now use it to your advantage. Though it’s much more likely that you are competing against a strong brand, and in these scenarios ask yourself, “What makes our content different from theirs?” The answer you come up with is your content’s USP. Call attention to that in your title tag and meta description, and watch the CTR climb.

I encourage you to hop into your own site’s analytics and look at your top 10-15 organic landing pages and see how well you differentiate yourself. Even if you’re hesitant to negatively affect your inbound gold mines by changing the title tags, run a test and change up your meta description to draw attention to your USP. In an hour’s work, you just may make the change that pushes you a little further up those SERPs.

Branding

Let’s break outside the world of digital marketing and look at the world of branding. Tom’s Shoes competes against some heavy hitters in Nike, Adidas, Reebok, and Puma just to name a few. While Tom’s can’t hope to compete against the marketing budgets of these companies in a fair fight, they instead chose to take what makes them different, their USP, and disseminate it every chance they get. They have labeled themselves “The One for One” company. It’s in their homepage’s title tag, in every piece of marketing they put out, and it smacks you in the face when you land on their site. They even use the call-to-action “Get Good Karma” throughout their site.

Wherever you stand on this issue, Tom’s Shoes has done a phenomenal job of differentiating their brand from the big hitters in their industry.

Know your USP and disseminate it every chance you get.

This is worth repeating. Know your USP and disseminate it every chance you get, whether that be in title tags, ad copy, on-page copy, branding, or any other segment of your marketing campaigns. Online or offline, be different. And remember the quote that we started with, “The one who follows the crowd will usually go no further than the crowd. Those who walk alone are likely to find themselves in places no one has ever been before.”

The amount of marketing knowledge that can be taken from this one simple statement is astounding. Heed the words, stand out from the crowd, and you will have success.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

A lot of fantastic websites (and products, services, ideas, etc.) are in something of a pickle: The keywords they would normally think to target get next to no search volume. It can make SEO seem like a lost cause. In today’s Whiteboard Friday, Rand explains why that’s not the case, and talks about the one extra step that’ll help those organizations create the demand they want.

For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!

Video transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about a particularly challenging problem in the world of SEO, and that is trying to do SEO or trying to do any type of web marketing when your product, service, or idea has no search volume around it. So nobody is already looking for what you offer. It’s a new thing, a new concept.

I’ll use the example here of a website that I’m very fond of, but which there’s virtually no search volume for, called Niice. It’s Niice.co.

It’s great. I searched for things in here. It brings me back all these wonderful visuals from places like Colossus and lots of design portals. I love this site. I use it all the time for inspiration, for visuals, for stuff that I might write about on blogs, for finding new artists. It’s just cool. I love it. I love the discovery aspect of it, and I think it can be really great for finding artists and designers and visuals.

But when I looked at the keyword research — and granted I didn’t go deep into the keyword research, but let’s imagine that I did — I looked for things like: “visual search engine” almost no volume; “search engine for designers” almost no volume; “graphical search engine” almost no volume; “find designer visuals” nada.

So when they look at their keyword research they go, “Man, we don’t even have keywords to target here really.” SEO almost feels like it’s not a channel of opportunity, and I think that’s where many, many companies and businesses make mistakes actually, because just because you don’t see keyword research around exactly around what you’re offering doesn’t mean that SEO can’t be a great channel. It just means we have to do an extra step of work, and that’s what I want to talk about today.

So I think when you encounter this type of challenge — and granted it might not be the challenge that there’s no keyword volume — it could be a challenge in your business, for your organization, for some ideas or products that you have or are launching that there’s just very little, and thus you’re struggling to come up with enough volume to create the quantity of leads, or free trials, or customers that you need. This process really can work.

Key questions to start.

1) Who’s the target audience?

In Niice’s case, that’s going to be a lot of designers. It might be people who are creating presentations. It might be those who are searching out designers or artists. It could be people seeking inspiration for all sorts of things. So they’re going to figure out who that is.

From there, they can look at the job title, interests, demographics of those people, and then you can do some cool stuff where you can figure out things like, “Oh, you know what? We could do some Facebook ad targeting to those right groups to help boost their interests in our product and potentially, well, create branded search volume down the road, attract direct visitors, build brand awareness for ourselves, and potentially get some traffic to the site directly as well. If we can convert some of that traffic, well, that’s fantastic.”

In their case, I think Niice is ad-supported right now, so all they really need is the traffic itself. But regardless, this is that same type of process you’d use.

2) What else do they search for?

What is that target audience searching for? Knowledge, products, tools, services, people, brands, whatever it is, if you know who the audience is, you can figure out what they’re searching for because they have needs. If they have a job title, if they have interests, if you have those profile features about the audience, you can figure out what else they’re going to be searching for, and in this case, knowing what designers are searching for, well, that’s probably relatively simplistic. The other parts of their audience might be more complex, but that one is pretty obvious.

From that, we can do content creation. We can do keyword targeting to be in front of those folks when they’re doing search by creating content that may not necessarily be exactly selling our tools, but that’s the idea of content marketing. We’re creating content to target people higher up in the funnel before they need our product.

We can use that, too, for product and feature inspiration in the product itself. So in this case, Niice might consider creating a design pattern library or several, pulling from different places, or hiring someone to come in and build one for them and then featuring that somewhere on the site if you haven’t done a search yet and then potentially trying to rank for that in the search engine, which then brings qualified visitors, the types of people who once they got exposed to Niice would be like, “Wow, this is great and it’s totally free. I love it.”

UX tool list, so list of tools for user experience, people on the design or UI side, maybe Photoshop tutorials, whatever it is that they feel like they’re competent and capable of creating and could potentially rank for, well, now you’re attracting the right audience to your site before they need your product.

3) Where do they go?

That audience, where are they going on the web? What do they do when they get there? To whom do they listen? Who are their influencers? How can we be visible in those locations? So from that I can get things like influencer targeting and outreach. I can get ad and sponsorship opportunities. I can figure out places to do partnership or guest content or business development.

In Niice’s case, that might be things like sponsor or speak at design events. Maybe they could create an awards project for Dribble. So they go to Dribble, they look at what’s been featured there, or they go to Colossus, or some of the other sites that they feature, and they find the best work of the week. At the end of the week, they feature the top 10 projects, and then they call out the designers who put them together.

Wow, that’s terrific. Now you’re getting in front of the audience whose work you’re featuring, which is going to, in turn, make them amplify Niice’s project and product to an audience who’s likely to be in their target audience. It’s sort of a win-win. That’s also going to help them build links, engagement, shares, and all sorts of signals that potentially will help them with their authority, both topically and domain-wide, which then means they can rank for all the content they create, building up this wonderful engine.

4) What types of content have achieved broad or viral distribution?

I think what we can glean from this is not just inspiration for content and keyword opportunities as we can from many other kinds of content, but also sites to target, in particular sites to target with advertising, sites to target for guest posting or sponsorship, or sites to target for business development or for partnerships, site to target in an ad network, sites to target psychographically or demographically for Facebook if we want to run ads like that, potentially bidding on ads in Google when people search for that website or for that brand name in paid search.

So if you’re Niice, you could think about contracting some featured artist to contribute visuals maybe for a topical news project. So something big is happening in the news or in the design community, you contract a few of the artists whose work you have featured or are featuring, or people from the communities whose work you’re featuring, and say, “Hey, we might not be able to pay you a lot, but we’re going to get in front of a ton of people. We’re going to build exposure for you, which is something we already do, FYI, and now you’ve got some wonderful content that has that potential to mimic that work.”

You could think about, and I love this just generally as a content marketing and SEO tactic, if you go find viral content, content that has had wide sharing success across the web from the past, say two, three, four, or five years ago, you have a great opportunity, especially if the initial creator of that content or project hasn’t continued on with it, to go say, “Hey, you know what? We can do a version of that. We’re going to modernize and update that for current audiences, current tastes, what’s currently going on in the market. We’re going to go build that, and we have a strong feeling that it’s going to be successful because it’s succeeded in the past.”

That, I think, is a great way to get content ideas from viral content and then to potentially overtake them in the search rankings too. If something from three or five years ago, that was particularly timely then still ranks today, if you produce it, you’re almost certainly going to come out on top due to Google’s bias for freshness, especially around things that have timely relevance.

5) Should brand advertisement be in our consideration set?

Then last one, I like to ask about brand advertising in these cases, because when there’s not search volume yet, a lot of times what you have to do is create awareness. I should change this from advertising to a brand awareness, because really there’s organic ways to do it and advertising ways to do it. You can think about, “Well, where are places that we can target where we could build that awareness? Should we invest in press and public relations?” Not press releases. “Then how do we own the market?” So I think one of the keys here is starting with that name or title or keyword phrase that encapsulates what the market will call your product, service or idea.

In the case of Niice, that could be, well, visual search engines. You can imagine the press saying, “Well, visual search engines like Niice have recently blah, blah, blah.” Or it could be designer search engines, or it could be graphical search engines, or it could be designer visual engines, whatever it is. You need to find what that thing is going to be and what’s going to resonate.

In the case of Nest, that was the smart home. In the case of Oculus, it was virtual reality and virtual reality gaming. In the case of Tesla, it was sort of already established. There’s electric cars, but they kind of own that market. If you know what those keywords are, you can own the market before it gets hot, and that’s really important because that means that all of the press and PR and awareness that happens around the organic rankings for that particular keyword phrase will all be owned and controlled by you.

When you search for “smart home,” Nest is going to dominate those top 10 results. When you search for “virtual reality gaming,” Oculus is going to dominate those top 10. It’s not necessarily dominate just on their own site, it’s dominate all the press and PR articles that are about that, all of the Wikipedia page about it, etc., etc. You become the brand that’s synonymous with the keyword or concept. From an SEO perspective, that’s a beautiful world to live in.

So, hopefully, for those of you who are struggling around demand for your keywords, for your volume, this process can be something that’s really helpful. I look forward to hearing from you in the comments. We’ll see you again next week for another edition of Whiteboard Friday. Take care.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!