Researchers estimate that it’s up to 25 times more expensive for a company to acquire a new customer than to keep an existing one, making ongoing investments in consumer satisfaction a priority. There's nothing more disheartening to a local business owner than receiving a very negative review — and given that as little as 13% of consumers will patronize a business with a 1- or 2-star rating, there may be nothing more important than the owner taking every possible step to resolve negative reviews with speed and skill.

Negative reviews don’t write themselves. While looking at restaurant reviews recently, I came across an owner-consumer interaction that perfectly encapsulates the typical steps that take a transaction from bad to worse. It serves as a diagram of how these costly scenarios begin, proceed, and escalate, ultimately resulting in permanent damage to the company’s reputation.

The blame isn’t one-sided, and my goal here isn't to make the customer or the owner out to be "the villain." Rather, I'd like to point out key elements that actually worsen the situation, rather than improving it. Both owners and consumers sincerely want to feel satisfied, and the good news is that, in most cases, the only thing standing in the way of this is responsible communication.

The key to the "Food Truck Fiasco"

This story begins at a family-owned Philly Cheesesteak food truck that signed up to be a concession at a festival in the Southwest. One customer describes what happened on the day of the event this way, with my interpretation to the right:

Key to Review

Grabbed "The Storm" (cheesesteak with green chiles) for $9 when they were parked outside the bike and brew festival.

Customer sets the scene for his story.

The woman told us it would take 20 minutes, but when we arrived back it took at least an additional 15 minutes to get our food. I'm sorry, but 45 minutes wait for a sandwich simply isn't acceptable. The sandwich was super small for the price, I could've eaten 3 of these things easy and I'm not a big person. I expect more for a $9 sandwich

The legitimate complaint in wait time, improper expectations being set, food portions, and pricing.

These are honest grievances.

from a crappy concession trailer with zero overhead.

The revenge. Customer vents his disappointment with cutting, dismissive language. He insults the business.

EDIT: Like several other Yelpers, I had originally rated them higher, but reduced my rating after I received a nasty email from the owner shaming me for my feedback. Seriously, that is how you treat customers after making them wait 45 minutes for a super overpriced sandwich? If you can't handle honest feedback, then you should probably find another line of work. Keep it classy, [name removed].

The worst possible outcome: owner's response leads to consumer editing his original review to dock stars and complain of a second bad interaction with the business.

The customer is permanently lost, and the world is informed.

The customer’s complaints are certainly understandable: he was honestly disappointed that it took so long for his food to be ready and then felt the portions were overpriced. It didn’t help matters that the staff over-promised and under-delivered in estimating the wait time. Up until this point, the consumer is blameless. But then he makes two mistakes:

He makes no mention of voicing his complaints to the owner or staff in-person, at the time of service.
Upon receiving his small sandwich after 45 minutes of waiting, it would have taken him just one minute more to say, “I really want to speak the owner about this. I’m not happy with what just happened.” It’s the customer’s responsibility to speak up on his own behalf — to the let the owner know there is a problem for him to resolve.

Having failed to take on the responsibility of voicing his complaints directly to the owner at the time of service, the customer then vents his feelings to the world in the form of a negative review. Not only this, but his remarks about a "crappy concession trailer" are mean-spirited, showing zero respect for the reality that this is, in fact, another human being’s livelihood. Being dismissive of someone’s job is uncivil, snobbish, rude, etc. Using language like this is unlikely to make friends, and is unlikely to bring out the best in the owner whom he is now, in fact, goading and insulting.

Regardless of the customer’s tone, the owner’s job is to be professional at all times. I’ve seen adept business owners handle even the rudest customers with a skill that leaves me in awe, but in this case, the owner of the food truck went down the worst possible road. Far from remedying the initial negative review, the owner’s response brought the customer back with further negativity, including taking off stars. Here’s how the owner responded (Eds.note: original spelling and grammar intact), with my interpretation on the right:

Owner's Response

Key to Owner's Response

5/23/2016 I'd have to say, I'm shocked and appalled at this "customers" behavior. However, if you look at his profile...it is really not all that shocking. Many businesses have felt the wrath of this poor shmuck. And just so everything is clear and on the table...I'm copying and pasting our so called "nasty" email we sent him. You can be the judge:

Calling your customer names and trying to shame him is the worst possible way to begin an owner response.

Hello,

I am one of the owners of [business name removed]...along with my wife and father in-law. I just wanted to take a minute to help you understand the impact of your publicly posted criticisms.

The owner's job is to apologize, not to correct or instruct the customer.

First, you should understand that we are an 8 by 16 truck that was inundated with 100's of orders all at once. Simply put, we did the best we could to get your sandwich out as fast as possible. With the space we have, grill size, etc...we can only do so much.

It is not the customer's job to be understanding about the business's limitations or problems. He expects to receive service. That is all.

We tried to be as honest and as accurate as possible with regards to wait time.

As for the quality, we certainly want our customers to enjoy our food. This is why we cook to order, and try to maintain a level of freshness that other trucks may not. Given that you didn't like it, we would have preferred to have a chance to make the situation whole. However, you clearly chose not to afford us that opportunity.

An explanation of the business's goal to provide quality product is good, but the real gem here is the owner's pain that the customer didn't complain in person.

This is the owner's honest disappointment.

In regards to "no overhead." Well, that's just so far from accurate it made our heads spin. Perhaps consider: food costs (certified angus beef is not cheap), labor for our employee, propane, the generator rental fee ($200), the city temp license fee, the fire inspection fee, fees to do the event itself, the gas to tow and run the gene, water....I could go on, but hopefully you get the point.

The customer may be ignorant of how the business operates, but this is not the time to explain its costs. You haven't yet earned the customer's friendship or empathy.

He's going to walk for lack of an apology.

Lastly, on a more personal level, you should understand that we worked two 14 hour days for this event. We have a 10 month old son that was at his grandparents, suffered from great separation anxiety, a cried himself to sleep both nights. We finished cleaning at 1 am this morning only to wake up no your negative review about our "crappy food truck."

This is completely over the top. Customers do not want to hear about crying babies. They are paying for service, not sob stories.

At the same time, the owner admits he's stung by rude language. This is real.

Overall, my ultimate goal with this note, is to help you better understand the impact of your words. And to understand that we are hard working people, who genuinely care about our business. When you are unhappy with a level of service, please, by all means, contact them privately and at least give them the opportunity to make the situation right. Criticizing them publicly gives them absolutely no chance to do that, creates a lasting stain on their hard work, and potentially takes food out of their children mouths. Had you contacted us, we would have offered you a free meal at our regular location or refunded your money.

The closing sums up how wrong and how right this owner's mindset is.

He is totally wrong to believe the purpose of an owner response is to correct his customers.

But, he's totally right that the lack of opportunity to respond well to an in-person complaint is a major pain point for his business, and millions of other businesses, too.

Reading between the lines of the owner’s response, a picture emerges of a business that underestimated how busy it would be at an event and did not have adequate cooking facilities or staff to fulfill orders within a normal timeframe. This was the initial mistake that set the stage for all that transpired. Unfortunately, the owner then worsened the scenario by making the following additional mistakes:

He confused his business with himself.
Learning not to take business criticism personally is Reputation Management 101 for all owners and staff, and it can be the hardest instinctive mindset for anyone to overcome. If you’ve put your heart into your business, it is genuinely challenging not to view criticism as a personal attack … but you mustn’t get stuck there. You’re offering goods and services; paying customers expect to receive them. Business is transactional, not personal, and the customer is not signing up to hear about your fatigue or family problems. All this customer wanted was the sandwich he ordered. This is not personal.

He refused to accept responsibility for the customer’s bad experience.
The response penned by the owner is not an admission that the customer’s legitimate dissatisfaction stemmed from poor planning or poor execution on the part of the business. The owner refuses to say, “My fault, I’m sorry.”

He failed to see the owner response function as his last chance to save a bad situation. He views it as a place to justify himself by correcting the customer’s attitude, expectations, and sentiments. Once the negative review has already been published, the owner response function is likely the only life preserver left.

Given the costliness of replacing a lost customer and the way a negative review can cost a company future business, the owner response field is not a platform for a lecture; it’s a platform for making the greatest possible effort to make amends.

Finally, the owner devolved into personal insults, betraying a fundamental lack of professionalism.
A business is professional. A customer is just a person. Even if your customer cannot utter a single sentence without using colorful expletives, professionals are meant to be trained to communicate in business-appropriate language at all times. What this owner has done is to reveal to the whole world that he refers to his customers in insulting terms if they have a complaint. Once anyone reads that, they know not to expect empathy if they encounter a problem with the business.

Perhaps the most powerful element of the owner response function is that it is not just for a single customer to read, but for all future customers to read. Respond well, and you may not only win a second chance with the customer, but also prove to all future potential customers that they will be treated with respect, empathy, and fairness by your company.

Crafting a powerful owner response

If the food truck owner were my client, this is a sample of how I would have helped him respond, with my key on the right:

Owner's Response

Key to Owner's Response

Dear Jim,

I hope you can find it in your heart to accept my apology for the poor experience you had on the day of the event. This was totally my fault.

Greet customer personally, if possible, and begin with a sincere apology. Take responsibility for your business, as its owner.

I underestimated how swamped we would be and would have hired extra staff for the day if I'd realized 10,000 people were attending. This was our first time doing this event, and my failure to correctly predict the number of orders we'd be filling is what led to you waiting 45 minutes for your sandwich.

I feel really bad thinking of you having to stand around waiting for your lunch when there was so much else to do at the festival. We make each sandwich fresh to order and my staff simply got inundated.

Where possible, explain how the mistake happened.

Validate the customer's experience by expressing empathy for their situation.

Be accountable for any errors.

I wish you'd had the chance to talk to me about this in person at the time, but I realize it may have been too hectic to reach out to me that day. I would gladly have given you a full refund or a free cheesesteak to try to make up for the inconvenience.

Please, accept this as my invitation to stop by our regular location at 123 Main St. for a cheesesteak on me, where I promise you'll receive within our normal 10-minute timeframe.

If you come, I'd really appreciate you taking a minute to let me know, in person, anything else you feel we can do to improve our food or service. This is our family business and we are so invested in serving our community well.

Encourage all readers to believe that, if a problem occurs, you would love to have them speak directly to you or staff about it right at the time of service.

Since the presence of the negative review means an in-person complaint likely never happened, offer an appropriate means of atonement and a guarantee of a better experience, if the customer will give you a second chance.

Again, please accept my apology, Jim, and please give me a chance to make it right.

Thank you,

Bill Williams
Owner, Philly Cheesesteak Truck

Close with a repeated expression of your sincere regret, your offer to make things right, and an identification of yourself as the owner of the business.

Contrast the owner’s real response with this sample suggested response, and you are likely to come away with a completely different, more positive impression of the business. A few quick suggestions for coming across well:

Keep length reasonable; don’t write a novel

Beware of sounding like you’re on your high horse; use common, neighborly language

Make sure you’ve apologized

Where appropriate, explain what went wrong and describe any steps you’ve taken to correct an issue

Extend your offer of something nice to try to make it better

Welcome further feedback; it could lead to the reviewer updating their review with positive sentiment

Those are quick tips that should immediately help you to improve your reputation in the eyes of all who read your owner responses. Ready to dig deeper into developing a powerful, permanent mindset for all future tough transactions? Read on.

3 empowering tactics for better reputation management

Every business encounters criticism. Meet this reality better prepared with these three tips:

1. In business, we wear the mask.

When your spouse tells you're inattentive, when your friend points out that you chew with your mouth open, when your children berate you for not letting them adopt another dog, it’s personal. It’s your privilege to respond with tears, embarrassment, a lecture, or whatever you feel you need to express at that moment, reacting to personal criticism in your private life.

In business, it’s different. In a civil society, and particularly in a business setting, it’s simple reality that we tend to suppress strong reactions and strong words for the sake of professionalism.

If you feel the color rising to your face when a customer insinuates that you actually founded your whole company for the purpose of ripping him off for $9.99, try picturing in your mind the image of the most serene, inscrutable face of a statue you’ve ever seen. Perhaps it’s the face of the Buddha, or a classical Greek god, or a Tlingit totem being. Imagine donning that mask, like a zone of safety, between the disgruntled customer’s business complaint and your personal life. It’s cooler behind the mask and you can respond to almost any commercial criticism, knowing your personal feelings are completely safe behind the barrier you’ve established.

2. Muster empathy to integrate as much of yourself into the interaction as you feel comfortable with.

Now that you’ve tried on the mask, and you’ve got your worries, your insecurities, loves, family, and everything else personal safely behind its barrier, see how much of yourself you feel safe putting outside the mask for the world to see.

Your life may feel too divided if your business and personal worlds are kept 100% separate, and you may not be able to pour the full passion of your heart and intellect into the business you are building if you have to be a statue at all times. Some customers may be so irrational in their expectations or conduct that the only way to manage them is with a marble coolness or a wooden face, but hopefully that will be the exception. For most customers, this technique will help you integrate your genuine human feelings into a situation in which distress is being expressed.

Picture a person you not only really love, but also of whom you feel protective. For just a moment, substitute that special person for the complaining customer. Imagine that it is your grandmother who had to wait in line for 45 minutes (she might have gotten heat stroke), or your nephew who was still hungry after being overcharged for lunch (he’s had trouble getting up to a healthy weight), or your spouse who was treated rudely (how dare someone disrespect him/her), or your friend whose product broke after a week of use (she can’t afford to replace it). Suddenly, that customer is transformed from an unknown complainer into an important person who deserves fair, empathetic treatment.

Integrate as much of the empathy you’d feel for a friend or relative as you can for the customer. The health of your local business, and your good feelings about the way you conduct it, depend upon turning as many unknown neighbors as you can into loyal customers and, hopefully, friends.

3. Master catching complaints before they become negative reviews.

It may seem counterintuitive to want to receive as many complaints as possible, but when you consider that they are your best safeguard against the publication of negative reviews, making your business complaint-friendly is incredibly smart! Implement these tips:

Install visible in-store signage detailing options for requesting help with a complaint. Wall signs, window signs, signs on counters, tables, menus, aisles, print materials, and company vehicles can all alert customers to complaint-receptivity.

Signage can include a complaint hotline text message number and phone number, both of which should be regularly monitored for activity.

Instruct all staff who deal with the public to invite complaints with clear language, like “Was there anything you couldn’t find, anything we can do better, etc.?”

Be sure your website is mobile-friendly and includes a visible complaint form.

Gather emails at the time of service and email customers shortly thereafter to request feedback, both positive and negative. Follow up quickly on any negative experiences and make every effort to remedy them.

Assign a staff member for each store who regularly checks popular social media sites for mentions of your business and who is empowered to reach out any time negative sentiment appears.

Document all complaints, identify patterns, and implement solutions. Your complaint document will be an absolute goldmine for resolving common problems before future customers experience them.

Consider purchasing paid products that help you analyze your social media opportunities and manage your reputation. Followerwonk and GetFiveStars are good places to start. Don’t leave things up to chance — know your stats and actively control the conversation that’s happening about your business! Be as connected and engaged with your consumers as you possibly can.

Speaking of GetFiveStars, I highly recommend taking the time to read the series of articles they’ve been publishing regarding the subject of consumer complaints, including some really insightful surveys. My favorite tip from co-founder Mike Blumenthal is this one:

“Make a complainer feel like your most valued customer because, in some ways, they are.”

Happier endings for everybody

The art of customer service is one you’ll be training yourself and your staff in for as long as you serve the public. Even if you’ve made every effort to catch complaints on the spot, no method is foolproof and every business is almost guaranteed to have to deal with a negative review here and there.

Some customers will not speak up for themselves, even when expressly invited to do so, because they are shy, dread confrontation, or are so accustomed to being treated poorly that they don’t believe their voice will be genuinely heard. They may utilize online reviews as substitute for having to "make a fuss" in person about their dissatisfaction.

Then there are those truly awful customers no business can avoid. They may have entitlement issues, unrealistic expectations, unpleasant personalities, or even have made it a life practice to throw tantrums in hopes of receiving free stuff. They may utilize online reviews as a place to spew rude language and invent false accusations because they have personal problems.

No business is immune to either type of customer, but if you plot out your company’s reputation management course, you can weather most storms and end up looking like one smooth sailor! Your plan might look something like this timeline:

I continue to be amazed at how many negative reviews slip through and sit unanswered on major review platforms, raising doubts in potential customers’ minds and giving a neglectful impression of the business.

With the right mindset that delineates comfortable boundaries between your personal and business worlds, cultivation of empathy, a clear plan, and concentrated devotion to staff training, no business need suffer dread of negative feedback, and can, in fact, view it as a powerful resource for making meaningful improvements pre-guaranteed to resolve existing issues. And when those negative reviews do squeak through your process, a beautiful, professional response can write a happy ending, just like this one:

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Autotrack is a JavaScript library built for use with analytics.jsthat provides developers with a wide range of plugins to track the most common user interactions relevant to today's modern web.

The first version of autotrack for analytics.js was released on Github earlier this year, and since then the response and adoption from developers has been amazing. The project has been starred over a thousand times, and sites using autotrack are sending millions of hits to Google Analytics every single day.

Today I'm happy to announce that we've released autotrack version 1.0, which includes several new plugins, improvements to the existing plugins, and tons of new ways to customize autotrack to meet your needs.

Note: autotrack is not an official Google Analytics product and does not qualify for Google Analytics 360 support. It is maintained by members of the Google Analytics developer platform team and is primarily intended for a developer audience.

New plugins

Based on the feedback and numerous feature requests we received from developers over the past few months, we've added the following new autotrack plugins:

Impression Tracker

The impression tracker plugin allows you to track when an element is visible within the browser viewport. This lets you much more reliably determine whether a particular advertisement or call-to-action button was seen by the user.

Impression tracking has been historically tricky to implement on the web, particularly in a way that doesn't degrade the performance of your site. This plugin leverages new browser APIs that are specifically designed to track these kinds of interactions in a highly performant way.

Clean URL Tracker

If your analytics implementation sends pageviews to Google Analytics without modifying the URL, then you've probably experienced the problem of seeing multiple different page paths in your reports that all point to the same place. Here's an example:

Note: setting up View Filters in your Google Analytics view settings is another way to modify the URLs sent to Google Analytics.

Page Visibility Tracker

It's becoming increasingly common for users to visit sites on the web and then leave them open in an inactive browser tab for hours or even days. And when users return to your site, they often won't reload the page, especially if your site fetches new content in the background.

The page visibility tracker plugin takes a more modern approach to what should constitute a pageview. In addition to tracking when a page gets loaded, it also tracks when the visibility state of the page changes (i.e. when the tab goes into or comes out of the background). These additional interaction events give you more insight into how users behave on your site.

Updates and improvements

In addition to the new plugins added to autotrack, the existing plugins have undergone some significant improvements, most notably in the ability to customize them to your needs.

All plugins that send data to Google Analytics now give you 100% control over precisely what fieldsget sent, allowing you to set, modify, or remove anything you want. This gives advanced users the ability to set their own custom dimensions on hits or change the interaction setting to better reflect how they choose to measure bounce rate.

Users upgrading from previous versions of autotrack should refer to the upgrade guide for a complete list of changes (note: some of the changes are incompatible with previous versions).

Who should use autotrack

Perhaps the most common question we received after the initial release of autotrack is who should use it. This was especially true of Google Tag Managerusers who wanted to take advantage of some of the more advanced autotrack features.

Autotrack is a developer project intended to demonstrate and streamline some advanced tracking techniques with Google Analytics, and it's primarily intended for a developer audience. Autotrack will be a good fit for small to medium sized developer teams who already have analytics.js on their website or who prefer to manage their tracking implementation in code.

Large teams and organizations, those with more complex collaboration and testing needs, and those with tagging needs beyond just Google Analytics should instead consider using Google Tag Manager. While Google Tag Manager does not currently support custom analytics.js plugins like those that are part of autotrack, many of the same tracking techniques are easy to achieve with Tag Manager’s built-in triggers, and others may be achieved by pushing data layer events based on custom code on your site or in Custom HTML tags in Google Tag Manager. Read Google Analytics Events in the Google Tag Manager help center to learn more about automatic event tracking based on clicks and form submissions.

Next steps

If you're not already using autotrack but would like to, check out the installation and usage section of the documentation. If you already use autotrack and want to upgrade to the latest version, be sure to read the upgrade guide first.

To get a sense of what the data captured by autotrack looks like, the Google Analytics Demos & Tools site includes several reports displaying its own autotrack usage data. If you want to go deeper, the autotrack library is open source and can be a great learning resource. Have a read through the plugin source code to get a better understanding of how some of the advanced analytics.js features work.

Lastly, if you have feedback or suggestions, please let us know. You can report bugs or submit any issues on Github.

Below is what happened in search today, as reported on Search Engine Land and from other places across the web. The post SearchCap: Google AMP in search, Search Analytics semantics & more appeared first on Search Engine Land.

Topic Trends, the latest feature added to Moz Content, allows marketers to quickly access a snapshot of the most popular and the most relevant content in any vertical.

By accessing the content in the Content Search index, Topic Trends highlights the topics that were written about most frequently in the previous five days.

Since the presidential election is the hottest thing going at the moment, it's little surprise that election news is dominant:

Topic Trends displaying the most current, most popular topics

This feature is based on the Sharing Trends Graph, which highlights the number of articles matching your search in the Moz Content index, in addition to factoring in the median number of shares per article.

By typing "Featured Snippets" into the search field, for example, you get a two-line graph that's rich in details that can instantly inform your content strategy:

The grey line represents the number of articles published about the topic being searched for over the last five days.

The green line depicts the median shares of those articles.

“We're using the graph as a rough indicator of audience interest in the topic,” says Jay Leary, Moz's senior product manager for Moz Content and one of the lead architects behind the product. “It’s sort of like a Google Trends, but instead of searches for a topic, we're looking at the sharing of articles about the topic."

Below the graph you'll find a list of results along with their content metrics, including Reach, Links, Discovery Date and a host of other metrics, including those associated with social shares:

Created by content marketers, for content marketers

Ever since Matthew J. Brown announced the Beta version of Moz Content at MozCon 2015, we've been focused on designing, creating and delivering a tool that will make it easier for marketers to create the types of content that'll resonate with their audiences.

The Tracked Audits feature is ideal for brands who already have an audience, but if you're just getting started, the focus is usually on research. That's where Content Search comes in.

The Tracked Audits feature, for example, provides marketers with all of the information a normal content audit would, but has the added dimension of an extended and customizable timeline.

Instead of spending hours manually (which I've done numerous times; it's no fun) you can simply have updates emailed to you detailing everything you need to know to track content performance.

Also, thanks to Content Search, you can find the most popular pieces of content from across the web via a simple topic search:

Results returned for the query "content marketing"

Not only will this allow you to be better informed about the content types and content topics you should create, but it also alerts you to who your main competitors are for content you desire to create, as well.

Share your feedback

If you've already been using the product, try out the newest feature and let us know what you think.

Either way, we'd love to hear from users of the product. We're always looking for ways to improve it and welcome your input.

Feel free to share your thoughts in the comments below.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Columnist Brian Harnish tackles how to analyze links from a Google Webmaster Guidelines perspective, and how to make sure that your client's linking activities don't cause them to lose everything they have gained. The post Link profile analysis: How to prevent penalties by being proactive appeared...

Link building is an art that many, including columnist Andrew Dennis, have learned through trial and error. Today, he shares some of what he's learned about how to identify and obtain better links. The post Link prospecting tips and tricks appeared first on Search Engine Land.

No more arguing about what Google means by positions in the Google Search Analytics report. Google has defined the multitude of metrics and cases in a help document. The post New Google help document defines Search Analytics impressions, position and clicks appeared first on Search Engine Land.

In that post I presented some concepts that, in my personal opinion, we SEOs needed to pay attention to in order to follow the evolution of Google.

Sure, I also presented a theory which ultimately proved incorrect; I was much too confident about things like rel="author", rel="publisher", and the potential decline of the Link Graph influence.

However, the premises of that theory were substantially correct, and they remain correct five years later:

Technical SEO is foundational to the SEO practice;

The user is king, which means that Google will focus more and more on delivering the best user search experience — hence, SEO must evolve from "Search Engine Optimization" into "Search Experience Optimization";

That web performance optimization (SiteSpeed), 10X content, and semantics would have played a big role in SEO.

Many things have changed in our industry in the past 5 years. The time has come to pause, take a few minutes, and assess what Google is and where it's headed.

I'll explain how I "study" Google and what I strongly believe we, the SEOs, should pay attention to if we want not only to survive, but to anticipate Google's end game, readying ourselves for the future.

Obviously, consider that, while I believe it's backed up by data, facts, and proof, this is my opinion. As such, I kindly ask you not to take what I write for granted, but rather as an incentive for your own investigations and experiments.

Like her, I am a strategist by nature. I love to investigate, to see connections where nobody else seems to see them, and to dig deeper into finding answers to complex questions, then design plans based on my investigations.

This way of being means that, when I look at the mysterious wormhole that is Google, I examine many sources:

The official Google blogs;

The “Office Hours” hangouts;

The sometimes contradictory declarations Googlers make on social media (when they don’t share an infinite loop of GIFs);

The Google Patents and the ones filed by people now working for Google;

The news (and stories) about the companies Google acquires;

The biographies of the people Google employs in key areas;

The “Google Fandom” (aka what we write about it);

Rumors and propaganda.

Now, when examining all these sources, it's easy to create amazing conspiranoiac (conspiracy + paranoia) theories. And I confess: I helped create, believed, and defended some of them, such as AuthorRank.

In my opinion, though, this methodology for finding answers about Google is the best one for understanding the future of our beloved industry of search.

If we don't dig into the "Expanded Universe of Google," what we have is a timeline composed only by updates (Panda 1.N, Penguin 1.N, Pigeon…), which is totally useless in the long term:

Click to open a bigger version in a new tab

Instead, if we create a timeline with all the events related to Google Search (which we can discover simply by being well-informed), we begin to see where Google's heading:

Click to open a bigger version in a new tab

The timeline above confirms what Google itself openly declared:

"Machine Learning is a core, transformative way by which we’re rethinking how we’re doing everything."
– (Sundar Pichai)

Machine learning is becoming so essential in the evolution of Google and search, perhaps we should go beyond listening only to official Google spokespeople like Gary Illyes or John Mueller (nothing personal, just to be clear... for instance, read this enlightening interview of Gary Illyes by Woj Kwasi). Maybe we should start paying more attention to what people like Christine Robson, Greg Corrado, Jeff Dean, and the staff of Google Brain write and say.

The second timeline tells us that starting in 2013 Google started investing money, intellectual efforts, and energy on a sustained scale in:

Machine learning;

Semantics;

Context understanding;

User behavior (or “Signals/Semiotics,” as I like to call it).

2013: The year when everything changed

Google rolled out Hummingbird only three years ago, but it's not just a saying: that feels like decades ago.

Let’s quickly rehash: what's Hummingbird?

Hummingbird is the Google algorithm as a whole. It's composed of four phases:

Crawling, which collects information on the web;

Parsing, which identifies the type of information collected, sorts it, and forwards it to a suitable recipient;

Indexing, which identifies and associates resources in relation to a word and/or a phrase;

Search, which...

Understands the queries of the users;

Retrieves information related to the queries;

Filters and clusters the information retrieved;

Ranks the resources; and

Paints the search result page and so answers the queries.

This last phase, Search, is where we can find the “200+ ranking factors” (RankBrain included) and filters like Panda or anti-spam algorithms like Penguin.

Remember that there are as many search phases as vertical indices exist (documents, images, news, video, apps, books, maps...).

We SEOs tend to fixate almost exclusively on the Search phase, forgetting that Hummingbird is more than that.

This approach to Google is myopic and does not withstand a very simple logical square exercise.

If Google is able to correctly crawl a website (Crawling);

to understand its meaning (Parsing and Indexing);

and, finally, if the site itself responds positively to the many ranking factors (Search);

then that website will be able to earn the organic visibility it aims to reach.

If even one of the three elements of the logical square is missing, organic visibility is missing; think about non-optimized AngularJS websites, and you’ll understand the logic.

The website on the left in a non-JS enabled browser. On the right, JS enabled reveals all of the content. Credit: Builtvisible.com

Why does entity search matter?

It matters because entity search is the reason Google better understands the personal and almost unique context of a query.

Moreover, thanks to entity search, Google better understands the meaning of the documents it parses. This means it's able to index them better and, finally, to achieve its main purpose: serving the best answers to the users' queries.

This is why semantics is important: semantic search is optimizing for meaning.

It's not a ranking factor, it's not needed to improve crawling, but it is fundamental for Parsing and Indexing, the big forgotten-by-SEOs algorithm phases.

Semantics and SEO

First of all, we must consider that there are different kinds of semantics and that, sometimes, people tend to get them confused.

Logical semantics, which is about the relations between concepts/linguistic elements (e.g.: reference, presupposition, implication, et al)

Lexical semantics, which is about the meaning of words and their relation.

Logical semantics

Structured data is the big guy right now in logical semantics, and Google (both directly and indirectly) is investing a lot in it.

A couple of months ago, when the mainstream marketing gurusphere was discussing the 50 shades of the new Instagram logo or the average SEO was (justifiably) shaking his fists against the green “ads” button in the SERPs, Google released the new version of Schema.org.

Use PreRender in order to let the browser begin uploading the pages your users may visit after the one they're currently on, anticipating the upload of the JSON-LD elements of those pages.

The importance Google gives to Schema.org and structured data is confirmed by the new and radically improved version of the Structured Data Testing Tool, which is now more actionable for identifying mistakes and test solutions thanks to its JSON-LD (again!) and Schema.org contextual autocomplete suggestions.

Semantics is more than structured data #FTW!

One mistake I foresee is thinking that semantic search is only about structured data.

It's the same kind of mistake people do in international SEO, when reducing it to hreflang alone.

The reality is that semantics is present from the very foundations of a website, found in:

Its code, specifically HTML;

Its architecture.

HTML

Its latest version, HTML5, added new semantic elements, the purpose of which is to semantically organize the structure of a web document and, as W3C says, to allow “data to be shared and reused across applications, enterprises, and communities.”

A clear example of how Google is using the semantic elements of HTML are its Featured Snippets or answer boxes.

Everything starts with the right ontology

Ontology is a set of concepts and categories in a subject area (or domain) that shows their properties and the relations between them.

If we take the Starwars.com site as example, we can see in the main menu the concepts in the Star Wars subject area:

News/Blog;

Video;

Events;

Films;

TV Shows;

Games/Apps;

Community;

Databank (the Star Wars Encyclopedia).

Ontology leads to taxonomy (because everything can be classified)

If we look at Starwars.com, we see how every concept included in the Star Wars domain has its own taxonomy.

For instance, the Databank presents several categories, like:

Characters;

Creatures;

Locations;

Vehicles;

Et cetera, et cetera.

Ontology and taxonomy, then, lead to context

If we think of Tatooine, we tend to think about the planet where Luke Skywalker lived his youth.

However, if we visit a website about deep space exploration, Tatooine would be one of the many exoplanets that astronomers have discovered in the past few years.

As you can see, ontology (Star Wars vs celestial bodies) and taxonomies (Star Wars planets vs exoplanets) determine context and help disambiguate between similar entities.

Ontology, taxonomy, and context lead to meaning

The better we define the ontology of our website, structure its taxonomy, and offer better context to its elements, the better we explain the meaning of our website — both to our users and to Google.

Starwars.com, again, is very good at doing this.

For instance, if we examine how it structures a page like the one on TIE fighters, we see that every possible kind of content is used to help explain what a TIE fighter is:

Generic description (text);

Appearances of the TIE fighter in the Star Wars movies (internal links with optimized anchor text);

Affiliations (internal links with optimized anchor text);

Dimensions (text);

Videos;

Photo gallery;

Soundboard (famous quotes by characters. In this case, it would be the classic "zzzzeeewww" sound many of us used as the ring tone on our old Nokias :D);

Quotes (text);

History (a substantial article with text, images, and links to other documents);

Related topics (image plus internal links).

In the case of characters like Darth Vader, the information can be even richer.

The effectiveness of the information architecture of the Star Wars website (plus its authority) is such that its Databank is one of the very few non-Wikidata/Wikipedia sources that Google is using as a Knowledge Graph source.

What tool can we use to semantically optimize the structure of a website?

There are, in fact, several tools we can use to semantically optimize the information architecture of a website.

Knowledge Graph Search API

The first one is the Knowledge Graph Search API, because in using it we can get a ranked list of the entities that match given criteria.

This can help us better define the subjects related to a domain (ontology) and can offer ideas about how to structure a website or any kind of web document.

RelFinder

A second tool we can use is RelFinder, which is one of the very few free tools for entity research.

As you can see in the screencast below, RelFinder is based on Wikipedia. Its use is quite simple:

Choose your main entity (eg: Star Wars);

Choose the entity you want to see connections with (eg: Star Wars Episode IV: A New Hope);

Click "Find Relations."

RelFinder will detect entities related to both (e.g.: George Lucas or Marcia Lucas), their disambiguating properties (e.g.: George Lucas as director, producer, and writer) and factual ones (e.g.: lightsabers as an entity related to Star Wars and first seen in Episode IV).

RelFinder is very useful if we must do entity research on a small scale, such as when preparing a content piece or a small website.

However, if we need to do entity research on a bigger scale, it's much better to rely on the following tools:

AlchemyAPI and other tools

AlchemyAPI, which was acquired by IBM last year, uses machine and deep learning in order to do natural language processing, semantic text analysis, and computer vision.

AlchemyAPI, which offers a 30-day trial API Key, is based on the Watson technology; it allows us to extract a huge amount of information from text, with concepts, entities, keywords, and taxonomy offered by default.

How do we conduct semantically focused keyword and topical research?

Despite its recent update, Keyword Planner still can be useful for performing semantically focused keyword and topical research.

In fact, that update could even be deemed as a logical choice, from a semantic search point of view.

Terms like "PPC" and "pay-per-click" are synonyms, and even though each one surely has a different search volume, it's evident how Google presents two very similar SERPs if we search for one or the other, especially if our search history already exhibits a pattern of searches related to SEM.

Yet this dimming of keyword data is less helpful for SEOs in that it makes for harder forecasting and prioritization of which keywords to target. This is especially true when we search for head terms, because it exacerbates a problem that Keyword Planner had: combining stemmed keywords that — albeit having "our keyword" as a base — have nothing in common because they mean completely different things and target very different topics.

However (and this is a pro tip), there is a way to discover the most useful keyword, even when they all have the same search volume: how much advertisers bids for it. Trust the market ;-).

Keyword Planner for semantic search

Let's say we want to create a site about Star Wars lightsabers (yes, I am a Star Wars geek).

What we could do is this:

Open Keyword Planner / Find new Keywords and get (AH!) search volume data;

Describe our product or service ("News" in the snapshot above);

Use the Wikipedia page about lightsabers as a landing page (if your site were Spanish, the Wikipedia should be the Spanish one);

Indicate our product category (Movies & Films above);

Define the target and eventually indicate negative keywords;

Click on "Get Ideas."

Google will offer us these Ad Groups as results:

Click to open a bigger version in a new tab

The Ad Groups are a collection of semantically related keywords. They're very useful for:

Individuating topics;

Creating a dictionary of keywords that can be given to writers for text, which will be both natural and semantically consistent.

Remember, then, that Keyword Planner allows us to do other kinds of analysis too, such as breaking down how the discovered keywords/Ad Groups are used by device or by location. This information is useful for understanding the context of our audience.

If you have one or a few entities for which you want to discover topics and grouped keywords, working directly in Keyword Planner and exporting everything to Google Sheets or an Excel file can be enough.

However, when you have tens or hundreds of entities to analyze, it's much better to use the Adwords API or a tool like SEO Powersuite, which allows you to do keyword research following the method I described above.

Google Suggest, Related Searches, and Moz Keyword Explorer

Alongside with using Keyword Planner, we can use Google Suggest and Related Searches. Not for simply individuating topics that people search and then writing an instant blog post or a landing page about them, but for reaffirming and perfecting our site's architecture.

Continuing with the example of a site or section specializing in lightsabers, if we look at Google Suggest we can see how "lightsaber replica" is one of the suggestions.

Moreover, amongst the Related Searches for "lightsaber," we see "lightsaber replica" again, which is a clear signal of its relevance to "lightsaber."

Finally, we can click on and discover "lightsaber replica"-related searches, thus creating what I define as the "search landscape" about a topic.

The model above is not scalable if we have many entities to analyze. In that case, a tool like Moz Keyword Explorer can be helpful thanks to the options it offers, as you can see in the snapshot below:

The Suggest function, though, is present in (almost) every website that has a search box (your own site, even, if you have it well-implemented!).

This means that if we're searching for more mainstream and top-of-the-funnel topics, we can use the suggestions of social networks like Pinterest (i.e.: explore the voluptous universe of the "lightsaber cakes" and related topics):

Pinterest, then, is a real topical research goldmine thanks to its tagging system:

On-page

Once we've defined the architecture, the topics, and prepared our keyword dictionaries, we can finally work on the on-page facet of our work.

User search behavior

While these posts are super actionable, present interesting information with original data, and confirm other tests conducted in the past, these so-called user signals (CTR and dwell time) may not be directly related to RankBrain but, instead, to user search behaviors and personalized search.

Even if RankBrain may be included in the semantic search landscape due to its use of Word2Vec technology, I find it better to concentrate on how Google may use user search behaviors to better understand the relevance of the parsed and indexed documents.

Click-through rate

Since Rand Fishkin presented his theory — backed up with tests — that Google may use CTR as a ranking factor more than two years ago, a lot has been written about the importance of click-through rate.

Common sense suggests that if people click more often on one search snippet than another that perhaps ranks in a higher position, then Google should take that users' signal into consideration, and eventually lift the ranking of the page that consistently receives higher CTR.

Common sense, though, is not so easy to apply when it comes to search engines, and repeatedly Googlers have declared that they do not use CTR as a ranking factor (see here and here).

And although Google has long since developed a click fraud detection system for Adwords, it's still not clear if it would be able to scale it for organic search.

What's less discussed is the importance CTR has in personalized search, as we know that Google tends to paint a custom SERP for each of us depending on both our search history and our personal click-through rate history. They're key in helping Google determine which SERPs will be the most useful for us.

For instance:

If we search something for the first time, and

for that search we have no search history (or not enough to trigger personalized results), and

then it's only thanks to our personal CTR/search history that Google will determine which search results related to a given entity to show or not (amber the stone or Amber Rose or Amber Alerts...).

Finally, even if Google does not use CTR as a ranking factor, this doesn't mean it's not an important metric and signal for SEOs. We have years of experience and hundreds of tests proving how important is to optimize our search snippets (and now Rich Cards) with the appropriate use of structured data in order to earn more organic traffic, even if we rank worst than our competitors.

Watch time

Having good CTR metrics is totally useless if the pages our visitors land on don't fulfill the expectation the search snippet created.

This is similar to the difference between a clickbait and a persuasive headline. The first will probably cause a click back to the search results page and the second, instead, will trap and engage the visitors.

The ability of a site to retain its users is what we usually call dwell time, but that Google defines as watch time in this patent: Watch Time-Based Ranking (March 2013).

This patent is usually cited in relation to video because the patent itself uses video as content example, but Google doesn't restrict its definition to videos alone:

In general, "watch time" refers to the total time that a user spends watching a video. However, watch times can also be calculated for and used to rank other types of content based on an amount of time a user spends watching the content.

Watch time is indeed a more useful user signal than CTR for understanding the quality of a web document and its content.

We’re learning that the time people choose to spend reading or watching content they clicked on from News Feed is an important signal that the story was interesting to them.

We are adding another factor to News Feed ranking so that we will now predict how long you spend looking at an article in the Facebook mobile browser or an Instant Article after you have clicked through from News Feed. This update to ranking will take into account how likely you are to click on an article and then spend time reading it. We will not be counting loading time towards this — we will be taking into account time spent reading and watching once the content has fully loaded. We will also be looking at the time spent within a threshold so as not to accidentally treat longer articles preferentially.

With this change, we can better understand which articles might be interesting to you based on how long you and others read them, so you’ll be more likely to see stories you’re interested in reading.

Context and the importance of personalized search

I usually joke and say that the biggest mistake a gang of bank robbers could do is bring along their smartphones. It'd be quite easy to do PreCrime investigations simply by checking their activity board, which includes their location history on Google Maps.

A conference day in Adelaide.

In order to fulfill its mission of offering the best answers to its users, Google must not only understand the web documents it crawls so to index them properly, and not only improve its own ranking factors (taking into consideration the signals users give during their search sessions), but it also needs to understand the context in which users performs a search.

Here's what Google knows about us:

It's because of this compelling need to understand our context that Google hired the entire Behav.io team back in 2013.

Behav.io, if you don't know already, was a company that developed an alpha test software based on its open source framework Funf (still alive), the purpose of which was to record and analyze the data that smartphones keep track of: location, speed, nearby devices and networks, phone activity, noise levels, et al.

All this information is required in order to better understand the implicit aspects of a query, especially if done from a smartphone and/or via voice search, and to better process what Tom Anthony and Will Critchlow define as compound queries.

However, personalized search is also determined by (again) entity search, specifically by search entities.

The relation between search entities creates a "probability score," which may determine if a web document is shown in a determined SERP or not.

For instance, let's say that someone performs a search about a topic (e.g.: Wookies) for which she never clicked on a search snippet of our site, but on another that had content about that same topic (e.g.: Wookieepedia) and which linked to the page about it on our site (e.g.: "How to distinguish one wookiee from another?").

Those links — specifically their anchor texts — would help our site and page to earn a higher probability score than a competitor site that isn't linked to by those sites present in the user's search history.

This means that our page will have a better probability of appearing in that user's personalized SERP than our competitors'.

You're probably asking: what's the actionable point of this patent?

Link building/earning is not dead at all, because it's relevant not only to the Link Graph, but also to entity search. In other words, link building is semantic search, too.

The importance of branding and offline marketing for SEO

One of classic complaints SEOs have about Google is how it favors brands.

The real question, though, should be this: "Why aren't you working to become a brand?"

Be aware! I am not talking about "vision," "mission," and "values" here — I'm talking about plain and simple semantics.

All throughout this post I spoke of entities (named and search ones), cited Word2Vec (vectors are "vast amounts of written language embedded into mathematical entities"), talked about lexical semantics, meaning, ontology, personalized search, and implied topics like co-occurrences and knowledge base.

Branding has a lot to do with all of these things.

I'll try to explain it with a very personal example.

Last May in Valencia I debuted as conference organizer with The Inbounder.

One of the problems I faced when promoting the event was that "inbounder," which I thought was a cool name for an event targeting inbound marketers, is also a basketball term.

The problem was obvious: how do I make Google understand that The Inbounder was not about basketball, but digital marketing?

The strategy we followed from the very beginning was to work on the branding of the event (I explain more about The Inbounder story here on Inbound.org).

We did this:

We created small local events, so as to

develop presence in local newspapers online and offline, a tactic that also obliged marketers to search on Google about the event using branded keywords (e.g.: "The Inbounder conference," "The Inbounder Inbound Marketing Conference," etc...), and

We worked with influencers (the speakers themselves) to trigger branded searches and direct traffic (remember: Chrome stores every URL we visit);

We did outreach and published guest posts about the event on sites visited by our audience (and recorded in its search history).

As a result, right now The Inbounder occupies all the first page of Google for its brand name and, more importantly in semantics terms, Google presents The Inbounder events as suggested and related searches. It associates it with all the searches I could ever want:

Another example is Trivago and its global TV advertising campaigns:

Trivago was very smart in constantly showing "Trivago" and "hotel" in the same phrase, even making their motto "Hotel? Trivago."

This is a simple psychological trick for creating word associations.

As a result, people searched on Google for "hotel Trivago" (or "Trivago hotel"), especially just after the ads were broadcasted:

One of the results is that now, Google suggests "hotel Trivago" when we start typing "hotel" and, as in the case of The Inbounder, it presents "hotel Trivago" as a related search:

Wake up SEOs, the new new Google is here

Yes, it is. And it's all about better understanding web documents and queries in order to provide the best answers to its users (and make money in the meantime).

Remember, SEO is no longer just about "200 ranking factors." SEO is about making our websites become the sources Google cannot help but use for answering queries.

This is exactly why semantic search is of utmost importance and not just something worth the attention of a few geeks passionate about linguistics, computer science, and patents.

Work on parsing and indexing optimization now, seriously implement semantic search in your SEO strategy, take advantage of the opportunities personalized search offers you, and always put users at the center of everything you do.

In doing so you'll build a solid foundation for your success in the years to come, both via classic search and with Google Assistant/Now.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Below is what happened in search today, as reported on Search Engine Land and from other places across the web. The post SearchCap: Bing & Google Olympics, Yahoo tests & more appeared first on Search Engine Land.

Powered by Bing Predicts, the new "Events to Watch" tool will be updated daily and offer a schedule of "must-watch" Olympic moments. The post Bing adds 2016 Rio Olympic Games search feature & “Events to Watch” prediction tool appeared first on Search Engine Land.

Searching your name will return a "Stay in the loop" tool at bottom of results that makes it easy to set up a Google Alert for yourself. The post New Google Alert widget saves us from repeatedly Googling ourselves appeared first on Search Engine Land.

Olympic event information will be delivered via Google search, along with broadcaster videos on YouTube. The post Google gears up for 2016 Rio Olympics with search updates & YouTube highlight videos appeared first on Search Engine Land.

Columnist Winston Burton discusses how search engine optimization (SEO) has evolved over time and wonders whether the job title is truly representative of the work we now do. The post Is “SEO” the right term anymore? appeared first on Search Engine Land.

Columnist Andrew Shotland shares insights gleaned from a large-scale statistical analysis of local search ranking factors in Google. The post How does Google’s local algorithm work in 2016? appeared first on Search Engine Land.

Days after Verizon announces it will acquire Yahoo, Yahoo is testing a new search bar at the top with a missing logo at the left. The post Yahoo testing new search bar where logo is on the right, not the left appeared first on Search Engine Land.

For what seems like forever, SEOs have operated by a set of best practices that dictate how to best handle redirection of URLs. (This is the practice of pointing one URL to another. If you need a quick refresher, here’s a handy guide on HTTP status codes.)

These tried and true old-school rules included:

301 redirects result in around a 15% loss of PageRank. Matt Cutts confirmed this in 2013 when he explained that a 301 loses the exact same amount of PageRank as a link from one page to another.

302s don’t pass PageRank. By definition, 302s are temporary. So it makes sense for search engines to treat them different.

HTTPS migrations lose PageRank. This is because they typically involve lots of 301 redirects.

These represent big concerns for anyone who wants to change a URL, deal with an expired product page, or move an entire website.

The risk of losing traffic can mean that making no change at all becomes the lesser of two evils. Many SEOs have delayed site migrations, kept their URLs ugly, and have put off switching to HTTPS because of all the downsides of switching.

The New Rules of 3xx Redirection

Perhaps because of the downsides of redirection — especially with HTTPS — Google has worked to chip away at these axioms over the past several months.

In February, Google’s John Mueller announced that no PageRank is lost for 301 or 302 redirects from HTTP to HTTPS. This was largely seen as an effort by Google to increase webmaster adoption of HTTPS.

Google’s Gary Illyes told the SEO world that Google doesn’t care which redirection method you use, be it 301, 302, or 307. He explained Google will figure it out and they all pass PageRank.

Most recently, Gary Illyes cryptically announced on Twitter that 3xx (shorthand for all 300) redirects no longer lose PageRank at all.

While these are welcome changes from Google, there are still risks and considerations when moving URLs that go way beyond PageRank. We’ll cover these in a moment.

First, here’s a diagram that attempts to explain the old concepts vs. Google’s new announcements.

Let’s cover some myths and misconceptions by answering common questions about redirection.

Q: Can I now 301 redirect everything without risk of losing traffic?

A: No

All redirects carry risk.

While it’s super awesome that Google is no longer “penalizing” 301 redirects through loss of PageRank, keep in mind that PageRank is only one signal out of hundreds that Google uses to rank pages.

Ideally, if you 301 redirect a page to an exact copy of that page, and the only thing that changes is the URL, then in theory you may expect no traffic loss with these new guidelines.

That said, the more moving parts you introduce, the more things start to get hairy. Don’t expect to your redirects to non-relevant pages to carry much, if any, weight. Redirecting your popular Taylor Swift fan page to your affiliate marketing page selling protein powder is likely dead in the water.

In fact, Glenn Gabe recently uncovered evidence that Google treats redirects to irrelevant pages as soft 404s. In other words, it's a redirect that loses both link equity and relevance.

Q: Is it perfectly safe to use 302 for everything instead of 301s?

A: Again, no

A while back we heard that the reason Google started treating 302 (temporary) redirects like 301s (permanent) is that so many websites were implementing the wrong type (302s when they meant 301s), that it caused havoc on how Google ranked pages.

The problem is that while we now know that Google passes PageRank though 302s, we still have a few issues. Namely:

We don’t know if 301s and 302s are equal in every way. In the past, we’ve seen 302s eventually pass PageRank, but only after considerable time has passed. In contrast to 301s that pass link signals fairly quickly, we don’t yet know how 302s are handled in this manner.

302 is a web standard, and Google isn’t the only player on the block. 302s are meant to indicate a temporary redirect, and it’s quite possible that other search engines (Baidu, Bing, DuckDuckGo) and social services (Facebook, Twitter, etc) treat 302s differently than Google.

Rand Fishkin summed it up nicely.

On Google's announcement that "30xs pass pagerank" -- be wary. Test. Don't assume. Pagerank isn't the only or most important ranking signal.
— Rand Fishkin (@randfish) July 26, 2016

Google's made announcements like this before that later showed to work differently in the real world. Pays to be a skeptic in our field.
— Rand Fishkin (@randfish) July 26, 2016

Q: If I migrate my site to HTTPS, will I keep all my traffic?

A: Maybe

A little backstory. Google wants the entire web to switch to HTTPS. To this end, they announced a small rankings boost to encourage sites to make the switch.

The problem was that a lot of webmasters weren’t willing to trade a tiny rankings boost for the 15% loss in link equity they would experience by 301 redirecting their entire site. This appears to be the reason Google made the switch to 301s not losing PageRank.

Even without PageRank issues, HTTPS migrations can be incredibly complicated, as Wired discovered to their dismay earlier this year. It’s been over a year since we migrated Moz.com, and we’re glad we did, but there were lots of moving parts in play and the potential for lots of things to go wrong. So as with any big project, be aware of the risks as well as the rewards.

Case study: Does it work?

Unknowingly, I had the chance to test Google’s new 3xx PageRank rules when migrating a small site a few months ago. (While we don’t know when Google made the change, it appears it’s been in place for awhile now.)

This particular migration not only moved to HTTPS, but to an entirely new domain as well. Other than the URLs, every other aspect of the site remained exactly the same: page titles, content, images, everything. That made it the perfect test.

Going in, I fully expected to see a drop in traffic due to the 15% loss in PageRank. Below in the image, you can see what actually happened to my traffic.

Instead of a decline as expected, traffic actually saw a boost after the migration. Mind. Blown. This could possibly be from the small boost that Google gives HTTPS sites, though we can’t be certain.

Certainly this one small case isn't enough to prove decisively how 301s and HTTPS migrations work, but it's a positive sign in the right direction.

The New Best Practices

While it’s too early to write the definitive new best practices, there are a few salient points to keep in mind about Google’s change to how PageRank passes through 3xx redirects.

Keep in mind that PageRank — and other link equity signals — are only a portion of the factors used by Google in ranking web pages.

Beyond PageRank, all other rules about redirection remain. If you redirect to a non-relevant page, or buy a website in order to redirect 1,000 pages to your homepage, you likely won’t see much of a boost.

The best redirect is where every other element stays the same, as much as possible, except for the URL.

Successful migrations to HTTPS are now less prone to lose PageRank, but there are many other crawling and indexing issues that may negatively impact traffic+rankings.

Changing URLs for SEO purposes, including...

Removing multiple query parameters

Improving directory/subfolder structure

Including keywords in the URL

Making URLs human-readable

… is less risky now that 3xx redirects preserve PageRank. That said, always proceed with caution when redirecting.

When in doubt, see Best Practice #1.

Happy redirecting!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Below is what happened in search today, as reported on Search Engine Land and from other places across the web. The post SearchCap: AdWords reports, CTR data & Google Maps ads appeared first on Search Engine Land.

Google Maps ads are changing to help local businesses become more visible. Columnist Will Scott discusses the four features you should be most excited about. The post Excited about Google’s new map ads? You should be! appeared first on Search Engine Land.

New to the world of search engine optimization (SEO)? Columnist John Lincoln explains some things you might not know about this online marketing discipline. The post 9 things most people don’t understand about SEO appeared first on Search Engine Land.

In this week’s Search In Pictures, here are the latest images culled from the web, showing what people eat at the search engine companies, how they play, who they meet, where they speak, what toys they have and more. Google’s Gary Illyes in scary clown mask Source: Twitter Real Google...

Is the practice of tracking keywords truly dying? There's been a great deal of industry discussion around the topic of late, and some key points have been made. In today's Whiteboard Friday, Rand speaks to the biggest challenges keyword rank tracking faces today and how to solve for them.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we're going to chat about keyword ranking reports. There have been a few articles that have come out recently on a number of big industry sites around whether SEOs should still be tracking their keyword rankings.

I want to be clear: Moz has a little bit of a vested interest here. And so the question is: Can you actually trust me, who obviously I'm a big shareholder in Moz and I'm the founder, and so I care a lot about how Moz does as a software business. We help people track rankings. Does that mean I'm biased? I'm going to do my best not to be. So rather than saying you absolutely should track rankings, I'm instead going to address what most of these articles have brought up as the problems of rank tracking and then talk about some solutions by which you can do this.

My suspicion is you should probably be rank tracking. I think that if you turn it off and you don't do it, it's very hard to get a lot of the value that we need as SEOs, a lot of the intelligence. It's true there are challenges with keyword ranking reports, but not true enough to avoid doing it entirely. We still get too much value from them.

The case against — and solutions for — keyword ranking data

A. People, places, and things

So let's start with the case against keyword ranking data. First off, "keyword ranking reports are inaccurate." There's personalization, localization, and device type, and that biases and has removed what is the "one true ranking." We've done a bunch of analyses of these, and this is absolutely the case.

Personalization, turns out, doesn't change ranking that much on average. For an individual it can change rankings dramatically. If they visited your website before, they could be historically biased to you. Or if they visited your competitor's, they could be biased. Their previous search history might have biased them in a single session, those kinds of things. But with the removal of Google+ from search results, personalization is actually not as dramatically changing as it used to be. Localization, though, still huge, absolutely, and device differences, still huge.

Solution

But we can address this, and the way to do that is by tracking these things separately. So here you can see I've got a ranking report that shows me my mobile rankings versus my desktop rankings. I think this is absolutely essential. Especially if you're getting a lot of traffic from both mobile and desktop search, you need to be tracking those separately. Super smart. Of course we should do that.

We can do the same thing on the local side as well. So I can say, "Here, look. This is how I rank in Seattle. Here's how I rank in Minneapolis. Here's how I rank in the U.S. with no geographic personalization," if Google were to do that. Those types of rankings can also be pretty good.

It is true that local ranked tracking has gotten a little more challenging, but we've seen that folks like, well Moz itself, but folks like STAT (GetStat), SERPs.com, Search Metrics, they have all adjusted their rank tracking methodologies in order to have accurate local rank tracking. It's pretty good. Same with device type, pretty darn good.

B. Keyword value estimation

Another big problem that is expressed by a number of folks here is we no longer know how much traffic an individual keyword sends. Because we don't know how much an individual keyword sends, we can't really say, "What's the value of ranking for that keyword?" Therefore, why bother to even track keyword rankings?

I think this is a little bit of spurious logic. The leap there doesn't quite make sense to me. But I will say this. If you don't know which keywords are sending you traffic specifically, you still know which pages are receiving search traffic. That is reported. You can get it in your Google Analytics, your Omniture report, whatever you're using, and then you can tie that back to keyword ranking reports showing which pages are receiving traffic from which keywords.

Most all of the ranked tracking platforms, Moz included, has a report that shows you something like this. It says, "Here are the keywords that we believe are likely to have sent these percentages of traffic to this page based on the keywords that you're tracking, based on the pages that are ranking for them, and how much search traffic those pages receive."

Solution

So let's track that. We can look at pages receiving visits from search, and we can look at which keywords they rank for. Then we can tie those together, which gives us the ability to then make not only a report like this, but a report that estimates the value contributed by content and by pages rather than by individual keywords.

In a lot of ways, this is almost superior to our previous methodology of tracking by keyword. Keyword can still be estimated through AdWords, through paid search, but this can be estimated on a content basis, which means you get credit for how much value the page has created, based on all the search traffic that's flowed to it, and where that's at in your attribution lifecycle of people visiting those pages.

C. Tracking rankings and keyword relevancy

Pages often rank for keywords that they aren't specifically targeting, because Google has gotten way better with user intent. So it can be hard or even impossible to track those rankings, because we don't know what to look for.

Well, okay, I hear you. That is a challenge. This means basically what we have to do is broaden the set of keywords that we look at and deal with the fact that we're going to have to do sampling. We can't track every possible keyword, unless you have a crazy budget, in which case go talk to Rob Bucci up at STAT, and he will set you up with a huge campaign to track all your millions of keywords.

Solution

If you have a smaller budget, what you have to do is sample, and you sample by sets of keywords. Like these are my high conversion keywords — I'm going to assume I have a flower delivery business — so flower delivery and floral gifts and flower arrangements for offices. My long tail keywords, like artisan rose varieties and floral alternatives for special occasions, and my branded keywords, like Rand's Flowers or Flowers by Rand.

I can create a bunch of different buckets like this, sample the keywords that are in them, and then I can track each of these separately. Now I can see, ah, these are sets of keywords where I've generally been moving up and receiving more traffic. These are sets of keywords where I've generally been moving down. These are sets of keywords that perform better or worse on mobile or desktop, or better or worse in these geographic areas. Right now I can really start to get true intelligence from there.

Don't let your keyword targeting — your keyword targeting meaning what keywords you're targeting on which pages — determine what you rank track. Don't let it do that exclusively. Sure, go ahead and take that list and put that in there, but then also do some more expansive keyword research to find those broad sets of search terms and phrases that you should be monitoring. Now we can really solve this issue.

D. Keyword rank tracking with a purpose

This one I think is a pretty insidious problem. But for many organizations ranking reports are more of a historical artifact. We're not tracking them for a particular reason. We're tracking them because that's what we've always tracked and/or because we think we're supposed to track them. Those are terrible reasons to track things. You should be looking for reasons of real value and actionability. Let's give some examples here.

Solution

What I want you to do is identify the goals of rank tracking first, like: What do I want to solve? What would I do differently based on whether this data came back to me in one way or another?

If you don't have a great answer to that question, definitely don't bother tracking that thing. That should be the rule of all analytics.

So if your goal is to say, "Hey, I want to be able to attribute a search traffic gain or a search traffic loss to what I've done on my site or what Google has changed out there," that is crucially important. I think that's core to SEO. If you don't have that, I'm not sure how we can possibly do our jobs.

We attribute search traffic gains and losses by tracking broadly, a broad enough set of keywords, hopefully in enough buckets, to be able to get a good sample set; by tracking the pages that receive that traffic so we can see if a page goes way down in its search visits. We can look at, "Oh, what was that page ranking for? Oh, it was ranking for these keywords. Oh, they dropped." Or, "No, they didn't drop. But you know what? We looked in Google Trends, and the traffic demand for those keywords dropped," and so we know that this is a seasonality thing, or a fluctuation in demand, or those types of things.

And we can track by geography and device, so that we can say, "Hey, we lost a bunch of traffic. Oh, we're no longer mobile-friendly." That is a problem. Or, "Hey, we're tracking and, hey, we're no longer ranking in this geography. Oh, that's because these two competitors came in and they took over that market from us."

We could look at would be something like identify pages that are in need of work, but they only require a small amount of work to have a big change in traffic. So we could do things like track pages that rank on page two for given keywords. If we have a bunch of those, we can say, "Hey, maybe just a few on-page tweaks, a few links to these pages, and we could move up substantially." We had a Whiteboard Friday where we talked about how you could do that with internal linking previously and have seen some remarkable results there.

We can track keywords that rank in position four to seven on average. Those are your big wins, because if you can move up from position four, five, six, seven to one, two, three, you can double or triple your search traffic that you're receiving from keywords like that.

You should also track long tail, untargeted keywords. If you've got a long tail bucket, like we've got up here, I can then say, "Aha, I don't have a page that's even targeting any of these keywords. I should make one. I could probably rank very easily because I have an authoritative website and some good content," and that's really all you might need.

We might look at some up-and-coming competitors. I want to track who's in my space, who might be creeping up there. So I should track the most common domains that rank on page one or two across my keyword sets.

I can track specific competitors. I might say, "Hey, Joel's Flower Delivery Service looks like it's doing really well. I'm going to set them up as a competitor, and I'm going to track their rankings specifically, or I'm going to see..." You could use something like SEMrush and see specifically: What are all the keywords they rank for that you don't rank for?

This type of data, in my view, is still tremendously important to SEO, no matter what platform you're using. But if you're having these problems or if these problems are being expressed to you, now you have some solutions.

I look forward to your comments. We'll see you again next week for another edition of Whiteboard Friday. Take care.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Based on AdWords cross-device conversion data, the new reports show device influence through the full conversion path. The post AdWords gains 3 new cross-device attribution reports appeared first on Search Engine Land.