Website redesign mistakes that destroy SEO

To keep up with user pref­er­ences, you have to redesign your web­site now and then. Learn how to avoid the most com­mon pit­falls when you do.

Redesign­ing a web­site, whether it’s your own or a client’s, is an essen­tial part of mar­ket­ing today. It’s essen­tial because tech­nol­o­gy, trends, and the expec­ta­tions of users change over time, and if we want to remain com­pet­i­tive, we must keep pace with these changes.

But this task, while essen­tial, also presents cer­tain risks from an SEO per­spec­tive. A num­ber of things can go wrong dur­ing the process. These issues can poten­tial­ly cause search engines to no longer view that web­site as the author­i­ta­tive answer to rel­e­vant queries. In some cas­es, cer­tain mis­takes can even result in penal­ties.

No one wants that.

So in this arti­cle, we’re going to explore some of the com­mon web design mis­takes that can destroy SEO. Know­ing the poten­tial risks may help you avoid mak­ing the kind of mis­takes that tank your organ­ic search traf­fic.

Leaving the development environment crawlable / indexable

Peo­ple han­dle devel­op­ment envi­ron­ments in a lot of dif­fer­ent ways. Most sim­ply set up a sub­fold­er under their domain. Some may cre­ate a domain strict­ly for devel­op­ment. Then there are those who take the kind of pre­cau­tions to hide their devel­op­ment envi­ron­ment that would give a CIA agent a warm fuzzy feel­ing in that emp­ty spot where their heart should be.

I tend to fall into the lat­ter cat­e­go­ry.

Search engines are gen­er­al­ly going to fol­low links and index the con­tent they find along the way — some­times even when you explic­it­ly tell them not to. That cre­ates prob­lems because they could index two ver­sions of the same web­site, poten­tial­ly caus­ing issues with both con­tent and links.

Because of that, I place as many road­blocks as pos­si­ble in the way of search engines try­ing to access my devel­op­ment envi­ron­ment.

Here’s what I do. The first step is to use a clean URL that has nev­er been used for a live web­site before. This ensures there are no links point­ing to it. Next, dis­al­low all bots using robots.txt, and set up an emp­ty index page so that oth­er fold­ers are not vis­i­ble. In the past, I’ve even gone as far as set­ting up pass­word pro­tec­tion, but in most cas­es, that may be overkill. You can make that call.

From there, I’ll set up a sep­a­rate fold­er for each web­site in devel­op­ment. Typ­i­cal­ly, the fold­er name will be a com­bi­na­tion of incom­plete words so that it’s unlike­ly to be found ran­dom­ly. Word­Press will then be installed in these fold­ers, and con­fig­ured to also block bots at this lev­el.

Arbitrarily changing image names on pages that rank well

This isn’t always an issue, but if a web page is rank­ing well, chang­ing the name of an image on that page may cause a loss of rank­ing. Espe­cial­ly if the web design­er doesn’t know what they’re doing.

I’ve seen this hap­pen more than a few times, where a client hires a web design­er who doesn’t under­stand SEO to redesign a web­site that already ranks well. As part of the redesign process, they replace old images with new, larg­er images, but, lack­ing the appro­pri­ate expe­ri­ence, they use stu­pid image names that pro­vide zero SEO val­ue, like image1.jpg.

This takes away a vital piece of con­text that search engines use to deter­mine where a par­tic­u­lar web page should rank.

Deleting pages or changing page URLs without redirecting them

Dur­ing a redesign, some pages will almost cer­tain­ly no longer be need­ed. Less expe­ri­enced web design­ers will often sim­ply delete them. Oth­er pages may be moved and/or renamed, which in most cas­es, changes their URL. In these cas­es, inex­pe­ri­enced web design­ers often change these URLs and con­sid­er the task com­plete.

This is a big mis­take because some of those pages may already rank well. They might have inbound links point­ing to them or have been book­marked by vis­i­tors.

When you delete pages that already have inbound links, you’ll lose all of the SEO val­ue from those links. In some cas­es, this could result in a dras­tic loss of rank­ing.

The issue goes even deep­er though. Any­one click­ing those links or book­marks will be greet­ed by a 404 page. That presents zero val­ue to any­one, and more impor­tant­ly, it cre­ates a neg­a­tive user expe­ri­ence. This is impor­tant because Google has con­firmed that user expe­ri­ence is a rank­ing fac­tor.

The prop­er way to delete pages is to redi­rect any them to the most rel­e­vant page that cur­rent­ly exists. As for mov­ing pages, which includes any­thing that changes the URL of that page in any way, it’s equal­ly impor­tant to redi­rect the old URL to the new one.

In both sce­nar­ios, a 301 redi­rect should gen­er­al­ly be used. This tells search engines that the old page has been per­ma­nent­ly moved to the new loca­tion. For most host­ing plat­forms, this is best accom­plished by adding the appro­pri­ate entry into your .htac­cess file.

If you’re unable to see a .htac­cess file on your serv­er, you may need to adjust the set­tings on your FTP pro­gram to view hid­den files.

Some spe­cial­ized host­ing plat­forms may uti­lize a dif­fer­ent method, so you may need to check with their sup­port team to deter­mine how to accom­plish it.

Not performing a full crawl after migration to and from the development environment

Regard­less of the method you use for migra­tion you’re bound to run into some errors. Typ­i­cal­ly you’ll first migrate the live web­site into your devel­op­ment envi­ron­ment, and then lat­er, send it back to the live serv­er after you’ve made and test­ed changes.

One that I run into fre­quent­ly is links with­in con­tent point­ing to the wrong place. For exam­ple, with­in a page or post on the live web­site, you may have a link that points to:

domain.com/services/

Once migrat­ed to the devel­op­ment envi­ron­ment, it may be:

devdomain.com/client123/services/

All is fine and good so far, right?

But some­times, while migrat­ing the com­plet­ed web­site back over to the live serv­er, the con­tent in pages and posts may still con­tain links point­ing to the pages with­in the devel­op­ment envi­ron­ment.

This is just one exam­ple. There are count­less links to con­tent with­in a web­site — includ­ing links to the essen­tial image, JavaScript, and CSS files.

For­tu­nate­ly, the solu­tion is sim­ple. A tool like Scream­ing Frog, which runs from your desk­top, or a cloud-based tool like SEM­rush, can be used to crawl every sin­gle link with­in your web­site. This includes the text links vis­i­ble on the front end, as well as all of the links to image, JavaScript, and CSS files that are tucked away in the HTML of a web­site.

Be sure to review all links to exter­nal sources once the new web­site has been migrat­ed to the live serv­er because any links point­ing to your devel­op­ment envi­ron­ment will appear as exter­nal links — when you find “exter­nal links” that should real­ly be inter­nal links, you can make the appro­pri­ate cor­rec­tions.

This step is essen­tial after migrat­ing in either direc­tion, in order to pre­vent poten­tial­ly cat­a­stroph­ic errors.

Failing to perform a complete function check on everything

Once a redesigned web­site has been migrat­ed to the live serv­er, you need to do more than quick­ly review a few pages to make sure things look OK. Instead, it’s essen­tial to phys­i­cal­ly test every­thing to make sure it not only looks right, but also func­tions prop­er­ly.

This includes:

Con­tact forms.

E-com­merce func­tion­al­i­ty.

Search capa­bil­i­ties.

Inter­ac­tive tools.

Mul­ti­me­dia play­ers.

Ana­lyt­ics.

Google Search Con­sole / Bing Web­mas­ter Tools ver­i­fi­ca­tion.

Track­ing pix­els.

Dynam­ic ads.

Failing to reconfigure WordPress and plugins after migration to the live server

Remem­ber how we talked about the impor­tance of putting up a wall between your devel­op­ment envi­ron­ment and the search engines’ crawlers? Well, it’s even more impor­tant to tear that wall down after migrat­ing the web­site to the live serv­er.

Fail­ing to do this is easy. It’s also dev­as­tat­ing. In fact, it’s a mis­take I made sev­er­al years ago.

After migrat­ing a client’s web­site to their live serv­er, I for­got to uncheck the box in Yoast SEO that told search engines not to crawl or index it. Unfor­tu­nate­ly, no one noticed for a few days, at which point, the web­site had been almost com­plete­ly dropped from Google’s index. For­tu­nate­ly, they didn’t rely on organ­ic traf­fic, and, once I unchecked that box, the web­site was quick­ly rein­dexed.

Because of the impact mis­takes like these can have, it’s crit­i­cal that after migra­tion to the live serv­er, you imme­di­ate­ly check the con­fig­u­ra­tion of Word­Press as well as any plu­g­ins that could affect how search engines treat your web­site.

This includes plu­g­ins for:

SEO.

Redi­rec­tion.

Sitemaps.

Schema.

Caching.

Neglecting to pay attention to detail

None of these mis­takes are par­tic­u­lar­ly com­pli­cat­ed or dif­fi­cult to avoid. You sim­ply need to be aware of them, imple­ment a plan to avoid them, and pay close atten­tion to detail.