This post was promoted from YouMoz. The author’s views are entirely his or her own (excluding an unlikely case of hypnosis) and may not reflect the views of Moz.

Trying to do SEO for a website without full access to its CMS is like trying to win a sword fight with one hand tied behind your back. You can still use your weapon, but there is always going to be a limit to what you can do.

Before this metaphor gets any further out of hand, I should explain. One year ago, the agency I work for was asked to run an SEO campaign for a client. The catch was, it would be impossible for us to gain full access to the CMS that the website was built on. Initially I was doubtful about the results that could be achieved.

Why no CMS?

The reason we couldn’t access the CMS is that the client was part of a global group. All sites within this group were centrally controlled on a third-party CMS, based in another country. If we did want to make any ‘technical changes’, it would have to go through a painfully slow helpdesk process.

We could still add and remove content, edit Metadata and had some basic control over the navigation.

Despite this, we took on the challenge. We already had a strong relationship with the client because we handled their PR, and a good understanding of their niche and target audience. With this in mind, we were confident that we could improve the site in a number of ways that would enhance user experience, which we hoped would lead to increased visibility in the SERPs.

What has happened in the last year since we started managing the search marketing campaign has emphasised to me just how important it is to implement well-structured on-page SEO. The client's website is now receiving over 20,000 more visits from organic search per month than it did when we took over the account.

I want to share with you how we achieved this without having full access to the CMS. The following screenshots are a direct comparison of January 2013 and January 2014.

Corresponding figures can be viewed in the summary at the end of the post.

Analytics

When we were granted access to analytics for the website, we got our first real insight into how the site was performing, and what we could do to help it perform better.

By analysing the way visitors were using the site (visitor journeys, drop-off points, most visited pages, which pages had highest avg. time etc.), we could start to structure our on-page strategy.

We identified how we could streamline the navigation to help people find what they were looking for quicker. We also decided it was necessary to create clearer call-to-actions, which would shorten the distance from popular landing pages, to the most valuable pages on the website.

We also looked at the top landing pages, and with what keyword data we had access to, we were able to define more clearly why people were visiting the site, and what they expected when they landed on a page.

For example, the site was receiving a lot of traffic for one of its products, with visitors coming into the site from a range of relevant short and longtail keywords. However, they would almost always land on the product page.

We noticed by analysing visitor journeys from this page that they would leave to try to find more information on the item, because the majority of visitors weren't entering the site at the buying stage of the conversion cycle.

However, where this supporting information lived on the site wasn’t immediately obvious. In fact, it was nearly four clicks away from the product landing page!

It was obvious we’d have to address this, and other similar issues we identified simply by conducting some fairly simple analytic analysis.

Product Pages

The product pages were generated from a global product catalogue built into the content management system. They aren’t great, but because we didn’t have access to the catalogue or the CMS, there was not much we could do directly to the product pages.

Rewriting content

I don’t necessarily believe that there is such a thing as ‘writing for SEO’. Yes, you can structure a page in a certain formulaic way with keywords in header tags, alt tags and title tags.

You can factor low-competition longtail phrases and target keywords into the copy as well…but if you sacrifice UX in favour of anything that I’ve just mentioned, then I’ll just be honest, you’re doing it wrong.

From looking at the data in Google Analytics (low avg. time on site and a bounce rate that should have been lower), and reading through the website ourselves, it became clear that the content needed to be rewritten.

We did have a list of target keywords, but our main objective was to make the content more valuable to the users.

To do this, we worked closely with the PR team, who had a great understanding of the client’s products and key messages. They had also developed personas about the type of visitor that would come to the client's site.

We were able to use this knowledge as a foundation to rewrite, restructure and streamline sections of the website that we knew could be performing better.

Another thing we noticed from analysing the content is that interlinking was almost non-existent. If a visitor wanted to get to another piece of information or section of the website, they'd be restricted to using the main navigation bar. Not good...

We addressed this in the rewriting process by keeping a spreadsheet of what we were writing and key themes in those pages. We could then use this to structure interlinking on the website in a way that would direct visitors easily to the most relevant resources.

As a result of this we have seen time on site increase by 14.61% for visitors from organic search:

Working with the PR team

As I have mentioned, we also handled PR for this client. Luckily, the PR team provided brilliant support to the search marketing side of the account.

This has proved integral to the success of this campaign for two reasons:

1) The PR team know the client better than anyone. It might even be fair to say they know more about the products and target audience than the client's own marketing team.

This helped us build a firm understanding of why people would come to the site, what they'd expect to see, and what the client wanted to achieve with its web presence.

This was great in terms of helping us identify what people would search for to find the site, which in turn allowed us to structure the content rewrite more effectively.

2) By working with the PR team, we were able to co-ordinate the on-page and off-page work we were doing, to align with PR campaigns.

For example, if they were pushing a certain product, or raising awareness of a specific campaign, we knew we'd see an increase in search volume in those areas. The SEO team would then also focus efforts on promoting the same product.

When the search volume increased, our site was there to capture the traffic. Unlike in the previous example when the traffic was sent to a product page, we were able to create a fully optimised landing page.

With this approach we knew we'd get a good volume of targeted traffic - we just needed to be there to capture it and give a friendly nudge in the right direction.

Restructuring navigation

The main navigation menu on the site proved to be a source of great frustration. Functionality was extremely limited...we couldn't even create dropdown menus as that wasn't built into the CMS.

That meant we needed to be really tight with our navigation options, as well as making it obvious where each navigation link would lead.

Again, we worked with the PR team and the client, as well as using information from Google Analytics to learn about how visitors were using the site, and how the client wanted them to use the site.

Armed with this information, we streamlined the navigation to support user experience by creating better landing pages for the navigation links and making the most popular and valuable pages of the website more accessible.

The result has been that although people are spending more time on page than 12 months ago, they are visiting fewer pages. This has helped us inform the client that navigation was working better, and visitors were able to find the information they required more easily:

Valuable content

There’s a vicious rumour circulating at the moment that quality content (no... not 300 word blog posts) can help drive SEO success. Well, we decided to test this for ourselves…

As well as rewriting existing copy, we also created new content that we hoped would drive more organic search traffic to the site.

We created infographics (good ones), product-specific and general FAQs, video and text based tips and advice pages, as well as specific landing pages for the client's three 'hero' products.

We knew from looking at the analytics that there was definitely opportunity to get more longtail traffic, but we wanted to combine this with creating a genuinely useful resource for the visitors.

Nothing we did was hugely resource intensive in terms of content creation, but what we did create was driven by what the data told us people wanted to see.

As a result, the tips and advice pages and FAQs have both pulled in significant volumes of organic search traffic, and given users something of value.

The screenshots below illustrating this are taken from the middle of August 2013, when the pages went live, to the end of January 2014:

Fixing Errors

With the site plugged into Moz, we were pretty shocked to see the crawl diagnostics return 825 errors, 901 warning and 976 notices. This equated to almost one warning and one error on every single page on the site. The biggest culprit being duplicate page titles, duplicate page content and missing or non-existent Metatags.

The good news – I got to spend tonnes time doing what every SEO hates loves – handcrafting new metadata!

The bad news – the majority of errors were caused by the CMS. How it dealt with pagination, the poor integration of the product catalogue and the way it handled non-public (protected) pages.

As part of our initial audit on the site, we noticed the site didn’t even have a robots.txt. As you know, this meant the search engine bots were crawling every nook and cranny, getting in places that they had no business going in.

So, as well as manually crafting new metadata for many pages, we also had to try and get a robots.txt that we had written onto the site. This meant going through a helpdesk, where they didn’t understand SEO and where English wasn’t their first language.

A gruelling process – but after several months of trying, we got that robots.txt in place, making the site a lot more crawler friendly.

Now we’re down to 122 errors and 377 warnings. Okay, I know it should be lower than that, but when you can’t get change how the CMS works, or add functionality to it, you do the best you can.

Conversions

The client does not sell directly through its website, but through a network of distributors. The quickest way for a customer to learn about their closest distributor is to use the 'Contact Us' page. Again, admittedly, this is far from the best system but unfortunately, it is not something we're able to change at this stage.

Because of this, we made people visiting the 'Contact Us' page a conversion goal that would be a KPI for the campaign. We have seen this increase by over 21% in the last 12 months, which has helped us prove value to the client, as these are the kinds of visits that will have a positive impact on their bottom line. It's good to know you're not only driving a high volume of traffic, but also a good quality of traffic.

Off-page

The reason I’ve saved off-page to last is that I really don’t dwell on it. Yes, we did follow traditional 'best practices'; blogger and influencer outreach, producing quality content for people to link to – but we didn’t do anything revolutionary or game-changing.

The truth is, we had so much work to do on-page, that we kind of let the off-page take care of itself.

I’d in no way advocate this approach all the time, but in this case we prioritised getting the website working as hard as it could. In this case, it paid dividends and I’ll tell you why.

Conclusions - Play to your strengths

Managing an SEO campaign without full access to a CMS undoubtedly poses a unique set of challenges. But what it also forced us to do was play to our strengths.

Instead of overcomplicating any of the more ‘technical’ SEO issues, we focused on getting the basics right, and using data to structure our strategy. We took an unfocused, poorly structured website, and shaped into something valuable and user-friendly.

That’s why we’ve seen 20,000 more unique visits per month than we were having when we took over the campaign a year ago – we did what many people would consider 'basic SEO' really well. I think this is what I want the key takeaway to be from this case study.

It's probably true that SEOs are experiencing something of an identity crisis, but as Rand eloquently argued in his recent post, we still have a unique skill set that can be incredibly valuable to any business with an online presence. What we may consider 'basic' still has the potential to deliver fantastic results.

Really, all we’re trying to do is make our websites more user-friendly and more crawlable. If you do that, you’ll get the results. Hopefully that’s what I’ve illustrated in this post.

Great article, Rory! I've worked with SMEs and many of them (especially one-man / woman operations) don't have the basics right. The other week, I ran a Google Analytics workshop for SMEs and around 50% of them didn't even have GA installed on their site! When analysing some of their sites, there were other clangers such as shady web designer promoting their own business in their client's meta description! It made me angry and sad.

Sometimes we forget (perhaps) that what we know isn't known by approx. 99% of the population. And while your case study presumably deals with a bigger (international) client, the same approach applies to many SMEs: start with the basics and get those right first (Analytics, on-page SEO). Especially in niche markets, on-page can yield great results without ever having to proactively 'build links' (not against building links - but getting the basics right is probably a better way to generate ROI for the client when you first start working with them).

Hi Simone, thanks for the comment - glad you enjoyed the post! You're exactly right, this approach has the potential to work universally, and I don't think it has to be dictated by the size of the client. It sounds like you have a great strategy though, and if you're helping educate business owners through workshops that's even better. Keep up the good work!

I also have some sites with not fully cms access and everything is taken so long time - and I can see clearly things I could change, things with great affects but I cant... annoying.Aggree with Krzysztof again - :)

I would prefer full CMS Acces but I am happy to see that I am not the only one without sometimes

Thanks for your comment - hope you liked the post! Yes, I'm starting to see I'm not the only person in this situation, but as I've discovered it's far from the end of the world. It really forces you to play to your strengths, and if you take the correct approach you really can do a great job!

I got one website. Awful for seo but client thinks "It's modern and very stylish!" (no parallax). It looks like big 10x10ft piece of paper (looking good but..) which I'm moving around to find what I'm searching for, or use navigation links to jump to specified place on that website. Arrrrghh...!I'm glad client agreed to add some cms to it, so I have full control and I'll get traffic to better seofriendly place.

Good Design can kill SEO if it isn't made right.. or an SEO was part of the team wich created the site. I also can see some good looking sites with poor navigation. And I can often see Sites wich say the same things on 3 Sites - homepage + product page + about us page = targetting the same topics...Grrrrr

Perfect case study, I would say. This shows that what SEO can do for a site. Even if we have a great content but cannot repurpose it in a better way, it might not yield the juice it ought to. I liked the landing page audit and re-purposing concept. Leveraging a concept with different descriptions and different formats on various platforms would yield enormous results than what its yielding now. So we all need to strategies on this part for sure to make the content do the magic in terms of generating loyal customers for a site.

Love this! Our site doesn't have a true CMS and the backend system has created more problems than I ever could have imagined. Been using some of these techniques to increase SEO currently but got some new ideas from reading about your experience. Will definitely use your advice about tracking page themes to create the best interlinking strategy!

Good to know other's have tackled similar issues and come out with big wins!

Excellent post and thanks for sharing the clear and specific examples of what you did to turn things around! Duplicate page titles, duplicate page content and missing or non-existent Metatags are relatively easy to correct but so often get overlook because of the CMS itself or the web resources are moving as fast as possible just to get the content updated and online. Then, I agree with many others here: if you get a boost in consistent traffic you'd better have great, relevant content and maintain it consistently.

Glad you liked the post! Yeah, I'm really pleased that the comments have progressed and developed the point I was aiming to make, that if you get the basics as close to perfect as you can, have great content and clear, logically defined goals, then you are going to be managing a successful campaign.

I and my SEO team has been working for client websites for almost 10 years now and the first thing we check with is the website UX in combination with the SEO work done on it before the project came to us. Many time we have found out that the biggest problem with the pages is the poor navigation which kills that SEO practices even. Navigation is a very strong tool that a good website can use to its stride.

What one can do to check with- how is his/her website with navigation, the easiest, simplest and cheapest way is to get some people to analyse your website through tools like UTest or Usertesting.com or some similar tool. You can put in some questions and ask the testers to navigate to certain information. This is one case where even a non- tech person can check his/her website and then plan for getting things fixed accordingly.

This is an awesome breakdown of how to take analytics data and use it to shape your SEO campaign, specifically UX data! Great blog post and good work with this client, Rory.

You mentioned making a visit to the "Contact Us" page a conversion… was there any thought to adding another conversion using Javascript to track an Event on the contact page (click of a certain button, etc.)? That would certainly help separate the random visitor from the prospective customer, no? Or would that change have required technical support from the people abroad?

(P.S. - I've totally worked with the "impossible to get technical changes implemented" client, and it's terrible. Kudos for hanging tough!)

I'm just curious if the client ever questioned you on the accuracy of treating a visit to that page as a goal. Really neat to hear such detailed case studies like this one. Great work, great recap. :)

You make a great point, and I'd have loved to have implemented event tracking on this page, but as you correctly identify, it wasn't something we were able to do. The contact form was in a 'portlet' (kind of like a WordPress widget), but we didn't have access to this, so couldn't implement the code unfortunately. When we asked the 'technical team' to implement it on the form, they said they couldn't because it would apply the same code to all the other global sites in the system that used this particular portlet so would skew the results wildly...

That's right, this was the answer we got from a 'technical team'...**facepalm**

Anyway, after some benchmarking we set the goal to fire once a person had been on the contact page for over 60 seconds, as this was the length of time it took the average person to get in touch with the relevant department. Far from accurate I know, but it was the best we could do at the time! This seemed to satisfy the client, who in fairness, were equally frustrated with the lack of technical support given.

Interesting read, Rory. Sounds like you learned a hell of a lot from that one client. Big (international) Clients are both a blessing and a curse: Strong domains and good backlink profiles (Once you've removed the crap) but outdated CMS, slow approval processes and excessive bureaucracy. It's all fun and games though :)

Thanks for the great article! I appreciated a lot, but I wanted to thank you for mentioning the piece about "not writing for SEO". Keeping the user experience first, and seeing where they are bouncing will tell where there is room for improvement, and in turn, will help SEO. It is common (and tempting) to write for engines, but I applaud you for clarifying that we need to write for user experience!

HI Evan, thanks for the comment, glad you enjoyed the article! Awesome that you picked up on that point specifically. It's something I strongly standby, and I think it's an easy trap to fall into that you have to shape your writing in a way that you perceive will be more beneficial to your rankings, but in the end all you do is end up alienating your visitors and diminishing the quality of their experience! That's not how SEO works anymore :)

Hi, thanks for the comment - really glad you enjoyed the case study. Completely agree, you can't overlook the importance of good content, but it also has to have some kind of strategic function. In this case it was improving user experience on the site, and driving additional search traffic. Glad it paid off for the client though in this case.

There's nothing like having to go through someone elses CMS & help desk to get something done...

Dealing with a bit of that myself at the moment, but on the plus side, we've convinced the client to switch their sites over to Joomla so we can truly manage them. You never truly appreciate the benefits of technical SEO until that's all you can do to a website. :)

Great case study. We experienced something similar with a client where we mainly focused on the fundamentals to optimize their site structure, fix crawl errors and duplicates, and simple meta data/content updates. Within less than 6 weeks we saw huge improvements with a 180% increase in organic traffic much higher engagement with the site compared to last year.

Great post it was very interesting, although I do not know google analytics to well I have been working with webmaster tools which I found to be useful too. You mention in your post "with moz plugged in" you were able to check all the crawl errors and duplicate title tags etc. How do I get this, although I do it with google and bing webmaster tools I may find some extra useful information using moz.

Thanks in advance and I may just give you a shout to see if you would do some work on our site in future after we have moved across to bootstrap in the coming weeks

Thanks for the comment - great that you enjoyed the post. Ah starting with Google Analytics can always be a bit tricky - I found it like going down a rabbit hole because you can keep on drilling deeper and deeper into the data until you get lost!

I'd really recommend you read the excellent Occam's Razor blog by Avinash Kaushik - he always shares really actionable, in-depth posts that have improved my use of Analytics no end.

By "with Moz plugged in" what I mean is that I have access to a Moz Pro account. As part of this Mozbot, or Roger to his friends, crawls your website, and identifies the potential issues that could be affecting the crawlability and usability of your site.

While it is true that you can use Google Webmaster tools to collect this data, the Moz dashboard really simplifies things and helps you target the most important issues to resolve first. If you can't get access to a Pro account, I'd recommend you try using 'Screaming Frog', as a tool. It will crawl your website and help you identify potential issues - but requires you to look into the data a bit more than with the Moz tool. I've written about it here.

Anyway, hope that helps - please let me know if you have any other questions :)

For the best results on page and off page factors are incredibly important. A combination off linkbuilding, certain landing pages for certain products for example will help you a lot. Always try to get your content to match the Google criteria and you're good to go. Thanks Rory for this article, a good read.

I totally agree with @RoryT11. Good content is definitely required and to some extent it will boost rankings as well. You will even get a few natural links because of your good content, but you require some strategy for content marketing as well to drive your traffic upwards.

great post.. contents are the main part of seo.. if we want to get good traffic or rank we should follow the google updates .. we cant say content alone will give good rank.. we should do both on page off page steps.. thanks for your post rory ...

Hi @RoryT11, Great post but i want to know one thing which is mentioned in your post about "search volume increase". My question is “how will you increase search volume of any keywords?” Please explain.

I believe that as SEOs, we don't just need to chase keywords where the volume already exists. With the budget you would have spent ranking for those keywords, it is sometimes more valuable to the business to invest that money in creating search volume for other keywords that are potentially more likely to convert.

The most obvious example I can give is if a company releases a new product. Without promotion through offline channels, social media, PPC and all the other channels out there, search volume for that product wouldn't exist, or at least it would be minimal. Through intelligent marketing and advertising you can create search volume, and if you're site is in a position to convert this newly cultivated search traffic, then you are in a really strong position.

We did actually use video, to enhance the general user experience but not actually to improve product descriptions. I mention in the post above that we created a 'tips & advice' section of the site - this is where we used a lot of our video content. We found that people would engage a lot more with the video content if it didn't appear to be selling them something directly, so using this type of content to offer advice and guidance around the products on offer actually worked really well. There was some really positive correlation between the visitors that spent time watching 'tips & advice' videos before clicking through to the 'contact us' page - which we classed as a conversion as the site wasn't selling directly.

We didn't use product reviews in this case, however, we are currently in the process of building an ecommerce store for this client. While we don't plan to incorporate product reviews at the launch stage, once the site is off the ground and has a steady stream of traffic flowing in, adding product review functionality is something we're strongly considering (we have full CMS access with this site! :D).

Anyway after reading your post and of course many others over the last couple of months I have spend a great deal of time doing unique title tags, descriptions lengths, alt tags and all the things mentioned for good seo, including the OGP stuff on my home page similar to yoast.

So my question to you and of course the rest of the professional SEO community is

Why am I barely even being found at all in google now for the stronger keywords, better title descriptions etc that I have targeted such as cheap hotel prices, compare car hire etc

I know these keywords have a lot of strong and established competitors but it says in my webmaster tools that everything is up so many percent but I still seem to be getting less clicks.

Q1: Does google drop your old pages from its indexes altogether when you make changes and re-rank and re-index accordingly or does it see the changes and just re-rank.

Q2: Does it take a while for things to happen, would you initially see a drop then a better comeback?

Q3: I have done everything to my knowledge that google & bing webmaster tools requires for their guidelines and of course there are no errors within either of these webmaster accounts but still nothing much of a change

Q4: Finally it says in webmaster tools that a rank 9th average position for that specific keyword but when I search in google I trawl through to the 10th page and still it does not show, is this information playing catchup or something to where it once may of happened to show.

Any advice would be of great assistance not only to me but I believe other people just like me in the same position.

That is a difficult question, and without being able to look at your site and the site's of your competitors, it is really hard for me to say. If you're an emerging business going after really competitive keywords, such as the one's you've listed above, you're chances of ranking highly organically based on modifications to metadata are slim.

Have you done any backlink research on your competitors? Not just to find out where their links are coming from, but the kind of content they are producing that works well to generate good levels of engagement? Have you thought about cultivating traffic streams from other channels that your competitors might not be so active in - social/video search, for example?

When you're trying to break a really competitive niche from scratch, particularly if you're hoping to generate lots of organic search traffic, I think you need to find an 'in' somewhere else. If you can build the visibility of your brand on another digital channel, it makes improving rankings a lot easier and you can be driving traffic in from other sources.

To answer your questions:

1) When Google recrawls your site it will index the latest versions of the pages that it crawls (although old pages can still be found in the cache). You can't expect this to instantly improve rankings though, particularly if it's just the metadata you've changed - there are so many factors that need influence a change in ranking position.

2) It depends on how competitive your niche in, and what your competitors are doing. If you are going up against a number of really established competitors, then it will take time and hard work for organic search position to improve.

3) Ref. what I have said above.

4) That's not just organic search position, it could be for image search, for example, or your site listed in another search vertical.

Sorry I can't be more explicit in my answers, but hopefully this information will be of some help to you!

I think this line may have some issues: "The good news – I got to waste spend tonnes time doing what every SEO hates loves – handcrafting new metadata."

Your article is very encouraging. It's not uncommon for an SEO to be denied full access to a website and the fact that you took this challenge head on is awesome. While I do think it's necessary to tell the client up front that less access usually results in less improvement, it doesn't mean that you can't fully utilize the resources that you do have.

I completely agree!! If you stuff your content full of keywords, it's crappy, or solely promotional it won't be shareable or encourage your audience to engage with it which negates the effect great, creative content can have!

What I find so amusing is that people think they need a lot of stuff to get to 20,000 visitors a month, or even to double that.

No, you don't need fancy analytics software, you don't need to be a member of some big-time SEO site, and you don't need to have a team of "professionals" holding your hand.

You do need to add new content regularly that interests your users. Most cannot do this. And that's why they continually look for shortcuts and then come complaining that they're getting Google penalties.

(Sigh)...I just don't understand why so many people can't understand such simple principles.

"I don’t necessarily believe that there is such a thing as ‘writing for SEO’. Yes, you can structure a page in a certain formulaic way with keywords in header tags, alt tags and title tags."

I agree with this fully and it is for this reason I never hire an SEO or SEO company to write content unless they are well aware of the differences between writing for people and writing for search engines.

Putting SEO first when writing content is a "horse before the cart" scenario if there ever was one. Of course content should be reviewed for SEO best practices before publishing, but this should happen after the bulk of the writing work is done, in my opinion.

Hi, thanks for the question! I did kind of purposefully skip over it as I wanted to focus more on the on-page side of things, but I honestly think off-page work is still incredibly important, and the results achieved in this case study might not have been achieved without that work.

But honestly, the off-page work we did wasn't driven by the desire to win links. The targeted engagement that our off-page work consisted off was getting our brand in front of the right audience. If they linked back, great! If they didn't we were confident that we'd see the volume of branded searches increase. We just made sure the site was optimised to convert that traffic that was already interested in our brand. Hope that helps!

"but if you sacrifice UX in favour of anything that I’ve just mentioned, then I’ll just be honest, you’re doing it wrong" truer words have never been written :)

as with most things in life, it's all about balance. First and foremost, figure out with each piece of content 'what you want to communicate'. Second, keyword research to see how people commonly phrase the concepts/ideas you're writing about. Finally, create your content, taking into account the phraseology you researched, and create something that flows and engages, and that limits cognitive overhead.

Google and us have a lot in common: we want to get people to the information they desire, in a format that they can digest. At the end of the day, it's all about substance.

HI Salza, thanks for the question. CMS stands for 'Content Management System', and it is the interface that is used to publish, edit and modify the content that appears on a website. Some of the more well known CMS' included WordPress, Joomla and Magento.

Thanks Rory some fantastic tips in there. Fun seeing the old SEO-Moz crawl errors it has come some way since then! Though you mentioned it in a comment also +1 screaming frog for a very quick overview of any easy fix problems (like 404s) Thanks again for taking the time for the post.

Use your newly revamped content on a site with not yet developed CMS to your advantage! Turn it into infographics, do some great research and create remarkable content. Things that the audience will find useful, educational, thought provoking, whatever. Then you can start doing some serious link building. People will not link to you just because. They'll link to great content, which will undoubtedly be on different pages, etc. Don't worry too much about the trickle down affect. Content is king, always. Think of your content as "linkable assets" and get those assets in front of people who might be willing to share.

I think it is a bit risky to assume that once you have created good content, it will instantly yield results in terms of driving traffic to the site. It's certainly an excellent starting point, but often has to work alongside other strategies to deliver the best results!

And I am the evidence of it! 7 months later, this post appeared on my twitter and I found this great content, where I am currently in the same situation ;) If it was not shared, I was going to keep struggle on my own. Thanks for the hints RoryT11!