Episode Guide

This is the second video in my Ultimate Guide To Tiered Link Building video tutorial series. If you missed part 1 then be sure to check out the episode guide on the right.

Thanks for all the great feedback so far! Please share this video series with others if you find it useful. It takes hours to make each video so taking a few seconds to share it is a great way of saying thank you.

Video Transcript

Hi guys this is Matthew Woodward and welcome to the second part of my ultimate guide to tiered link building.

I would like to say thanks for all the great feedback on the first video so far it took 8 hours to make so it’s great to see it’s been well received by everyone. But don’t be scared to share it with others.

In this video you’re going to learn

How to create the perfect tier 1 link profile

How to prepare everything you will need for the campaign

How to spin the content the right way, and I’ve got some crazy stuff for you on this topic.

How to setup a campaign over my shoulder

And most importantly how to schedule it all so you can be doing something else.

I just wanted to recap on exactly what makes a good tier 1 link. I know we covered this in the last video so I’ll make it very quick.

It should be high quality, contextual and surrounded by relevant content that makes sense. Ideally these links will be built on domains that Google already trusts and carry a low risk of been deleted or removed in the future.

In the last video I said having the ability to remove a tier 1 link is optional, but I really can’t stress enough how important is it to be able to remove links from your tier 1 at will. I’m sure some of you watching this have kicked themselves in the past for not having that level of control.

You also need to have the ability to get the exact location of the link as we will need this to build the lower tiers.

And let’s quickly recap on examples of good tier 1 links. You can use hand built web 2.0 sites which I will make a separate video for. You can also use things like press releases, wikis, article directories, social bookmarks, web directories, blog posts and videos.

The entire tiered linking process is made up of 3 parts.

First of all we schedule out our first set of tier 1 links over 2 weeks to create a natural link velocity to the site.

Once those links have all been created over that 2 week period we start to build out our tiers in week 3. You might think this is where it will get complicated but setting up the tiers takes less than an hour and they will grow automatically forever, more on that in a future video.

Once you have setup the self growing tiers at the start of week 3 you can then repeat the entire process by building out a new set of tier 1 links. All you have to do is rinse and repeat the process until you hit the top spot.

Before starting any link building campaign I like to map out exactly what I will need and get everything prepared. This makes building the actual campaign very quick and easy.

First of all we need to get some base content for the campaign.
Then we need to spin it and prepare it all properly
And most importantly we need to scrape a list of target sites to post to. This is a very important step that most people don’t bother with but it is critical to your success.

To give you a bit of a head start you can download a sample of my personal list of targets underneath this video on my blog.

Our first tier of links must feature high quality content that makes sense. In total you will need-

3x 500 word articles about your keyword or topic

1x set of social bookmarking titles and descriptions

1x set of web directory titles and descriptions

1x set of related tags and keywords

1x video with titles/descriptions

Sourcing the content couldn’t be easier and it should cost you no more than $25 in total.

For the articles I tend to use seogenerals.com they have a superb interface for ordering and managing your articles and this is how all article services should be setup if you ask me! You can use coupon 7daysale to bring the cost of a 500 word article down from $7.5 to $4.85.

I have also had good result with 99centarticles.com but I prefer how SEO generals is setup.

The video just head straight over to Fiverr, there are hundreds of gigs offering video production for just 5 dollars. Find one you like the look of and place an order immediately.

For all of the other titles and descriptions that we need I tend to just write these myself. You need 3 sets in total which consists of 1 title and a 3 sentence description so you should end up with 3 titles and 3 descriptions in total.

Personally I use Scrapebox to get the related tags and keywords but you can also use Adwords. Let’s take a look at how to do that now.

Getting the related keywords and tags is quite easy; you can just come over to the Google Adwords keyword tool, type whatever your keyword is into the box up here. In this example I’m going to use payday loans. Tick the box that says only show ideas closely related to my search terms and hit search. This will return a huge list of keywords for you to use and to download them just hit download all search results there.

The way I do it though is with Scrapebox. If you just hit the little scrape button here and then keyword scraper, enter payday loans and just hit scrape. And that will go out and gets lots of keywords from lots more sources than just Google. Hit Ok at the end of scraping, remove duplicate keywords and there you go you have a nice list of keywords that you can use.

You can take this a step further and take all of these results and put them into here and then scrape again and then you will end up with an even bigger list. So once you’ve got all your keywords together just save them somewhere as we will need them later on.

We need to make sure our content is spun in the right way.

Unfortunately the only way to create high quality spun content is by hand which is time consuming but worth it in the long run.

You need to write 15 alternative titles for everything including your articles and bookmarks and so on

Once you have done that, go through everything and write 2 alternative sentences for every sentence.

Finally go through the lot and spin all of the words and phrases but only where it makes sense to do so.

Don’t just rely on the suggested words as there are usually lots of ways you can spin things if you stop and think about it.

I use The Best Spinner to do all of this, and there is a link under this video to download a 7 day trial of the software.

Once you have everything spun you have the perfect base set of content for the campaign. When we come to post the articles we will implement some advanced spinning techniques such as spinning in images and video, html elements, placement of our backlink and related links to other sites.

You don’t need to worry about this right now though I’ll show you how to do it all later on in the series.

If your anything like me you will probably be banging your head against the desk when your spinning content. Personally I hate doing it but let me explain why it is so important we take as much care and attention with it as possible.

Most importantly we don’t want Google to flag our tier 1 links as duplicate content or we might as well not had bothered building them.

Secondly manual spinning provides very high quality output. This is important so it doesn’t look spammy and helps to ensure our tier 1 links never get removed by a moderator.

And thirdly we can use the content hundreds of times throughout our campaign without a hitch.

The Best Spinner has a handy little tool built in that allows us to compare articles to check how unique they are. If you just click on publish up here and then generate & compare, you can generate as many articles as you want from your spun article and then compare them all to see how unique they are.

Just to show you what it looks like I have generated 50 articles and compared them all to each other and you can see we’ve got some pretty high percentages of uniqueness here and this is before any of the advanced spinning techniques.

We can actually use this article 500 hundred times quite safely, uniqueness is going to sit around the 65% mark at worst but that’s before taking into account all of the advanced spinning as well. So I just wanted to show you that if you’re unsure how many times you can use a spun article before it starts becoming duplicate content just whack it into here and have a play with this number and see what the results are.

Now let’s take a look at why it is so important to build your own list of target sites to link from.

First of all you will get all your links on loads of sites that other people don’t which helps set you out from the crowd.

You should also go through your list and make sure the domains you’re placing your links on are still indexed in Google. While I can’t categorically say building links from deindexed domains will hurt your site, I can say that it does not benefit your site with confidence.

You should go through your target list every month and remove any deindexed domains.

Scraping your own list of targets is pretty easy once you have things set right. It really isn’t any more complicated than hitting a few buttons with my tips in the next couple of slides.

Whatever you do, do not skip this part of the tutorial. Scraping your own list of targets is essential.

You can find a sample of my personal list under this video on my blog to download to get you started, but you still need to find your own targets.

We are going to target all of the platforms that you can see here.

For article directories we are going to go after Article Beach, Article Dashboard, Article Friendly, Article MS and sites using the WordPress article directory plugin.

For blog posts we are going to rely on the Elgg and Jcow platforms

Bookmarks come from platforms like Pligg, Scuttle, Scuttleplus, PHPDug and Public Bookmarks.

Web directories are provided courtesy of PHPLD and videos will be posted to the Clipshare platform.

Wiki’s come in the flavours of Mediawikki, Wikkawikki, Moinmoin and Tikkiwikki.

Scraping link targets is easy when someone else has done all of the hard work for you.

you can download my personal set of footprints for all of the platforms I have just mentioned underneath the video in my blog.

You can copy and paste these into the footprints.ini file available in the Scrapebox configuration folder so they will be available in the footprints list drop down.

You will want to merge your list of related keywords with the footprints to try and find as many relevant targets as possible.

You can scrape even more targets by merging in my list of common words and domains extensions which you can download under the video.

So let me show you how quick it actually to scrape your own list of platforms. First of all import your list of related keywords that you saved in a previous step. Then come up here and use the merge button and we are going to merge in the raw footprints file which is available to download underneath this video on my blog.

And then we’re going to hit the merge button again hit the merge list file which includes a list of common words and domain extensions. Again that’s available to download under this video on my blog.

And what you’ll see this actually does is it’s generated over 280,000 different queries including our footprints and related keywords and common words and everything like that to compile a huge list of target sites.

So once we start harvest we will end up with an awful lot of results, you’re probably going to want to leave this running overnight depending on how many threads your using and if you’re using private proxies to harvest.

But once it is completed you will want to come up here, remove and filter any duplicate URLS and then export the URL links as text and save that somewhere as we will be importing it into some software in the future.

So let’s talk about my weapon of choice for posting tier 1 links.

I used to use a wide range of software and tools to get the job done. All the usual stuff like Article Demon, Bookmarking Demon, SeNuke X and SliQ submitter.

While all of these tools are fantastic at what they do, managing and running so many different tools was time consuming and difficult to manage. My computer would grind to a half trying to use them all at the same time.

On top of that it was a pretty expensive setup for example SeNukeX costs $147 every single month, that adds up to nearly 1800 dollars across a year. That is ridiculously expensive if you ask me.

So to reduce costs, make things easier to manage and improve efficiency I wanted to find a single piece of software that could do everything I need for the first tier.

After trying a few pieces of software that tool ended up being Ultimate Demon. There are several reasons I settled on this-

It can hit all of the link types we need for the first tier

You can get reports on exactly where your links are in a couple of clicks of the mouse.

It is quick and easy to setup and schedule a campaign

I knew it would be reliable and stable as I already used bookmarking demon and article demon in the past.

It is available for a fixed onetime payment of $347 with the discount code under this video on my blog.

Although you can get it on a monthly subscription basis but it’s cheaper to buy it outright in the long run.

Once you’ve got ultimate demon installed there’s a few things that we need to do. First of all you need to import your list of private proxies into the software.

It’s important that you use private proxies when creating your tier 1 accounts as we don’t want them to get deleted later because of heavily spammed public proxies.

If you need some private proxies check the links under this video on my blog and I’ll point you in the right direction.

Secondly we need to setup the ping servers, again paste your list into here. If you don’t have a list you can download mine underneath this video.

Thirdly you need to setup your captcha solving services, go with Death By Captcha simply because it is the cheapest by quite a margin.

Finally you need to import your sites list, click on the sites tab over here, click on the green plus and the site detector will launch. Once that open select all from the dropdown here, paste your sites list in and click start.

Once the import process has finished, open the site detector again and tick this box and then do another run of the import. You’ll probably find that you get a lot more targets added to the software this way.

Once you’ve done that come back to your sites list and select them all. And then right click and export the sites to a .csv file then import all of the domains into Scrapebox and run a Google index check.

What you want to do is create a list of domains that have been deindexed. Once have a list of the domains that have been deindexed you can come up to the red minus button here, click on the drop down and click on mass deletion. Paste all of the deindexed domains into here and click delete.

We do not want to be building links to domains that have been deindexed. So make sure you list is always clean and only contains indexed domains.

In the next video I’m going to show you how to setup and schedule the entire tier 1 campaign as well as introduce to the advanced spinning techniques that will make you stand out from the crowd.

If you have enjoyed this video then please share it with your friends and subscribe to updates on my blog.

If you need them there are links to all of the resources I mentioned underneath the video on my blog.

Sourcing Content

SEOGenerals.com – Fantastic backend for managing your orders and projects.Update: Since publication the SEO Generals service is not what it once was and the coupon has now expired. Please use 99CentArticles who offer a great service and pricing.

Agreed!
People want the content to learn how to make money online and then some complain when you lead by example.
Look at what Matthew is doing here guys…
High quality content with useful and actionable information, compelling headlines and call to action.
Don’t knock it, learn from it!

I have now found a resource that finally explains tiered link building in a simple way and how to integrate the tools together to make it happen in a Google safe way. Wish I had this before I bought UD. I would certainly have purchased through your affiliate link.

Can't wait for the next video's. Will you be covering the new version of UD with link trees?

The link tree's feature looks interesting but it won't be something I use personally, I have a much more efficient and streamlined way of doing things but it will allow you to create multi tiered structures within Ultimate Demon easily.

Great videos, do you offer this as a service Matthew? I don't have SB at the moment or ultimate demon. I am just building my tier 1 with a blogger blog and was going to redirect that to my main domain once I had it ranking. it's on page 2 now.

I normaly have about 20 unique kewyord specific articles on my sites and follow up with links from article directories, blogs, social networks, ect.

In this section you mention 3 x 500 word articles on a keyword, does that mean you are linking to all three articles on the website or your mass spinning these back to one original copy on the website?

I normaly have about 20 unique kewyord specific articles on my sites and follow up with links from article directories, blogs, social networks, ect.

In this section you mention 3 x 500 word articles on a keyword, does that mean you are linking to all three articles on the website or your mass spinning these back to one original copy on the website?

Matt, could you clarify a few points.
1) the 3 x 500 once spun will be placed on web2.0 sites does this mean you will have a 3 posts on each web2.0 and the final post will have your link pointing back to your money site?
2)in the video you post your link in the first post to the web 2.0 isn't customarily to create your profile wait a few days before posting content, to prevent your account getting deleted by moderators.
3) would you use the same 3×500 articles spun to submit to article directories?
thanks for your help in advance I feel I've got most of it down just need a few clarification.

I'm having a hard time coming up with the funds for UD. What do you think about manually posting to about 15-20 WEB 2.0 properties…like the "big" ones (wordpress/blogger/etc) for Tier 1 and then using GSA to make the Tier 2 and 3? Would 15-20 be enough? Or would those "big" ones know what I'm doing and shut me down?

Spend some time preparing your content and watching the videos so your familiar with the tool before you even use it. Then take out the 14 day trial, load your campaigns in and fire them up straight away using the theory of what you have learnt in these videos to guide you.

Sorry I haven't got any hands on experience with Magic Submitter but I don't see why you can't use it for tier 1 effectively.

I just see UD as a one time investment without having to worry about anything else, but a few people have asked me to tackle magic submitter so I might start having a play with it and seeing what it can really do.

Its personal preference really! But one time payment > monthly payment

Hey matthew you are so so awesome. My God since i found you online yesterday i have been shouting your name and telling all my friends about you. I am a total newbie to SEO. i dont know a thing. However i have a good heart to make an impact among students so i started a blog. http://www.opensourceafrica.com/genius can you please kindly look at it and tell me what SEO keyword i should target so i know way to start in the link building process. Thanks so much already.

Great stuff! – Have the downloads been removed? It would be good to be able to get them. I have not used scrapebox before – do you need to put the footprints in to make sure it finds the right link type?

Ok figured it out now – I should have read it more carefully! That thing with the social buttons is pretty clever.

How does Ultimate Demon tell the difference between the url website (platform) types?
Does it work it out itself or do you need to submit them as separate pre-sorted lists – based on your footprint criteria?

hi matt, again many thx for the amazing tutorials, I just set up my web 2.0's, I have to admit it did take sometime, hopefully I will get a bit faster with practice. just wanted ask what you thought of seogenerals? just received 3 articles and they where really bad quality. have u had any similar experiences?

First, this is a great set of videos you've made! I just had a quick question regarding the content and preparation. You had mentioned…

3 x 500 word articles. – Are these spun articles?

Then you went on to mention 2 alternative sentences for each sentence in each article. This confuses me a little bit. Would you mind explaining this so I can set up my content appropriately? Would it be…

3 unique 500 word articles…then write 2 alternative sentences for each of these 500 word articles and then spin them?

Sorry for the double post, but I'm not sure where to use all 3 articles. Or if it's just 1 mass spun article. I watched video 3 and it looks like you used just 1 of the spun articles for the web 2 and article submission directories. Would we just spin it using {Article 1 + Spun content w/ 2 alternative sentences} {article 2 with spun content + 2 alternative sentences} {article 3 w etc}. Appreciate your help Matt!

Just so I’m clear, are you saying here that you use 3 articles prepared for spinning, and use the first of those articles to build links using Web 2.0s, the second of those articles to build links using Wikis, and the third of those articles to build links using article directories and “PRs”?

Is this correct? You use a different article for Web 2.0, Wiki and article directories/PRs?

Hey Matt, maybe you can shed some light on this question/problem and can help some fellow subscribers here. Basically what I'm doing is exactly what you have done here. I first acquired a list of keywords via scrapebox (30 keywords). Then I imported those keywords, the merge list, and the footprints.

I then ended up with around 182,000 keywords/queries. I ended up getting 8 high quality private proxies as well. I imported those proxies and then I hit the harvest button. The problem I'm having is that the cpu usage is fluctuating around 0-2% and it repeatedly stops harvesting at around 9,000 results.

When you say it 'stops harvesting' does that mean the Scrapebox window just pops up showing you the scrape is complete and which keywords were completed and which weren't?

If so the problem is all your proxies are getting banned in Google at which point Scrapebox cannot continue to scrape and ends the process. 8 proxies to scrape 182,000 queries is more than ambitious unless you have threads set to less than 10 or something (which would take forever)

The URL's are getting removed because you have options > automatically remove duplicate domains ticked.

I dont see why you would want to try and increase CPU load? If anything you should be looking to reduce it.

scrapebos doesnt automatically show any kind of complete signal. It just stops harvesting like a freeze up i guess you could say. I figured proxies would speed up the time the actual harvesting could complete rather than a whole 20 some odd hours like you explain in the video here. How many proxies do you suggest? I figured CPU output would speed up the whole harvesting process.

I was wondering if you would recommend spinning the words in the Titles. I have created 15 alternative titles and think it may be better to spin the words in there as well. If you disagree, can you please let me know why? I appreciate your help!

Just a quick question. When word spinning your content (after sentence spinning) how many synonyms are you adding? The reason I ask is that I've sentence spun (3 versions of every sentence) and word spun everything in TBS for my article, and TBS still only says 23% unique at the bottom. Not sure if that's an issue. Thanks for your time.

This is actually the best linkbuilding guide I've ever read and it's free, actually I do the same just a less scale (until I get all the tools xD) and I never get slapped with those G updates, thanks for the share! and succes!

I have 2 questions on the tiered link building videos (part 2 in particular)….

1. You kindly detail the process of finding your own list of target sites for UD. This is superb information, thank you. However it caused me to wonder about something. You say to merge a list of related keywords, which I understand and have done so. However I will be using UD for various projects, which span several very different niches. What I don't understand is that if I do it for one niche, lets say car insurance, then won't my sites list be full of car insurance article directories, bookmarking sites etc? If so, what happens when I come to run a link tree or campaign for a website about fishing bait for example? Should I scrape again, and add those scraped target sites? If so, won't I be submitting hundreds of spun articles to hundreds of sites which won't accept the content, and won't that get my email or IP banned?
Hope I explained this well enough, maybe you can explain where my thinking is going wrong, as I am sure you must be able to run multiple campaigns to different niches in one installation of UD.

2. On your videos you set up a gmail address, which I also did. However on some of my campaigns, already run, I noticed a lot of failures on article directories citing the reason for refusal as something like "Free email addresses are not allowed". Do you get this problem and if so do you no longer use free email addresses? I have many parked domains, I suppose I could just turn on pop3 for some of them and let UD access them. Would you do that, or do you just use free email addresses and accept the failures?

Thanks again for such a brilliant video tutorial, if I had money I would pay you a lot of money to sit with me for a few days and show me first hand how you use UD! (Idea perhaps?)

1. I think your over thinking it – I just merge in those keywords to find more sites that I can target. As long as the pages your links are coming from have relevent content that makes sense on them your good to go.

2. Yes you can reuse those domains and setup catch all email addresses so you can just make them up on the spot without actually creating accounts for each one. See the UD documentation for more details on how to use it with UD.

Matthew, I have followed your instructions to the letter, for scraping for target sites and importing into UD. However out of several thousand sites, only about 6 were successfully imported into the site detector. Is this normal?
About 80% had "NOT FOUND" and the rest of the failures had server errors either 500 or 403 forbidden.

Similar question here Matt – Seeing a really low success rate with my export from SB as UD runs through the sites and detects what platform they are.

Just wondering if this is down to the footprints as would expect a pretty high hit rate if the footprints are specifically digging up sites for the platforms UD supports?

How have you found this and any tips on improving.

Also, can't remember if you mention this, but glad I've been running anti virus on my VPS as obviously UD is hitting many sites and AVG is catching the viruses… Just hope it gets them all or my VPS is gonna need rebuilding frequently!

Having purchased Ultimate Demon and ScrapeBox I am trying to set up the proxies and have a general question, I am in BuyProxies and they offer for $10/m the 2 subnet semidedicated 10 proxies, but which location should be specified, USA, Europe, or USA & Europe, I am not sure if it makes a difference which is chosen but thought to ask before purchasing these?

Hi Matt, have a bit of a problem with the UD proxies as well, I have used the format that you mentioned above, but as I am using an email directly through hostgater(ssl/tsl) the second part of that fails see the format below

Hey Matt getting everything set up right now and transferring the seo software to a VPS. I am checking out the proxies, what package do you use for your seo services? I am sure you do way more than me but just want to make sure I am covered when scraping content. Thanks buddy

Awesome! thanks for giving all this info for free by the way will you make a tutorial to explain how to make senuke templates? or do you know any good link that will explain how to do this? I googled but didn't found anything good.

Thank you very much! My problem is that i don't know where should tier backlinks point to eg. web2 profiles should link to articles or moneysite or something else? i wanted to know if there is a detailed guide on what to link to what

I purchased scrapebox and shocked at the uge list of keywords it has uncovered and having spun my articles now at SEO generals I am about to purchase Utlimate Demon and have a couple of questions.

Normaly I use Market Sammurai to identify keywords and select 20 for the website articles, so with the list that scrapebox comes back with, does that mean we use the scrapebox keywords linking back to the articles on the website, OR can the scrapebox keywords be used for the website articles as well?

The reason I ask is that many of the 1000 keywords that scrape box has come back with are very long tail keywords with very few searches, so not sure how it all fits together?

I’ve been in this business a while and still found the easy way you lay all this out very helpful. One question about spinning, I saw above that you outsource yours.

From what I have found, deep spinning services run around $30per 500 words + the article cost. Once you have all that content you still have to spin in links, images and video. You also have to generate all of your titles, descriptions, tags, keywords, etc.

Seems to me that the content generation side of this either costs well over $100 or takes several days to really do right yourself. Even if you buy the articles you likely have to do the descriptions etc. yourself, which combined with the image/video/link insertion, takes several hours. Once you have all of the content it looks to me as though it will take an hour + to set up the campaign in UD.

I guess in the end my question is, does that all sound about right? Obviously if you can get a good odesk worker to do this at $2-$5hr you could lower your costs.

The videos do a great job of showing how to set up and run the campaign but it looks like most of the work/cost is on the content side and that is a very involved process. I would think this is a $399+ service just for tier 1, lol.

I am looking to run this whole process on probably around a post a week on just one of my sites, let alone other ventures. Seems to me you really need full time staff if you are working on much more than 1 simple site?

I am getting ready to dive in and set up systems to get this done, just wanted to make sure I am not missing something that would make the content side a lot easier.

Thanks not only for these videos but for this whole blog, its a great case study!

This design is spectacular! You certainly know how to keep a reader amused.
Between your wit and your videos, I was almost moved to
start my own blog (well, almost…HaHa!) Excellent job.
I really enjoyed what you had to say, and more than
that, how you presented it. Too cool!

Hi, Matthew, Great video’s – really straightforward.
Tweeted (3x) to get access to the Personal Targets, Merge Lists, Footprints & Videos
But not seeing a place to download –
Guessing I’m missing something obvious…
Any help would be greatly appreciated.
Thnx,
A

Got em n just finished the vid’s – really awesome – cleared up a ton.
One question – I think I missed something obvious or had a brain freeze…
Why the 3 articles?
Is each for a different tier?

Sidenote: I’m a bit paranoid about swiping images, but was thinking one could just take the same image file, make multiple copy’s of it – then add it to name mangler (free I think) software that will bulk change the file names to your keywords but retain the original extension if you choose – then jump back to what you’ve laid out.
2 cents(pence) from across the pond I suppose…

Sorry the 3 articles are for the various tasks, typically I tend to use 1 article for the web 2.0s, 1 article for the Wikis then the other article for everything else but mix it up a bit as you go

I wouldn’t use the same image with a different file name, given you can search Google by images and not just words I’m guessing they can detect duplicate images easier than they can detect duplicate text based content.

I’m using the best spinner and spinning 5 paragraph’s and 5 sentences and getting over 90% unique (without photos/vids) when doing a publish and compare with 50 articles. Is there any need to do a word spin if I’m getting that high a unique rate with paragraph and sentence spinning already?
I won’t be using any spun content on my 1st tier or connected to my money site.

Personally I would go the whole hog – even if I was only going to use it those 50 times I would still go through and do the words. While that uniqueness is an indicator of uniqueness it isn’t a stretch to say that Google may pick up on duplicate sentences across 50 articles.

Remember you should be building out your campaigns for the future, not just what works in the here and now!

First of all, nice videos! One of my resolution for 2013 is to get more involved into IM and SEO in general. I have found your blog yesterday through BHW and THANKS for all the information. I really enjoy your transparency.

Secondly, like other people, I have shared your post over Twitter/Facebook and no link…

Matthew, great tutorial series (found it on BHW)! I’m on my second time through. I’ve got at least large 3 projects I will be applying these techniques to. Now for the cash outlay to purchase the rest of the tools I don’t already have… I’ll be purchasing through your links.

How are you viewing social signals into the mix these days? Seems the Big G is emphasizing that more and more.

Thats very kind of you thank you and I’m glad you’ve enjoyed the series!

Social signals are actually covered in video 6 They are supplementary tactics that you can use to bolster your efforts but I do think the future of link building will be social based. Essentially a ‘link’ is one person saying they approve of something else which is exactly what social sharing is by definition.

So I think we will be moving to a more socially driven search in the coming year or two. I’ve been giving my key tier 1 properties social signals for a while now in anticipation of that shift.

Well my link building process has evolved over the years. I usually have 1 main process along with 3/4 variations which I use to refine the main process further, it’s constantly evolving

Well you can use the footprints, merge list and your own list of keywords to create your own target list. But I guarantee my personal target list has been used less times than any that get posted on forums etc

I use your two files and a list of keywords I made up. I got a list of 300k keywords scrapebox says. I click on harvest like in your video. Remove dup. url and save as text file… I look at the url list they look very poor and low PR. Would it be better just to download a High 4-8 PR list of articles directories, web 2.0, bookmarking etc. Or have I missed the point. Look forward to your answer. Love the series think you need to do one on the New Link Wheel model or something as well… Thank Matt keep up the good work!

Hey Matt,
Just a heads up – I don’t know why this is happening, but Scrape Box is misreporting indexed links as not indexed. I would manually check some of the output links reported as de-indexed before you remove them from your database.

I’ve got a ticket in with SB support now, but searches online show people complaining about this as far as 2011.

Matthew, you are awesome. A lot of what you offer in the way of tutorials is better than paid memberships I have participated in. It’s amazing! I know you plug a few products but I believe your promotion to be of a helping nature of which is well founded in your experience. I was on the fence about UD but I know I will now purchase as well as ScrapeBox through your link. I just hope there are some tutorials offered by the developers of the those programs to help tie everything together. Will be spending a lot of time with you here my friend. I may need to watch some of these a couple of times for it to completely register-lol. Appreciate it more than you know. THANKS!!!!

Thank you very much glad your enjoying the tutorials. I had to make the decision of releasing them as a product/wso or just giving them away for free, I actually felt like putting a price on them and associating them with WSOs would devalue them so I just gave them for free =D

My tutorials will tie it all together for you no worrys, and thank you for buying through my links – much appreciated!

Using Scrapebox for the first time following your video and have a small problem.

I produced a list of keywords as suggested in scrapebox in video2 then put this into scrapebox again to produce a bigger list, this came back with over 1,000,000. However, as I am now harvesting, scrape box this has now been running for 2 days and still going, should I abort and do this start this again using the first list of keywords scrapebox generated?

Update, when I mentioned 1,000,000 above this was the total quieries after the the merge files were added as indicated in the video. Perhaps I should have used the original list of 30 scraped keywords rather than the 163 that came back after scraping the list a second time?

my question is , I harvested a list of 1,000,000 + Url’s with scrapebox using your footprint and merge list and my keyword, after removing duplicates i end up with still a big list of ~200 000 urls…

Then in UD, iam supposed to add these urls in the site detector like you mention at the end of the video ( UD tell me that i am recommended to keep it below 10,000 items) …so i tried with like 10 000 item , it took like 45 min and stop a 99% with only 10-15 success detected…

So what do you recommend , should i paste my huge list of 200 000 urls to get the most of it

Thanks for your answer, I really appreciate that we actually can communicate with you …

I scrapped probably 2 Millions of URL’s using your footprints, and merge list , i had 2 list of 300k URLs to check with the UD site detectors, I check those list with the tickbox as well… Then at the end i just get only 1800 target list !!! ( most of them are FORUM)

So I run an index check in scrapebox , only 3 are indexed …

So is something wrong with the scrapping part …?

Can i use your target list ( but what would be the point of scrapping your own list of targets…) i’m kinda frustrated right now :/

No sometimes you’ll get lots of new sites, sometimes not many and sometimes none – just how it is. Scraping your site list and refining it is a continuous process that should be repeated monthly at a minimum.

Thank you Matt for these videos. I am on the third round of watching them and each time I seem to pick up another point that I need to address.

I am now ready to get going with my first campaign. I have downloaded your target list but am unsure as to what sections to upload them into within UD. You have broken them down in what appears to be a sub-criteria of the way UD splits them up.

Please can you indicate which type of site each of your folders relate to.

Matt can you please make a List scraping series? I tried to scrape those footprints. I trimmed URLS to root, removed duplicate URLS and entered into Ultimate Demon. I got less than 500(5k unique domains) sites to go most with very low PR

Hey Matthew – Thanks for the tweet. I can appreciate the time and effort put into the videos.

This is why most people are not fond of SEO. It’s centered around beating the system, especially these advanced methods. It’s centered around beating Google.

The problem is Google is getting smart, and their Search technology is evolving. There’s are some clues out there that links are becoming less important to SERP’s. I wonder if in the not to distance future all of this work that you suggest will be worthless.

Spinning, scraping, and link building are not foreign to Google. They’ve learned that great content is not at the end of this rainbow. These techniques are tried and true, very powerful over time. However, anyone using these techniques needs to be careful, and realize that roof can cave in at any moment.

Yes Google are at long last getting smarter – a lot of the updates in the past few years haven’t really changed anything (use HQ content, diversify keywords, standard advice for years) but with the introduction of AuthorRank and Social Signals this year it will be an interesting time

But that is why the tiered system is so good, as long as people ensure they have the ability to remove any tier 1 links you can instantly disconnect your site from any links Google may decide are troublesome in the future that aren’t now.

Matt…I followed your videos through and now I have a two week wait. So Ive started on a second site….and followed videos 1-3 all the way through again….when I go to each project on UD the sites it’s got in the database that I scraped for each project are exactly the same according to UD…is this right?

I assumed with the different keywords in Scrapebox it would have found different sites?

Matthew, when you say you “wouldn’t build links to a new site”, do you mean to say that your tiered link building method should not be used on a newly created website/domain? Do we have to wait for a site to be of a certain age or have a certain amount of existing authority or traffic before we can use this method on it? Please elaborate as I would really like to know whether I can use this on a new website.

I’ve used them. The only thing I like about iwriter is the fast turn around. Quality is unique too, but don’t expect any life changing content. I did use them for quite a while but, stopped because the quality isn’t what I wanted.

Thanks for all the awesome tutorials. I tweeted and liked the video, but no links to download your lists. I scanned this page up and down to make sure I didn’t miss it, but I still don’t see it
Could you help me out?

Hi Matthew I love the tutorials. I am only starting out trying seo. I usually source it out to someone else. I am going to try put your tutorials into action with a site I am working on at the minute. It is an Opencart site. I have one very basic question before I start.
Should I submit a site to Google through the webmaster tools or just jump straight into your what your are doing in the tutorials?

Preparing for the campaign you said I need 3 articles. and then I must write 2 alternatives for each sentence of each of the 3 articles… but why not actually buy 9 unique articles? …it’s faster. or maybe I’m not getting this right….?

Hey Matt, I’m having hell of trouble getting these list of targets imported into my ultimate demon software. On the first run, only 14 stuck, and 23,000 could not download or were not found. Do you have any tips here? Am I doing something wrong?

For the social bookmark, web directory and video titles, do we need 3 titles and 3 descriptions for each, or one for each? Also, can you give an example of what makes a good title and description for each one?

Great tutorial! Very well put together, and I would agree to the previous comments that there isn’t anyone, anywhere on the web that publishes content so invaluable as this. My question for you Matthew is …

When uploading your target sites from scrapebox into UD, what’s confirming those sites being uploaded are valid web 2.0, Article Directories, Social Bookmarks, etc… And if UD has no problem sorting them out, should we use semantically related keywords to further scrape for additional target sites in scrapebox, for overlooked target sites that weren’t picked up from the original set of keywords?

Wow, what a site. This video tutorial is superb. I know you said it took you 8 hours to create it, but let me tell you that you have easily saved me a month or more. I have been spending the week hunting through poorly written, garbage descriptions and advertisements to figure out some Multi-Tier marketing best practices.

Your site rocks, your videos rock, and I will be viewing all of them tonight.

Hi, like I said earlier, fantastic resource. I am excited about using scrapebox with the keywords and footprints you created, only I cannot locate them. It said on this page that if you tweet, then you will get those resources. I did that but for some reason I am unable to download these. Please help.

What determines the amount of private proxies I should need? I have the general concept of them and was reading like 20,000 url blast a day with 10 would be good but can you give me the run down on what determines the amount I would need?

I’m working on my backlink strategy and your tuto help me a lot.
But I’m french and I would like to know if writing a post on sites like EzineArticles in english has the same effect or I need to find some others french site?
I can find them but they have less Authority

Hi Mathew, I am new to this so please bare with me. Whilst going through your Tiered link videos I lost track a little. What threw me was how to do Tier 1 in Senuke because the Senuke video is an addition and not core to the Tiered linking videos. Once you have 3 unique articles – is it these that I should spin by following your expert spinning video and have them all linking direct to the money site? is that safe in your opinion?
and then linkwheel tier 2 and 3 as per your video series?

Yes that is right – just substitute video 3 in the series with the nuke tutorial – but then see the advanced spinning tutorial for a more detailed look at how to prepare the content as seen in video 2.

Hi Matthew,
In video 1 we get the huge list of sites from scrapebox. Is it this list you paste in UD in video 2 ? Not so sure, because those wont be Tier 1 quality links or did I miss a beat ?
Thanks
Nicolas

I was just wondering, since I prepared my Tier 1 with Senuke XCr Social Network links , some PDFs and some high PR Wikis, then create a campaign in GSA Search Engine Ranker later that night – 7 hours later I have submitted around 400 Tier 2 Links with GSA , but I get around 20 LIVE links at the moment – is that normal ?

Thanks for all of the awesome videos. Extreme newbie here just starting out. One step is confusing me. After you scrape your URLs and export them as text you import them to UD. You then export them again back to scrapebox to check the index status of the URL’s. Why would you not just check the index status of the URL’s after you’ve scraped them in Scrapebox straight away?

Because when you do the first load into UD, only a very small percentage of them will be added to the main UD database. So instead of doing an index check for like ~50,000 URL’s, you only need to check a few hundred.

Hi there! I could have sworn I’ve been to this website before but after browsing through some of the post I realized it’s new to me.
Anyways, I’m definitely happy I found it and I’ll be book-marking and checking
back often!

Hey Matthew Gerry here from Ireland been following you now for a while as I can tell from your writing and your tutorials you have a vast experience with a No BS approach is there another way to contact you other than here, I gave your vids a share as I thought it was the least I could do, also as these were made quiet a while back is this still a viable method now ?

I just love your tutorails videos and all your blog in general, I still have some noobie doubts though, sry if you already answered them:

1. When you say “scrape your own targets lists for automated software”, what makes a difference between pligg, jcow, and all of those platforms? aren many people using the same footprints? so how could I avoid adding my content and links in platforms that are no so “known”? (spammed)

2.What is the best way to find new platforms for building our tier linkbuilding? I mean, pligg and jcow, drupal, etc. are famous platforms, so, is it better to post in not so popular platforms to avoid posting where everybody else is doing it?

Both questions I made are very similar, but Im about to star my tier linkbuilding, and I don´t want to do it the wrong way.

Thank you for spending the time to help others in this crazy confusing business. I had a question, hope I didnt miss it some where but her goes.
So when building the initial tier 1 level should I direct that towards http://mysite.c0m or should I build out a seperate set of tier 1’s for every page I.e. mysite.com/how-to-skin-a-cat, etc. or should I mix them all up? Hope that makes sense.

Would each of the campaigns need 3 more articles and its various titles, descriptions, etc., or can you use the original 3 articles. If I had a website with 5 URLs I want to target, would I then need 15 articles or just the 3 original ones? Thanks.

I’m an agency SEO and the majority of the work i do for my clients is white and grey hat, I’ve decided to buy ultimate demon not for any my current clients but for a side project i’m doing.

I haven’t done any major black hat campaigns before, so i’m actually looking forward to it!

However i’ve had a problem buying ultimate demon through mycommerce. Long story short they had a issue processing the order and ive had to pay twice! I’ve supposedly got a refund/cancelled transaction, but if you get double commission you owe me a pint! Haha

1. In this tutorial you built them with UD or Licorne but on the other hand you stress they should be hand built. What is the best way in your opinion then?

2. Granted you have built a few web 2.0 sites manually, can you easily add those URLs to a T1 link campaign of that kind you set up in this tutorial or do you have to create a different Task/campaign for existing web 2.0 accounts. I recently bought Licorne and therefore I´m mainly interested in how to manage this in Licorne if possible.

3. A bit referring to question 2: Can you generally use existing T1 accounts or sites (created by UD/Licorne or just manually) to create new content and backlinks with Licorne and UD on these properties and build them out? This question may sound a little bit stupid but I didn´t find a description to this issue in the tutorial videos yet. You only recommend to repeat the process and create completly new T1-accounts (if I understood this right).

1) THey are very different things. The ones that are built with UD are not comparable to the hand built ones. What UD calls a web 2.0 site isn’t always a web 2.0 site with a dedicated subdomain like we create by hand.

Hey Matt I could really do with your Personal Targets, Merge Lists, Footprints & Videos but I don’t have a FB, Twitter or G+1 account:( Can you sort me out please as these downloads will be a great starter for me.

Hey Matthew,I have downloaded all of your videos and it’s great.But I have one question,when I asked at warriorforum,people are saying that article submission direct to money site is dead now.So can we use article submission in Tier1?Or does it have negative impact in rankings?

I get what you mean by a “set” and that we need to “spin” all the titles and descriptions.

But I want to be sure I know what those things are supposed to look like. Could you please give me an example of each of those three types of set that you would consider to be of suitable quality?

I have tried looking at these on various other websites, but they seem to look different on some sites to others, and I would like examples of ones that are done like you do in this process so that I can feel certain I’m doing them right.

I would really appreciate you giving me an example of these three things, or just posting links to three pages that have ones that are done right.

Just a quick question about the 3 base articles. Do you have these written in a specific way? What I mean is, does the article you use for a press release have to be a certain style, i.e. read like an actual press release?
Or do you just have three articles written about a niche related topic?

I wonder how to manage the target URLs the right way in manners of already used targets and targets I not used yet. I will try to exemplify this: Granted, I scraped 1000 target sites (bookmark sites) but I only really posted to let´s say 200 of them in a bookmarking campaign (because I limited this T1 campaign to 200 operations). When I use the same target list (1000) again later on, UD, Licorne and so on will target this 200 sites again. How do you avoid this problem in your daily work? Do you rather scrape completly new sets of target sites for each run or do you delete the used ones. In other words: How often do you use a list of target sites for your linkbuilding campaigns within one project?

I’m curious what you would say if I offered to buy you a pint and asked you how would you change your tier one campaign were you to intend on using it to diversify anchor text? Would you substitute domain type anchor text for some of the keywords scraped via scrapebox?

I want to scrape for article directories in german language which are supported by Licorne AIO. I know you prefer UD but maybe there´s no difference. Can you give an example for footprints to scrape These targets? Thanks in advance.

I did so but there are no article directories in the results that Licorne supports. Instead I discovered that many german AD run on wordpress. I just try to find out if you can submit to them using Licorne and posted this as question inside the Licorne forums. By the way. Do you know if submitting to WordPress sites running as article directories works with Licorne and UD (i don´t mean blog comments)?

Ive just tried using Scrapebox and trying to harvest 4120 keywords with 4 shared private proxy’s and all 4 got banned straight after the harvest which prob took about 10 minutes, I’ve even deselected multi-threading, they seem to still be okay if test them on proxy manager, or using the keyword scraper

I just began scraping with different keywords for my current niche site project. Since I have other projects in different niches I wonder if I should scrape new targets for each project or if I can use this target list for all my projects. What is your opinion about this issue and if I am allowed to ask: Do you manage different target list for your projects? I think it could make sense for certain platforms but article directories for example are good for almost every linkbuilding because they mostly have lots of categories. Please tell me if I´m wrong or not in your opinion. Replys from others would be nice too.

This is pretty good info–not new–but it’s pretty good…..but I think that you have to a 20 something crazy young guy to actually spend so much time doing this mind-numbing stuff–SEO.

Life is too short I do spend some time doing SEO of sorts, but mostly pay for services that I have found to work, just don’t have the time, or the desire to ever do this kind of stuff again…

And, actually, SEO is NOT easy and it IS complicated-despite what SEO courses/guys selling SEO courses will tell you, so if you don’t have the time, resources, knowledge etc to spend time testing, and keeping up on things….well, good luck…but to each their own..

Sure, and it is good info, and I have learned much of it as you have….not a lot has changed in the basics…one thing I won’t be doing is sending 1000’s spammy links to any tier..no matter of it’s the “3rd tier”, I believe there may be a few footprints that G can follow sooner or later.

I actually do spend some time doing my own SEO :)….but for some parts, I find people to do certain tasks, not the whole enchilada, so I keep some control.

I am currently following your tiered link tutorial have nearly completed spinning the articles.

SEO is new to me, but this tutorial is helping me learn fast, for which I thank you for.

I am however, a little unsure about the social bookmarking and web directory title and description preparing.

are these titles and descriptions targeted towards the articles, i.e. the set of titles is in reference to each of the 3 article subjects or do you point directories / bookmarks to your money site with the title and link referring to there?

I’m having a bit of a nightmare with scrapebox proxies – wanted to get your opinion.

I have purchased 10 dedicated proxies from buyproxies, and also tried 10 semi dedicated from squid – im using only 1 connection to harvest but its going extremely slow. e.g. average 1 URL per second sometimes none. The proxy’s test fine for google, but then when harvesting seem to lock up?

When I try with public proxies it goes fine, and when I tested with my own IP address (no proxy) it went super fast. Problem is it’s tough maintaining the public proxies so I was really hoping to use the privates I bought!

One thing I have learnt is the scrapebox proxy tester is not really accurate when it comes to google. It checks if the proxy can access google, but doesnt check if its been captcha blocked. So it can sometimes say its G passed when its not. Using Google Proxy Checker (BHW forum) has helped assess these more accurately.

Here is my testing from several proxy providers:

Purchased 50 dedicated from Squid: Had to contact support to get the proxies changes to google/scrapebox then got working to a degree. 20 were already captcha blocked and 30 currently working with 2 connections.

10 dedicated from buyproxy: These seem to be captcha blocked at the beggining. Giving them the benefit of the doubt and will test when they are unblocked.

proxy-hub – proxies not working at all in scrapebox – support said they should be fine but they are not. Support slow to respond.

1. I get a list of over 2000 target sites after harvesting in scrapebox, which I paste in UD, however I only get about 8 successful sites detected. When I then tick the subfolder – got zero sites detected? Am I doing anything wrong (I did change the dropdown list to All)

2. Also there is a huge list of sites already pre-loaded in UD. Is it safe to create links to all of them and the target sites? or do you have any tips that you can share on this?

I’ve gone through you tutorial several times now. You mention the importance of having the ability to remove tier one links, but I’ve been unable to find a place in the tutorial where you show how to do that.

Hi Matt, 2 questions:
1: when merging your ‘merge-list’ and ‘footprints-raw’ list with my keyword I get everytime and error message after finishing or aborting scrapebox and scrapebox will shut down. Do you have an idea why this happens? Scraping related websites is for example working fine. First time I see this error.

2. you say the following 2 things:
-write yourself – 1 title with a 3 sentence description for every type (so 3 sets in total, so you end up with 3 titles with 3×3 sentence descriptions in total)
-you should write 15 alternative titles for everything including your articles, bookmarks so on…

Hi Matt, thanks for your reply. I made this comment also yesterday: I think it dissapeared because I made 2 comments. Not sure about it. But this were the questions:

1)You said you have to make a 3 sentence description for every title in the tutorial. So how many descriptions do I need to make exactly? You said 3×3 sentence descriptions in the tutorial, but Im not so sure about this anymore? Because you said first also only 3 titles. But I cant imagine we have to do 45 descriptions.

So is 3×3 descriptions and for every sentence 2 alternative sentences and sentences also spun enough? Or how many do you exactly? Hopefully you can explain it in a very clear way.

2) Im using squidproxies, but my proxies got banned after an hour. I also read here in the comments, that you don’t advocate Squidproxies, because you also have problems with it.

So I think it’s best to buy 30 semi dedicated buyproxies, proxies. But which settings in scrapebox are best for this, so I will not get banned when scraping 24/7? This are my settings now for 25 private squidproxies:

Hi Matt, great stuff. I’m confused about something: you talk about “pasting your personal set of footprints into the footprints.ini file in the ScrapeBox configuration folder” so that it will be available in a drop down menu, and yet in the video we only see you pressing the button in ScrapeBox to merge in the raw footprints and then pressing it again to merge in the list of common words.

If we can just do it that way, can we just do that and refrain from modifying the footprints.ini file and still get the same results? Or do we need to make that modification to the footprints.ini file in order for us to use your footprints?

Hey Matt, I am not sure how to thank you for the tutorials.. it’s mind blowing. This write-up inspired me to try and do SEO across our 5 eCommerce sites in-house instead of spending hundreds of dollars every month contracting it out. I also started buying up old domains with intentions of starting a PBN .. thank you.
If you’re ever in Texas, drinks are on me!

I also used your link to sign up for the monthly UD subscription.. will likely buy the full version next month.

After I scrapped 78k keywords and merged with your footprint and common word files, I got over 2 million target URLS and when I loaded in scrapebox, it kept crashing. I ended up using only my keywords merged with your footprint (skipped the common words) this yielded 200k target urls.
I loaded those in UD and been running the site detector since 1pm ET Saturday – 49hrs ago.

These are the current stats
32601 domains on queue.
Success 395
undetected 19302
Duplicate 32.
Connection error 894
Estimated time left 71 hrs.

I am using 50 semi-dedicated proxies from buyproxies.
This means it checked 20k domains in 48hrs.. still have over 30K to go..

1. Does it usually take this long?
2. Apparently it loaded only about 50k is there an easier way to know which 150k domains were skipped?
3. I am yet to uncheck the folder stuff to run the detect the 2nd time.. is this necessary at this time? I do not have another week to run this same 50k domains.

When I load your footprints file and merge file to the 800 keywords in scrapebox, I get total of 7296000 keywords. But, when I press the harvest button to get a list of related target sites, it keeps crashing.

I have emailed scrapebox support regards this. However I wanted your thoughts on this. Have you ever encountered anything similar or if you have any possible solution to this.

Thanks for these amazingly helpful tutorials Matthew. I have a question about sourcing a video gig from Fiverr which you mention here. What kind of video are we looking for? If the money site in question is for an affiliate product, can this video be a testimonial? Let me know if I’m on the right track, thanks!

In the UD software, after I add the site’s from the site detector to the site list, you say to select all the sites and then run a google index check in scrapebox. But in UD there is already around 100 site’s that was preloaded with Ultimate Demon. It is okay to keep that as part of your tier one or do you recommend deleting them and only use the target URL’s from the site detector?

As with any approach to link building you need to diversify and mix things up. If you combine this with replicating competitor links, building a private network, creating good content and driving social signals, you are winning!

I currently use Licorne AIO but I have some difficulties with it. The same Thing with scarpebox. But back to the Topic now: Do you know if you can create Google, FB, Twitter and Pinterest accounts and submissions with Licorne. If or if not: Is it possible to do that with UD?

Unfortunately it is over-my-head in some areas and def out of my budget. I calculated over $700 worth of tools.

I am so new to backlinking and struggling to set up my Link building platform now. I have found that many of the Web 2 properties that allow do-follow are sites where I have to create a site (mini site I guess).

I def was not expecting that! I thought that I could just put my article up and be done. No one seems to teach the best way to QUICKLY set these mini sites up.

I don’t want to spend days trying to set up all of these sites.

I thought you said you had a post about that but I haven’t found it yet.

Long story short – I can’t employ your method right now b/c it is so out of my league right now – mostly financially and in other areas I’m just befuddled – like proxies and all that.

Why do you now use 99centarticles? the prices and whole “feel” of this place is all wrong. I feel like im being scammed from the home page on. The site must have been designed before computers were invented!

Hi Matthew,
Thanks for the tutorial, I managed to follow the steps and used the footprints provided by you, using scrape box to scrape more than 14000 links. As I only have GSA SER at the moment, can I import them into GSA? Any additional steps like filtering off not usable links or GSA is able to work it out. Have a good day.

Hey Matt,
Thanks for the videos, they have been very helpful in breaking everything down. I have a basic question regarding the 3 X 500 word articles when implementing them in UD. Currently I have one of them spun properly in TBS (by the way it is mind numbing at times). My question is when submitting them in UD….are you just submitting one of them to the 100 web 2.0 directories in your video or are you incorporating all three at the same time somehow? I may have missed that in your video or didn’t see it. Also I have yet to pick up UD, so far I just have TBS and Scrapebox getting everything set before I pick that up. So do you just put in your spun format article and UD does all the spinning like TBS does? Also after you launch the article once in UD to the 100 directories is the article then useless at that point to re-use as it could be reduplicated in UD if submitted at a later time or does UD remember what it’s spun previously and re-spins it differently if used down the line? Hope you can answer some of these questions and thanks again for your videos.

You’ve recommended 99centarticles in your tutorials most of the time and it seems that you’re quite satisfied with their quality.

My question is that can I use their service to have (1000 word) articles written for my own (primary) site, not for the purpose of link building? Assuming that I’ll tweak the articles to add more detail and value, should I go for them to have 10 x (1000-word) articles written for my site?

Hi Matthew Woodward
I’ve built about 10 web 2.0 (3-5 articles per site, 1 to 2 articles with back links to money site). Google has to index these web, but when I check the back link of money site, I do not see any back link. This is why?
thank a lot

In your example of video 2 (and with the raw footprint) when you scrape, do you also include those websites without PR (example like N/A?)
And when commenting/scraping it needs to be niche related right ????

And do you go over the websites manually to see if they are related ???

No im not familiar with the manually approved in “wordpress” yet. But thanks for taking time on every question/comment. Even though im not always the one who are asking. Im still reading to learn from other questions!

Thank you for all this wonderful knowledge you are sharing with us. I have few questions:
1- the link to Linkwheelbandit doesn’t seem to work. Is it no more on the market?
2- Is there any particular reason why you don’t use GSA for Tier1.
3- What are the tools, services or softwares (hidden costs) involved for someone who buy your Premium tutorial?
I already own GSA, scrapebox, KontentMachine,WAC, and The Best spinner.

Matthew, I’m sure this is a dumb question but I’m a noob, and want to cover my bases. Just to clarify, when you say “hand-spun”, are you referring to using The Best Spinner, or literally spinning all of my titles, descriptions, and sentences manually, by hand. I’m sitting here imagining doing everything by hand and am like, Damn, that’s going to be a lot of work! Thanks in advance for not laughing at me too hard.

By the way great job on reorganzing your navigation bar, i like the categories, easier to find things.

Question, I noticed when you scraped for URLS in Scrapebox you used a broad match, do you think its better to scrape your keyword without quotes? e.g. “article dashboard’ dog training. Or “article dashboard” “dog training”

So let’s say I take on an attorney. I then create a PBN for tier ones. I then have content created for the tier ones and post the new content linking to the money site. Then tier 2’s and 3’s.

So when I repeat the process again and again by adding the content to my tier ones linking to the money site. Does it look weird for all the tier ones to be linking to just one site? Do you add other content to the tier ones linking to site that aren’t your clients money site so the site doesn’t just have links to one domain?

Please fill me in on this. This is really the one thing in the process I am unclear on.

So… I din’t know much about tier 1,2,3…. et al. Thanks a lot man. I think most bloggers (especially me) are very afraid of doing all that work. I will invest my time in doing what i’ve learnt in the videos. You should have a donate button somewhere