This time he has spent many days compiling data for sites affected by the Penguin update and has come up with some startling conclusions about what Penguin actually is!

The biggest surprise for most people will be the fact that Penguin has NOTHING to do with your backlinks, contrary to whatever rumors you might have heard! It only targets on-page factors! Watch the video for a complete explanation.

For those wondering how to achieve the suggested percentage amounts for varying your anchor text (as mentioned in the video) from inside the SEnuke X wizard, here’s how:

The SEnuke X wizard now lets you specify more than 3 URLs/keywords. Simply repeat a particular keyword multiple times to create the desired percentage. So if we were trying to rank senuke.com for “SEO software”, our wizard setup would look like so:

There are a total of 10 keywords in that list. The keyword we want to rank for (SEO software) is mentioned 3 times, so it will be used approximately 30% of the time. Similarly the keyword “click here” is used once, so it will be used 10% of the time.

Note: The latest update (2.6.10) handles repeated keywords much better than older versions, so make sure to update your version before trying this!

In the “Money Site URL List” that is auto-generated by the wizard, the anchor text will look like:

@5bed3f6d7c2aab26bc77c21906666c3e:disqus
Bluewater. It’s not like your site goes down because of penguin update, they are various factor’s which effect your website ranking and page rank.. Including link exchange strategies etc

great post, but saying that panda + penguin have nothing to do with backlinks is outright incorrect, seeing as though to compliment the release of penguin many webmasters affected have a corresponding webmaster tools warning. specifically stating an unnatural backlink profile. you have not factored that into your presentation at all.

Great work with the onsite analysis and, link devaluation summary though.

[…] Reverse-Engineered! Video From SENuke Hi everyone, Check out this video from Josh of SENuke. SEnuke X, SEnuke Blog – SE Nuke, Search Engine Optimization Software What do you think of this video? He may be right in certain aspects of the Penguin […]

The weird thing is that while I had been blasting with SENUke on tier 2s and linking back to my pages with tier 1’s social networking sites. Why is it that my overall site disappeared, while this particular inner page still topped for a competitive keyword. The process used is still the same.

On a separate note, I am guilty of metakeywords stuffing (I am not sure whether this constitutes stuffing keyword) on the main site. But this is the same as well in the internal page. Here’s what I did for meta keywords for each of my internal page:

I am also interested in this as well, I created my site a while ago and have 1000+ pages that most money pages would definitely be stuffed in the meta keywords section. Could this along bring the penalty?

You will find your inner pages ranking because you have lost links to your main homepage. This has happened to lots of my sites where I was doing deep internal linking. From what I see it is volume of links and consistency that still works. If you lose a lot of links in penguin then you need to get them back before you rank again. Also relevancy has taken a bigger role in this now. But as usual diversification with anchor text and sources should still do the job. But build links daily consistently – if you drop off so will your rank.

1) don’t use meta keywords – google does not count it and may penalize you for it

2) you said “Why is it that my overall site disappeared, while this particular inner
page still topped for a competitive keyword. The process used is still
the same.” if my hypothesis is correct then it is because the subpage 1) did not have very many spam backlink pages that are now discounted, and / or 2) is filling the vacuum left by pages that did and therefore fell from ranking

Good luck to us fixing things with regards to Penguin. Seems this video complements the penguinar webinar by Dan Theis. If you want to watch it it’s in this link for additional resource, seobraintrust.com/penguinar/

thanks, all this is quite interesting and makes sense. The problem seems to be: how to create quality backlinks if you do not want to end up with a blognetwork of 20 blogs and a nightmare on how to invent content for all these.

I know all the bullshit seo’s talk about it loads etc etc however if you can make Guest blog posting scalable and into a process then I think this is the winning link building strategy for 2012 or at least for now.

I love Josh. But I love all men that confirm I am right. 😉 I refused to delete links even though I was asked to. Google is not going to get into bed with competitor link onslaught.

I have case studies over dozens of blogs with different link strategies. It’s simple: the page is devalued (spammed) the links it has are devalued the less value you have going to your site – your site drops.

My question is – how to deal with a spammy page. I have old pages on my sites which act as “hubs”. For instance I have “information by state”. You go to your state and there are links to each important topic. (i did this for pageviews – whoops) but this could be seen as thin content and flagged.

Do I get rid of a three year old URL or try to fix the URL? Does a page ever come back from being spammed?

I came up with this same theory a couple weeks ago but added in my own unique method that is DESTROYING other pages right now. Not spamming as I am not selling anything, but I am going to be posting about it very soon at Social-Signal.com I am not selling anything or spamming at all, just want to share my method here pretty soon.

Best explanation of Penguin yet and I want to desperately agree with you. But, there are still many Black hat techniques that are still working post Panda and Penguin. We have tested everything on the Black-White Hat Spectrum for close to a decade. And since the Panda update in May of 2011 we have gone all out in our experiments. From Absolute Black Hat to Ultra Vegan White Hat. The Absolute Black Hat things still work very well. Keyword stuffing, highly spun content that is unreadable, hiding text, cloaking, redirecting and a lot more tactics that are supposedly being “flagged” by these algos.

So as I would love to agree with everything Josh is saying, but I have empirical data that is contrary to what is being said here and by most people. And it makes me crazy because I can’t come to a conclusion because of real conflicting data that exists.

Lets look at it like this. If everyone agrees that hiding text is a huge no-no and will easily be picked up by one of the algos, then how can we have over 100+ sites built before, during and post the panda/penguin updates that do very well in the SERPS??? Would that mean hiding text is OK. Yes it would be. Sorry, no one can tell me otherwise when we have sustained rankings with hiding text in a hidden divs for years and continue to make sites that place with it.

What about highly spun nonsensical content? We can all agree that using this as your primary way to build a site would be foolish. But what if I could show you a couple dozen sites that are page one for hundreds of thousands of keywords and only use un-readable spun content? Wouldn’t we have to question our thoughts on highly spun text being used?

We can’t TURN A BLIND EYE to these facts. Assuming you believe what Im saying, then how do we explain this?

Josh, if you email me I can show you some of things that we are doing so you can see for yourself. I would love to get a clear idea of how the algo is really working, but it seems to be very inconsistent. And sadly, I can show you so many of your points that are simply not always happening. I consider my self an SEO Scientist, and we can’t ignore conflicting data that our long time experiments have proven to us.

(please note, Im still think what Josh is saying makes sense, but I would love to know how to incorporate these facts into the discussion. I almost wish none of these black hat techniques worked, but they do. And White hat Sites get flagged more frequently then our Black and Grey Sites. Go figure!)

It’s a bayesian filter, not one thing on that list I posted will get you unless it is egregious, but a number of them will. Cutts admits it’s not perfect – meaning it is not finding all instances of the things I said. They willbe working on it so that it will.

I have a blog that has a lot of very high quality content on it. There isn’t anything spammy going on with it because it targets mainly social media sites instead of Google.

There are pages on the site which has the main keyword once on the page and the page itself is 1000 words long with a picture and no hidden text. Many of the pages aren’t even targeting a specific keyword, so they can’t be over-optimized.

Before April 25th it got about 300-500 visitors from Google a day, after April 25th it gets 120-250 visitors from Google a day.

The only thing I can think of is that it has links from blog networks. I haven’t gotten a message from Google, but right around the time of penguin it took a hit. I’m working on taking those links down to see what happens because I honestly can’t think of any on page thing it could be.

Why remove your links? Just as Josh said in the video. Google would have already de valued them and I would certainly not be removing them if you did not receive a WBT warning. How will this help recover your rankings if you do this?

That was very imformative, only thing I dont understand is the non-visible text.. are you saying don’t use keywords in the meta tags? ie the title, tag, description tag and kw tag?
coz not using kws in the title and desc tags seems a bit radical?

Using meta keywords is an outdated practice. There is no need to use meta keywords and Google has said for some time ago that it no longer uses this tag in the algorithm. Less work! Plus using meta keywords gives your competition an idea of what you’re going after. Not that they can’t figure it out but… Well who wants to give a strategy road map to the enemy?

Meta title and description would still be very important regardless of on-page visibility. They are displayed in the serps and can determine if you get a click or not.

I do like Josh’s recommendation to use the meta title as a call to action. Keyword 1 | Keyword 2 |site.com is boring to the webmaster and the user. An exciting call to action will appeal to a user and should achieve a higher CTR.

No, you can still use the meta keywords as some search engines (all be it not the main players) still use this to check page relevancy. I wouldn’t bother using this any more however if you want to continue I would advise you only enter keywords that the page is relevant to. Don’t go stuffing in keywords that your whole site is about. You won’t (or shouldn’t not) get penalised for it if you use it sensibly.

Why I do not 100% agree with you is this: I have a site ranked based only on social bookmarking links. Certain pages had a fall of about 30 spots, others were unaffected, while the homepage fell out of 1000. The effect seems to come with higher monthly searches. The homepage had only 10 links.

I have another site in the same niche, which has the same bookmarking links as Site #1, plus Senuke links, AMR, directories, and some PR5+ editorial links. That site has fallen out completely, the highest rank it has is around #300, while before Penguin it ranked top 10 for a bunch of keywords.

If it is safe to say that the social bookmarking links to BOTH of my sites are coming from the same spammed social bookmarking sites, then a devaluation would cause a smaller drop for the second site than the first one, because the second site has the bookmarking links PLUS a bunch of others.

In my opinion it is not a devaluation, it is a penalty. It is not an integral part of the Google algorythm, it is a filter like Panda.

You are probably right in that it is not the links, but the content on the linking pages. In my opinion what they did is went out and checked the content of every linking page. They got good recognizing duplicate content with Panda, they do the same with your linking pages, like they are your expanded site. So Penguin recognizes that 100 links point to a site and have the same article around them poorly spun and not readable. Penalty.

you said “I have a site ranked based only on social bookmarking links. Certain
pages had a fall of about 30 spots, others were unaffected, while the
homepage fell out of 1000. The effect seems to come with higher monthly
searches. The homepage had only 10 links.”

this correlates with my hypothesis perfectly. send me the site and I will confirm. twitter or email.

Okay, I’m confused about something. Are you saying we shouldn’t be putting anything into the meta keyword section of our SEO plugins or etc. For instance, in my Thesis theme it has meta keyword area in the seo section. Should I stop putting meta keywords in it?

Great efforts Josh, but the above comments is right. All is theory. When you finally figure out how to cheat google. They come in a different approach. Only to find out two general truth.
1. Make a useful and contribute to the web with quality pages on your site.
2. You can do seo on other sites, but again make it contributory and useful. That I think is the general and utmost message that google is sending with these updates.
Beat this today, yeah but the next time you wake up. They’ll use their billions of profit to catch you. Just play fair.

I prefer to syndicate some of my content to about 10 other sites to point to my quality ranking content, since the content is the same for all 100 off site pages will that trigger a flag on those pages?

Completely wrong. I have pages with absolutely no content on them. A blank page with no backlinks to it what so ever. These are competitive search terms too. all it has is the keyword in the title and the URL. So your completely wrong

Great video, however I do not understand why everyone wastes their time talking about penguin and Panda. It is not a big deal you could of summed up this 30 min video by saying do not use duplicated content and have high quality content it is that simple. SEO is not rocket science like everyone tries to make it seem it is all about marketing. Have a good link building plan and that is all you need for SEO.

2) Just b/c everyone else is saying it (and not everyone is actually,
Dan Thies and Leslie Rhodes are not – they think it is onpage too) does
not make it true.

3) And I ask for your proof as well. There are anti-correlations.
Show me 100 sites with “red widgets” in their title, url, meta kws, and
majority of links that is penguin-ified and I will show you 100 more
with the EXACT same profile that is not.

I have proof that discounts that theory. The reason I came to my
conclusions is that I had a client site (that I cannot out here) that
was hit by penguin. I checked analytics and it

a) had a number of pages that were penalized on keywords – meaning it
is not a sitewide – to do the same in analytics for your site check
content > landing pages > add secondary dimension: keywords,
tailor for Apr 25 and compare to past and you will see the penguin
effect is a PER page PER keyword effect. You will also notice that if
you are penalized along a phrase like “xy” best XY, or get XY here ay
also be down on some pages on up on others! This means it CANNOT be a
link percentage penalty, or you would be down on ALL instances of XY. It
means that some links are directly or indirectly bad pointing to some
pages.

I repeat, and you can check yourself, some pages went down on the
same keywords and some went up!!! This means it is not a sitewide
penalty from link percentages.

b) so I looked deeper with that in mind. The site in question had a
bunch of comment links with the name “Thomas Hobbes” as the anchor text
(nofollow as well btw – nofollows pass juice, or at least pass penalty =
watch out). now that is the name of a 17th century English philosopher
and this site has nothing to do with any of that ;p So out of curiosity I
did a search like this

“Thomas Hobbes” domainname

and a subpage heavily linked with that name for the site came up in
8th position. This is interesting because of course old tommy hobbes did
not appear ANYWHERE on the site and the site has nothing to do with
18th century philosophy or anything remotely related. Then I used the
(now turned off) penguin query hack like this:

“Thomas Hobbes” domainname -dsfsdfsdf.org

and it popped up to the 1st position!! This means that page was
effected by pengiun on that particular keyword “Thomas Hobbes” That
means those links were directly or indirectly causing the issue.

But I realized it had to be indirect because the Tom links were WAY
low on the total link percentages, only like 5% or something like that
to that page!!!

So this told me right away that I needed
to look at the backlinking pages. There was something wrong with those
links or their linking pages. They were being discounted. So I looked at
the backlinking pages. Once I did, I noticed that a lot of them had the
things I mention in the video (dup content, keyword stuffing, garbage
pages, etc.). I then did my standard penalty test: ***These pages did
not rank for their own title tag in the top 100. Any page that does not
rank in the top 100 for a verbatim search of its own title tag is
usually penalized. ***

This is also exactly what Cutts said penguin was looking for. Spammy
backlink pages. This also coheres with why it needs to run offline like
panda (whereas pagerank and linking penalties happen and lift everyday).
Tthis also explains why they reported a “keyword classifier filter”
adjustment for april – there were no linking algorithm changes reported
for april.

All the direct and anecdotal data fits with my hypothesis. I can find anti-correlation for all the other penguin theories I have seen. Penguin is not the only algo. It works on thresholds and is admittedly by google not perfect.

I repeated this process for all penalized queries and noticed the
pattern held – the backlinking pages that did not rank for their title
tags were no longer passing significant link juice in the anchor string
and those were the exact phrases each page was penalized on – there was a
1 to 1 correlation, with no variation. (I had to write a program to
scrape google to do this of course).

I said it was only a hypothesis, but one that correlates perfectly
with the evidence with no anti-correlations. No other penguin hypothesis
that I have run across does this.

Hey dude you can call bullshit on me but then I ask for YOUR proof / reasoning process as to why. I don’t see any here from you.

[…] saw the URL as they are known to operate one of the biggest spamming platforms in the industry… SEnuke X, SEnuke Blog – SE Nuke, Search Engine Optimization Software Watched the video and have to admit that much of what he said, i can agree with. Free […]

josh, i have to ask you then: what would your approach be now to the entering of our header meta information…i have read to still include KW in the title, optional for description…and you are the first to talk about meta keywords.

could you summarise here, just quickly, what exactly you think we should be doing in our All In One Seo fields (meta fields) on our WP blogs??!!??

What I’m confused about is why you say penguin isn’t about links, however, you mention links multiple times!?

At the end with the query match percentage counts and most notably at 8:00. You discuss how linked pages that fall, will also cause your pages to fall in the SERP’s. While right next to that (on your board), you have “Penguin & Panda have NOTHING to do w/ links”

Dafuq?

I really appreciate that someone has really taken the time to figure out how penguin has hit, but please explain why links don’t matter, because clearly, they do.

It is by definition a loosing battle, yes you may temporarily develop a method whereby you may take advantage of the system, but eventually they are going to change everything up again the only good thing here is this, people are leaving google because they are sick of them. The other day I searched for a product and got instead a wiki, really I dont want a freezing wiki I want to buy something, anyway, xoutg.com

Josh, as much as your theory sounds interesting I can’t find support for it in the data I’ve looked at. I link build and track 200+ sites and a portion of them (probably 30%) have been hit by Penguin which has give me a lot of data to work on.

Here is what I’ve found:1) Sites with a similar on-site approach have been treated differently by Penguin.Many of the sites are similar in content and were built in bulk (similar industry, similar page layout, similar title tags, similar on-site optimisation approach). Generally I try not to overuse keywords in copy, I avoid using META keywords tags, and stuffing keywords in stupid places like image ALT text.

Despite looking hard I cannot find any correlation between onsite and Penguin.

2) Many sites were receiving links PRIMARILY from link-networks.

Let’s assume that Penguin is an on-site related penalty, but has not hit my sites directly, but rather has hit the pages that my back-links are on.

What I should see then is rankings sliding (various degrees) depending on how many backlinking pages have been hit by penalties.

Instead what I’m seeing is either drastic drops or no drops (even though the SOURCE of the links remains the same).

3) This is a page-based penalty.

I’ve been building links to to the home-page quite heavily on most my sites. On affected sites I’m seeing the home page get penalised (and drop into #100+), while often a sub-page that hasn’t been linked to now outranks the home-page (as it most likely hasn’t been hit with a penalty).

Now it could be that the sub-page is ranking because it’s not “over-optimised”. Going back to 1) however, home-pages that are optimised to the same level have been treated by Penguin differently.

4) The only difference I’ve been able to find between several affected and not affected sites has up to know been the anchor text patterns.

You’re theory does sound interesting, but I’m seeing a whole lot of contrary evidence. I’d be willing to share some examples privately if you wish.

Nice video and good food for thought. The one thing you lacked here that would have made your case convincing is empirical evidence. You should have given specific instances of websites that you analyzed that led you to this conclusion. You could have used generic urls for illustrating your point instead of the actual ones that you analyzed.

So, all in all – a pretty good video – Just lacked the convincing real world examples in order to convince me. I am still of the belief that Penguin has everything to do with backlinks and more specifically – anchor text.

Interesting view on the various updates. Looking through the rest of the comments here, it seems that no one really has their mind made up. Let me tell you how we do it and have not suffered in any of the updates. We provide well written and informed comment about the concerns our customers have (what problems they need solving) and then we litter this a range of keywords which automatically get used because of the focused content.

We then take those posts and content and ask/beg/allow links to be build up from those that are interested in the article.

I have lots of blogs and most with authority that lost SERPs. Whenever i try to do a backlink campaign that don’t consist of real quality backlinks, like for example links from related sites, one way links, sites with low OBL, gov and EDU domains, HIGH PR etc… my site will lose positions.

Im pretty sure this new algo changes focus on backlinking, which is the main factor today to get positions. If you don’t believe this, just search google for 1 hour and you will see lots of sites on the top 3 that don’t follow any of your guidelines, they just have a bunch of quality backlinks and that’s all.

If you are going to go out and say you have reverse engineered google at least have some evidence to support it instead if putting a heap of big words together and trying to sound like you know what’s happening.

I pretty much agree with the content in the video, apart from one small part where it states that the use of url, title and headings may jeopardise your ranking .. this, in my experience is slightly off-base and from the recent 15 page expansion of one of the sites that I manage, it has had no relevance to the rankings .. those new pages have jumped right in to the 1st page of google results using the desired keyword (phrases) in the url (somewhere), title tag & H1 tags … some of those pages actually went into the #1 slot

In addition to this those new pages also had ZERO external backlinks, relying completely on internal structure and good on-page seo practices … including images, unique content (approx 30-50%, this may be higher), social interaction and most importantly a human friendly interface

Penguin is all about the links… If Penguin was nothing to do with links, then why have you released an update for SENUKE that handles multiple keywords better, and an explanation on how to get the right percentages??? Un-natural linking has also been known about for a while, so if your theory is true, why just release this new update now???

penguin is not the only algo – the unnatural linking message seems to have to do with unnatural exact match percentages, it is prudent to look natural in all you do so in the future if/when they check for that you are already covered

This cannot be true sir, Google is too smart for that, they know old sites from the early 2000’s would have used meta tags to cater for not only Google, but other engines also. Are you saying that all old sites are now hit for spamming? If so sir….why after penguin did we see the rise of so many ‘old and dated sites’. Sir I have sites that used meta keywords extensively and in the description and title that were unaffected by these updates. They also had clean link profiles and an even spread of anchor. I’m afraid sir I must stand by my original thought….that you are a horse bandit and a bafoon.

I should also add sir that I have sites that used no meta keywords, and neither did they have a keyword density of above 1% but ranked on the merits of 95% exact match anchor text. The day before penguin they ranked #1, the day after gone. I must upgrade you sir to a complete idiot. You really are talking so much shit and I have the evidence.

2) Just b/c everyone else is saying it (and not everyone is actually, Dan Thies and Leslie Rhodes are not – they think it is onpage too) does not make it true.

3) And I ask for your proof as well. There are anti-correlations. Show me 100 sites with “red widgets” in their title, url, meta kws, and majority of links that is penguin-ified and I will show you 100 more with the EXACT same profile that is not.

I have proof that discounts that theory. The reason I came to my conclusions is that I had a client site (that I cannot out here) that was hit by penguin. I checked analytics and it

a) had a number of pages that were penalized on keywords – meaning it is not a sitewide – to do the same in analytics for your site check content > landing pages > add secondary dimension: keywords, tailor for Apr 25 and compare to past and you will see the penguin effect is a PER page PER keyword effect. You will also notice that if you are penalized along a phrase like “xy” best XY, or get XY here ay also be down on some pages on up on others! This means it CANNOT be a link percentage penalty, or you would be down on ALL instances of XY. It means that some links are directly or indirectly bad pointing to some pages.

I repeat, and you can check yourself, some pages went down on the same keywords and some went up!!! This means it is not a sitewide penalty from link percentages.

b) so I looked deeper with that in mind. The site in question had a bunch of comment links with the name “Thomas Hobbes” as the anchor text (nofollow as well btw – nofollows pass juice, or at least pass penalty = watch out). now that is the name of a 17th century English philosopher and this site has nothing to do with any of that ;p So out of curiosity I did a search like this

“Thomas Hobbes” domainname

and a subpage heavily linked with that name for the site came up in 8th position. This is interesting because of course old tommy hobbes did not appear ANYWHERE on the site and the site has nothing to do with 18th century philosophy or anything remotely related. Then I used the (now turned off) penguin query hack like this:

“Thomas Hobbes” domainname -dsfsdfsdf.org

and it popped up to the 1st position!! This means that page was effected by pengiun on that particular keyword “Thomas Hobbes” That means those links were directly or indirectly causing the issue.

But I realized it had to be indirect because the Tom links were WAY low on the total link percentages, only like 5% or something like that to that page!!!

Thank about what this means. So this told me right away that I needed to look at the backlinking pages. There was something wrong with those links or their linking pages. They were being discounted. So I looked at the backlinking pages. Once I did, I noticed that a lot of them had the things I mention in the video (dup content, keyword stuffing, garbage pages, etc.). I then did my standard penalty test: ***These pages did not rank for their own title tag in the top 100. Any page that does not rank in the top 100 for a verbatim search of its own title tag is usually penalized. ***

This is also exactly what Cutts said penguin was looking for. Spammy backlink pages. This also coheres with why it needs to run offline like panda (whereas pagerank and linking penalties happen and lift everyday). this also explains why they reported a “keyword classifier filter” adjustment for april – there were no linking algorithm changes reported for april.

All the direct and anecdotal data fits with my hypothesis.

I repeated this process for all penalized queries and noticed the pattern held – the backlinking pages that did not rank for their title tags were no longer passing significant link juice in the anchor string and those were the exact phrases each page was penalized on – there was a 1 to 1 correlation, with no variation. (I had to write a program to scrape google to do this of course).

I said it was only a hypothesis, but one that correlates perfectly with the evidence with no anti-correlations. No other penguin hypothesis that I have run across does this.

I respectfully claim that you have no idea what you are talking about, in terms of science, scientific method, or seo.

This does make sense, and obviously onpage is a big factor at play here. I’m at a loss for what to do about things like alt tags now.

What do you do if you have an ecommerce store, and want to put the product name in the alt tag for screen readers? This is a valid use case for disabled people with poor sight who use your website, but will Google now see this as spam?

When it comes to backlinks, I can’t help thinking that it’s not just devaluation of links. At the SEOMoz blog they’re saying that WPMU.org recovered by having links removed. If it were simply devaluation, removing them should have had no effect right?

Sorry, just thought of something else. I often organise images on my computer by giving them a descriptive name – often the name of the product in the image. I suppose image file names could look manipulative to Google too?

yes, I read that article too – I thought oh shit my hypothesis could be wrong. But ross hudgens admits in that article that 1) it could be an algorithmic change to allow wpmu specifically to rank – google has fixed things before (viagra, make money online, for example) if it is high profile, 2) that removing the links shouldn’t make a difference UNLESS there is some kind of “too many spam flagged pages linking to you” penalty – which I DO speculate about in my video, but I cannot prove yet – this could be some evidence for it, 3) they ALSO changed onpage stuff to their own site as well, so it could be that too, alone, or in combination with removing some links – some of which sounded like paid or sponsored links

speaking of which, there are other algos that look for paid and sponsored links and they may hit at the same time as they are rolling algos, but that is statistically unlikely

also all they show is a general traffic report – my question is what keywords dropped on what pages, and then what keywords and now back ranking what pages? this will tell you better what’s going on

all in all there is too much going on with that site and it is too high profile to tell for sure

Thanks for sharing Josh. So many people in the SEO community do no research or testing. They simply read opinions and give opinions….many examples can be seen in the comments on here lol. You should stop wasting your time responding and asking people to provide evidence. They don’t have any and are just on here to flame and cry that their sites got penalized.

On a beneficial note – I have noticed a random element to drops in SEO performance. I can’t get similar sites with the same link profiles and on-site elements to react the same to algo changes. It’s possible Google is randomizing things on purpose so it is harder to pinpoint exactly what is causing the issues. What do you think?

Also – The drop in rankings seems to primarily be on a URL level. So new pages with better linking strategies can still rank. I have experienced whole sites going down, but those were especially full of shit.

What people need to understand is that on-site changes dramatically effect off-site. Every page you get a link from is being run through the on-site filter. The value that link passes is dependent on the on-site rating.

Josh this was a decent video but I have to disagree that this has nothing to do with backlinks because even in your video you stated that if pages that we link to are poorly created in the eyes of Giggles it will effect our money site.
Therefore, your argument is contradictory.

Doesn’t it have more to do with where the links come from and whether,or not, it matches the content of your site?

IE: A basket weaving site that has backlinks from a medical forum (which has nothing in common with basket weaving) vs backlinks from a knitting forum? The common element being that they are both hobbies.

Links that have zero in common with the site are being targeted from what I have read.

Josh, I have data on 20+ sites that goes against what your saying. It’s clear from my own data and that of thousands of other webmasters that Penguin is at least out to get link building (probably too much exact match). I cannot recommend that anyone takes this man seriously (although what he says about anchor text diversity should be taken seriously).

You sir are indeed a horse bandit and should open a mobile burger bar!! (people love to talk crap whilst they wait for their burgers)

Hi Josh, I have emailed you regarding our site because if what you say is true and you can find this issues on our site and fix teh penguin crush I’ll be happy for you to use our site to prove your theory.

Right now though it seems you have no proof and while most of what you say is true for general SEO advice it just does not make sense based on what has happened to us and many other sites, its links that have caused the damage.

Bob, ur so funny, made me laugh at least half a dozen times today in just this one blog post.

I suspect u have a full time job somewhere on a comedy script – which might pay way better than seo.Anyway, if you decide to open the mobile burger joint, line me up for a burger, Joshua – hopefully something a bit kosher with more beef like content.

Joshua great post and work. Don’t let the naysaying negative commenters get you down. They are just frustrated that they can’t figure out anything for themselves and no one will give them a button to push to fix it all!

I have data that leads me to believe you are on to the answer or at least a major part of it. The filter or penalty is definitely looking like it is page specific and also keyword per page specific from what I have found. I have one site that has not recovered from the original 3 I was tracking that got hit. I can’t figure out why this one has gotten no relief and the other 2 are recovering very well.

Stop using web 2.0 and forum profiles specially when you are are spreading a campaign’s submission over few days. Google dislikes such pages containing dozens of anchor texts with little or no content.

I am using SEnukeX since a year, It worked for me and really my site is going on well.
Since senuke done well with most my sites, I launched a big portal naturally the domain was new, as usual I started building backlinks using senuke I did it carefully bcoz it is an important project for me. I used several techniques in senuke to make my blogs or content appear more natural but it took time.

I have a good PR for new domain and later stages my site became victim for penguin, I was afraid at that time and I discontinued with senuke.

After some research and yes even I scraped something like you, I was concluded that it is not because of senuke and even today my portal is working well and well and well.

But I recommend users also to participate in Social media marketing which is important nowadays along with the senuke. Getting backlinks from other domains or related sites is also important.

I was just waiting for this update and I am reusing it for me and for my clients. I know the google’s play very well.

Now I am comfortable and can F**K google. Google forgot how it became popular and I will make them remember those days.

Thank you for this update and which is necessary for most of the senuke experts.

Thanks for the video. While I’m not sure you are 100% correct, I sincerely appreciate an alternate theory on Penguin. A lot of the other theories out there seem to be developed in an echo chamber. Frankly, I’m not buying some of the stuff I’ve heard thus far.

While there isn’t a lot of solid “proof,” merely observations we’ve all had with our various sites, the prevailing theories on backlink penalties aren’t explaining the movement (up and down) I’ve seen since Penguin. Some combination of your ideas into the mix may go a long way to further clarifying what is going on.

I think you’re like everyone else… partially correct. I do appreciate your prospective and will be trying to fix any on page SEO issue I might have. however, i had 2 sites drop due to either penguin or panda that had been fine through other panda updates. I have no doubt that spamy backlinks are getting sites pentalized. Both of my sites use gwt and neither of the recieved any notice of unnatural link patterns. Both got wacked but my money site got nailed hard.

My money site won’t rank above the hundreds, even when people search specifically for my site (which has a unique domain name), it doesn’t rank. It was very obviously penalized. I’ll try the on page SEO and I’ll respond if it works but I highly highly doubt it will do anything. Can’t hurt though, right?

no it can’t hurt, and if your domain name is not ranking it’s own name, i.e., the search http://www.yourdomain.com is not ranking for itself, then you have been banned and have a bigger problem than panda/penguin which (to my knowledge) don’t do that – that is likely a manual penalty

that is an interesting thought and entirely possible. But I have seen high authority pages get hit as well as low authority on high authority highly relevant terms. In each case they had many spampages linking to them with that exact match keyword. High profile, high authority does not seem to matter, they get hit too.

So what does this mean for e-commerce type sites, where much of the canvas is the same (the cart), with only the product information being different? Same question for exact same product, but in, for example, different sizes? Thanks for a helpful video!

Sir I wish it was that easy but I have evidence to confirm that you are blowing hot gas from your sweet behind. I must stress sir that links have been penalized and it is now very difficult to rank in Google. I fear you are living in a fantasy world and hoping to bring others with you so that you can dream with them that you can still make money from ranking in Google.

Sir Matthew Cutts appears a lovely man, but we all know he is a bastard at heart and has ruined the lives of many hardworking men and women.

Recovering a site from the recent updates following the rules of on page de-optimization was unsuccessful. A 301 redirect on 3 sites also confirmed the link juice was alive and well, however after just a few short days the new site was penalized from the incoming link juice being provided with almost 100% exact match anchor and from blog networks and known link building article directories.

I am afraid sir that I must tell you that your are talking nonsense in an already confused and down beaten community. I would however like to thank you for your input, invalid as it is.

You must stop giving people false hope sir in the hope they will buy your software and continue fighting a lost battle. You must stop!

I would like to add a comment to this I have been using gsa search engine ranker, And I’ve seen alot of hype and Ive cracked the code rah rah regarding Google update, and quite frankly, Its always been the same for me. I’ve used 4 to 5 different anchors and and never had a problem… Good on page seo and great content.. and the matter of the fact it still works. its good to seo nuke doing there thing 🙂

Not sure i this is enough to make the conclusion, or if I would ever agree with it. But any SEO worth a cent takes the time to compare and test and comes up with his own formula, since the actual formula will never get leaked. Right or wrong, he took the time formed an opinion and will apply and use it. When he sees he is wrong (or right) he will adapt and adjust but actually learn and actually create new data for the community at large to consider. Hats off for actually trying to make a map while 99% of those with our jobs just read blogs or take 2 day seminars of old information. There is no wrong information in the scientific approach, just “wrong” so far or yet to be proven wrong….so even if it is wrong glad someone opens the conversation that invites people to prove it wrong and help us all learn.

And he stands by it and defends it, so…I hope it’s on page, easy to control and test, doubt it, but I hope so.

And if nothing else, look at all the links this post has likely gotten him.

And im sure he doesnt mind that we all link with the exact same keyword, since it wont matter.? 🙂

Google are some smart motherf***as. I assume your hypothesis is correct to a degree. And I also assume Google throws in a good amount of randomization to throw all of us deep thinkers off. I say to the SEO community, **** em, do best practices, and you will succeed in the long run. If you’re in it for the short run….switch to something that doesn’t depend on Google.

I have one question for you, and it might be something that you missed in your analysis.

Have you noticed that a VAST number of affiliate sites have been thrown out of the rankings with this update? Searchers for major terms show that a very small number of affiliate sites, that were once there, are not anymore.

What are your thoughts on this? I remember a few years ago when Google attacked CB links and sites promoting CB products – almost none were appearing on the 1st page of Google, especially if they had raw CB links.

I do not know whether Josh is right or wrong as I simply do not have enough experience to say either way. I will simply take his research, think if it makes sense in light of Cutt’s statements and simply say Thank You to Josh.

A wise man once told me “If what your doing isn’t working, TRY something else! If that aint working, TRY something else!” we all have numerous sites and can create them for next to nothing…. the moral of this post test, test, test. Find out what works for YOU!

Great video. My observation is that the Google Penguin update looked PRIMARILY at low-quality PAGES that violated it’s published quality guidelines. LINKS are a SECONDARY result of penalized PAGES.

Any low-quality page linking to a website got penalized either by being de-indexed from Google or plummeted way down in Pagerank. The result of these penalties affected LINK juice to the parent website.

In other words, less link juice pours into the parent website and it too suffers as a result of it.

I have websites at are in total compliance with Google’s quality guidelines. However, I have many low-quality websites linking back to them who were severely hit by Penguin. As a result of those websites being in violation, I paid the price as well.

So when people say that Penquin is related to links are not getting it. Of course Penguin is related to links but not PRIMARILY and DIRECTLY.

Penguin IS primarily about low-quality pages (CONTENT) and removing them from the Google index. LINKS are an indirect and secondary result of the removal of those low-quality pages.

If your website LINKS TO low-quality web pages that violate Google’s guidelines, you will be impacted.

We all must now be just as picky and anal as Google is when it comes to our own website property. Seek quality tenants (content), evict the bad tenants, and do not associate our web pages with undesirables – Google Guideline Breakers.

Fab video, answered a lot of questions. I’m not too sure about the duplication of design – I personally haven’t seen any evidence how the design has influenced the site. I’m pretty certain at the end of the day it comes down to duplicated and low value content from the results i’ve seen and really enjoyed your video SENuke. You go guys x

I have one site that completely tanked after Penguin. This is a ten year old domain that ranked highly on page 1 for a variety of search phrases in its niche. It’s not a money making site, but a hobby site of mine that has hundreds of pages of good quality.

After spending time correcting onsite issues, I am slowly seeing it come back. I do believe that some sites that have been hit are due to their backlink profile, but onsite issues are definitely a factor with Penguin.

Obviosly Josh you have no clue what so ever, I can tell now that you run 0 sites of your own because if you did you would not make misleading videos like this one. This update 100% about keywords anchor text. PERIOD.

Both Josh & Matthew Cutts appear to be kind, knowledgable men with the best intention of the Google users at heart. But we must remember sir that they are both out for their own evil gains. Matthew Cutts wishes to be hailed as the saviour of the internet and Josh wishes you to buy SE Nuke in the false hope you can get your rankings back. They are both evil bastards and are flat out lying to our entire community. Someone PLEASE stop these bastards.

Thank you sir, but I must say that it is YOU who are the idiot. This was a very public case and one that required Google manual intervention to correct and save face. Very much like the ‘Viagra’ & ‘Make Money Online’ fiasco when Penguin launched. You must remember sir that when Matthew Cutts returns home to his wife at night he does not want to have to say ‘honey, I broke the internet!’ You sir Tim Polt are indeed an idiot.

My site was hard hit since April 24. I didn’t think I did any B-A-D stuff, but after I watched Joshua’s video, there are a ton of things I can correct, a ton. Especially on the sites that provide links for my money site: most of the errors and bad stuff is there.
I had absolutely no keyword linking, just plain links with continue reading, etc. but there was ton of duplicate content, meta and hidden keywords, etc.

I will provide another post as soon as I am done pulling the crap from those sites, and tighten up my main (money) site… I am sure that Joshua is right on.

Josh I really appreciate your work. You are getting a lot of hate but some of us recognize real testing. Doesn’t mean I agree or disagree just saying I appreciate it. I also don’t believe it has to do with links.

The funny thing is these so called seo experts or as I like to call them, trolls, think it has to do with ratio of anchor text. The reason why I think that is so funny is because that information has been out for 8 months and they are barely recognizing the importance of anchor text variation.

This has nothing to do with that, I believe that was always an issue. To think that a room of phds sat around and the only resolution to web spam they can think of was to come up with a certain percentage of anchor text is a joke.

Hey BX, have to agree the Phd theory of finding seo’d sites based on the anchor text percentage is kind of funny, but not funny, since its a clear winner that I could determine nearly anyone’s site here in this forum as seo’d or not simply by looking at the percentage ratio of keywords to domain name in the backlinked anchor text.

I used to think that a properly seo’d site could have primary anchor text variations of up to 30% and still ‘look good’ to google.

But that was before I had a chance to analyze the natural backlinks of some rather large sites that could care less about seo.
And sometimes they hire an seo team, but the team is clueless but the site advances for other reasons so management keeps the seo team on, lol.In at least one glaring case, the public had ensured that the anchor text being used in the external links were in 99% of the cases – variations on and mispellings of the domain name.The site ranked for over 6300 keywords because the backlinks provided the domain trust and the internal links provided the proper product related anchor text.Since the last update, the number of keywords in the top 20 ( semrush ) has dropped from 6300 to 6100+ but the value of those keywords has gone up from $600,000 to just under a million.My personal belief is that the google algorithm is so complicated that with the exception of obviously spammy sites even the Google engineers don’t know anymore why one site ranks and another doesn’t.I think its like trying to decipher the interior of a girl’s mind. Yeah, guys we know our own heads are often empty, but a girl’s head is something else.

What works on monday, fails on tues because you tried too hard monday night and popped a warning, but on wednesday having gained some additional credibility none of monday/tuesday warnings wil trigger at all because ur now above that petty stuff, right?

Well on thurs, thinking that u were above that pettiness, u hit ur girl directly with 50k worth of sweet xrunner arguments and oh boy did she ever like that, huh? Not!

No disrespect meant to the women in here, just saying that too often we think these algo’s are linear and easily deciphered. Too many posters on/in these comments appear to have examples where Josh’s theories seriously didn’t work, but few are willing to let the rest of us really have a *look* at it.

I must say sir that you have displayed yourself as a
complete fool. To say that the Google engineers have no idea why one
sites ranks and another does not is clearly misguided. Also that you
compare the Google algorithm to that of the mind of a girl shows that you are
of an inferior IQ.

You sir are a complete fool & bafoon. No one
would wish to provide their url’s for you to ‘have a look at’. The very
fact that you mention Xrumer blasting a site with 50K links clearly shows you are
an SEO amateur trying to look knowledgeable in our currently downbeat
community. You sir are indeed a total fool and should not engage in
further seo tactics. Your ‘theory’ is as random as the shit which spills
from your mouth on a daily basis.

Thanks for the video, pity the lack of data. I mean numeric exact data to validate each theory.
Difficult without a proper case and showcasing a potential recovery.
Now your comments in 10:09 opens the door to the reality of Negative SEO .
what are your thoughs about negativeSEO?

Thank you sir for providing this video. I found it very scary but I am also now thankful that I can harm all my competitors. I would also like to try and harm the website of Sir Matthew Cutts, but I fear it could be difficult.

first thank you very much for your video and the lot of work with it. I like your video and i am happy that you share it for free with us. Continue your work and let the other guys what not repect your work thinking what they like.

all of my sites tanked but one. that was a Spanish site that only had a few article links ‘just a handful. now yesterday even that site tanked. after the 1st penguin update it doubled in traffic but now dead. this cannot be just about links.

Google us doing a wholesale destruction on anything that is organic affiliate

You must remember sir that Google does not wish for, nor require websites that are only there to make money. They have billions of pages within their index to choose from, they will happily discard any they wish. You sir must think like a professional business person if you wish to rank in Google. You must also ensure your website appears as a proper business, which unfortunately will discount most of the people who will read this page. It is time to think smart sir, protect yourself, and become socially popular and acceptable rather than try to be a SERPS scavanger. Become a business professional sir, instead of a ‘bang me a buck affiliate’. You sir must also on the surface appear to respect and abide by the rules that Matthew Cutts has laid down. If you do this sir I believe it is possible to rank in Google. You sir must also have many ranking tricks up your sleeve and implement them on a daily basis. You sir must also have a way to ‘reverse out’ of your strategy should it be required in the future. We all know that Matthew is a devious bastard and will turn on us at any moment he chooses.

Josh – great analysis and I’m making changes on my sites now. Your theory clearly explains both the onpage issues and how they impact links in a cascading manner. The problems with the SEO blog sites were always so transparent it was amazing it worked while it did. Google has always had numerous inconsistencies with its algos, and those inconsistencies could be exploited but you always had to know there would be an end to that. The unfortunate responses to your message were also highly predictable; the majority of the SEO industry lack publishing skills and are forced to focus on off-site factors that helped to create the need for Panda and Penguin. I do think you downplayed the SPAM Flag factor as many of my sites fell from 1 to 500+ in a single day. I also think that there are a lot of leeway on the duplicate content, that it can go more towards 90% when you have recognized variables, such as geos, and that the ratios on anchor text of backlinks can be wildly different, even 100% keyphrases, with the exception of exact match anchor text and domain names, although I’ve seen no penalty in that – it just offers no benefit. Overall, this is sage advice that I hope my competitors completely ignore.

I do not usually post on these forums, only read. I would just like to say Joshua that I appreciate the effort and the insight. These debates are important and thought provoking. The people on here who are just bashing you are the real problem, as they contribute nothing.

Here are a couple of my thoughts:

1. People get up in arms over these new algos every time they come out. They really never do much more than improve what Google is already trying to do. There is no big secret, you just have to get better with your linking, keep up link velocity, and quality quality quality.

2. Don’t link junk to your money site, it’s that simple. If you do, you are taking a risk. It’s much better to link to your blocker sites and make sure they have quality content. If Google discounts the junk links, the quality of your “blocker” sites should not be affected and link juice will still trickle down if it these updates do not catch them.

3. I think negative SEO is the biggest myth on the internet. I know many many people claim to have done it, but did they ever track the site they hit long term? I have never found a concrete example with someone who was willing to name names….even privately. When I first began SEO, and had no clue what I was doing, I actually “negative SEOed” my own site. However, the drop lasted only for a week or so before I saw them return (with a slight increase). It makes much more sense to have a positive system where you get a cookie for a good link, and nothing for a bad one. Google is smart enough to know this too.

4. I don’t know how accurate your numbers and formulas are, or if they are even intended to be exact. I also think that certain terms Google allows more keyword “stuffing” for because they know the particular phrase is more common and appropriate on for that industry and website. I think people get too caught up in 4% density and 30% exact match anchor text. These are complicated algorithms that are intended to catch things that are unnatural. So my advice to everyone is “Quick! Act natural!” Lol

thanks for the comment. Of course the numbers are rough estimates from experience and experimentation. Also unfortunately I know that some negative seo works long term, I have a client right now who is being attacked and there definitely is an effect, although the more healthy a site is the harder it is to do.

Josh – I understand you advise against including meta-keywords, as they don’t count much in Google’s eyes. However, what about including them for the benefit of Bing and yahoo. Do you know if they are meta-keywords are still relevant for Bing and Yahoo rankings? Thanks.

The discussion on this subject is good, but I wonder if we are focused too much on one SE and not caring about the other 40% of search traffic. If that is true then how we should try to improve our sites in those SEs?

Here is my experience with contextual backlinks in a blog network. The sites that I had less links to are ranking #2-#7 in Google whereas the ones I was “over optimizing” in off page SEO took a huge hit… The sites that stayed ranked for good keywords are aged domains with not so much done to them since I was focused on the other sites…

Matthew, I feel exactly the same! After researching numerous sites that rank very well post-any animal update, the only pertinent conclusion is that ANYTHING works – even the old link exchage on /links.html urls! Problem with that is that it makes any SEO campaign look more and more like a gambling game! I think all we can do now is throw it on the wall and see what sticks…

Maaaan! I feel bad for you Josh. I’ve taken a few minutes to read most of the comments and you are taking a beating!

In the video you reiterated multiple times that this is your “HYPOTHESIS” and you clearly are supporting it with actual data. It is still in the testing phase. So, there will be more findings to come. People need to relax.

I’m curious to see what you discover from more extensive testing. I look forward to these videos and have learned a lot from them.

Keep your head up and your chin tucked…there are a lot of people throwing punches at you in the comment thread.

Excellent video Josh – and also something that I can do for myself without too much trouble – I just need to rework my pages and see if it makes any difference, that makes life a lot easier ! Thanks for taking the time to explain.
Can you also xplain why I get much better results using duckduckgo rather than google ? I’ve got a finance blog that used to do quite well but now I see that on Google I’ve got keywords that are ranked 380 whereas on Duckduckgo they are ranked no. 3 on page 1 !
And this applies to multiple keywords – on duckduckgo they are at the top of page 1 – on Google they are bottom of page 1 (if I’m lucky) or out on page 25. If only everyone would use duckduckgo I would be rich !

Sorry, I also say “bullshit” Joshua. If you want to read my analysis please see http://seo-mentoring.ca/panda-penguin-linking-anchor-text-a-13.html
You are looking at the wrong factors.
Correlation is not always causation.
In fact, from what I can see in discussing anchor text and links in general, it is not.
The per page / per keyword effect is correct, but not for the reasons you propose.
First you have to factor in the effect that links USED TO HAVE, which was a high weight in deciding SERPs.
Then you have to factor the changes made to linking.

Along with this you factor in the change to the basic page rank calculations which removed the PR of the linking page as a primary and the substitution of relevance between liking and linked pages.
This was a big change and was done in the mayday update.

If you consider that Google does not rescan link profiles as a matter of course.
Algo changes apply to future link discoveries/indexing and SERPs reached by the use of profiles when links counted, are not affected, UNLESS “resampled”. (Something that Google mentions in most updates.)
What would happen if Google resampled your pages and removed links that were on non relevant, (Usually on high PR), pages? Your SERPs would suffer.
It is NOT a link percentage but a rechecking of relevance as seen by PageRank.

You talk about domainname -dsfsdfsdf.org but oddly enough i have never seen this on any other pages discussing SEO.
“There was something wrong with those links or their linking pages. They were being discounted. So I looked at the backlinking pages.”
Did you look at relevance? You did not say so.

Where did you see : “reported a “keyword classifier filter” adjustment for april”?
I can find no mention of this in April.

You also state “there were no linking algorithm changes reported for april.”
Google says:
“Anchors bug fix. [launch codename “Organochloride”, project codename “Anchors”] This change fixed a bug related to our handling of anchors.”
Anchors are part of linking.

A “bayesian filter” is for email. Not webpages.
The text algos are way beyond bayesian. Think LSI.

@ Matthew
“But what if I could show you a couple dozen sites that are page one for hundreds of thousands of keywords and only use un-readable spun content?”
So, show us at least one.

Back @ Josh.
“don’t use meta keywords – google does not count it and may penalize you for it”
They will only penalize you if you use it incorrectly. Think keyword stuffing. Used properly is it a valid indicator of page content .

So I have a website and it automatically generates pages based on the Country/State/City and another factor and we have over 2 millions URL (consider all those factors). Another buddy (I know for 100% sure) did the same thing as we did BUT he has domain name since 1997 and his pages are also generated based on the same factors. Now, after all the updates his site hasn’t been hit. But mine has (my site is 1 + years old which is a little baby based on his 20+ ). Now My question is … how the hell am I supposed to generate over 2 million pages manually (currently the content changes – actually just the keywords changes in the text which is not enough based on the Country/State/City )

So, are you saying that if a site gets flagged by penguin or panda the site that those sites link to is going to get NEGATIVE link juice? or are the sites they link to going to go down in rank because they have less of a foundation?

Very informative and love the video! It is nice to see individuals helping other out with the recentpenguin
update . I am also glad to see so many people voicing their opinions and concerns. Readjustment are needed, but these tips and explanations will equip a website owner with some answers. This will help many with understanding the new update. Thank you for your post.

Josh … any chance you can help me out with a site? I would greatly appreciate and value your expert opinion. I am trying to dig out a site that looks like it has gotten a google penalty with an 86 page hit. I have been working on it for about 40 days now with no movement in the SERPS which is impossible if it wasn’t being destroyed by the Penguin algorithm. What is more interesting is that it is #1 for a certain keyword, but is on page 86 for the main keyword he wants to rank for. It used to be #1 for a certain case. I think it would be interesting for you to take a look at. I am following you on twitter now @stevesnyder101 find me. Let’s talk. Thanks for all the advice. Your videos are great.