What do you think about generate content to make a webpage pass a duplicate content filter.

So here is an example.

I have a website with hundreds of auto generated pages.

All have the same template and the same content only one word will be different (the word will repeat few times)

Ex: Best seo service in France
Best seo service in US
Best seo service in China
Best seo service in India

So only the name of the country will be different, the rest will be the same.

To avoid duplicate content I was thinking to auto generate 300-400 words that will be placed in a <p> tag but with a very small font so they will appear on the webpage but the visitor will not be able to read them.

So each webpage will have the content spited in 2 parts:
1. One part that will be readable and that will deliver some useful information
2. A part that will be only random words auto generated (related to the niche) that will not be readable or visible but that will count in on a content analysis.

Is there is any way that G will see that and penalize the website?

My opinion is that this will work with no problem but I will also like to hear your opinion.

Note that adblockers might block our captcha, and other functionality on BHW so if you don't see the captcha or see reduced functionality please disable adblockers to ensure full functionality, note we only allow relevant management verified ads on BHW.