authenticity of web content?

are there any rules or regulations in place to ensure the quality of web content? can we trust wikipedia and other websites? does google or other search engines have any mechanisms in place?

2 Answers

Relevance

Anonymous

7 years ago

Favorite Answer

The 'delivery' of content (in an unadulterated form) from a website can be enhanced by using SSL encryption, which is a protocol of data transmission (TCP/iP).

This assures content is not replaced or modified during transit across the Internet.

The actual content of any website is ungoverned, and indeed, the Internet is for all intents and purposes 'the Wild West'...and anything goes.

Mix into that hodge-podge the possibility that a site (even a well known & widely used one) can be hacked and it's genuine content revised by persons unknown, and you have yet another element of doubt to deal with.

Trust in a website is just that (as so much is regarding everything on the Internet): you trust someone else for reliable and competent data or facts.

Largely it's a matter of reputation, and 'track record' of proven or reliable information.

The content quality is like 80% of an successful website. Recently http://thehostbay.com published an article on " The new rules of content quality " . Google & Bing have start look more on the website content quality . You can read the full article here: http://thehostbay.com/new-rules-content-quality/ . The are also more articles there where you could read.