Anyone who comments online has a responsibility to act with decency and consideration of the consequences.

It can be as simple as taking a moment and putting yourself on the receiving end and seeing how you'd feel.

Calling someone a "dumb c---" or suggesting they "f--- off and die" is hardly constructive or kind.

Now that we know "ordinary" users are capable of trolling given the right environment, then banning the worst offenders won't fix the problem.

We need to create an improved online culture for all, and discussion platforms should be redesigned to encourage this.

The Stanford and Cornell researchers suggest limiting the number of comments a person can make if he or she has just participated in a heated debate; allowing users to retract comments and minimise regret; and reducing other sources of user frustration such as poor interface design.

Altering the context of a discussion - by hiding troll comments and prioritising positive comments - may also promote the perception of civility and encourage more thoughtful behaviour.

Existing regulations governing Facebook give it 48 hours to delete content deemed to be offensive. Facebook reckons this is adequate. Politicians and police disagree, and I'm with them. Two days of offensive content is two days too long.

When it is painfully obvious that self-regulation isn't working, and that thinking before venting is considered old hat, then stronger action must be taken to stamp out trolling.

Kylie Lang is an associate editor of The Courier-Mail

If you are experiencing depression or are suicidal, or know someone who is, help is available.