The BBC reports that the chief scientific advisor to the British government, Professor Sir David King, has set out an ethics code of “seven principles aimed at building trust between scientists and society”.

The aim, [King] said, was to outline responsibilities and values in order to encourage researchers to reflect on the impact their work would have on wider society. …

The code has been adopted by scientists working in the UK government – and Professor King has invited researchers in UK universities and industry to join them. Next year it will be launched internationally.

“The seven points in this code are part of what separates researchers from charlatans, medicine from quackery and science from supposition,” he said.

They seem like quite sensible principles — so sensible, in fact, that you might ask why they need to be formalized in a code of ethics. Don’t scientists already know that they should be honest, be fair to their fellow scientists, avoid conflicts of interest, keep up with the literature in their field, and all that good stuff?

Surely they do, but we’ve noted before that knowing what you ought to do and actually doing it are two different things. The question then becomes, how exactly does having a code of ethics help?

Professor King mentions a couple tangible ways that a recognized code of ethics might help. If an employer asked a scientist-employee to advance a claim counter to good evidence, the scientist-employee could say, “Boss, I’d love to help, but my hands are tied by this code.” Also, King suggests that having an explicit code will remind scientists of their relationship to the larger society.

But don’t scientists already know that what they do has implications for the public? Certainly, they know that a minimal level of public support is necessary for the public monies that support scientific research.

What do you think the effects of adopting such a code will (or could) be for scientists in the UK? Do you see any of the seven points in the code as problematic? Do you think there are any important ethical requirements on scientists that should be in the code but are not captured by the seven points?

Comments

Based on what we have witnessed over the past two decades regarding unethical behavior of scientists and their institutions, I believe that a code of ethics will have only a minimal effect on reducing misconduct. The ethical scientists know perfectly well what is acceptable and what is unacceptable in their paractice of science. The fraudsters will continue to commit their unethical deeds. Unfortunately, the only measure that may have some teeth is a recognized book of rules that every scientist, when hired for a research job, must sign, agreeing to abide by them with the understanding that any infraction could lead to termination of employment.

The thing is, any percieved problem between science and society is rarely the fault of the scientist’s themselves. The majority of ‘science’ literature consumed by the general public is through the press (largely newspapers and television), who have their own agendas, and often misrepesent what is actually going on. Both accidentialy, journalists aren’t usually scientists and so frequently misunderstand the science itself, and by means of ‘sensationalisation’. For instance look at what happens anytime someone releases something new in the field of quantum mechanics, we get press headlines of “Scientists close to star-trek teleportaion” when nothing of the sort is true.

Of course this isn’t always the case, the success of ‘A brief history of Time’ is a great example science being reported to the layman.

One way in which this code of ethics may help is in educating the public as to what standards they should expect from scientists. In the UK this seems especially relevant following the MMR vaccination scare that was started by a scientist/doctor ignoring several of the principles.

In fact I would not be at all surprised if Andrew Wakefield and his antics were central to this idea.

One way a code of ethics helps is if it can be used for legal protection from firing or demotion because of whistleblowing or for refusing to do something that would be in breach of the code. Engineers and lawyers in the United States do receive some limited protection because of their codes of professional conduct.

This kind of code of ethics is completely useless. All it says is, “Behave ethically”.

There is a reason why the “Model Rules of Professional Conduct” for attorneys are tens of thousands of words, and contains detailed proscriptions and prescriptions relating to situations attorneys regularly find themselves in.

> Minimise impacts on people, animals and the environment
Should be
Minimise involuntary and unintentional effects on people, animals and the environment

> Ensure that research is justified and lawful
Should be (wow, justified is one weaselly word. It can mean _anything_)
Ensure that your research is funded and lawful, and if not, ensure that it will be.

I think one way that a code like this could be an advantage is not for scientists, but for the general public. Much like the general public has a basic understanding of medical ethics through their understanding of at least some parts of the Hippocratic oath, a widely publicized, agreed-upon set of ethical principles for scientists could be helpful.

I agree, though, that it will have minimal affect on practicing scientists.

Having been in two labs in which scientists sabotaged others’ work, I think this bears some emphasis.

Should be:
Minimise involuntary and unintentional effects on people, animals and the environment

A PI in grad school told me he’d disposed of a half gallon of our phenol/chloroform waste by pouring it down the drain. A stream that ran through campus regularly changed color as it past the chemistry building. Maybe “voluntary and involuntary” would be appropriate.

A written code of ethics can be quite a good thing even if it generally ‘obvious’. How about the Hippocratic Oath as an example? It severs more to clarify the expectations and role to others than to govern the behavior of the practitioners (which it also does, but they should already know what is within their role and what is unethical because it breaches that role).

This is related to another aspect of having an explicit ethical code… it helps define what an adherent of that code ‘is’. We could say that while there are others who may adhere to these ethical principles (I would certainly hope so), being a scientist in good standing requires it. This partially defines what it means to be a scientist… and allows non-scientists to know something scientists have in common.

I went to a school with an honor code. We also had take-home exams and all sorts of other things that simply would not work if the honor code didn’t have a real effect on how students (and faculty) acted. It was short and sweet and obvious… “Don’t take unfair advantage of other members of the community”. It also hinges on ‘weasly’ words like “unfair advantage”… which is not really a weakness. An honor code or ethics code is not a law. How those vauge words are interpreted is actually part of it. (Another example would be “harm” in the Hippocratic oath.)

That all being said, this particular code has one bit that bugs me:
“Minimise impacts on people, animals and the environment”

We know what it is trying to say, but it doesn’t actually say it. Us scientists don’t want to have minimal “impact” on people or the environment… quite the opposite. We try to improve the world (or at least we should). How about “Minimize negative impacts on people…” instead.

Other than that, while it could be significantly shorter (and more pithy), it is a pretty good.

The key points IMO:
1: Try to maintain expertise so you know what you are talking about.
2: Be open and honest… unfortunately too many scientists actually do have difficulty with, esp the “open part”.
3: Engage in honest and open debate.
4: Try to do research which will make the world a better place, and don’t make mess along the way.
…. 1-3 are actually necessary for science (as a collective intellectual endeavor) to actually work. The 4th is blatantly obvious “do good”, but nice to say anyway.

The ethical codes I have seen before are either regulation for medical or biological enterprizes, or provided by unions (to avoid conflicts, help raise the status of a profession, et cetera – and yes, against effects of whistle blowing as well).

Those haven’t been especially detailed. They are framing the issues, not specifying them, which can be problematic. (Not everyone lives in a fixed working situation.) Of course, that is also helpful for communicating the codes within and without the profession.

This one bothers me. I agree that, as a general rule, research should be lawful, but it’s offensive to emphasize that: scientists have exactly the same obligation to obey the law that everyone else has. Asking that research be justified is equally bad. If someone values it enough to be willing to pay for it, then that is justification enough. If nobody’s willing to pay for it, but it is being done on a volunteer basis, then who is in a position to complain that “unjustified research” is being done? Anybody is entitled to waste their own time.

Act with skill and care, keep skills up to date

This is a professional obligation, in the sense that it is necessary if one is to do good work, but I’d hardly call it an “ethical” obligation. In some areas, like medicine, it’s unethical to even attempt to treat patients if one’s skills aren’t up to date, but that doesn’t apply here. If a scientist’s skills are rusting and they’re unable to do good research, then their employer may have a legitimate complaint about their incompetence, but I wouldn’t call it an ethical failing (unless they tried to cover up the problem, and that’s a different issue).

Discuss issues science raises for society

I’d replace “discuss” with “acknowledge”. It’s not right to pretend there are no issues, but not everybody has to devote their time to discussing them.

Prevent corrupt practice and declare conflicts of interest

What the heck is a conflict of interest? I assume this means financial conflicts of interest, but there are many other sorts. The originator of an idea may have a conflict of interest in work that challenges it. Anybody may have a conflict of interest regarding people they are related to, in love with, a mentor to, etc. Anybody who’s ever been on an NSF review panel knows how many conflicts arise there, and that’s just the tip of the iceberg. I don’t know how to handle these conflicts in everyday life, but I doubt we’ll ever have to declare them all in papers. Either this rule should specify that it means only financial conflicts of interest, or it needs to be a lot more detailed.

Respect and acknowledge the work of other scientists

“Respect” should be dropped. One is required to acknowledge the work of other scientists, and it might be rude not to respect it, but it’s hardly unethical to be rude.

Do not mislead; present evidence honestly

This is so vague as to be almost useless. Is it dishonest to use photoshop to highlight details in a photo (if one never claims the photo was unmodified, and the details are real)? I think this is ethically questionable, but it is common practice in some areas. A good code of ethics help resolve tough questions; it’s not enough to have a handful of vague, feel-good statements.

The site is currently under maintenance. New comments have been disabled during this time, please check back soon.