Posted
by
Soulskill
on Saturday May 30, 2009 @11:28AM
from the don't-steal-plutonium-from-terrorists dept.

Hugh Pickens writes "The image of scientists as objective seekers of truth is periodically jeopardized by the discovery of a major scientific fraud. Recent scandals like Hwang Woo-Suk's fake stem-cell lines or Jan Hendrik Schön's duplicated graphs showed how easy it can be for a scientist to publish fabricated data in the most prestigious journals. Daniele Fanelli has an interesting paper on PLoS ONE where she performs a meta-analysis synthesizing previous surveys to determine the frequency with which scientists fabricate and falsify data, or commit other forms of scientific misconduct. A pooled, weighted average of 1.97% of scientists admitted to having fabricated, falsified or modified data or results at least once — a serious form of misconduct by any standard — and up to 33.7% admitted other questionable research practices. In surveys asking about the behavior of colleagues, admission rates were 14.12% for falsification, and up to 72% for other questionable research practices. Misconduct was reported more frequently by medical/pharmacological researchers than others. 'Considering that these surveys ask sensitive questions and have other limitations, it appears likely that this is a conservative estimate of the true prevalence of scientific misconduct,' writes Fanelli. 'It is likely that, if on average 2% of scientists admit to have falsified research at least once and up to 34% admit other questionable research practices, the actual frequencies of misconduct could be higher than this.'"

It is often cited that crappy, broken or incomplete code is often shoved out the door by business in order to meet deadlines. Quality or even truth are sacrificed for business reasons.

Why would R&D be any different? Big business often exhibit quota and other incentives for patent filing and the like. Outside funding sources pressure even pure research activities so that they can get their hands on new technology or even for silly things like a name being recorded as "first to" do something.

If we accept that scientists are human like anyone else, we accept that scientists, like others, will make mistakes that get bigger and go more wrong than they anticipated. Some may intentionally commit fraud.

How common is scientific misconduct relative to other types of misconduct seems a more relevant question.

I wonder if this refers to shortcuts taken because its common knowledge, Such as, if you use water as a control lubricant, you might test its wetness, density, purity, viscosity, etc, to compare against water with a slippery polymer in it. I wonder if these "questionable" practices involved taking distilled water, making sure its pure distilled water, and then pulling the other factors off of charts for distilled water or if "Questionable" means something far worse.

The reason i bring this up is because hindsight is 20/20 and everybody knows every mistake that they've made, if they're smart and that's what they're fessing up to.

True, giving a certain "spin" to you interpretation of correctly presented data is common - but not necessarily a terrible thing. As you said, it it will be scrutinizied and filed in the big "misinterpretation" folder. As for active misconduct - it probably happens more often than reported, but thankfully gets caught internally most of the time before it is published. I can only offer anecdotal evidence, but while doing my PhD work, one of my colleagues tried to get away with made-up results. Head of department smelled a rat, checked the data and promptly fired the guy without hesitation. PhD student one day, unemployed with revoked visum the next day....

I respectfully disagree. Falsified data has real world effects. Sooner or later, someone will try to reproduce or modify your experiment, fail to do so and properly mock you at the next conference. Seen it happen, no pretty sight. Science is mostly self-correcting, although some crap can always fall through the system and stick around a long time until it is corrected.

Scientist will behave much better as soon as society (or perhaps the government at least) understands that if you want reliable information, you actually have to treat your scientist well.

Now, do not got me wrong, some countries, especially the US, invest quite a lot in science. But the problem is that the whole system is rotten to the core. It makes almost no sense at all for a young graduate to stay in a University/Institute. Pay will be low, and you have (in most countries) no job security. In Europe you either get a nice job at a company, or you go around taking post-docs for 5-10 years, hoping to get lucky. Working crazy hours with no holidays. For most, in the end, they go to a company anyway (having lost quite a lot of money in the process).

Often you are expected to go abroad, and unless you are lucky this leaves you with no good way to take care of your pension. Then if you want to return, somebody else took your place at university.

There is 2 ways to stay in the system: either you are lucky or you lie like hell.

Now, people may say that if your good you do not need luck. But remember that for high impact publications you need a lot more then good ideas and good skills. In research it is perfectly normal to conclude after 2 years that your hypothesis is false. This is great science, it also is hardly publishable in a good journal. People like positive results, and the reviewer system actually encourages you to confirm generally accepted ideas, not to falsify them.

Well, I could go on but I am sure others will.

To be honest, I do not even get angry anymore when I suspect someone may have done something "questionable". It's just sad.

The truth is the way that scientific institutions are set up isn't very scientific. There is definitely an attempt at oversight and impartiality but it's very easily corrupted by a wide variety of people with a wide variety of interests and ulterior motives. There aren't nearly enough checks and balances.

There are many things wrong with the system. Some include:

- Almost anyone can commission a study, write a book etc. and it's left to the scientific community to place value on that work. Viewed on it's own, without knowledge of the scientific community's opinion it can be difficult to tell how valid the work is. For example Wolfram's "New Science" has been largely debunked as mostly a rehash of old ideas (minus accreditation) but it took some time for this to become clear and in the meantime it was popularized in the press as a breakthrough work.

- The only real form of moderation is whether or not work has made it into a respected journal. Other scientists are then expected to publish corroborating work etc. However, until this is done, it is very difficult to judge the validity of the work, and papers get published that are later discredited. (Cold fusion anyone?) Likewise, work that should be published is often initially rejected. The primary motivation of a lot of the scientific journals is financial gain. In fact the entire publishing system is an antiquated remnant of the last 2 centuries and doesn't belong in an Internet connected world, yet publication is still the primary tool by which a scientist's work gets recognized.

- Speaking of antiquated the institutions, committees and governing bodies of science are about as scientific as a mother's group - it's all professional bitching and posturing for status. Real monkey hierarchy stuff. A lot of decisions get made on the basis of status. It's particularly bad for applied science professions like the medical profession where you hear stories about doctors who should have been prevented from practicing continuing for many years before being disciplined or quietly removed. At the senior level, scientists are often more politician than anything else as then need to secure funding and approval from political bodies. Then you see students who have to work their way up in status being treated like crap "paying their dues" as noted in a story posted a few days ago about a student who died in a chemical fire.

- Speaking of status, there is an emphasis on using scientific jargon to exclude the community at large. Some scientific ideas require complex specialized language and university post graduate mathematics to understand, and so require such specialized language. However even simple concepts must be described in overly complex specialized language to be accepted for journal publication. This is absolutely backward. We should have a system that requires simplified language where possible and a layman's overview attached early in the document. Instead, reading a scientific paper if you're not a specialist in the field is an art that you learn when you do post graduate work. If you assess a published article for readability you'll find the statistics you generate tell you that it's dense and difficult to understand. There are journals and subjects that allow simpler and informallanguage but they are the exception rather than the rule and usually apply as addendum publications for applied fields. (Again I'm thinking of medicine. My own post grad work is in astronomy so I'm very much a lay reader when it comes to medicine, and when I've tried to read medical papers it's usually been an interesting excercise). Any real simplified content seems to get presented in slide form at conferences and presentations are often a better way of getting an overview.

I could go on about the shortcomings of various scientific institutions but I won't.

My point is that when you have a system that is so open to corruption, with so few checks and balances, and so much baggage inherited from institutions that began in the dark ages, it's no surprise that you end up with science that's much less than perfect.

Definitely they sometimes fudge their data so that it will support their theories. Scientists are human and not perfect, it's part of human nature. That is where peer review comes in. A true scientist s work has to stand up to peer review and this is where the fudging of data is often uncovered. The problem is that much of the research going on is cloaked in secrecy by governments and corporations and proper peer review doesn't happen.

This brings to mind an incident in history where the scientist was right but his data was just too good. I'm talking about Gregor Mendel and his work on genetics. Later statistical analysis of his data indicates that it was very unlikely that he got that data. He probably got very close to the experiment result that he predicted but it was not good enough so he fudged his results. It wasn't until long after that this inconsistency in the data was uncovered. Was he right? Absolutely he was but his data is suspect nonetheless.

A number of my friends are scientists and some have told me they bodge the results now and again to match what they were expecting.

In that case, they're not scientists. If they fudge results, they are simply invalidating their experimental data by repeating their initial hypothesis as a result without bothering to challenge it.

I can understand commercial pressures for funding and so forth may be important to the researcher, but in many cases it saves everybody a lot of time if negative results are published to start with. Sure, they will rarely earn anyone a Nobel Prize, but we have to accept that a lot of what science is about is repetitious or tedious donkey-work.

Aye, I think I go with your interpretation here. I personally would confess to "questionable practices" of that kind - not thoroughly testing each and every factor that might have influenced your experiment, because it is "common knowledge" that the factors in question won't matter. Deadlines looming ahead, supervisor chewing your ass, you take the shortcut. No research is perfect. In hindsight you always find some things that you should have tested to be really sure, but real life is not perfect. I'd file that 33.7% under "maybe questionable, but not malicious". Scientist tend to be overly critical of themselves. I personally could not state that my research was alway impeccable and perfect with a straight face. Who could? We are humans, too.

At least in science there is a built-in way of self-correction. Publish all the made up crap you want, but when no one can duplicate the feat don't be surprised when the community calls you out on it. Tell me where you go to find the guy double checking the work of the corrupt police officer or judge when they perjure themselves to ruin your life and your ability to defend yourself. Find me the people replicating every aspect of your grafty mayor's work to make sure he's not full of shit...

I can't think of anywhere else in life that there are as many checks and double checks and accountability as in the field of scientific research. Just because no one catches it immediately means nothing. If it was fake no one will be able to replicate it. A single study proves very little and likewise does very little damage, so if no one cares enough to replicate it chances are slim that it will cause harm.

he primary motivation of a lot of the scientific journals is financial gain. In fact the entire publishing system is an antiquated remnant of the last 2 centuries and doesn't belong in an Internet connected world, yet publication is still the primary tool by which a scientist's work gets recognized.

Let's not go there, lest i shall rant all evening. I am due for a pub-crawl, don't wanna miss it...

Short version of the evil socialist scientists rant.. I do government funded research, then have to PAY a private enterprise to publish my data, peer review is done for free by other scientist, and then I have to PAY again for reprints and the money-grubbbing bastards charge through the nose for the subscription, too, so that the local library can't even afford the online access to the journal I published in. Forjudge the bastards!! Freedom for scientific publication! To the sun, to freedom, comrades!

More seriously, the major flaw with the current publication system, is that you need positive data to even have a chance to publish. I always wished for a kind of "Journal of Negative Results", which basically gives you summaries on "see, we tried this, it did NOT work"-attempts. All the valuable work that did not work out as expected has no chance of getting published today, forcing you to repeat countless mistakes, because you have no chance of reading about previous failures.

Moreover, you rarely become a professor at a major university or some other distinguished position only on the basis of being talented; it is much more important that you are skilled at writing and inter-personal politics, manipulative both in terms of being able to sell your research and in terms of luring grad students, junior researchers and funding agencies to work for you or to pay you. Unfortunately, the same manipulative skills you need to acquire to become successful make you potentially more capable of cheating. I don't mean to insult anyone here by implying that it will actually make you more likely to cheat; only that it's easier for you to cheat because you are skilled at manipulating others (this being said, arguably the line between skilled manipulation and outright cheating is not as crisp and well-defined as one might hope). Indeed, sometimes cheating happens unwillingly; I have witnessed it on multiple occasions, when a famous professor would write a pile of an outright bullshit in a paper; not intentionally, but because his bullshitting skills and confidence were orders of magnitude above his raw technical competence.

It is often cited that crappy, broken or incomplete code is often shoved out the door by business in order to meet deadlines.

The reason why R&D is different from software developers is because the latter usually don't need to present conclusions or premises to the community at large. It can (and often does) hide the source and get away with saying "no warranty yada yada..."

By presenting your research in reputable journals, you are exposing it to the examination and criticism of your peers. Thus in theory anyone else can pick up your work and reproduce it. One aspect of Hwang Woo-Suk's work that brought about his demise was that others failed to be able to reproduce his work. Unfortunately for him, his claims were so grandiose that alarm bells rang and people started looking at his work more closely.

The eventual fallout can be seen as evidence that the system works. We have little way of knowing how much dodgy work slips under the radar in the short term, since people don't get paid much for reproducing other scientists' work, but at least there is a mechanism where it CAN happen.

So let me get this straight. You wanted to be a high-level scientist, but felt it was an unreasonable burden to prove you could understand the theory, could perform the practical, and could explain findings to others? And now it's the system's fault you're doing nothing worthwhile?

Scientists are humans too and a job won't change some humans from being cheats.

I see about half a dozen comments along those lines, but giving up and saying "c'est la vie" isn't constructive. Our scientific systems and institutions should have better checks and balances. Many jobs/professions including monitoring and auditing to prevent corruption as standard. Some are better, some are worse. Regardless, the checks and balances on scientists exist but are antiquated an ineffective. The institutions and traditions are outdated. We can do better!

Our scientific systems and institutions should have better checks and balances.

They do: science. While you can game the system (grants, publications, fame and fortune) you can't game science forever. If it's real, it's repeatable. Somebody can do it (if it's important enough). If it's not important enough and the information gets stuffed in some hard drive somewhere - no big deal.

Sure, money can be wasted. People can be injured. Reputations can be trashed. But in the end if it's real and important someone else will look into it and either confirm or deny it. It may take years or decades, but it will happen.

I don't quite see how you came to this conclusion, especially given the text of this article. The authors were specifically looking at misconduct in research published in peer-reviewed journals. The vast majority of material published in these journals originates from universities, not industrial research and development.

I would suggest, in fact, that misconduct is probably at least as common if not more so in a university environment than in an industrial one. Tenure-track professors are under enormous pressure to publish and their research projects are operated in an essentially unsupervised environment. The graduate students and post-doctoral researchers who actually do the lab work are generally in no position to correct or even be aware of misconduct by a professor, and are also under the same kinds of pressure to produce results in order to succeed. Couple this with the fact that much research is esoteric and funding, time, and interest to reproduce others' results is nearly non-existent and you have an environment ripe for scientific misconduct.

In the very least, in industry, you're constrained by reality. If you say you can make a product and you can't, there is an economic penalty (and potential loss of employment) which encourages conservatism and honesty in research. In academics, a paper containing falsified data published in an obscure journal which no one reads is still a publication that you can add to your c.v. and really, who will ever notice?

Regardless, the checks and balances on scientists exist but are antiquated an ineffective. The institutions and traditions are outdated.

And...?

I'd be more interested in what you think would fix it rather than another statement that the problem exists, because that's not all that constructive either. I take it that you are saying that peer review isn't a sufficient means of monitoring and auditing?

Ever heard the phrase "publish or perish?" Trust me, there's just as much pressure in academia to produce results within a specified time frame as there is in industry. The organization's measurement is different -- publications vs. ROI -- but the situation of the individual researcher is much the same.

There's also the case, common among undergrads, where the results of the experiment are already known. Suppose your experiment is testing the law of conservation of momentum. You have a few hours of lab time and they won't let you have more if you make a mistake. You get home and start analyzing your data only to find that *gasp* the law of conservation of momentum appears to have been suspended at your lab bench for a few hours. Neither claiming that you've cracked physics nor that this is obviously a case of human error makes an acceptable lab report.

In the "real" scientific world, maybe scientists aren't under quite so much pressure to find the "right" results, but often, only just a little bit less. They've incorporated a mindset throughout school that the "right" results are important to their superiours and, like the undergrad, their access to lab time is limited.

So you find an excuse to dismiss an inconvenient outlier, you apply a magical fudge factor which you can't explain, or you guess about what human error could have done to the data and you try to compensate for it. It's intellectual dishonesty, no doubt, but it's inevitable in a system where our method of laboratory education emphasizes confirmation of "known" science and punishes students whose data appears to deviate.

It is often cited that crappy, broken or incomplete code is often shoved out the door by business in order to meet deadlines. Quality or even truth are sacrificed for business reasons.

Why would R&D be any different?

In a sense R&D is worse, in that its farther removed from corrective mechanisms. If you sell consumer tech that doesn't work, chances are fairly good that it will harm your business. Depending somwhat on your field, if you publish research that is arguably correct but meaningless or highly misleading, nobody will care. Your funding source doesn't even care, as long as it looks enough like real science that they can get away with continuing to support it.

One could also argue that if one is manipulative politic in many aspects of their life, they are also most likely manipulative and politic in the others as well. If somebody were always drunk at home, drunk at work, and drunk while in public... why the hell would you think they're always sober while driving? If a scientist is always manipulative with their family, their co-workers, and while attending social events... why the hell would you expect they're always forthright while doing research?

Leaving aside your questionable assertion that being a factory worker is far worse than being a scientist, the more relevant comparison would be to jobs that require similar levels of education and competence. A compelling case can be made that the education and competence required to be a scientist is similar to that of a medical doctor, lawyer or high level engineer.

The problem with a career in science is that it is like a career in acting. Sure, there's the super stars at the top who are doing extremely well for themselves but then pretty much everyone else is struggling just to feed their families.

Of course, there are struggling actors who obviously don't have what it takes to be actors and there are struggling scientists who obviously don't have what it takes to be scientists. There are also, however, huge numbers of actors and scientists who are doing everything right and who are just as talented as the guys at the top but who somehow just didn't get their big break - and who, as a result, are struggling to feed their families.

So, what's the problem? Well, a lot of young are encouraged to embark on the long and arduous path to become scientists with the belief that they will eventually command salaries on par with careers that require similar levels of education and competence (medical doctors, lawyers, etc.). Unfortunately for them, when they final complete the dozen or so years of training to become scientists, they realize that they are overwhelmingly likely to command a salary on par with mid-level factory workers.

Eventually as knowledge about expected science salaries becomes more widespread, "the market" will probably adjust and young people who are considering careers in science will have enough information about expected salaries to choose other careers such as medicine, law, or management.

If the USA, for example, doesn't want to be a world leader in scientific research then that's totally fine. It's unfortunate for people who have already committed to a career in science but, with any luck, today's young people will choose other careers and complaints about low pay for scientists will go away because there won't be any scientists left to complain.

So what you suggest? To accept the report of your undergrad that he violated the law of conservation of momentum? Or another that violated the conservation of energy?

Deviations from the laws are interesting, in that you can go find out why your experiment deviated, and learn a lot in the process. You don't learn anything if you just assume your experiment went wrong because of x, and leave it that way. You do the experiment again to check if it was what you imagined.

I'm talking as someone who has plenty of times repeated his experiments. I don't know what rock you live under, but in my college it was just unacceptable to turn in a speculative report. Sure, there were kids that faked data so they wouldn't have to redo the experiment, and most weren't caught. Fuck them, they're just hurting themselves.

And I don't buy the limited lab time either. The basic lab sciences are always empty, you just go there out of class time. If you are in the advanced course, it's rarer that you commit such basic mistakes, but if you do, you can prepare yourself to sleep a little less that day.

Peer review may not catch the journal article, but it eventually catches the faker.

The problem is, the public seems to think that one paper published in a journal translates into "this is true." It's not. Far more commonly than outright misconduct is studies that are preliminary, contain an honest error or are a statistical fluke.

Journal papers are about sharing information, NOT about laying down Truth on the Record. When all the studies start consistently showing the same thing, THEN you can start thinking about believing it.

But scientists like doctors are supposed to be trustworthy. They are experts who's opinions seem to have more weight than the average person. Now what mechanism is in place to check up and verify everything they do? Were is the regulation/punishment for breaking regulations?
Take construction for example. If I build a house there are electrical, plumbing, foundation, insulation and final inspections. Why? Because people cheat and someone has to ensure no one is cheating. If the rules are not followed someone could get hurt (electrical fire), someone could get sick (mold), someone could get screwed out of a lot of money (shady contractor). This is why construction is regulated. Who is regulating scientists?
Scientist can hurt a lot more people than a shady contractor. They play around with deadly diseases, nuclear reactors, decisions that affect the planet. Accidents happen, but what happens when scientists intentionally do something wrong or take money for doing a job they don't do? Who is looking over their shoulder watching what they do? Self regulation doesn't happen which is why most industries are regulated/monitored with penalties for now following the rules. What makes scientists so much better than the average person that they don't have to accountable like everyone else?

The issue here is, when you're doing things like stem cell research, the future of human kind is in your hands. This is like saying "we shouldn't put people in prisons, because they're not animals and being killers or thieves doesn't make them animals". Unfortunately, you're right, because almost anyone can do "research" today.

I don't really understand your point about people and prisons. I suspect there is a typo or something in there but I would like to address the first sentence.

The level of importance of a task doesn't make people more or less likely to cheat. And when I say "people" I mean a sample community en masse (in this case the research community). I suspect there is little that will make some people cheat and other cheat quite easily so I'm really talking about the statistical chance of the random community member cheating.

The chance they'll get caught and the penalty for getting caught versus the reward they'll experience if they succeed is what matters and rewards don't have to be monetary (think notoriety, vindication, etc.). The work being important affects both of these usually but it doesn't directly affect the cheater. I suspect despite our protestations of being reasoning creatures you'll find that cheating or not cheating when modeled at a group level looks a lot like every other risk taking decision we make and even more primitive ones like do I drink from the water hole while predators are around.

you rarely become a professor at a major university or some other distinguished position only on the basis of being talented

I assume you mean "book-smart at science," in which case, you're right.

it is much more important that you are skilled at writing

Being able to effectively communicate your results is critical for scientists. That isn't a bad thing. There's no point in doing science if you don't or can't tell anybody what you did and why it matters.

and inter-personal politics, manipulative both in terms of being able to sell your research and in terms of luring grad students, junior researchers and funding agencies to work for you or to pay you.

You're putting a bad spin on this with "manipulative." Most science nowadays involves teams and collaborations; very few discoveries are made by the lone guy in his garage with a bunch of test tubes. If you are working in any area where you cannot go it completely alone, you need to be something that's an even dirtier word on Slashdot than "manipulative." On top of knowing your science, you need to be an effective... wait for it... manager (gasp!).

As for the funding... most funding is peer reviewed. What is wrong with telling scientists that they cannot have scarce resources unless they can convince experts in their field that the research is worth funding? Can you think of a better way to fund science?

Unfortunately, the same manipulative skills you need to acquire to become successful make you potentially more capable of cheating.

Do you have any evidence to back this up? Good people skills and Machiavellian manipulation are not the same thing.

It seems more plausible to me that if you're a scientist who works in a highly collaborative team environment and regularly gets funding from the bigs (NSF, NIH, etc.), it would be harder to last as a successful cheat. Somebody who works mostly solo or with just a couple of grad students can send off their results to a journal, and they just have to look plausible to the editor and journal referees. The socially skilled scientist who has a big team has to slip their cheating past the grad students who did the hands-on work. If they're attracting lots of funding, they are going to get close scrutiny, and it's going to be hard to keep getting grants if nobody can replicate their work. And if they are well networked and therefore well known, there are going to be lots of people trying to replicate the results so they can build on them.

I have witnessed it on multiple occasions, when a famous professor would write a pile of an outright bullshit in a paper; not intentionally, but because his bullshitting skills and confidence were orders of magnitude above his raw technical competence.

I don't know about your field, but in my experience these are the people with enormous targets on their backs. Good scientists are smart enough to recognize bullshit, or at least suspect it. And the young upstarts, who haven't been around long enough to be impressed by Professor X's reputation, see an opportunity to make their bones by taking down a famous blowhard. The system ends up self-correcting pretty well.

You talk about working two years on an experiment to find out your hypothesis is wrong? Cry me a river. There's tons of people that work for two years, five years, ten years, pitching in to build up a business, and then they'll get bumped out on the street because some jackass guy in bufukistan can do it cheaper.

I think that you are missing the gp's point.

ASAICT he is saying that good research jobs *are* cushy ( which they should be - it's important to reward competent researchers ) but that we dont reward good research properly.

Working 2 years and producing a strong negative result is good science, but it doesnt get you published in a good journal. So, when you embark on a two year project as a post doc to test a hypothesis and get a negative result, what do you do? Get another post doc, and be severely underpaid for another 2 years? Leave science altogether? Or fabricate results. None of those are good options for a good researcher and, until we as a society start rewarding people for good science and not just exciting results, we will continue to have people inflating the excitement of their work.

As far as your analogy goes, I think it would be better to say that someone works 2, 5, 10 years to develop a *profitable* business and then be kicked out on the street when someone else develops a less profitable business. Does that happen? Probably, but I'll bet that it's pretty rare.

Have you got any idea how difficult it is to refute an experimental outcome, at least in the less exact sciences? It's not only that you can create a gazillion possible deviations between your set-up and the one from the article (making direct comparison difficult), you will also need to run it with a pretty large subject group if you want to have enough power (making it expensive and time consuming), and then you're going to have problems publishing your article (reviewers and editors don't like null effects). In short, there is no profit in it. Most people, and researchers are people, are in it for the money, prestige, whatever, and replicating a study generally doesn't get you funding, prestige, publications. So guess what happens? The world, at least the part that does experimental psychology, gets stuck with 90% junk publications. And that's being conservative.

The problem with a career in science is that it is like a career in acting. Sure, there's the super stars at the top who are doing extremely well for themselves but then pretty much everyone else is struggling just to feed their families.

You've got chemists coming out of the gate making almost 70k a year, moving up to 120k a year as their career progresses.

You're right that certain applied scientists do OK financially - not as well as medical doctors or lawyers but enough to feed their families. The "geologist" salaries you linked to were for petroleum geologists. The "chemist" salaries you linked to showed large variation. For example, the chemistry post-doc salaries were down at $40,000. It's also worth noting that many of the "chemist" jobs (particularly the high paying ones) were almost certainly primarily management jobs.

I'll agree that a few scientists are doing very well for themselves financially and that certain other classes of scientists are doing OK financially (particularly those working in applied science and in management positions). What you'll find, though, is that the scientists who are trying to make a career out of actual basic science research are far from financially secure.

It may even be that at some level we agree. If you were to claim that PhD scientists (even those doing basic science) should earn a minimum salary of $70K per year then I would say, sure, problem solved. As it is, though, I personally know plenty of talented hard-working PhD scientists making only about half that ($30,000-$40,000 per year).

Maybe $35K is a fair salary for a PhD scientist and maybe it's not - but young people considering a career in science need to be aware of the reality that most hard-working PhD scientists are only earning $30K-$40K per year.

I don't think you understand the reasoning behind the welfare system, nor the word 'welfare' itself.

To address your post: Welfare is an imperfect system. Most systems are. The questions that are actually useful to consider are:1) Is it doing more harm than good? Will that change?2) Is it reformable? Does it need constant reform, or does it tend to get better over time?etc etc.3) How fair are the current policies? Are they more unfair to some parties than others?

I'm sure you're just trolling, and I guess I'm falling for it, but, I hear this crap repeated way too much. Think before you type -- the Internet allows mature adults to act like children, but by the same nature allows anyone to act like a mature adult... (why do more people not take advantage of that?!)

The cracks I see on the welfare system remind me greatly of the policies of eugenics around the beginning of the 20th century. Modern medical research has indicated that synaptic plasticity is far more common than previously thought, and very important for long-term, measurable traits (such as IQ). The data that support this claim put eugenics in the trashcan where it should have gone long before; unfortunately, a lot of people were neutered or killed in the interim...

Look at biological sciences under the Soviets. Or for a far less sinister view look at the luminiferous ether. It was accepted as fact for a long time that light needed a medium to flow through as a wave; until someone finally did the research and expiriment to "prove" the fact...