Posted
by
timothy
on Monday August 27, 2012 @07:02AM
from the the-leaven-is-the-best-part dept.

Hugh Pickens writes "Biologist David P. Barash writes in the LA Times that as a scientist he has been participating in a deception for more than four decades — a benevolent and well intentioned deception — but a deception nonetheless. 'When scientists speak to the public or to students, we talk about what we know, what science has discovered,' writes Barash. 'After all, we work hard deciphering nature's secrets and we're proud whenever we succeed. But it gives the false impression that we know pretty much everything, whereas the reality is that there's a whole lot more that we don't know.' Teaching and writing only about what is known risks turning science into a mere catalog of established facts, suggesting that 'knowing' science is a matter of memorizing says Barash. 'It is time, therefore, to start teaching courses, giving lectures and writing books about what we don't know about biology, chemistry, geology, physics, mathematics.' Barash isn't talking about the obvious unknowns, such as 'Is there life on other planets?' Looking just at his field, evolutionary biology, the unknowns are immense: How widespread are nonadaptive traits? To what extent does evolution proceed by very small, gradual steps versus larger, quantum jumps? What is the purpose of all that 'junk DNA"? Did human beings evolve from a single lineage, or many times, independently? Why does homosexuality persist? According to Barash scientists need to keep celebrating and transmitting what they know but also need to keep their eyes on what science doesn't know if the scientific enterprise is to continue attracting new adherents who will keep pushing the envelope of our knowledge rather than resting satisfied within its cozy boundaries."

I guess I must have gone to a fundamentally different kind of college. Nearly every single professor I encountered wasn't excited about what was already known in their respective field but got disturbingly excited about untestable theories, suspected areas of interest and tantalizingly unknowable facts. My computer science professors would treat P=NP in an almost religious fashion -- treating that solution like the face of god. Sometimes it was just a numbers game like natural language parsing and parts of speech tagging. Here's the best-to-date accuracy, can you beat it? Ask my physics professors about entropy in space or, worse, string theory and they'd shortly be speaking in tongues. My philosophy instructors, even, loved to ask questions that had no clear answer: would you murder one person to save thousands? Why did Charles-Henri Sanson, the executioner of 3,000 lives in Paris, survive the revolution and what moral implications entailed him executing his former boss the king?

And that sort of makes sense to me because what are you going to publish about if your field is dead? What is going to drive you to keep studying your field if it's a dead field. I will say I don't remember many exciting things coming out of my advanced math courses. I know that field isn't dead but my instructors were abysmal in that field. Even the statistics professor had more fire. And I think the reason behind that is that math is a very deep field with so many before us that have pushed that field so far. In order to make original progress in that field, it appears to me that you almost have to become a hermit. You've got to become some sort of phantasmal waif like the great Grigori Perelman.

And I think that's the essence of where this article becomes misaligned. The author is complaining about learning by rote but there's few other ways to accelerate young minds quickly up to the point of modern positions of each field. I feel polymaths become much more rare as each field deepens in knowledge and that's because they are all rapidly becoming very deep rabbit holes (like mathematics). For me, grade school and high school contained the teachers that this guy is complaining about and that's because they had no choice. I wasn't ready for the real questions that remain when I was learning about derivatives and integrals in high school. I probably would not have comprehended P=NP very well at that time let alone the proof to the Poincaré conjecture.

It is time, therefore, to start teaching courses, giving lectures and writing books about what we don't know about biology, chemistry, geology, physics, mathematics.

I think there's a healthy balance, if you're teaching about what you don't know about then what could the students possibly be learning? Instead, I think teaching by rote and example of what we do know while using what we don't know as a carrot is the best methodology. If you can make your students excited about the unknown possibilities while at the same time conveying the boring and known but pragmatic information then you hit that sweet spot of teaching at a college level.

As to the particular field discussed in the article: Yeah, evolutionary biology is a relatively young field with a lot to be learned yet. I realized only a fraction of what I don't know when I read and reviewed The Logic of Chance [slashdot.org].

I guess I must have gone to a fundamentally different kind of college. Nearly every single professor I encountered wasn't excited about what was already known in their respective field but got disturbingly excited about untestable theories, suspected areas of interest and tantalizingly unknowable facts. My computer science professors would treat P=NP in an almost religious fashion -- treating that solution like the face of god. Sometimes it was just a numbers game like natural language parsing and parts of speech tagging. Here's the best-to-date accuracy, can you beat it? Ask my physics professors about entropy in space or, worse, string theory and they'd shortly be speaking in tongues. My philosophy instructors, even, loved to ask questions that had no clear answer: would you murder one person to save thousands? Why did Charles-Henri Sanson, the executioner of 3,000 lives in Paris, survive the revolution and what moral implications entailed him executing his former boss the king?

The tasks of most professors I met were reduced to management stuff. They only appear as authors on papers because of things they did while being a postdoc, or because they want to be added to a student's paper (in order to get their references up). They had more up-to-date knowledge about the issues of the faculty's politics and the mechanical problems of the coffee machine than their (former) field.

They had more up-to-date knowledge about the issues of the faculty's politics and the mechanical problems of the coffee machine than their (former) field.

Oh I don't know if its that bad. To the best of my knowledge I'm the only person I've ever met who always asked any post-secondary educator about their PHd dissertation. Two observations:

1) On topic, virtually all of them spent the last 10% of their discussion talking about very recent work in that field. Apparently my favorite calc teacher tells people he takes credit for inventing how pretty much every kid learned algebraic equation multiplication in the 80s based on an enormous number of teaching experiments and lots of early computer based statistical analysis, but that was superseded by a more recent fad / trend / research around 1990 blah blah blah. I never fact checked these people, but even in something irrelevant to them now, they pretty much all keep up with old times.

2) Off topic, at least a small percentage of phd's are achieved on a non-dissertation track. Maybe 5% of my phd level instructors talked about submitting a large quantity of research papers with their name on it. Maybe luck, donno, but this seemed more prevalent outside the hard sciences. My pre-civil war history prof got his PHD based on lots and lots of published research papers some fairly interesting sounding historical economic analysis of England or something very similar to this story, but he claimed to never write "a" dissertation just turned in stacks of research papers and did his written and oral exams.

TLDR if you think your prof is clueless about modern research, motivate your prof by asking about their PHD dissertation and you'll probably get a pretty interesting speech about modern developments in the field both during and since the prof's dissertation.

I don't think this is all that surprising... J random luser walks up to me and asks whats new in the modern world of computing and I probably tell them to F off I'm busy, but if they have a good conversation starter about something from my past, maybe we'll have an interesting discussion instead.

In the sciences, the dissertation is typically published. Each chapter is usually a separate paper. In fact it's common to take all your published (or at least submitted) research papers and staple them together (or combine into a single LaTeX file), and call that the dissertation.

It is time, therefore, to start teaching courses, giving lectures and writing books about what we don't know about biology, chemistry, geology, physics, mathematics.

I think there's a healthy balance, if you're teaching about what you don't know about then what could the students possibly be learning? Instead, I think teaching by rote and example of what we do know while using what we don't know as a carrot is the best methodology.

I think problem solving and deductive reasoning should be the primary things taught in school. In Japan many lessons start with a question to answer or problem to solve, that the student is not yet knowledgeable of. Then the students are put to the task of coming up with a solution or finding an answer in whatever way they think best. Then the teacher presents the established known answer or solution, and discusses how the students own attempts compare and contrast with the known method. Doing so reveals things such as mathematic principals as obvious, not mysterious, and gives young minds the tools to go forth and explore.

I wish my schooling was like that in the USA. When I was 10 I was creating a 2D vector graphics space game in BASIC (moveto, lineto, rotate). I only understood linear equations, but I needed to find the angle from one ship to the other ship for the CPU player to turn towards the player's ship. I understood slopes, and made a drawing of line slopes and their corresponding angles. For the rest of the summer I spent inventing Trigonometry. There was a sin() and cos() function, but their documentation didn't explain what they were used for -- I ended up making my own slopeAngle() program.

The next school year was more long division, and ratios... When I presented my 3D distance equations and what I would soon learn were proofs of the Pythagorean Theorem to my mathematics teacher, she was unimpressed. "You'll learn about Trigonometry in high school", she said. That was the key word I needed to continue my education, I soon discovered calculus at my local library. When we did start learning Trig, I was just as unimpressed with the "Geniuses" of old as my math teacher had been of me. I found it odd that these old dead bastards were so highly praised for what would be obvious to any 10 year old.

I dropped out of Highschool as soon as was legally possible and started a career in software development. "School" was utterly useless to me, and college remains even moreso: It would cost so much for me just to be able to prove that I know what I know, and would waste so much time in the proving... I would be forever in debt. My customers like results, they could care less of my mental upbringing, only my experience and accomplishments. We should do away with "final exams" and instead place "entrance exams" at job entry points, thus freeing our minds to learn however we think best without punishing us for doing so.

YOU may not have been ready for P != NP or the Poincaré conjecture, but why should your slower development be a limiting factor to others?! I've been using Unit-Sphere Quaternions and Integration for NEAR Polynomial time Inverse Kinematics since Junior High School -- I'm not bragging, I don't feel superior at all. I'm just trying to drill it in that everyone develops at different rates, and the current establishment completely ignores this to the detriment of our race.

I think he has a point where the public is concerned - except for a few gifted science popularizers like Sagan and deGrasse Tyson, public exposure to science seems to be too heavily tilted to the "look what we found" side, and it's ruined public perception. Even on Slashdot science stories inevitably have several "wake me up when I can buy it at the corner store" comments.

Teaching can be done without pouring facts into kids' heads too. The best teachers I had would teach well known concepts by first posin

What THE FUCK could possible be the point of the Poincaré conjecture anyway? From a pathfinding point of view it seems obvious that any ring can be compacted into a point in any sphere like object. Could you explain it in terms a software engineer could understand?

The author is complaining about learning by rote but there's few other ways to accelerate young minds quickly up to the point of modern positions of each field.

But that's just it: you've done nothing for them if all they have done is learn by rote. They won't understand a thing, and everything you taught them will be easily forgettable. You do a disservice to people by making everything boring and assuming that they can't truly understand it.

Okay well somebody modded you up so let's take the example from the article:

In my first college-level biology course, I was required to memorize all of the digestive enzymes and what they do. Even today, I can't stomach those darned chemicals, and I fear the situation is scarcely much better at most universities today.

I'm not a biologist but here's how I'd teach this: 1) here's the methodology and a brief history of how they found these enzymes 2) here are the list of the all the known enzymes and their functions 3) this is why we suspect there might be more we don't know about or why we suspect we have discovered all of them. (keep in mind I have no idea which of those is reality)

So you teach that to the class and you tell them that they will be expected to know the full list of enzymes from number two. Okay so how do you propose we teach them that? Give them a cow's stomach and tell them to get to work? I mean, at the end of the day you only have so much time and you cannot give the students the opportunity to discover in a class period what took some well funded researchers many man months. You're best off to give them these enzymes "by rote" and, should they want more information, be able to approach you about this outside of class.

I'm more comfortable speaking about computer science so a comparison of this might be telling students about the evolution of memory management systems in operating systems "by rote" instead of forcing them to code each iteration of what Unix, Minix, Solaris, Linux, Windows 1, etc did to manage memory or schedule threads. There's only so much time and while this information is valuable in some context, it's not as valuable as being able to move forward to get to more pragmatic fronts of the field in question.

I'm totally open to hear how you think biology is supposed to teach enzymes. A lot of memorizing and teaching by rote in biology has to do with just coming to agreement on what you're going to call the bones of the body or tissues in the body or fragments of the skull or whatever you want to agree on with your area of focus. How do you make naming the bones of the human body fun and then expect them to read a paper on metatarsals and expect the students to have come up with a better name from metatarsals and know that that's what the paper is talking about?

My problem with that is that we're still working on the assumption that you need to memorize those enzymes.

Why?

For the vast majority of people taking general biology classes, knowing those by heart won't be of any use. Furthermore, for just about everyone, they'll be forgotten hours after the test.

TFA is right for some courses: they're becoming memorization courses. Sciences where there is a lot of things to recall, like chemistry or biology, seem particularly affected, and I think it's the premise that's wrong, not merely the execution. To give an example, in one of my college chemistry courses we had to remember the orbitals of the hydrogen atom (1s, 2s, 2p, 3s, 3p, 3d, etc.). Now this isn't a particularly hard thing to memorize, but you didn't have much context for it. It was merely "these are the orbitals" and you'd need to regurgitate them in a test. Later, I've gone through numerous physics courses and those orbitals naturally popped up. We were never asked to memorize them, but we did because we actually needed them. We understood what the symbols meant and had to use them to get the answer.

So I say, only get students to solve problems. If something needs to be memorized along the way, they will be, and probably far more efficiently and in a far more durable fashion than would be if the question was strictly about memorization.

My dad used to to teach college level general chem to students including medical students.

Every year he would get at least one med student (soon to be former med student) who could not balance a redox equation to save his/her life. Each of these students had somehow gotten an A in high school chemistry (or they would not be in this medical school).

Each of these morons would demand they get an A. They never got one. They were all very good at memorizing. Hence: MD = Memorized Degree.

He is quite proud that these idiots are not physicians. One absolute, concrete product of his years of work.

Conceptually you are correct. The problem is in the ease and efficacy of handling certain kinds of thoughts. Sudoku is just pattern matching. But until you begin to develop the ability to see perhaps a dozen standard patterns, there are puzzles which are really hard. Until you're burning in the synapses responsible for doing the heavy intellectual lifting, understanding the concept(s) and applying them with grace and velocity are two different things. There's a story of a student posing a question to a Math

Question: Are there real situations in the field of biology where the a non-specialist would significantly benefit from having all the enzymes memorized? Understand that if it's your research field you should probably know them. And I understand that you should probably know where to look them up and how to interpret data about them. But is it really and absolutely necessary to have all the information available to memory? If it isn't, then who cares how you teach the list of enzymes? And if it is, I'

So you teach that to the class and you tell them that they will be expected to know the full list of enzymes from number two.

Why would you expect your students to know a list of of arbitrary names ascribed to chemicals used in digestion? How could memorizing such a list possibly be important or useful to students? Requiring that makes no sense at all. People can look up trivial details like that on their own, should they need them. The vast majority (dare I say all if it!) is immediately forgotten by studen

Because what I did in college was write software which among other things was used to model digestive processes. As I recall, the medium sized model was a fairly non-sparse system of more than 200 differential equations, mostly linear but with a few key highly nonlinear terms. And in case it isn't obvious, 200+ equations translates into a hell of a lot more chemicals involved than you could reasonably memorize.

Difficulties of solving the thing aside (the system turns out to stiffer than you'd expect), th

The Socratic Method actually requires a good bit of that "lowly rote learning" that people like to be so dismissive of around here. It's a necessary prerequisite so that you know what everyone is talking about.

It's not glamorous but you can't skip lifting your head, rolling over, learning how to crawl and then how to walk.

As my geology prof exclaimed when the class complained about the amount of memorization required: "welcome to college."

Memorization is probably 50% of the work. The other 50% is knowing how to learn. I never learned to do homework until my third year in college. I learned it because of Dr. Tripp's analytical mechanics class: "Lambert, problem 4, up on the board".

One of the worst things about most educational formalism is the unwritten rules that let smart people get away without having to learn how to learn, or at least work to a schedule. The one thing that was the same for all my classes up to that point was that you

Actually some people do skip learning to crawl. I found out recently that I did. But point taken! I think even those who skip rote learning from lists, would eventually pick up all the terminology that the roters use anyway.

Actually I don't really think there is any point forcing kids to memorise stuff that they'd be able to look up in a textbook in the real world. Focus on teaching them how to analyse problems and then look up the information they need, rather than trying to know everything.

I didn't say that memory isn't useful, just that there's no need to fill it up with rote learning of digestive enzymes when your desired field of study is Computer Science for example. When I don't know something, I look it up. I remember things that interest me more easily than if I have to force myself to read through a textbook.

I'm not trying to say that people should try not to remember things either. I think it will be easier to absorb relevant information if you're not also trying to cram with irrelev

The Socratic method also requires that we learn numerous fields of study. This is something else often complained about by many. In my ever so humble (hah!) opinion, the real meaning of that requirement is lost today. For example few in college study Philosophy, Logic, Symbolic Logic, Rhetoric, and Ethics. Most people take "Humanities" as their Liberal Arts objective, which teaches no real skills but is glorified "Social Studies". People whine about taking Communications and Algebra if it's not related

Teaching and writing only about what is known risks turning science into a mere catalog of established facts,

Science is about explaining things, not cataloging facts. If the guy thinks that the facts are the important bit, he's lost his way somewhere. Facts are the questions, theories are their answer and "science" is really the process of creating theories and disproving them. Hopefully replacing old theories with better or more refined ones. It's not about being able to recite the properties of a given thing, person or animal (those can be looked up).

People think science tells them what is "true" or "false" or "real" or "unreal". This is my biggest beef with pop skeptics.

The notion that science can "prove" something is an 18th century conceit that does not have much currency among scientists today. We have models that seem to be supported by observation and we find them useful and we have models that are not supported by observation and we (hopefully) discard those to a shoebox which someone will someday open to write a book about the ridiculous things scientists once said.

I get this all the time regarding what pop skeptics would call "woo", such as Qi Gong or the concept of Qi. I try to explain that it's just a model, a way of describing something, and one that has held up pretty well to observation (yin and yang, the way a diagram of the channels and vessels of Qi is amazingly similar to the nervous and circulatory system). OK, it's a philosophical model, rather than an engineering model, but a model all the same.

Hopefully replacing old theories with better or more refined ones.

Models have different purposes. For the purposes of neurosurgery, the model of the circulation of Qi in the body is insufficient. For the purpose of maintaining and promoting health, martial arts, etc, the model of circulation of Qi is appropriate, precise, extremely useful.

Science is a funny thing. I occasionally play music with a guy who's been part of the Committee on the Conceptual Foundations of Science at the Univ of Chicago and he's a bona fide scientist. His view of "science" is very surprising, very...mutable. I find that the further up the food chain in Physics, in Math, you go, the less you'll find pop skeptics. The less you'll find the concept of "real".

I get this all the time regarding what pop skeptics would call "woo", such as Qi Gong or the concept of Qi. I try to explain that it's just a model, a way of describing something, and one that has held up pretty well to observation (yin and yang, the way a diagram of the channels and vessels of Qi is amazingly similar to the nervous and circulatory system). OK, it's a philosophical model, rather than an engineering model, but a model all the same.

Except that it isn't. Qi is presented by those selling it as ancient fundamental truth, not some model of truth.

I think the key word there is "selling" rather than "Qi". Anytime anyone is selling an idea (whether they want your money, your faith, or whatever) they're probably going to present it as Truth. If you've only encountered Qi through salesmen, then I'd say you've never really encountered Qi.

I think the key word there is "selling" rather than "Qi". Anytime anyone is selling an idea (whether they want your money, your faith, or whatever) they're probably going to present it as Truth. If you've only encountered Qi through salesmen, then I'd say you've never really encountered Qi.

Except that your parent is completely right and you are completely wrong.

You are not a Qi scientist, so you don't qualify anyway, however it would be interesting how you come to your opinion. If you ever had done a 10 minutes qi gong cours, or tai chi or kung fu you would realize: oh, that is funny, that is what they call qi, interesting.

There is no one "selling an ancient fundamental truth". It is only you who does not believe (because you never tried) just like people did not believe the sun is the center

This was largely my experience up through high school. Science was taught as a body of facts, and less so taught as a process. When process was mentioned, it was taught as THE scientific method...which is not exactly how research is done! The whole body-of-facts approach makes it boring to most people.

Beginning in undergraduate courses, it was somewhat better. Mainly the beginning undergraduate courses were all about getting one up to date on a few centuries of research, and there just wasn't time to discuss the frontiers of the field. Really good teachers made time for it, and stressed that there is much more to be learned. I don't think any graduate school science course, at least among the physics ones I've taken, have treated the field that way. The underlying assumption was that there is much more to be learned. But that's why there is graduate school.

This was largely my experience up through high school. Science was taught as a body of facts, and less so taught as a process.

I'd agree with you but give it a different spin, in that all education begins with little kids being taught the virtues of authoritarianism and a class-based society (the classes being educational achievement which equals authoritarianism of course). Eventually branching out.

It doesn't help that an education major probably doesn't know much about science, anyway. Odd how you can separate our educational system along the lines of "this group has teachers with education majors" and they're almost universally failing, and "this group has teachers with a major in the field they're teaching" and they're almost universally the shining jewel of the entire world's educational system. I don't know anything about the the french language; I'm sure I could study educational techniques and eventually do an awful job of teaching to the test; but my students would probably end up with some pretty bizarre ideas about the french language by the time I'm done with them./. has a subculture of science-types but the mainstream/.er is a comp sci-type. So who out there started on their comp sci path with a learned debate about the virtues of functional vs object vs procedural programming, or the virtues of ye olde waterfall vs agile strategies, or the beauty of the codd-normal forms of database design, vs the vast majority who probably started their comp sci educational adventure with "don't copy that floppy" and "I shall teach you the one true language, because its the only one I know"

I'll go out on a limb where I don't really know anything (as you can see by my writing), and make the wild bet that english majors started their educational path with hyper-authoritarian vocabulary lists, weekly spelling lists, and those PITA grammer flow charts (what is the technical term for where you break a single line of prose into a very strictly formatted flowchart / graph thing? I hated drawing those Fing things so I'm blanking it out). Then after 15 to 20 years of indoctrination / training to get a job / or even god forbid education to teach them how to think, the students are allowed to debate the relative ranking of metaphors in beowulf or whatever it is english lit majors do all day?

If a silo of hard science geeks only hang out with other hard science geeks and try to reverse engineer the 1800s era educational system they grew up in thru observation and analysis, they're going to come up with whoppers like this such as thinking only evolutionary biologists are taught this way, and surely the english lit majors are taught some other way.

Hm... that's quite different from how I remember my learning days.There was always the authoritative learning, but the topics of this type of learning were about expectations from others: To say "please", and "thank you", not to eat too much sweets, and when to stay quiet for a moment.And then there was the learning about the nature of things: How the blade of a knife is sharp, and thus it might be better to act carefully when wielding a knife. That a sky full of dark clouds is a warning about rain, and thu

I recall being told, in high school, that one's body was made up of cells.
That made me uncomfortable because it simply made no sense.
I could not have said why it made no sense to me.
I just figured I was incapable of understanding.
All anyone needed to say was 'Though the body is made up of fluids, non-cell tissue and cells, we will
focus on the role of cells here.'
Now we know all those symbotic organisims in our body are crucial too, but 50
years ago nobody knew that.
So I never got interested in biolo

When process was mentioned, it was taught as THE scientific method...which is not exactly how research is done!

This. And it's a misconception that persists among a surprisingly large number of very smart people, well into adulthood. How many/. arguments have we seen in which people casually dismiss rigorous, well-founded scientific results because the process by which those results were produced doesn't fall into their high school science class idea of how "the scientific method" works? It's very comforting to think that there is a single, fixed process which scientists in all fields can follow, and if they fill

As a physicist, I would like to read a book on why people outside the field consistently refer to large things as quantum. It means 'the smallest discrete amount possible,' not large, composite chunks.

Regarding the article, science would be more honest about research if we emphasized what we don't know and what we're doing to learn new things in the field. Also, I might emphasize how science has changed, so students can see that the taxonomy charts they are filling out had less useful predecessors (kind of like making your C++ class learn how to type "Hello World" in Assembly or Fortran halfway through the year).

As a physicist, I would like to read a book on why people outside the field consistently refer to large things as quantum. It means 'the smallest discrete amount possible,' not large, composite chunks.

I believe (although I'm not an etymologist) that the source of your frustration is the irksome fact that Scott Bakula [wikipedia.org] is better known in American households than Max Planck.

Regarding the article, science would be more honest about research if we emphasized what we don't know and what we're doing to learn new things in the field. Also, I might emphasize how science has changed, so students can see that the taxonomy charts they are filling out had less useful predecessors (kind of like making your C++ class learn how to type "Hello World" in Assembly or Fortran halfway through the year).

I think the key problem is that there's only so much time. Why did you pick Assembly or Fortran? Why not force computer science students to start out on punch cards or a PDP-6? In physics better models have been developed and while I learned of the less correct models (like combining the Rutherford and Bohr models) we never truly delved into their original states or why their failings drove them to something better. I think that's great stuff to preserve but ultimately when you're teaching high school physics there's just not enough time and students only retain so much. So I think sometimes we're forced to teach it by rote rather than as a process or journey that the student embarks upon.

I think the key problem is that there's only so much time. Why did you pick Assembly or Fortran? Why not force computer science students to start out on punch cards or a PDP-6?

I think the GP gave a bad example, because C++, Assembly, or Fortran are engineering products, not discoveries. The focus shouldn't be the language, but the paradigms (like functional, procedural, or object orientated). And yes, all of these should be taught.

I also have a bone to pick with your punch card comparison. You are implying that since we have modern technology that we shouldn't look at the basics. No, I don't think we need to learn punch cards anymore. But I do think that anyone with a CS degree t

"Why did you pick Assembly or Fortran? Why not force computer science students to start out on punch cards or a PDP-6?"

Wires. Make them use wires. My CS profs did. Once you made a gate using transistors you could use IC gates and you appreciated them. Once you made an adder out of gates you could use ALUs. Once you made a processor out of ALUs and gates, you could use processors. Once you programmed your processor using machine code entered with DIP switches you could use machine code. Once you wrote

The earlier models that survive, survive because they are useful. Newtonian mechanics is simpler than general relativity and is highly accurate for many applications. There are other, obsolete and obscure models which we only hear about in passing (eg phlogiston, aether, Lamarkian evolution, miasma, classical elements) since they aren't as useful.

As a physicist, I would like to read a book on why people outside the field consistently refer to large things as quantum. It means 'the smallest discrete amount possible,' not large, composite chunks.

It's not about size. It's used to denote a large discontinuous transition as opposed to a continuous transition. Like the transition, "jump", of an electron from one energy level to the next.

FWIW I was thinking the same as you until I was explained this, and as with all sayings it's not always used appropriately.

As a physicist, I would like to read a book on why people outside the field consistently refer to large things as quantum. It means 'the smallest discrete amount possible,' not large, composite chunks.

Thanks for bringing that up. Would you agree with me that if you insist on twisting up what quantum physics is, a better way of twisting it up than saying its the biggest possible change which is ridiculously wrong, would be to say quantum physics is a way to deal with what amounts to negative probabilities and also deal with stuff that has some underlying rules that are more complicated and less random than you'd expect at the (non-quantum) level (I guess I'm aiming more at, for example, the shapes of ele

I think we are getting a little too detailed for "the ideal quantum mechanics for the masses one line popular science explanation"

For now I'm sticking with something like this one liner "all kinds of small stuff is not smooth like a ramp, its surprisingly stair steppy, and that leads to behavior that would look pretty weird at large scale, like what amounts to something like negative probabilities, which leads to using complex numbers for amplitudes, which leads eventually to all kinds of complicated math".

As a physicist, I would like to read a book on why people outside the field consistently refer to large things as quantum. It means 'the smallest discrete amount possible,' not large, composite chunks.

When used properly - as in the evolution example - it refers to a sudden change between two states, without any intermediate steps. like an electron that can only jump between two "orbits" rather than gradually change energy. It may look small to you, buster, but that's one hell of a jump for an electron.

When used in an advert for dishwasher tablets (sad but true) it has the same meaning as "fantastic", "incredible", "ultimate" - i.e. "hey! sucker!"

The phrase (as it referred to stuff outside of physics) originally was used (reasonably accurately) for discrete rather than continuous (or imperceptably small) changes. I believe it entered the common lexicon from popularizations of the quantum model of bound electrons. This spread to anything about getting from state A to state B without spending much/any time in between.
I agree with the sentiment that it's a bit odd, thinking about it form a physical point of view it seems that it should refer to a sing

Exactly! "Quantum" has nothing to do with size, but with discreteness. And the physicist grandparent should know about Bose-Einstein condensates, superconductors, superfluids and other big things that display quantum effects.

As a physicist, I would like to read a book on why people outside the field consistently refer to large things as quantum. It means 'the smallest discrete amount possible,' not large, composite chunks.

Maybe it has to do with the way the discovery of quantum mechanics totally changed science (at least physics). I once read an anecdote about Max Planck. When he started studying physics, his professor told him: "It is nice that you are interested in science, young man, but unfortunately you are a bit late. We have uncovered almost everything. All we need to do is fill the last little corners here and there. So there won't be much interesting left for you." (quoted from my memory..) He couldn't have been mor

>As a physicist, I would like to read a book on why people outside the field consistently refer to large things as quantum

Probably because the word basically just meant "quantity" before physicists decided they wanted to used it to mean "smallest possible amount". Sort of like "force", "power", and "work" were words in English before physicists decided to use them, too.

As a physicist, I would like to read a book on why people outside the field consistently refer to large things as quantum

The same reason they refer to cyber-burglars and cyber-vandals as "hackers". The popular press latched on to "quantum leap" without understanding what the phrase meant, and it has been perverted. Just as "Hacker" use to mean "someone who repurposed hardware or writes quick and dirty code" but has changed to "bit burglar".

I guess it has to do with the fact that quanta are being compared with a continuum (infinitesimal steps) not with bigger things. In that case a quantum , a discrete step change, is more dramatic than a smooth, continuous change.

Ever read a research paper? Stereotypical conclusion is something similar to "if I only had more grant money, the logical next step would be to..." There are exceptions, but like most stereotypes this is based on a lot of evidence.

In hard sciences you could do worse than just go to arxiv in your field and cut and paste those "if I only had more money" lines from about 50 articles onto about two pages of paper and gin up a speech about where the frontier of your field is currently located. It would be a

It's all "what we don't know" which is why it's so neat. I remember the following quote, I just don't remember the source:

"The difference between an old scientific theory and a new one is that the old theory is wrong in more subtle ways."

Science is the process by which we work together to collectively improve our explanations and predictions about the world over time. It's how we develop, test, and explain/record our best guesses. Our current best guesses are likely to be improved in the future (i.e. t

The "unanswered questions" are critical for stimulating interest, but from the standpoint of accurate portrayal of science (the author's main point), what is more important is portraying the evolution of knowledge discovered thus far.

The most glaring example is the periodic table. Bam! There it is. It is knowledge in its most reductionist form. How were the elements separated and identified? Heck, how would you even go about separting elements today? (This would lead into the beginnings of material science, a subject important for everyday and political life but which much less than 1% of college students touch on, let alone grade school and high school students.)

I was really confused in all my science classes, because I was a Math/CS major. I would have been a lot less confused if someone had explained the philosophy of science -- not just the "scientific method" (and I don't think I even got that explicitly -- labs seemed to be more about showing how bad we were at taking measurements than about the process of discovery), but that the "laws" of physics were merely the best known model of observed phenomena, and that furthermore the models tended to break down at the extremes. I.e., it was never explained to me that science works backwards of math and computer science.

That's one reason I favor classical education for schools. Classical education cover the "great books" from the beginning of recorded human history to the modern era, in chronological order. Mortimer Adler, editor of Great Books of the Western World, called it the "Great Conversation".

A conversation that reveals the evolution of human knowledge is comprehensible, interesting in the way drama is, cross-disciplinary, and leads to holistic and lasting knowledge.

That's one reason I favor classical education for schools. Classical education cover the "great books" from the beginning of recorded human history to the modern era, in chronological order. Mortimer Adler, editor of Great Books of the Western World, called it the "Great Conversation".

A conversation that reveals the evolution of human knowledge is comprehensible, interesting in the way drama is, cross-disciplinary, and leads to holistic and lasting knowledge.

Thats pretty much my education, strongly recommend.

You missed mentioning the big problem with that strategy, which is the spectacular impedance jump when you go from modern translations of ancient foreign languages, which are pretty easy reads, to original but very old texts in your own native language (assuming native English reader). For example I know from personal experience a good modern translation of Herodotus makes a hell of a lot more sense than suddenly having a foot of Gibbon dropped in your lap. Gibbon's actually pretty modern compared to Shakespeare. A modern Herodotus is a fun easy read, but Gibbon is like a part time job. A modern english translation of Nietzsche is easy vs John Locke in his 17th century original glory. You get a twisted view of the past where everything made sense until 1600 or so, then its all incomprehensible until 1850 or so, very roughly.

So, then why not a translation of Gibbon, etc. into modern language? There's nothing sacred about the fact that the writings can deciphered without knowing a different language. In fact I'd venture that a direct literal translation of most older writing would for the most part be similarly impenetrable, full of cultural context and usages of language that are no longer relevant.

The problem is there's a long tradition of "translating" Shakespeare into the modern era, in fact about every 5 years we have to suffer thru yet another agonizing utterly awful "Romeo and Juliet, the hip hop years" and "Romeo and Juliet in the 1950s" etc. The pretty accurate stereotype is a translation from a foreign language, say, Plutarch's Lives will be done pretty well, but a "translation" from Ye Olde English into modern american english is just going to be awful.

I definitely agree with the article, it's not so much what we know about the universe, but what we don't know that is really interesting.

My biggest wonder is consciousness. What is it? How does it work? If I am conscious, does this mean the universe is conscious? Am I conscious? Is consciousness only available in higher order complex physical structures (like higher order mammals), or is it possible in lower order structure too, like rocks? I have to say that this there is not a big effort to solve this question. For me it's the most important question to answer, and most interesting. Where do you start to answer such a question? Of course many great thinkers have tried to answer the question, but at the moment it's little more than just philosophy.

Another interesting question is: How the heck does the universe exist?

Consciousness occurs when a being begins introspection of ones own thoughts and allows that thought process modify ones own behaviour. Most life forms that navigate through their environment have learned this trait on some level. where to draw the dividing line is more of an issue.

As to why the universe exists has nothing to do with consiencness. Clearly it existed before humans came into existance to observe anything.

Before you get that far, one must look to Descartes work (also Aquinas) to know that they exist without question. If you do not know you exist, the question of being conscious is not relevant. If you know you exist, being conscious is just an extrapolation of that same principle.

This is the foundation for critical thought, which in my opinion everyone should be trained in.

Consciousness is an illusion, along with free will. It doesn't actually exist, but your brain perceives the world and the actions you are taking in the context of consciousness.

Hmmm, do you have the scientific papers to back this up? I believe you are just making an assumption, like people used to assume the Earth was flat. I agree that this is currently the easiest explanation, but no-one has actually come up with a reasonable theory for it. It's the "I feeling" that I refer to as consciousness, to my knowledge there is no physical explanation of its mechanism. How does the universe allow for this awareness, the "I feeling"?

While this may sound interesting and logical to someone uneducated, this is simply a nice fallacy with no scientific backing that could not be countered by other scientific data. Simply put, you are full of shit.

While numerous studies show that often we make decisions by calculations, there is also a tremendous amount of proof to things like instinct, precognition, and premonition. We don't understand it, it's extremely complex to measure, but it is most definitely real.

I'm a HS science teacher [bio and chem] and he seems out of touch. Sure, he's right about there is a tonne of shit we don't know. Great. We also know there is a tonne of stuff we DO know. I constantly attempt to draw attention to BOTH. My students are regularly attempting to verify the 'what we know' and investigate the 'what we don't'. The latter is always a challenge at the HS level. A constant difficulty is that science 'stands on the shoulders of giants' and therefore to move forward we need to appreciate the past. Again, there is nothing new here. Lastly, I attempt to focus on concepts I HOPE my students move towards mastering. The fact is, many concepts require years of scaffolding, spiraling and application to truly understand. You really think you knew Newton's laws in grade 8 or 9? Memorizing the statements is fine but applying the concepts to authentic scenarios is challenging. I don't only teach facts, I ATTEMPT to teach a way of thinking and problem solving and wondering and all the other more interesting stuff.

Right, because according to the author, there is no scientific progress at all since everyone is "satisfied". The scientific method works as is. It's up to any decent scientist to review the work of others when postulating a new hypothesis to see if the question has already been answered by someone. That's part of the steps of the scientific method. Automatically you find out that way what is known about a subject. However teaching "what we don't know" is ridiculous because there are plenty of things that w

There is scientific progress. But basically every scientific discipline oscillates between times of fast, fascinating discoveries and getting bogged down in detail work, because "everything has been discovered". The way to get from the latter into the former is shaking things up and this may be what the author is trying to attempt. Of course not all such attempts are justified. For example in CompSci, unless we do a lot of detail work to solidify the foundations for actually using what we know, no real prog

It's a critically important point he is making, and that just makes it all the more frustrating that his examples are mostly really poor ones. It's been a few years since my biology classes but "Why does sexuality occur at all, since it is fully one-half as efficient in projecting genes into the future compared with its asexual alternative?" seems adequately explained - assortment of genes has significant benefits despite its inferior efficiency in a very narrow sense. And of course it's not like once sexua

But the worst of it is probably "What is the purpose of all that "junk DNA"?" That is not a scientific question. It's a teleological question.

While that's true, I think there's a scientific question in there; it's just difficult to word the question in a non-teleological way. I suppose you could say, "Does 'junk DNA' have a practical function (to either the individual organism or to the species) and if so, what is it?"

While that's true, I think there's a scientific question in there; it's just difficult to word the question in a non-teleological way. I suppose you could say, "Does 'junk DNA' have a practical function (to either the individual organism or to the species) and if so, what is it?"

If I could I'd give you a +1 insightful for that. You understand. I have to hope the gentleman quoted in the article does too, but if he does he's guilty of simply atrocious semantic hygiene. Or egregiously misquoted perhaps.

Finding this out is part of any reasonable scientific education. It is something people have to find out themselves, otherwise they would not believe it. But it is hardly the only scientific fact in that class (although most are more specific to individual disciplines) and calling it a "deception" is grandstanding on the part of this person. There is no deception, at least not by scientists. Just people that are incapable of understanding, usually because they want a simple, clear (and wrong) picture of the

There is some justice to this criticism. It was certainly my experience in high school, but by the time I got to college there was plenty of discussion of what we don't know. One problem is that in many fields, it requires considerable knowledge of what we do know to make sense of what we don't know and why it matters. Of course, there are some areas where what we don't know is sufficiently straightforward that it could be easily discussed at the high school level--the origin of life, for example, or the na

Sure, you can discuss for infinity about all that you don't know and their endless possibilities, but why not simply teach what we know, and what is probably. I don't know for sure if Aliens do exist, but based on what we know, and the vastness of the universe, it seems quite probable that other intelligent beings exist. It doesn't make much sense to me why we should contemplate on things that are unlikely to be true .

Looking just at his field, evolutionary biology, the unknowns are immense:

The parts that a nonscientist actually cares about are generally solved. To take just one example, evolution's "small" and "large" jumps are all large on a human scale. As far as everyone else is concerned, evolution takes a long time and scientists are just arguing exactly how long.

It is obvious that in the daily news feed no one is ever going to say "Hey by the way did you know that today no one discovered a solution to - Frankl's union-closed sets conjecture." What we never hear is the foundation on which new discovers stand. Today there are many fundamentalist or just uninformed people who don't "believe" in evolution and geology. If the press included in the discovery of say a new medicine for cancer the fact that evolutionary theory underlies our understanding of the what and why of genetics that led to the discovery, maybe people would see that biology today is the study of the evolutionary process.

People have strange notions of what the quantum uncertainty principal means. I have heard people say that "anything can happen" and "scientists can't say for certain that gravity will work". The truth that should be told when we smash particles to find the Higgs Boson or like is that quantum physics for all of it's uncertainty makes better predictions than any mathematical scientific framework ever previously invented. It may rely on probability but, it is still very exact.

I guess I mean that if we are talking about informing the uninformed about science I think telling them how much we know and how we got there is more important than saying what exactly we still don't know.

Well, of course. He's a biologist. A full understanding of biology, to the level of understanding that physics provides to mechanical and electrical engineering, is a ways off.

On the genomic side, full genome sequencing for complex organisms is only about ten years old. That just provides the raw bits. Now the meaning and the expressive mechanisms have to be figured out. We're nowhere a tool like SPICE where you put in a DNA sequence and an organism simulation runs. People are still trying to figure out

Yes, but many of them are the worst of both worlds. Speculative and unproven whilst being presented as dogmatic fact. This increases the public perception that science is both certain/absolute and changes its mind frequently/frivolously, and makes it even harder to explain how it really works.

Compared to other systems of "truth", Science does indeed change it's mind frequently and frivolously. That's not a bad thing. It's nothing to fear. It's also not something to gloss over just because half of the population is going to get a panic attack over it.

People like to believe we have a good handle on everything. It makes them feel safe.

There's a simpler non-psychological / non-political / non-philosophical explanation, that your average J6P knows we really do pretty much have an excellent handle on most anything we can mass produce, which for J6P is pretty much everything, or everything J6P knows, anyway. Therefore we must have a good handle on everything.

On the other hand, there's a heck of a lot of things we can imagine that we can't mass produce, and history shows we're pretty good at inventing and later mass producing some unimaginab

Indeed. Aristotle wrote a book 2400 years ago called, appropriately enough, "Questions". It's 400 pages of questions without answers, things he'd like to know but didn't, most if not all of them biology-related. As of today we have about 25% of them answered. At this rate in 7000 years we'll get answers for the remaining one (much less if things proceed exponentially, but a noticeable amount of time nonetheless). And that not taking into account the tons upon tons of additional unanswered questions added since...

..except there were many centuries of 'anti-science' in there as well, i'm sure more questions will be answered more quickly - probably a large portion of that 25% have been discovered pretty recently as well. (But unfortunately, some of that anti-science culture is making a come-back in portions of the US.)

Not really. This is a popular misconception, popularized by some authors in the 16th century and then again in the 18th, which entered the public consciousness and stuck. If you actually go and study the history of ideas over the period you'll see lots of quite interesting stuff happening all over the place during the whole period, specially in math and logic, but also in engineering, chemistry, metallurgy and many other fields, all of which became quite useful down the line and without which post-Galilean stuff wouldn't have been possible. On the other hand, it is quite accurate that a few decades, spread over the last 500 years or so, were difficult for scientists, but those periods were by far the exception, not the rule.

As for recent developments in the US, looking from afar (I'm in Brazil) it doesn't seem that bad. You guys still do most of the important research around. What happens in a few schools around is hardly enough to cause major impacts. Besides, these things come and go following the generations. If the current one moves one direction, the next one moves the other, if for no other reason than being rebellious. Provided the net result is positive, and so far it's been, the risk of things coming full stop is quite low.

Science was an anti-rationalist movement (Before I'm flooded with nonsense: Not irrational, a reaction to rationalism. History, people!) that was indeed unique to Europe. No where else in the world at the time do we find anything like the process of science. (See Whitehead for a nice history.)

I sure as hell expect to see large portions modern science overturned in a generation or two

I don't, I expect fantastic technologies and new science, but well established science will remain well established. Mankind's knowledge has evolved and grown enormously in the last few centuries, in the beginning of the modern scientific era there were huge leaps as people grabbed the "low hanging fruit" using simple tools, those leaps have become less frequent but have enabled us to build new and better tools (such as the LHC and Hubble) that allow us to try and find things we think are there (Higgs), or

You seem to be confused. If you think you are horny about something, then you are horny about something. If you think you like something, then you like something. These conditions exist so far as you think they exist, and need no other objective consideration.

I think if people truly had a choice over their sexual orientation most homosexuals would opt to not be part of a category vilified and marginalized by ignorant halfwits such as yourself. Considering they have to put up with dumb asses such as yourself I am more inclined to believe its a biological imperative rather then simply a choice about continuing to do something they just got horny about as a teen.

You don't understand homosexuality, that is obvious, most likely you are fearful and obsessed by it, b