Like the jump from high school to university was a big leap. My marks were better but it required a LOT more studying to do well and a lot less social life/time for extra curriculars. I was not prepared for the big difference at all.

And then the jump from undergrad to grad school. I believed 60% was a pass in undergrad. Here, everything needs to be perfect. People will nitpick every line. "Papers are rejected cause 1 line is out of place." Everyone expects perfection. Getting 90% to get into grad school doesn't prepare you for the perfectionism here. I think I have imposters syndrome here, I just dont notice most of the nitpicky things other people see, or have their breadth of knowledge.

And it's like this in the real world. One mistake and the power goes out. Why is 60% a pass again? I think the way undergrad works is you learn to take tests and you don't learn skills to prepare you.

Thoughts?

(I think I may have started typing up opinions on one thing and making a different point by the end. Oh well.)

The high school you go do does seem to make a pretty big difference (and probably what courses people take there if they have much choice too). I had a pretty awesome high school and the stuff I took advanced courses in during high school I feel like practically qualified me to teach first year university of the subjects in question.

On the other hand, I didn't take biology in high school and as part of general science reqs at my uni I found myself taking first year bio, and I definitely felt woefully deficient compared to some of my peers who already knew various cycles and stuff, and I barely scraped through.

I've opted to not go to grad school unless I can find a career related incentive to do so, but one of my professors in one of my courses taught an undergrad course pretty much as if it was a grad course, and it was pretty harsh (and unlike bio, this is in a subject I went on to get half my honours degree in). I did ok in the end, but it was pretty jarring to go from being able to pretty much do assignments drunk last second and show up to tests completely unprepared and still do good, to fighting tooth and nail for every single point and actually needing to take things seriously.

I imagine the undergrad->grad transition would be similarly dependent on where you came from and what courses you took in undergrad. People from different schools may have covered slightly different things and prepared them better for certain aspects, but overall I'm lead to believe everyone feels like an imposter in academia by that point. If you do poorly or even average, you feel like you're in over your head, and even if you're doing great and leading the class people feel like it's a fluke and they're just waiting for people to find out that they're not as smart as their marks indicate.

Also if you're in grad land and don't already know about it, embrace phdcomics which I'm too tired to link to properly.

GTM wrote:Like the jump from high school to university was a big leap. My marks were better but it required a LOT more studying to do well and a lot less social life/time for extra curriculars. I was not prepared for the big difference at all.

And then the jump from undergrad to grad school. I believed 60% was a pass in undergrad. Here, everything needs to be perfect. People will nitpick every line. "Papers are rejected cause 1 line is out of place." Everyone expects perfection. Getting 90% to get into grad school doesn't prepare you for the perfectionism here. I think I have imposters syndrome here, I just dont notice most of the nitpicky things other people see, or have their breadth of knowledge.

And it's like this in the real world. One mistake and the power goes out. Why is 60% a pass again? I think the way undergrad works is you learn to take tests and you don't learn skills to prepare you.

Thoughts?

(I think I may have started typing up opinions on one thing and making a different point by the end. Oh well.)

Speaking mostly from the sciences here, because that's really all I know. I think the big difference between undergrad and postgrad studies is that in postgrad, you are more or less expected to think for yourself. You have to take courses, but you're expected to do well in them, and then nobody cares that you took them. I spent a month studying 8+ hours a day for my qualifying exam, did the exam, took a day off, then was back at work like nothing had happened. There are hoops that you have to jump, but everyone knows that they're hoops to jump, so nobody actually cares. Rather, you live and die by your research, which necessarily, requires higher standards than a course. Remember that when you send a paper to a journal, you are, in a way, competing against submissions by distinguished professors with maybe 40 years experience. Your paper has to be near-perfect because you're competing against papers that are near-perfect. You don't get off easy because you're a student (at conferences, you do a little bit). Though, that said, you will find that some journals will also take papers that are pretty much in every way absolute shit, and you can't understand how they got through review when papers you've been agonizing over every word for months end up getting rejected. It happens.

I think all grad students feel a bit like imposters. It's part of the process, especially when confronted by people who really do have a great deal more knowledge and experience in the field than you. By the time you get out, hopefully you will have grown into it. Best advice is to read a lot of papers. It was suggested to me when I started to aim for one paper a day. Can't say it exactly worked out that way, but it was a worthy goal at least. Talk to everyone about everything, too. The nitpicky stuff, likewise, comes with time. When you have a better depth of knowledge in your field, then it's easier to spot mistakes because you know what things should look like. The more papers that you write, the better you will get at this as well.

As far as the education part goes, yeah, it's mostly bullshit. There's lots of problems. Probably the biggest problem is that until you reach grad school, you aren't really expected to truly learn things on your own. The professors/teachers give you all of the information, and you spit it back out at them, or apply it to a very narrowly defined set of applications and model systems the professor has chosen for you. In grad school, you have to do really start figuring things out for yourself. Which is a lot harder, but much more rewarding as well.

Izawwlgood wrote:In grad school, the material for classes was way more difficult, but they were far less interested in our ability to memorize it.

I feel the opposite! They expect you to remember things on the spot when asked!

Dopefish wrote:The high school you go do does seem to make a pretty big difference (and probably what courses people take there if they have much choice too). I had a pretty awesome high school and the stuff I took advanced courses in during high school I feel like practically qualified me to teach first year university of the subjects in question.

On the other hand, I didn't take biology in high school and as part of general science reqs at my uni I found myself taking first year bio, and I definitely felt woefully deficient compared to some of my peers who already knew various cycles and stuff, and I barely scraped through.

I was talking less about content and more about behaviour and lifestyle.

Dopefish wrote:Also if you're in grad land and don't already know about it, embrace phdcomics which I'm too tired to link to properly.

Already do! Did since undergrad.

The rest of the posts seem like good advice. Just note that I feel like the real world is like this as well. Small mistakes can affect millions of people, so 99% isn't good enough there either!

GTM wrote:Just note that I feel like the real world is like this as well. Small mistakes can affect millions of people, so 99% isn't good enough there either!

Except its extremely rare in the real world you're going to be the only person checking something. For example, a lot of the software I work on has strict requirements from FAA/EASA/Transport Canada that require multiple independent reviewers when going through the entire design process. Even in cases where its not government mandated, no sane company is going to have only one person looking at something that has the potential to affect millions of people. Being right 99% of the time is pretty unrealistic in any industry.

LaserGuy wrote:You have to take courses, but you're expected to do well in them, and then nobody cares that you took them.

When I was in grad school, no one even seemed to expect people to do particularly well in them. I also recall grad school courses as being a great deal fuzzier than most of my undergrad courses, with their straightforward, closed-ended exam questions. I recall doing Powerpoint presentations more than anything else.

Most of my graduate classwork was focused on conceptual understanding of the material. I distinctly remember getting one of the highest scores on an exam without once naming a protein. Most of my professors simply weren't interested in our ability to memorize specifics. Our first exam in MCB required we knew a handful of parts of replication machinery or such, but saying something like 'one of the helicases nicks and relieves torsion in the nascent strand during replication' is sufficient.

... with gigantic melancholies and gigantic mirth, to tread the jeweled thrones of the Earth under his sandalled feet.

Speaking as a former university math teacher, yes, yes, yes, a thousand times, yes. I was very frustrated and often amazed by what my students didn't already know.

But it also depends greatly on their high school as well as their own individual work ethic. Many people come out of high school without even a basic understanding of algebra, but some have a good grasp of calculus.

I think, though I'm not really ready to commit to this, that the problem is primarily a sense of entitlement. "I studied hard, so I deserve to pass!" Well, requirements for passing have nothing to do with how hard you studied, but how well you understand the material. But teachers have a hard time constantly fighting students, so they pass them anyway. Then the students aren't ready for the next class. (When I was tutoring, I had a lot of students that were in multivariable calculus that didn't understand basic algebra.)

adanedhel728 wrote:Many people come out of high school without even a basic understanding of algebra, but some have a good grasp of calculus.

Wha..? I'm curious; how are you defining algebra?

How is this confusing? He's saying that some people get out of high school knowing more math than other people, not that some who don't know algebra do know calculus.

I think Jorpho is thinking about groups, rings, homomorphisms, that kind of stuff, while adanedhel728 meant the more menial skill of "calculating with letters". It is really uncommon to learn the former before calculus. (The latter: Always)

Confusingly both things are called algebra, with outsiders thinking of the latter and mathematicians thinking of the former, with neither commonly specifying which is meant. The first can be called "abstract algebra" or "modern algebra" as well..

adanedhel728 wrote:I think, though I'm not really ready to commit to this, that the problem is primarily a sense of entitlement. "I studied hard, so I deserve to pass!" Well, requirements for passing have nothing to do with how hard you studied, but how well you understand the material.

I'm in my 5th year of undergrad now, and for the past 4 I've been helping my classmates study (read: teaching them most of it) and by far the most common complaint from people is "I spent all day every day studying and did terribly!" I try to convince them that that's a terrible way to learn anything but they "got an 85 once, so it must work". They're almost all still stuck in the "study for the test" mentality, too. I've been helping two friends with organic chemistry (taking it over the summer), and I'm having to re-teach them *super* basic things from general chemistry, and we took the general chemistry classes together!

The conflict between having little remaining sympathy and wanting your friends to do well!

I've been getting into the habit of reading more papers of my own accord while I have time. Which I think brings up a big aspect of learning: that enjoying the material is crazy helpful.

There's a certain amount of freedom involved in cycling: you're self-propelled and decide exactly where to go. If you see something that catches your eye to the left, you can veer off there, which isn't so easy in a car, and you can't cover as much ground walking.

flownt wrote:I think Jorpho is thinking about groups, rings, homomorphisms, that kind of stuff, while adanedhel728 meant the more menial skill of "calculating with letters". It is really uncommon to learn the former before calculus. (The latter: Always)

No, students with calculus very often can't do basic algebra fluently and reliably. Worse if you want them to use variables, like "There's a pulley with masses m1 and m2 on either side, find the acceleration a." Setting up, reading and solving equations are all harder than calc 1 style derivatives or integrals, especially if you actually want the students to do those first two things.

LE4dGOLEM: What's a Doug?Noc: A larval Doogly. They grow the tail and stinger upon reaching adulthood.

One of the biggest problems in my opinion with western education systems (I know much less about learning in eastern Asia) is that Learning/Teaching methods only really get taught to people who want to become teachers/professors.

While you can't necessarily teach higher level concepts to 5 year olds, I believe that teaching them basic methods for learning should be just as important, if not more so, as reading, writing, and basic math concepts that young school children learn. I sort of view this as an extension to Active Learning. If the student knows a handful of methods for learning, they can try them out and decide for themselves which work best for which types of topics.

As school progresses, they could learn more and more methods, and build on their theory of learning, so they can become better students.

It's the age old "Learn How to Learn" issue, that so many people don't figure out until late high school or college/university because "well that's when the material gets hard" and they're forced to. How many kids drop out of high school (or don't attend college) though because to them the high school material is hard and they lose interest because of that?

"Learning How to Learn" should be among the first things you ever learn in school.

I felt like this when I went from studying computers to going into freelancing. But college/university aren't geared to giving students these tools. I might have a swimming pool if it weren't for that!

Yes. I wish that someone had told me in elementary school (or junior high, where I might have been more receptive to the point) that the study of mathematics is cumulative. Even through college, I somehow never grasped that I had to hold on to everything that I'd learned about math before to have any hope of solving new problems. Math is annoyingly cumulative (and, by consequence, difficult) in that regard. To perform at the highest level requires perfect recall of everything that one has learned before.

That's weird though - in a sense then the previous levels do prepare you well, you just have to actually retain what was in them, because it's all cumulative. That's a very different complaint than if someone said "abstract algebra is so totally different from the linear algebra course I was unprepared why did they throw me in this pool," for example.

But I think that somewhere along the line people have de-emphasized the cumulative nature of math. In my classes when I was in school this was always presented as huge, but when teaching I had a colleague who had previously taught AP calc, and pointed out that in this exam students were explicitly told not to simplify fractions or other such bits of arithmetic or simple algebra. That kind of thing may have become more common in classes, but the requirements of actual math haven't changed, so yeah, that would be a huge problem if students are getting that message.

LE4dGOLEM: What's a Doug?Noc: A larval Doogly. They grow the tail and stinger upon reaching adulthood.

doogly wrote:That's weird though - in a sense then the previous levels do prepare you well, you just have to actually retain what was in them, because it's all cumulative. That's a very different complaint than if someone said "abstract algebra is so totally different from the linear algebra course I was unprepared why did they throw me in this pool," for example.

But I think that somewhere along the line people have de-emphasized the cumulative nature of math. In my classes when I was in school this was always presented as huge, but when teaching I had a colleague who had previously taught AP calc, and pointed out that in this exam students were explicitly told not to simplify fractions or other such bits of arithmetic or simple algebra. That kind of thing may have become more common in classes, but the requirements of actual math haven't changed, so yeah, that would be a huge problem if students are getting that message.

True. I'm 30 years old, and the cumulative nature of math was never expressly emphasized to me in junior high or high school, which is when such emphasis would've helped. Maybe math teachers assume that the principle is obvious, but I distinctly recall using textbooks that (as you've pointed out) said not to apply certain principles that I'd learned before.