Posted
by
Soulskill
on Friday August 24, 2012 @02:26PM
from the cellphones-going-off-in-lectures dept.

dougled writes "A survey of 4,500 college professors (and campus technology administrators) reveals what faculty members think of digital publishing (they like it, but don't do it very much), how much they use their campus learning management systems (not nearly as much as their bosses think), and how digital communication has changed their work lives (they're more productive, but far more stressed)."

I can tell you, working with some very smart profs, that they fall into the exact same classes that you find anywhere else.

You have people that are unreasonable (wanting things to be perfect in an imperfect world), you have people that can't apply basic common sense to using their computer (someone today, for instance, that they can have unlimited disk space and has magical thinking about the situation), people with poor problem solving skills, oldsters whom the world changed around and can't deal with it, people that can't use google, etc. etc...

So I guess what I am saying is that sometimes I wonder if singling them out as a class has any use at all. They're simply people.

Interestingly enough Max Planck said the same thing back in 1948 about the dogma and institution of Science:
"A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it."

Or para-phrased:
"Science advanced one funeral at a time"

The old want things to remain the way they always have been.The youth want things that will be.Society is a balance of these two diametrically opposed ideologies.

...to that I'll add that things shouldn't always change. For every new correct idea there are 100 new incorrect ones that sound reasonable, and among the ones that people think are correct, half are wrong.

Actually, all of them are wrong. Some are just less wrong than others.

That is indeed true. Some people like to criticize an idea [simply] because it is "old." Yet these same people ignore that the concept of the "wheel" has been around for at least a few *thousand* years. Old does not imply obsolete. If it works, why break it?;-)

It's not true, but I can tell you from firsthand experience that learning new things requires a little more effort when you get older. The unfortunate result of that is because people are lazy (myself included), they tend to not put in the effort. I still learn new things pretty quickly, but it's not the effortless sponge that it was as a kid and in my 20s and 30s. I'm a little more choosy now about the things I spend that effort on; it's usually reserved for work things (My job is to support scientists

Wow. You're saying that once people are no longer young they can't change or learn. Almost makes me want to die.

Of course older people can change and learn, but it's somewhat more difficult than for the young. (Even young adults have a harder time learning than children, who soak up knowledge like sponges.) Perhaps an even larger factor is that older people are more likely to have entrenched positions that allow them to refuse to update their knowledge. Young adults have to keep up or risk being left out.

People that don't live in the real world. I have a physics prof that I visited one day, and after we caught up, he asked for some computer help. He didn't even understand the concept of gigabytes or megahertz. He is so buried in the world of the very small (superstrings) that he doesn't understand the basics of computer science. And he's not that old either; he used computers in college but it's as if his knowledge stopped in 1985.

I never understand why people cite professors as if they are the end-all

I have a physics prof that I visited one day, and after we caught up, he asked for some computer help. He didn't even understand the concept of gigabytes or megahertz.

Maybe he didn't know, but if you explained it and he wanted to learn it (I admit, some professors like that can be arrogant and condecending) a physicist should have no problem grasping the concept of a frequency and of information. "Bytes" are a strange invention, but bits and information theory may be an important part of quantum theory. I have met physics professors who are old enough to have trouble seeing, but still have an amazing ability to work with computing. It helps that many physicists (experim

The administrators appear to be out touch too (see below). Frankly I don't understand the obsession with posting video lectures. I've found copied handouts of the prof's notes (and also homework solutions) much more useful than a meandering talk. I can scan the notes far, far faster than I can scan a 50 minute video.

"Administrators believed that 73 percent of the professors at their institutions used data logged by the LMS either âoeregularlyâ or âoeoccasionallyâ to identify students who need extra help..... In fact, only 51 percent of faculty reported doing so. About half of the administrators estimated that professors regularly or occasionally posted video-recorded lectures into the LMS, but just 25 percent of the faculty respondents actually do. Nearly 80 percent of administrators said their faculty members regularly or occasionally used the LMS to track student attendance; the professors clocked in at 44 percent."

It's pretty much standard at my university (in Australia) to post video lectures and slides online, and I'd say it's invaluable. The slides are great if you need to look something up or fill in something you missed during the lecture, but the videos are better if you missed the lecture entirely (you could have a clash, been ill, etc.) since the lecturer often goes into additional details and explanations beyond what's in the slides, or does a derivation in the margins.

As for âoeflipping the classroomâ -- that is, banishing the lecture and focusing precious class time on group projects and other forms of active learning

Man..glad they didn't have this crap when I was in school....I just wanted to get in there, listen, take notes....and GTFO. I just need enough interaction to take the test and make the grade and get out to get a job.

Strange tho...I'm actually quite a sociable person...outside of the class and work, I have lots of friends and go out, have fun, I have no problem talking to strangers and making new friends.

But
at school, and usually at worksites...I'm there to go in, get a job done...and get out. I'm not there to make friends. I don't hardly ever socialize with co-workers. I didn't ever want to really socialize with anyone in my classes, hell, I never really knew anyone's name in the classes (unless it was a good looking girl I'd like to meet and bang)....

I dunno....i guess to me, work is work...get in, get it done, get out...and then go into "real life" mode..where I have my friends and my fun.

Man..glad they didn't have this crap when I was in school....I just wanted to get in there, listen, take notes....and GTFO. I just need enough interaction to take the test and make the grade and get out to get a job.

To put it bluntly, professors don't care about students with this attitude. Nor should they. If the student has no interest in learning (but just wants to do the minimal amount of work to get a grade and pass the test), a professor isn't going to put any kind of effort into teaching them. Why should they? College isn't elementary school where it is the teacher's job to force kids to learn.

Newer ideas like "flipping the classroom" are for students who actually want to learn about the subject. Studies sh

People talk about this "flipping the classroom" thing as if it were a brand new concept. I guess they're the ones who, in school when asked to read such and such a chapter and come to class prepared, didn't do it. They probably skipped the seminars and non-mandatory labs in university too.

To put it bluntly, professors don't care about undergraduate students with this attitude, or any other.

FTFY.

And I would say that that goes double for professors teaching undergraduate required classes (chemistry, biology, physics, etc.) at universities that have medical schools. At the U.S. university I attended (a large state school with more than 30,000 students at the time), people interested in learning chemistry need not apply, because the goal was to avoid teaching freshmen at all costs -- too many students might pass, and then how would they fund the sophomore organic chemistry classes? Besides, the

At Penn State (graduated last year) we had some GREAT introductory science profs. I'll never forget my introductory physics prof -- every other lecture he'd have some big demo of the concept he was trying to teach. From "killing" Kenny from South Park (I think he hung him to demo something about pendulums) to shooting himself into the next room on a swivel chair with a fire extinguisher (newton's laws)...great class. And during lectures he'd usually wander through the lecture hall (we're talking two hundred

I'm not sure that you're understanding me. By professors not caring, I'm talking about the guys who don't actually teach anything and then just post the entire exam -- along with the solutions -- on their website a week before the exam, call it a "study guide" (No, it wasn't 'similar' to the exam, it WAS the exam, he just printed that out and gave it to use for the test. And it was multiple choice and fill in the blank.) The professors who don't answer questions; who take 30 minutes out of a 70 minute class

I'm not sure that you're understanding me. By professors not caring, I'm talking about the guys who don't actually teach anything and then just post the entire exam -- along with the solutions -- on their website a week before the exam, call it a "study guide" (No, it wasn't 'similar' to the exam, it WAS the exam, he just printed that out and gave it to use for the test. And it was multiple choice and fill in the blank.) The professors who don't answer questions; who take 30 minutes out of a 70 minute class just getting the computer booted up; who do nothing more than read off of slides prepared by somebody else; and who refuse to accept obviously correct answers because they're not phrased in exactly the same way it was phrased on the answer sheet.

Doesn't sound like any professor that I've encountered in my life... I guess you went to a tough college:) The worst I've seen was a professor reading from a textbook (one section == one lecture).

It's an attempt to help you improve your ability to work with others, something that your job expects you to know how to do upon entry into a bottom-level position.

Well, I know it sounds like I'm not a terribly social person, but I do get out and interact with others. I'm friendly at work...I just leave them AT work when I walk out the door. Work has 0% to do at all with my private and true social life.

I learned people skills...by working when I was working starting in HS....through college. I worked in

My classical mechanics teacher used the technique described in TFA, or at least a derivative of it. It was the single most interesting class I've taken at university, and that's despite the subject matter being rather ordinary.

The funny thing is that it would actually have been perfect for everyone, you included! If you actually want to learn and understand, you attend the classes and interact with other students all while actually solving problems instead of being a biological xerox. If all you want is "ta

Something more hands-on like Linux Administration, do you think you'd learn better with straight lecture or with some lecture and a lot of lab work, projects, etc?

True...but then again, it isn't like labs doing computer work like described, take interaction with other students in the same course. The admin life..IS kind of a lonely one as far as having to deal face to face with people very often.

That is a good way of phrasing the question: what does it do? There is a huge push in teacher education of integrating more technology into classroom instruction for its own sake. It is my opinion that if the teacher has to find a way to integrate the technology, then it's already pointless.

Buying books for class? You'd better buy the new book, with the online access code. That way you can access your online assignments and do your homework. God forbid you buy the used book and fail the class.

The book racket has reached a new level of thievery. How much for the access code you ask. That depends. It could be as little as $75, but is could be a real value at $150.

When you are a computer science student and you get your grade by manipulating the computer system the school uses (whether it is remote access to laptop or anything else), you should automatically pass. Isn't that the point of computer science, to understand how they work better to make them do what you want?

If your class is about engineering your way into someone else's computer, then yes. If it is about writing code that is, say useful for whole teams of developers and support people, or code that needs to run longer than 2 seconds in a production environment that is regulated by various laws and rules, then no, you should fail because you didn't learn shit.

Just the other day, I was asked by a post-grad researcher to help them get SSH working on their server. They were unable to remote login to their server. So they come into my office, I open a command line, and have them type in the credentials to get into this server. I log in just fine, no problems. Then this person, with a Master's degree in computer science, tells me they are using OpenVPN to log in to their gateway server, so they can SSH into the rest of their se

I think the problem you're describing is a bit more complex than you make it. I used to know this kid who, in elementary school, was doing DirectX 3D games in C++. Brilliant coder...but he knew nothing else about computers. He probably wouldn't have been able to install an operating system on his own. Universities are starting to divide things up a bit better, but it's still very vague.

The way I see it, you've got computer science, which really should be theoretical things -- not about how to program an app

If your class is about engineering your way into someone else's computer, then yes.

No, not even then. Do you think a professor that teaches a security class has any control over the LMS that the school selects or the security built around it? Even so, being able to manipulate a computer system that way is merely one aspect of computers and does not constitute mastery of the subject.

Using your same logic, if the student who breaks into the computer system and changes his grade also fails another student, that's OK too, because hey, that's what the class is about and if the now-failing st

I welcome any computer science major to write programs without any understanding of how to exploit the same program. It makes for good reads on Slashdot, when [insert large company here] gets hacked because some "programmer" forgot the usefulness of a regular expression to prevent a SQL injection. Myself, in agreement with every CISSP and Cybersecurity major, welcomes these programmers into the workplace, because it keeps us employed.

Grades would probably be focused on projects that require an understanding of the materials and real problem solving skills rather than on tests that usually just require that you regurgitate a bunch of facts. In other words, learning by doing. Of course, with 200 people in a class that would be hard to grade. I think in the future we will be using computers to learn the facts and concepts and then go into learning centers to interact with others intellectually and complete projects that are actually graded

I hope this is what we end up with. It would be a huge gain over what we have now and it would definitely give a more accurate view of if someone has learned the material. So many are good at regurgitating facts but can't apply it it save their lives.

I never understood the point of any kind of exams in a comp sci class really. Some of the more theoretical ones (like the logic classes that are 'comp sci' but are really math classes), sure, it makes sense...but for an actual programming class? You want me to hand write code? What the hell is the point of that? I'm never going to need to have these things memorized -- at worst I'll have an IDE to help, at best the internet (and hell, this is coming from the guy who does all his coding in text editors...) I

My schools rely on Blackboard too, and most of my classmates hate it. I find it to be quite useful, when used correctly. Grades go on BB, as should homework submission, quizzes and tests for most subjects (watch out for formatting issues with code snippets, profs who give CS quizzes on BB), and course documents (assignment spec sheets, syllabuses, etc.). What shouldn't go on BB? Discussions. Seriously. BB's "forums" are shit. They're a pain in the ass to read, a pain in the ass to post, and a pain in the as

The problem isn't so much that 'using the internet to exchange documents' is a bad plan(because it isn't) but that Blackboard's specific offering in that area blows goats through capillary tubing. At least it's expensive and buggy, though.

I used to feel like you, until I TA'd a class. Oh my god. It's so staggeringly awful to input grades in anything conceivably approaching an efficient manner that most people download the grades as an Excel file, edit that, and re-upload it. It worked alright, as long as you didn't fiddle with the structure. Half of my CS classes have grading scripts that read files and batch email grades to everybody because it's less painful.

Basically, Blackboard makes some things easy for the student... and some things ea

me too. Blackboard sucks hairy alpaca balls. For all the reasons mentioned here, but here's one that is SO obvious, it hurts:

You can't upload a folder of documents directly. You have to MAKE a "Folder" and then upload each file individually. A complete time wasting pain in the ass. Fucking retarded. Drag and drop has been around HOW freakin long? This isn't rocket science - it's just Blackboard being retarded.

Oh god, don't even get me started on ANGEL. I basically rewrote half of it's functionality for a couple clubs I was in because nobody could stand using that piece of crap. I never saw it improving -- and I can't tell if it was actually getting worse or if that was just my perception due to finding a new bug nearly every time I logged into the damn thing. So glad I no longer have to deal with that, although my new employer's internal system almost makes me miss the days of ANGEL...

of use and understanding of classroom technologies among my professors. Some are very skeptical and perhaps a little afraid of using the management software (we use CTools which is open source and pretty awesome). The biggest difference in adoption that I notice is between colleges. The professors in the school of education use way more technology and with much more confidence than my liberal arts professors.

Technology is awesome, but teaching is an art form no matter what the discipline. It involves interacting with people and helping them to develop and learn. If a teacher is not interacting directly with the student then they are doing that student a disservice. Management software is great for holding grades, averaging, and printing reports, but it doesn't tell me which student studied four hours to get the same grade as a student that studied one hour. That information can be found in their eyes and their

I couldn't agree more. I'm studying education and there is a pretty sizable push for teachers to adopt technology in the classroom seemingly for its own sake. I'm not in favor of this. If the teacher (or professor) has to figure out some way of integrating the technology, then it is alredy pointless.

I used to teach home schooling for elementary aged students with developmental challenges but never actually college level. Using technology in teaching was never something that was thrust on us, interacting and giving the students the personal attention they needed was our number one goal. If you have the opportunity to try something like that try it but don't expect good pay.

I easily get more than 100 a day too, but it drops quite a bit after I delete the anonymous cowards and all the emails from that guy who is CERTAIN that he'd better forward everything to everybody, just in case they missed it the first three times from various student lists, staff lists and concerned secretaries.

I'd like to know why the medical profession isn't embracing technology. They still use antiquated 20th century tech: i.e. fax machine. It would be nice if you could email your doctor and save yourself time and money with a followup visit. The doctors could determine from the email if patients needed to physically come in or the doctors could determine that the patients didn't have to come in and they knew enough to prescribe the next step. If it is about wanting you to come in for a follow-up visit so they

This is a follow-up visit, not an initial visit. For example, I just saw a doctor and told him everything that I could have told him by email or a question-assisted form. He didn't touch me and didn't ask me any questions. Plus, he had the benefit of seeing a report from a physical therapist so he could see a 3rd party's assessment of my situation. During these kind of follow-up visits, there is no reason that he could do one of 3 things if I had simply emailed him: 1) told me to stay the course and refill

> I'd like to know why the medical profession isn't embracing technology.I'd love to know that too. The only thing that seems to make sense is that probably because they dislike change and/or don't see the benefit in adapting to the customer to give them what they want / need.

> The doctors could determine from the emailSometimes face-to-face conversation is more efficient for the *doctor* in terms of time for *conveying* information, but yeah, from a scheduling point of view it is terribly inefficien

I can't wait until we live in a world where we can electronically contact a virtual doctor (i.e. IBM's Wilson) and describe our problem, take pictures, webchat, etc, and the virtual doctor could triage and determine if we needed to see an actual doctor, or if it was a simple problem that a super computer could recommend some treatment. Obviously, even if the super computer thought we had something serious (i.e. cancer), then the super computer would recommend we see a real doctor and the super computer coul

Every situation is different. Some situations can safely be accessed over email, while others cannot.
I had a situation where I had to pay out of pocket instead of relying of health insurance. In that situation, the doctor embraced email. So I don't think it is a legal restriction. I think it is because doctors don't see us patients as the customer but they see our health insurance plans are the customers. In the case, when I was the customer and there was no middleman like health insurance, the doctor cate

As a doctor who has been involved in the start of multiple electronic records systems in multiple clinics and hospitals, I can answer your question partially. Really it's two reasons:

1. Privacy. In some ways it's easier to lock down paper charts than networked records systems. You have a chart, one person has that chart at a time, and it's in one physical location. Networks get hacked, electronic charts can be viewed by multiple people at the same time, can be copied and pasted into emails readily, etc.

*shrugs* The higher the degree, the greater the hubris. If you have a lot of power, money, and / or authority, you are well-insulated from the small bumps, but also somewhat deaf the need to change. This is why some offices are so badly run -> paper filing cabinets, calling someone as opposed to texting them, and visual basic 6 apps with access backends refuse to die, and also why a big change, even with plenty of notice, can wipe out a large company.

The professors I know say that "technology" has had a bigger effect on their students than it has on themselves -- specifically, their lack of concern with plagiarism. Having grown up with Google and the Internet, when asked to write a paper discussing, say, the contributions to Twentieth-Century culture of recently-deceased Lithuanian tennis champions, the students' normal way of research is to Google the topic, find a relevant web site, copy the material, and present it.

They're often shocked when the plagiarism is noted and the fail the assignment because, after all, the paper is on-topic and factually true (let's suppose); what's the issue? The concept that one needs to come up with his own ideas and opinions is often a foreign one to someone who has grown up using the web as an immediate source of all the world's knowledge. I suspect, but of course cannot prove, that developing one's own opinions was an easier and more natural thing when one had to search multiple libraries for bits and pieces of the subject matter here and there; often your opinion developed over time, based on the facts you were able to find, and the order in which you found them.

Students (and professors) have been plagiarizing since the second piece of paper was made, of course; the new issue is that many students today do not see a problem with it. Because of this, the highest level of technology some professors use is their plagiarism-detect software.

Paraphrasing does not free you from plagiarism; paraphrasing without attributing the source is plagiarism.

You can legitimately create a work consisting mostly of (properly cited) paraphrases and quotes, while completely avoiding personal opinion or analysis. This is called a literature review, and there are times when they are completely appropriate (in the introduction to a graduate thesis, for example). Where plagiarism comes into play is when you state or imply that an i

The thing that's always bothered me about the "analyzing information for yourself" bit is that the paper subjects are in well-studied fields, such that any analysis you might do has already been done anyway. A half-competent researcher will then find that analysis, and they are stuck with ONLY literature review as an option, because someone else already wrote down an analysis the way they were going to analyze it!

I understand and agree with what you're saying...I just think that "different ways to say something" is shallow and trivial compared to actually having new ideas. Requiring novel expression as a tool to force students to develop as writers works really well, and banning plagiarism as immoral is correct, but there is no connection between the two: learning to write a phrase that sounds only half a bad as the perfected one you found in a review does not improve your knowledge of the subject or skills at anyth

Some professors clearly blame "technology" for all the ills of the world. That is, they blame the greater access to smart phones to reducing attention span and limiting the ability of student to read college level texts. Before the blaming of smart phones, many blamed MTV, or action movies, then Drugs, Sex and Rock and Roll. Going back further I suppose they either did not try to educate or blame the boggie monster, or whatever. In any case these professors are crazy. In any case in the last 50 years we

I'm an adjunct professor at three local colleges, so I get to experience a variety of educational technologies and IT departments. My frustrations don't come from the technology itself, but from the policies administrators and the IT staff implement. All three schools have a campus email system for students, faculty, and staff. But two of them are web-based systems that do not allow auto-forwarding. I have to manually log in to the clunky web-based system and sift through a mountain of intra-spam. The feature exists on these platforms, according to my research, its just been disabled. I guess they want to make sure we're all using the outsourced webmail system they spent millions of dollars importing from the late 90's.

When it comes time to submit my grades, one school's system flips a coin each semester to decide whether it supports Mac users. Not whether it supports Safari, not "the Mac version of Firefox" or even "the Mac version of IE" but logging in from a Mac computer at all. When I call the registrar's office, they claim to have never supported Mac. Except, they did. Last semester.

One school has a laptop loan program for faculty and students. We can request to borrow a laptop to run our classes with. For one month. Then we have to return the computer and resubmit the request. The same school installed 3M Smart Boards in many of the classrooms. They have loads of cool features, but the remote controls and digital pen devices you need to use them all disappeared within months of installation. Now they serve as very expensive white boards.

The list goes on . . . None of these are failings of technology, but how technology is implemented. I often get the impression that the people in charge of acquiring, installing, and managing tech at my schools are being brought in from the business sector. They are attempting to implement methodologies and policies suited to smaller, homogeneous work environments. Classrooms aren't office buildings; faculty and students use tech differently from the office staff.

Me too. We have lock boxes for the remotes but whenever I actually remember the darn thing what I want is not in the box.

IT is underfunded and they seem to have temp student workers as filler for real IT. I had to fight to get external email access which they do not advertize but I knew they had it. They will not turn on IMAP or POP even though I know their server supports it. They had unix systems and now it is all windows crap; including the incorrectly implemented MS DNS server and the occasional issu

Oh good, I'm not the only one who hates white boards. Expensive markers that constantly walk out the door, special erasers, special cleaning solution because the markers aren't all that erasable . ..

I love technology and I genuinely have no idea how people got through college without the web and email (lots of camping out in front of professor's offices I guess). But some times if its not broke, don't fix it. Chalkboards have been cheap and effective for a few thousand years. It also helps remind you t

Just change how your browser reports itself. If your website locks out Android browser, tell it to identify as iPhone. If Macs are locked out, tell FF to pretend its Windows Firefox. I love this crap. I mean just love it. My school doesn't support Chrome, but our Blackboard only supports Chrome for several key features. And not Chrome from last year, but Chrome from umpteen versions back. If you try to use it, and it doesn't work, and you're on a newer Chrome the IT people are "that's not supported." If you

As a CS instructor, I use Blackboard for homework and program submission, for posting solutions and for recording grades. Nothing else. Making a full-fledged web site out of Blackboard is too terrible to think about.

This is really interesting, as there is some anxiety within the public university system about tenure and LMSes, and how with the private institutions you have the freedom to implement them, whereas with public universities, there is a lot more resistance to things the faculty sees as wasteful.

Also, to run a really good flipped class, the time investment is rather insane. You might be spending less time working on powerpoint or whatnot, but you've got an email queue to deal with.