Friday, August 29, 2014

In my last installment, I explained why for-profit colleges like the one I work at exist and why, at least at a basic level, they meet the needs of our students in ways that more traditional post-secondary educational institutions (a fancy way of saying "universities and public community colleges") usually fail. Today, I'm going to go into greater detail while exploring what it's like to work at a for-profit institution. Before I do so, however, I want to point out that I'm a graduate of a traditional university - I received my Bachelor of Science in Computer Science from the University of Nevada, Reno almost a decade ago. That degree opened doors for my career in IT that I'm reasonably certain wouldn't exist otherwise, even including my current job. I mention this because, in my last installment, I was pretty harsh on traditional universities and I'm going to remain a little harsh, but I want to be clear about something - there's room for both approaches in this world, and it's important that we have opportunities for everyone to pick the path that's appropriate for them. I am in no way trying to infer that anyone is a fool if they pursue a university education, unless they're going to graduate school in a liberal arts program, in which case they're only fools if they're expecting a job in their field after graduation.

Having said all that, I want to point out something that should be really obvious - there's absolutely no reason an auto mechanic must learn the Nevada Constitution.

Don't get me wrong, I'm not saying that learning the Nevada Constitution isn't a worthwhile endeavor. I'm not saying an auto mechanic, bricklayer, welder, medical assistant, or whatever wouldn't benefit from a better understanding of Nevada's Constitution or English literature. However, if someone could explain to me, using simple, declarative sentences, why it's vitally necessary that all tradespeople in the State of Nevada absolutely must learn the contents of the Nevada Constitution before they receive a certificate from a post-secondary school that says, yes, they're nominally qualified to perform work that pays halfway decently, I am all ears.

Ears and, well, shucks, maybe a kernel of corn or two as well.

Point being, some people just want to learn how to get the job done without a giant pile of imposed distractions so they can go out and get jobs done. So, how do for-profit schools do that?

It's all about... speed.
Remember Summer Break? And Winter Break? And Spring Break? Remember how, when you were in college or high school, you spent about half the year actually in class while, the other half of the year, you partied with your friends, or played video games, or read Emily Dickenson poetry or something? Well, for-profit schools have breaks, too, only instead of 180 days worth of them each year, we take more like 28.

Total.

Okay, I exaggerate slightly. We also get snow days, at least where I live, plus probably another week or so if you add up the days we get off for three day weekends like Labor Day, Memorial Day, and so on, but the point is, once you're in school, you don't leave until you're done. Consequently, the longest program where I work is an 18 month program which, at its conclusion, gives you an Associates Degree. Remember, at a public community college, it would normally take you at least two years to get one. That's six extra months of earning a higher wage after graduation right there.

In order to make that possible, though, there's a catch...

Academic independence? What's that?
Instructors don't write their course curriculum where I work. They don't write their homework assignments. They don't write their syllabuses. They don't pick their textbooks. Instead, this is all the responsibility of their department chairs, who, in turn, are accountable to the Academic Dean and must submit any and all changes in their classes to the Dean before they're allowed to implement them in class.

Any professors reading this post right now just felt the hair on the back of their necks stand on end and started muttering profanities in Coptic Sumerian.

Thing is, this is an absolute necessity for two reasons - first, we don't employ professors. Our instructors are people that used to work in the field. They know how to do one thing - whatever job it is they're training students how to do. What they don't know, at least not right off the bat, is how to write homework that makes sense, how to write fair tests, how to set grading criteria, how to write up a syllabus, how to pick a textbook, or any of that. Consequently, all of the academic paperwork has to be taken off their hands so they can do what they do best, which is show someone else how to do what they know how to do. Another advantage of this approach is that, if an instructor decides they've had enough and it's time to go back to the field - and that does happen - another qualified instructor can pick up right where the syllabus left off without drastically changing what's taught in the class.

Oh, don't get me wrong, we have part-time instructors too, but, for the most part, they're part-time because they already have full-time jobs in the field and don't want to leave. It's not that we're not willing to hire them full-time - in fact, given our rather aggressive schedule, we usually prefer instructors we can hire full-time because they're much more flexible. However, there are certain topics we teach that make far more money in the field than any college can afford to pay without effectively doubling or tripling tuition. Since it seems a little unfair to charge students who make less than $15/hour enough in tuition to pay an instructor the low- to mid-six figures they're capable of earning for the same skill set in the private sector, we take what we can get when and where we can get it.

Oh, and graduate students? Students don't teach students at our school. Period, end of discussion. For what we charge in tuition, students deserve to be taught by people that actually know what they're talking about.

Wait, flexibility?
Yes, flexibility. A lot of our students have day jobs. What does that means for us? It means we teach at night. Until 10:30. Some students, however, work in the afternoons, so we also teach in the mornings, starting at 8:00. Some instructors end up working split shifts, teaching a class for a few hours in the early morning, then heading home for a nap and coming back in the evening to teach the same class to a different group of students. We try to avoid that, but it happens from time to time. We're even looking into weekend courses.

Student schedules must be a mess, then.
Quite the contrary. Students pick a block upon admission - morning, afternoon, or evening. Each block is filled with classes and takes about 4.5 hours to complete from start to finish. Assuming they pass their classes, don't take a leave of absence, or request a block change, they'll stay in that block from start to finish and have their schedule automatically chosen for them all the way through. No fuss, no drama, no "I hope I can get into that class," none of that.

If they fail a class and have to retake it, things start to get a little complicated, but only a little - it happens enough where we're prepared for it. At worst, they'll come back in a phase or two when they can hop back on to their original block schedule.

Phase?
Oh, right - you know those "semesters", "quarters", or whatever that traditional schools have? Yeah, they don't exist where I work. We have six week phases. You start class, you do your homework, you sit in lectures, you participate in some labs, and six weeks later you take your final. Then, the following week, you start another phase with another set of classes.

No rest for the weary.

Didn't you say you were going to talk about what it's like to work there?
Indeed I did, and this is as good of time as any to bring up a rather interesting point - because of the constant, grueling schedule, working as an IT Manager here is more like working in IT at a casino or a retail establishment than a school. Maintenance is done in very tight windows that assiduously avoids any downtime for students or instructors as much as possible. Scripting is used religiously - if something can be done repeatedly, it can be done automatically at 0300 on a Sunday. Network upgrades are done very carefully, usually during one of the four weeks a year that we're not holding classes. Even then, though, the building is open and staff is working.

Coincidentally, as awkward as this is for IT, remember that instructors and department heads are operating under the same constraints. There's no time to completely overhaul a class between one phase and the next, so all class changes and improvements must be done iteratively. If you'd like, you can think of each phase as a six week scrum.

Back up a second. If the people that are teaching the students aren't in charge of what's in the curriculum, how do you know the students are learning the right material?
Oh, that's easy - we ask employers. In fact, even department chairs don't get to make arbitrary changes to the curriculum without asking a committee of employers about the possible change or without soliciting input from employers about other changes they'd rather see instead.

A good example of the difference between our approach and a university would be the use of Linux in our curriculum. At UNR, Linux is all over the place, especially in the Computer Science program, which makes sense - it costs nothing to install and the code is open source, which makes it really easy to show future programmers what's going on, where, and why. Where I work, however, our curriculum is determined by our students' future employers, and, truth is, at least in Reno, more Linux in our curriculum would be nice, but not as nice as what we already have in there. Thing is, employers think it would be "nice" if our students knew more about Linux; on the other hand, employers think it's "imperative" that they know something about Cisco and Windows. Consequently, when we teach Linux, we have to be kind of sneaky about it - if it doesn't fit in a class we already teach that has already been approved by the employers, we can't use it.

It's not that we don't want to teach Linux - it's just that the people that hire our students would rather they learn just about anything else instead first. Believe me, I asked.

So, how do you pick which programs to teach?
This is where I'll pause and point out I'm an IT Manager, not an Academic Dean, so my understanding of the details is square in the middle of what I like to affectionately think of as the "Dunning-Kruger Zone". However, from what I've been able to piece together, it looks something like this:

What skills are employers looking for that they're not getting enough of from the community right now?

Can we find people in that field that are willing to teach these skills?

Are students willing to learn those skills?

The last point is actually a pretty big issue. Our best paying and highest placing programs after graduation are also frequently our least popular. Why? Because a lot of students think those programs look too hard for them. Maybe the program takes to long, maybe the program promises to teach a set of skills that the prospective student can't even conceptualize. Which actually brings something else up...

We have to start from scratch.

So, what, like "not fast food?"

No, scratchier.

Oh, okay! Scratch!

Still not scratchy enough.

Now?

Almost there...

Uh...

Perfect.

A couple of months back, I had to take over an Introduction to Computers course at work. The class itself is pretty straightforward - the first half of it focuses on typing exercises, with the latter half focusing on using Microsoft Word to write a basic letter. Based on my previous IT consulting experience, I knew that there was a better-than-even chance that I might go too fast, so I was encouraging them to raise their hands and stop me if that happened. It's not because the class was stupid or I was so smart - it's just that, well, my father's a programmer who got the family a computer when I was three, which was over thirty years ago. Long story short, I've been eating, drinking, breathing, and sleeping computers almost since birth while I was partially raised by someone who's been making a living doing the same for longer than I was alive. When you're exposed to something that long and that early, there are a lot of assumptions you take for granted, like "double-clicking on something opens it", "Delete and Backspace do different things", "the left mouse button does something different from the right", and "you really don't need a mouse to use a computer", among other things. Trouble is - and I knew this from working in the field - most people haven't spent literally decades internalizing these lessons.

Don't get what I mean? Perhaps this will help:

Bear in mind that the computer they were struggling with is not only the same computer I had at home (more or less - ours was an Apple ][e, not the Apple ][+ shown in the video), it was also the same computer that I and my classmates saw in every single computer lab from 1985 until the early '90s. Seriously, those things were the Volkswagen Beetles of the early educational computing world. It's hard to overstate how ubiquitous those things were in schools. If you put one in front of me right now and told me to use it, I'd be halfway to finishing this blog post on it in no time. The kids in the video, however, never had to deal with a computer like that, so all the things I took for granted growing up - flipping a disk over, inserting a disk in general, typing simple commands at a prompt to make something happen (yes, I remember PR#6) - are completely lost on them. They never had to deal with it.

Okay, maybe that's not a fair example. How about a Sony Walkman? That's easy enough, right?

With a little bit of trial and effort, they eventually figure it out... sort of. Again, though, it's easy to forget when you were raised around these things and everyone had one that there's really nothing particularly intuitive or obvious about a cassette tape player if you've never held a cassette in your hand in your entire life.

Now, pretend those weren't naturally inquisitive kids figuring out a Walkman but were instead adults - many of them somewhat older - trying to figure out a computer for the first time.

Wait, what's that? The first time?

How is it possible for any American to use a computer for the first time in the 21st century, you ask? Does Northern Nevada have a surprisingly large Amish population or something? No, and that's not the crazy part. Most people in the class I was teaching that hadn't meaningfully used a computer before were younger than me. A lot of them were students in their early 20's. Maybe a little younger. The older students had at least used a computer or two at work at some point in their lives.

But, what about Computer Literacy courses? Computer labs in elementary schools? Did their parents sign permission slips exempting them along with the sex ed waivers or something?

No, but here's the thing - and, if you're the type of person to read blogs, this is going to be near-impossible to believe - a lot of people, and I mean a lot of people, don't actually have computers at home. Even now. Or, if they do have a computer at home, they rarely power it on and never, ever let their children touch it. After all, if you're a financially struggling parent with minimal computer literacy, spending a few hundred dollars on anything is a stretch, much less something you barely understand how to use. Add in a few moral panics about social media and viruses and you'll be firmly convinced that, if your child touches your computer, they'll break your expensive computer on the way out the door and get kidnapped by a sexual predator they met online on the way to school. Naturally, after being told by their parents that "computers are forbidden" over and over again, when children actually get a chance to touch one at school, they'll either be too afraid to do anything or, just as likely, they'll just ignore it at school just like they learned to at home.

So, what we have to do where I work is not only teach people that have never had a real chance to use a computer before how to use a computer - all jobs these days are requiring them in some capacity or another - we also have to overcome that initial fear of computers that has been purposefully instilled in them their entire lives. Adding insult to injury, we have exactly six weeks to somehow accomplish this educational and psychological breakthrough; if we don't succeed, we have to fail them and they'll have to retake the course, only next time they'll be carrying the added internal shame of failing one of their first courses. Not surprisingly, that really doesn't help.

This, coincidentally, comes back to why some of our best paying programs are also our least popular - the ones that pay the most use computers extensively, either to keep networks running, or manipulate robots, or something else entirely. The most popular programs are the ones that use computers in their curriculum as little as humanly possible.

This looks like as good of a stopping point as any - next time, I'll discuss what happens to the students when they're done and some of the things we have to be careful about along the way, along with a little more detail about the experience of actually working at a for-profit college.

Thursday, August 28, 2014

Last night, the local news in Reno announced that Morrison University, a for-profit university, was closing its doors due to Anthem Education, its corporate parent, filing for Chapter 11. This follows an extended meltdown of Corinthian Colleges, which is currently undergoing a criminal probe, ongoing financial issues at ITT Tech, and increasing pressure on the University of Phoenix over its how it conducts its operations and the programs it offers to veterans. Since I currently work at a for-profit college and have been for almost a year, I'm understandably... not alarmed, but concerned about the direction my employer's industry is going. While reading about the state of the industry I work in, however, I noticed something missing - our side of the story.

So what's it like working at a for-profit school, anyway?

Before I begin, I'll point out that I'm not a cheerleader. If you want to hear stories about how for-profit schools change students' lives for the better, leads them to better opportunities, and so on, contact the admissions department of any nearby school. They'll be happy to chat your ear off about all of the success stories they've seen, tell you all about our placement rate, and so on. That's not why I'm writing this today. I'll also point out that I'm rather fond of my job and rather fond of my employer, so if you're looking for a harsh exposé on the for-profit college sector, that's also not why I'm here. There are plenty of those already anyway. So, with that, let's begin.

Why do for-profit colleges exist and why do students enroll in them?
[NOTE: When I first started this post, I expected this section to be a couple of paragraphs at most. As I started writing, I realized this section deserved its own post, so I'll pick up on what it's like to work at a for-profit college in subsequent posts.]

In many countries, students get the Sorting Hat placed upon them fairly early in life - in the early teens, if they're academically inclined, they'll be placed on a university track, where students experience demanding college prep courses. If they're not, they're placed on a vocational track, where students receive training in various careers, ultimately focusing on a particular career of their choice by the end of graduation.

The United States, however, doesn't do this for a variety of cultural and financial reasons. Culturally, we're highly allergic to telling people what they can and can't do, and this allergy turns anaphylactic when we talk about our children. No parent wants some faceless bureaucrat (okay, highly accessible public school teacher or school administrator, but they're all government workers and government workers in America are automatically "faceless others" by default, right?) telling them that their child isn't good enough to go to the best schools and become cowboy astronauts, even if - no, especially if - that's true. Financially, meanwhile, it's a lot less expensive to hand a student a book or two and place them in a desk than it is to buy expensive shop equipment and hire instructors at halfway competitive wages away from industry jobs to teach students how to use the shop equipment without inadvertently lighting themselves on fire. Put the two factors together and you get a muddled mess of an education system where, theoretically, every student is placed on a "college bound" track, but where in practice, a majority of American college students have to take remedial education courses after high school to prepare themselves for college-level coursework and absolutely nobody comes out of high school with any sort of useful trade skills.

Why didn't school teach me what these white plastic things are and how to open them?

Once students graduate and become adults, however, we're more than happy to let them sort themselves as they wish. Trouble is, most companies would rather spend as little time as possible training employees and more time getting useful, productive, profitable work out of them. Consequently, most jobs that pay tolerably well require some sort of post-secondary training, either in the form of a college diploma, an accepted industry certification, or some other proof that the prospective employee has spent some of their own time and money on learning how to do the job they're applying for.

This job requires a Master's in Computer Information Systems, CompTIA A+ and Security+ certifications, and at least five years of previous job experience.

To meet this need, the United States has two parallel tracks of post-secondary education available. The first track is the traditional university system, which descended from the old universities in Europe and New England and has since been radically expanded; the second track, meanwhile, are vocational schools. Both tracks have been around for quite some time and have interbred considerably; land-grant universities were originally conceived as vocational-oriented universities. After World War 2 ended and the Cold War started in earnest, however, the United States, noting that people with university educations were generally paid better than their vocational counterparts and noting that we needed more scientists to blow stuff up, collectively decided to throw as many people into the university system as possible.

There have been a few problems with this, though.

For starters, universities were originally designed to teach students how to embrace a "life of the mind". This had roots in universities' original role as "finishing schools" - they were more about the connections made between fellow aristocratic and wealthy classmates and instilling similar interests than actual educational attainment or (the horror!) learning how to work. Since the Enlightenment and the Industrial Revolution, universities have largely transitioned from a philosophical "life of the mind" to a focus on theory - a scientific "life of the mind" - over practice, educating students on how the world should be, at least from the view of the Ivory Tower. In order for this focus on the theoretical over the practical to be possible, universities, which were already naturally insular to protect their aristocratic clientele, increasingly became worlds unto themselves, as separated as possible from the practical world outside; this separation, when taken too far, that has led to severalfar-reachingconsequences.

Another issue with university educations is cost - not just money but time. Since university educations focus on theory, it's important to not only understand the theoretical underpinnings of whatever field of study a student chooses to follow, they also have to understand some of the theoretical underpinnings of those theoretical underpinnings. For example, a Computer Science university student, who normally would end up as some sort of programmer after college, wouldn't just need to know the theory of howprogramminglanguageswork - they also need to know the mathematicaltheories that underlie the theories that underpin programming languages. Otherwise, there are large sets of programming problems that are simply unsolvable; linear algebra, for example, is used extensively when dealing with large, sparse data sets, like simulating whether there are dust particles in a particular simulated cube of air, or large demographic simulations. For most students, this takes a minimum of 4-5 years, with many Computer Science students these days opting to spend 2-3 more years in school and get a post-graduate degree of some sort. This sometimes means that professionals aren't even starting their careers until their mid to late twenties; in order for that to be financially possible, either someone in the family needs to support the student, the student needs to receive funding (usually student loans) from another source for several years, and/or the student has to pick up some part-time employment somewhere to make ends meet.

Finally, there's the simple matter of aptitude. Not everyone is interested in going to school for several years as an adult, no matter what the financial end result might presumably be. Not everyone possesses the necessary academic talent. Not everyone has the required time management or study skills. Some people are just practical-minded people and have no interest in living a "life of the mind" - they want to go out and do something without spending years of their lives dedicated toward establishing why that something did anything in the first place. There's nothing wrong with this - though it certainly doesn't hurt for, say, a well driller to understand the underlying geology that they're drilling into, it's usually not imperative for them to understand it as fully as a geologist.

So what's the alternative? In a theoretically ideal world, public community colleges. They're cheaper than universities and, frankly, for-profit vocational schools. An Associates degree usually only requires two years, which is considerably shorter than the four or more required for a Bachelor's or a Master's. They're usually a little smaller, which keeps them from getting quite as insulated as larger universities. The academic coursework is usually considerably more relaxed. So what's the problem?

This is a problem for a couple of reasons. First, the people running universities are going to be more interested in running universities, not in making sure a bunch of students that wouldn't make the cut at their carefully manicured and groomed institution of higher learning actually get a decent education. Secondly, when the people running universities actually take an interest in running community colleges, they'll do what anyone else that spent their lives not running community colleges would do - they'll run the community college like a university. Consequently, if you want to study to become a bricklayer, you'll need 3 Credits of "Diversity", 6 Credits of Communications/English, 3 Credits of Social Science, 3 Credits of Human Relations, 3 Credits of US and Nevada Constitutions, 3 Credits of Science, and 3 Credits of Mathematics, most of which need to be taken before you can even think about taking a class that lets you touch an actual brick and lay it on something. The goal, of course, is to provide a "well-rounded education" for someone that just wants to learn how to lay bricks; that these "General Education" requirements happen to provide teaching jobs to otherwise employment-challenged liberal arts university graduates is just a textbook example of regulatory capture nice side effect.

This is where for-profit vocational schools step in.

Unlike a public community college, a good for-profit vocational school is accountable to two groups of people - the prospective students that want to get in, get the skills they want and only those skills, and get out, and the people that are interested in hiring those students. The result, when everything goes well, is a faster paced, far more focused education that provides the students exactly with the skills they need to find work in the field and provides employers with fresh labor with precisely the skills they're looking for. Where things go wrong is when the school stops listening to one or both of those two groups, usually because the school is busy listening to a third group. In the case of the schools listed above, all of them share one common trait - they are also accountable to these guys:

What's better than a public school? How about a publicly traded school, preferably funded through piles of government debt? What could possibly go wrong that hasn't gone wrong with the economy since, oh, 2008 or so?

The trouble with public stockholders is most of them are interested in making a profit right now, not after a few years of steady growth. Since most stocks don't pay dividends anymore, this puts pressure on management of any publicly traded company to pursue business decisions that boost profitability immediately, thus quickly boosting the stock price, regardless of what that'll do to the company or its customers in the long term. For a publicly traded college, this means letting anyone with a pulse into any program they want, regardless of whether they have an aptitude for that program or whether that program will lead to a job, and jacking the tuition as high as someone is willing to pay. Sure, it'll lead to the college's reputation in the employment community becoming more radioactive than Pripyat and it'll lead to most of the college's graduates' credit scores becoming more radioactive than Mayak after they refuse to pay back loans for a college education that didn't do them an ounce of good, but it'll make the stock look good right now, which will let the stockholders sell right now, which will let them profit...

Right.

Luckily, my employer isn't publicly traded.

Tomorrow I'll dig into what it's like to actually work in a for-profit college, along with how it's different from a more traditional university approach.

Friday, August 15, 2014

Ah, the joys of late nights at work. It's quiet enough to hear a server bearing spin, at least until you walk outside to go to the bathroom and the alarm goes off. It's good to work with smart, conscientious, security-minded people that do things like arm building alarms while IT is still working in the building. That's all right, though - at least I know nobody's going to get me by surprise while I'm here. The mad sprint down the stairs to the security panel so I can move around the building without the police showing up in short order (and I mean short - I could walk to the police station in under 15 minutes if I was so inclined) is just a nice exercise-inspiring perk of the job.

What made tonight particularly interesting was the maintenance I decided to perform. We have a small VMWare ESX infrastructure at work and, for the past few weeks, it's been acting up a bit. While digging into the problem, I noticed that there were several discarded inbound packets being registered by our stack switch; some Google-fu suggested that the problem could be due a combination of factors, but most of them pointed at the network drivers for both the guests and the Broadcom-equipped ESX hosts. Since the virtual machines were configured with bog standard Intel E1000 cards, and since VMWare's documentation suggested that the VMXNET3 virtual NIC has a higher performance envelope, I decided to swap virtual NICs as well. Having more than a little experience with hardware-independent restores, I knew that changing virtual NICs in SQL and Active Directory servers was non-trivial - in my personal experience, I've found the following instructions useful:

Reboot the machine into either Directory Restore Mode or Safe Mode with Networking.

So that's what I did on our VMs this time as well. Upon rebooting, however, I noticed that my servers - especially the SQL servers, for whatever reason; this problem was considerably less common among file servers and domain controllers - wouldn't keep their static IP addresses. Instead, the server would arbitrarily assign itself a 169 address. Interestingly, I wasn't alone - in fact, this has apparently been a problem with ESX-hosted Windows servers for a while, if the 2006 date stamp on the start of that thread is any indication. Even weirder, on other servers, it would keep the assigned IP address but randomly drop the gateway.

The good news, if you want to call it that, is that these issues are common enough for VMWare to offer KB articles on them:

What I ended up doing on the affected machines was closer to the spirit of the second article (2012646) than the first. I went to HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\services\Tcpip\Parameters\Interfaces as VMWare suggested on a working physical server with a static IP address and compared against one on a misbehaving VM. What I found, once I isolated which GUID corresponded to the network interface on the VM that needed to be configured, was that I was missing the following keys:

Name: IPAddressType: Multi-String Value (REG_MULTI_SZ)Data: Corresponds to each IP address used by the server, appears to be comma-delimited.

Name: DefaultGatewayType: Multi-String Value (REG_MULTI_SZ)Data: Corresponds to the IP address of the default gateway for the NIC. I don't use multiple gateways on any of my NICs, but NameServer and IPAddress appear to be comma-delimited, so I would assume this one would be as well.

After manually assigning the values above, I then installed the Hotfix recommended in 1016878 and rebooted each affected VM a few times to make sure the changes stuck. I'm happy to report that, at least so far, everything appears to be more or less stable.