Categories

I recently heard a news commentary that referenced a story about a professor who taught his class about socialism by applying it to their grades. Supposedly, the outcome was progressively diminishing grades until everyone essentially gave up and failed the class. Disbelieving the account, I looked it up on Snopes.com.

Apparently the account is a legend, probably written as an object lesson by a staunch capitalist and circulated via email for years before being posted on various websites in various forms. In further research, I found postings where hot debate ensued between socialists and capitalists about both the meaning of the story, the validity of the story (assuming it was even true), and whether that evil professor wasn’t breaking a number of rules by grading that way. I also found an amusing reversal where the professor graded capitalistically and the wealthy kids ended up buying their work to get all the A’s while the middle class kids had to drop out to create the work for the wealthy and the poor kids got neither A’s nor money for their work.

Among the primary criticisms I read against the socialist grading story was that it wasn’t true socialism. Those making this point argue that those with greater ability should/would tutor the kids who struggled until everyone got an A, or at least a B. Other criticisms indicate that this wasn’t real socialism for one reason or another, often including that old “means of production” line.

But let’s be clear. If you boil down socialism to it’s core principle, you get the following: equal opportunity or apportioning of the outcome to each individual, contributed to by the whole. Since socialism is intended to work in a society, the ideal is that everyone would get equivalent pay for equivalent work, whether they’re farmers, laborers, businessmen, scientists, or whatever. Whatever people produced would be taken by the whole (be that the government or other organization), divided up fairly and equally and each would get his or her share.

As an ideal, I like socialism. Under socialism we’d pay anyone who worked in the basest jobs, those necessary for our society to exist, the same thing we pay everyone else. After all, we need them to do those jobs so we can do these jobs right? Of course, right.

The socialist grading story actually does that, but in one dimension. We can debate whether the students should have helped one another or not, but in the story they didn’t. More importantly, in historical attempts at socialism, they didn’t, so that aspect of the story is valid. The point of it is this: some students were able to produce a lot (ie. get an A), while other students weren’t so able (ie. B-F). Likewise in a society under socialism, some will not be able or willing to work and necessarily be subsidized by those who produce more than them. By averaging the grades, the professor in the story does in one dimension what socialism would do to all dimensions: take from all what they produce, average it and return the parts to each.

So far so good? The story could have continued with the smartest students continuing to perform well. Other students, buoyed by not getting failing grades, might then have worked harder to bring up the average and the class would have taught that mean old professor a lesson. Of course, the story doesn’t go that way, and neither does society.

Welfare has been a significant cost to our country for decades. And it wasn’t until limitations were put on it that the roles of those receiving it finally diminished. Why is that? Welfare is a little like socialism. We take from those who have and give it to those that don’t? On the surface that’s good. But what happened with welfare? While, yes, many people are ashamed to be on welfare or want to support themselves and get off as quickly as possible, there was an increasing number of people who just wanted free money so they could do nothing. The same is true for all of society. If you tell everyone they don’t work, they’ll be supported by those that do, a certain fraction of the population will drop out and live on the work of others.

So, the writer of the story was not wrong to project the student’s who did poorly but got good grades (relative to what they deserved) wouldn’t work harder. What about those students who earned an A on the first test but got a B?

Here we get to incentive. Some people are naturally hard working, they do as much as they can because that’s just what they do. Most of the us though need incentive. In school, I worked hard because I wanted that A in every possible class. If just learning stuff was good enough I probably wouldn’t have worked as hard. The same is true for society at large. There are a number of professions that are extremely demanding, hard, and require incentives for there to be enough of them for there to be any benefit to society.

Doctors for example. They have many years of training in addition to what most of us get, just to be able to do what they do. Then they have to deal with difficult problems, difficult people, long days, limited personal and family life, and so on. Without incentive (ie. more money to be made) most doctors wouldn’t have bothered. Yes, some would have, but there’d be a severe limitation on availability. So, the writer of the story was correct to depict the students who initially performed well, backing off and performing progressively poorly on the tests.

These two aspects of socialism, the free ride, and lack of incentive, are the backbone of the argument against socialism. True, socialism doesn’t espouse these qualities in the ideal definition, but they come as part of human nature.

So my conclusion here is this: The story is a good object lesson about socialism, not executed according to the ideal, but according to history and human nature.

Recently, at work I was given the task of improving a piece of software that ran slowly, consumed far to many resources and was far to rigid to perform any more than the single task for which it was originally created. As I began looking into the code, I discovered that worse than all that, the code was an utter mess and there was absolutely no way I could may even the most marginal of improvements without starting from scratch.

So I did. While I worked on recreating one part of the software, a colleague of mine worked on another. Eventually, I finished my parts and got to testing and tuning his parts. As I went, I continually tested my new code and ideas against the existing system to ensure the new product was better than the old one. With little exception, the new product ran at least twice as fast as the old, consumed fewer resources and was flexible enough to handle the original task as well as a wide range of current and future potential tasks. I was pleased.

Then I got to the code my colleague worked on. There didn’t seem to be anything wrong with it on the surface. Given ideal and even the least ideal circumstances we imagined the software might face, it appeared it would run faster than the old code, perhaps even two or three times faster. Yes, we had to admit it would consume much more memory, but the speed improvement would be worth it as the old software was far too slow.

But once I got into testing against the old software (like I did for the parts I worked on) I came to a stunning and all together disappointing realization. First, we were consuming far too many resources which was bringing the hardware to its knees, even more so than the old software did. Second, and far worse, we were running as much as seven to ten times slower.

So what happened? Even in far from ideal circumstances the software should have easily out performed the old software. The answer was simple: circumstances were even farther from ideal than we thought. This made our software (which was smart enough to deal with even the worst case) take much more time and resources dealing with the terrible cases it faced. Unfortunately, this was the standard case and our beautifully crafted software was far from adequate for the task.

So I worked tirelessly (then tiredly) for weeks trying to squeeze the new code my colleague wrote to get the performance out of it that we needed. After all, we couldn’t go back to the old software since it was a nightmare to maintain and too rigid for even current tasks. But finally, I had to admit there was nothing I could do.

Nothing short of rewriting it yet again myself. So I finally threw away the code my colleague wrote and started from scratch for a second time. Carefully considering how the old software did the task, comparing it to the flexibility of how our attempted solution was designed, in the space of just two days, I wrote a solution that not only consumed fewer resources, but ended up running two to three times faster than the old software.

Yay for me, right? Sort of. If it hadn’t been for the original implementation I wouldn’t have known what works well. If it hadn’t been for the second implementation I wouldn’t have known how bad the circumstances were the software faces, and wouldn’t have seen how that can effect performance. While I’d like to claim a stroke of genius here, I pretty much have to say it was the process of elimination: we already did everything that wouldn’t work, then I did what would.

This, it seems to me, is like the Health Care debate/bill in the United States Congress today. Clearly, the current system isn’t working for everyone, and increasingly for anyone. It is slow, cumbersome, expensive and getting more expensive at a rate no one can afford in the long term. The Federal budget already devotes a huge percentage to health care related programs and will continue to have to devote more money to them, else abolish or diminish them. So, President Obama rightly made it his agenda to “fix” it, by asking Congress to create a piece of legislation that would somehow make things better.

There’s probably a lot of good things in the bill Congress eventually came up with. I hear there’s some sort of tax incentive, rules against insurance companies dropping patients when they actually become sick. There’s probably a lot that’s good. However, there’s enough that is – let’s call it “less good” – about the bill that public opinion has turned sharply against it. Some examples of “less good” qualities of the bill include throwing money at states wholesale just to win votes from their senators/representatives.

Does that mean we should keep the current system as is? Absolutely not. Many people still want reform, just not this reform. Like my software system, the original health care system isn’t working like we want and consumes far too many resources. But if the new system is only being approved by a slim margin (if indeed it gets that much) because votes were effectively purchased for it, how much better can we hope it will actually be? Reform yes, for the better? Maybe not.

I don’t expect us (by “us” I mean Congress) to just throw that bill away and let it go to waste. I expect us to start from scratch again, taking lessons learned from the current system as well as the bill and try to make something beautiful. I don’t pretend to know what that bill will look like, but whatever it is, it had better not resort to pork, special interests, or any other means of purchasing votes to get through either house. Health care is far too important a subject for any bill to be voted for or accepted for anything other than its merits.

Game Developers long suffered the stigma of the lone, smelly, poorly groomed and overweight, pizza eating bachelor that spends countless late night hours in his (yes, his, not his/her) basement cranking out the next big thing in games. While this stigma may have been well earned in the 1980’s and even a little in the 1990’s, the industry has since pulled away from that stigma.

Still, game developers, like most software engineers, tend to be male, most are required to work many hours under tight schedules with little if any additional compensation. As a result, many tend then to have little or no family life (though that is far from “all” of them). Some who don’t sacrifice personal or family lives for whatever project they’re on at the time find themselves first in line for layoffs when the inevitable budget crunch comes around.

But game developers have something that many software engineers lack – passion. It is their passion that calls them into game development and it is their passion that sees them through the hard times and long work hours. But most of all, their passion helps them learn much more about software engineering for a single project than many software engineers learn in a career.

Why? Because games are among the hardest software problems in existence. They must make use of the latest hardware for graphics, sound, user input, squeeze every drop of performance out of memory, CPU, file system, network, and other resources. They deal with concurrency, complex algorithms, artificial intelligence, and security. They are often distributed, run intricate simulations, and predict user actions and involve dozens of developers working cooperatively.

But on top of being extremely hard to create (relative to many types of software), games are fun, exciting, and sometimes even educational. Think of the last game you played. Do you remember how fun it was to play? Now imagine creating that game. I find my enjoyment of writing a game is at least ten fold greater than my enjoyment of playing one.

That is why I would recommend to everyone out there who is attempting to teach or learn computer science, software engineering or and form of computer programming: create games. Students will learn more, they’ll enjoy it more, and they’ll be more driven to keep learning if they are making games. The best part is games can be simple – or complex. Basic games can involve only a little knowledge, take less than a day to write and be tons of fun both to write and to play. The more complex games will obviously take more time to create but will also teach the student more software principles. The learning here won’t just be an act of regurgitation as so many homeworks and exams are, they’re true learning experiences.

So, go play a game and appreciate the hard work of those that made it. Software people, go write a game. You’ll learn more than you think, even if you’ve been in the industry your whole career.

It never fails to elicit an eye roll and a heavy sigh. The days of Deities descending in a chariot to save the protagonist may be over, but “Deus Ex Machina” has taken on a new (if you can call it new) twist has become the cliché that refuses to die.

I speak of course of how computers (and in particular software in any form) are portrayed in television and movies. You know what I’m talking about, the “who done it” crime shows where the right computer software can enhance an image, clean it up and add almost infinite detail if the right protagonist is manipulating it. Other examples include, hackers and viruses that can break into any system, even the most secure, even those that have no external connectivity, and sometimes those of other times, races, worlds and even galaxies. Yes, these computer programs are almost deified in their own right according to screenwriters (and other writers too I’m sure).

One “classic” example comes from Independence Day when the lowly Earthlings uploaded a virus to a never before encountered race of alien’s computers, dropping their impenetrable shields and allowing them to be destroyed by mere missiles and nuclear weapons. One might ask – How did this little virus some guy wrote know how to effect the alien systems? Is there any reason to believe the virus wouldn’t look like static noise to the alien system if it saw it at all? How did they get the alien ship to even interface with the human technology? Well, fear not – it was a Mac!

The problem here is this: as a software engineer, programmer, and computer scientist myself I find these plots, and devices so horrifically fantastical that it completely destroys my ability to suspend my disbelief. Whether the genre is basic fiction, science fiction, mystery, or any other I see how poorly they understand computers and it makes me wonder how badly they must understand law, police or military protocol, other sciences, and anything else that may be in the story.

If a computer centric episode that gets it wrong occurs occasionally, I can usually still enjoy the series (ie. Stargate-SG1 and Atlantis seem to throw these in now and then), but when the dependence on utterly wrong and impossible computing takes place 1-2 times per episode (ie. CSI, and similar shows) I lose all ability to watch. The cringe factor becomes too large.

For all of you writers out there, I have only one plea. PLEASE STOP! Stop, writing super viruses, hackers, and otherwise magical programs that can’t possibly exist. Unless you’re writing science fiction (and even then if you don’t have a good explanation) just say no. Just as you can’t extract blood from a stone (for the simple reason that a stone doesn’t have blood), you can’t extract additional detail from a digital photo! It just doesn’t have the detail beyond the original pixels. Yes, you can enlarge it and blur it a bit then sharpen the image, but that’s not more detail, that’s less and with increased uncertainty, and you certainly can’t do that to the extent done on TV on a weekly basis to solve crime. All of these things are blatantly wrong! They show ignorance. They mislead and misinform the public. But worst of all – they’re cliché.

I regularly find myself facing peers (ever since my college days) who are decidedly misguided if not moronic about the practical application of object oriented programming. For those of you lay folks out there who don’t know what that means, it basically means: programming, where you model objects. Instead of data and functionality being separate in the code, you can simplify many things by putting it together. It sounds simple on the surface (trust me it is simple), but some people get it so terribly wrong it makes me sad.

The premise, while an easy one for me, seems to elude many. You see, there are several implications and powerful tools this coupling of data can give us programmer types. For one, it allows us to hide the actual data and algorithms we’re using so we can change things in one piece of code without hurting other parts of the software. That alone is HUGE. We can also reuse capabilities of existing objects (we’ll call the “classes” from here on) by something known as inheritance. Inheritance is a mechanism through which a class can get the capabilities and data of another class completely for free! Then it can specialize something in order to do a specific job better. There are lots of reasons for wanting to do this.

Let’s look at the example that makes me sad. It makes me sad because it comes sooooo close to describing how to use inheritance and why you’d want it, but instead of driving the point home, it kind of drives it to the neighbors house, where they have loud parties and sleep till noon

The example is of cars, or automobiles. We’ll often start with “suppose we have a class called ‘Automobile’ that describes all the generic capabilities of an automobile.” With me so far? “Then suppose we want to describe a “Car” as inheriting from “Automobile” all those attributes plus the idea that it’s small, has four wheels and ” So far, so good. “Then we want a class for ‘Truck’…” Here’s where I start twitching. If you follow this example to the logical fruit, you basically end up having a class for every type of thing out there.

So, what’s wrong with that? Well, for one thing code tends to be easy for students to develop so they’ll take this line of thought off the bridge past the signs that say “warning” without realizing they’re doing it wrong. They don’t realize that once that code is deployed you can’t so easily go in and change it. Instead, for this car example, we should pretty much just parameterize the class “Automobile” and leave it there without any inheritance at all. That way, when Ford invents a new model of car, we don’t have to redeploy our software. If instead of “Automobile” we had “Vehicle” we could do something else. For example, we could have a class “Boat” which would describe any kind of sea faring vehicle in terms of sea faring things, “Car” that did the same for land bound vehicles, “Plane” for air born ones and so on. It’s still not a clean cut example however since we may have vehicles that are capable of land and water use or water and air, or all three. See? Examples are not so easy.

This is where the student of Object Oriented Programming needs to make that intuitive leap from swallowing and regurgitating facts and ideas, to actually understanding the intent behind them. The intent of these examples isn’t to show you where to use inheritance, it’s how. And unless you can learn the how, you’ll never understand the where. Unfortunately, so many I have come in contact with are stuck in regurgitate mode and have yet to make that leap.

I am continually astounded by what seems to be a very common practice among content providers, be they broadcast television, cable, radio, or internet video and music services. The practice is to play the main content (show, news clip, video, or song) at one volume, and advertisements at another. Usually the volume of commercials are louder than the show, sometimes uncomfortably so.

To put this in context, I listen to a lot of content while at work, in the car, doing homework, exercising, or just spending an evening with my wife. I often find that I have to turn the sound up on the computer, radio or television to hear the show, particularly during any quiet parts, on phone interviews, etc. Then the commercials come and I’m forced to quickly reduce the volume or risk hearing loss.

So, why do they do this? Do they think through the consequences of this practice? Now, I suppose the various channels, stations, and web sites don’t actually control the volume of the advertisements, and possibly of the shows either, but it seems to me they should. I for one am 100x more likely to change the channel, mute the volume or otherwise make that advertisement go away if the volume needed to hear the show makes the commercial uncomfortably loud, or even if it’s just much louder but still in my comfort zone. This compared to commercials that use reasonable volume where I am likely to listen at least passively while waiting for my show to come back or my video to start.

Add to this the fact that some internet music and video services now disable volume and playback control while commercials are playing, and you get one (or possibly millions) of disgruntled viewers/listeners. I don’t quite see the purpose behind blocking pause or mute for an advertisement. If they can’t get to the desired content without viewing the ad, let them pause. If they think the ad is too loud, let them change the volume.

In fact web sites have the unique position of being able to take feedback on this sort of thing. Hulu provides the option to give an ad “thumbs up” or “thumbs down”. I routinely give a thumbs down for the exact reason of being too loud. More feedback however could be gathered. For example sites could monitor when users mute or change volume on specific commercials. Aggregating results from hundreds or thousands of users they could automatically adjust the volume of the commercial to be less offensive to ears that are straining just to hear the show.

Last week my wife had a scary experience. We had some old bananas she decided to turn into banana bread. All was well; she mixed the batter, preheated the oven and put the bread pans, filled with sweet batter into the oven. Normally, a small loaf of banana bread bakes for 25-30 minutes before it should even be checked, and probably needs 35 to be done. Around 30 minutes into the baking my wife, who had gone downstairs and away from the kitchen, noticed the smell of smoke. She ran upstairs to discover smoke pouring out of the oven.

Turning the oven off hastily and pulling the now blackened bread out and setting it aside, my wife couldn’t help but wonder what went wrong. She gave me a call at work to let me know what happened and get my insight as to what may have gone wrong. I double checked with her that she set the right temperature and time on the oven. She confirmed she did.

Then she noticed the top element of the oven was still on, glowing brightly with intense heat. She tried turning the oven off again but the control panel already thought it was off. Yet that element continued to heat. I instructed my wife to go to the breaker box and switch off the breaker to the range. At our house, as I am sure is the same for many others, the range has its own breaker switch so the only thing effected by the loss of power would be our range.

She complied and we waited a few minutes, breathing a sigh of relief together. Hoping the computer in the range just got confused we thought we might switch the power back on and see what happened. Much to our disappointment, the top element of the oven came on again and began glowing brightly before my wife switched the breaker off again.

After all that, we looked up the make and model of our range online to see what might be wrong. We discovered that many other people have experienced the same or similar thing with many models from our range’s maker. So, those of you out there looking for ranges, don’t buy Frigidaire, or Kenmore (same company). Some stories included fires, near fires and not even being home when the oven came one.

The problem, it seems is either the computer or a switch in the control panel or some temperature sensor somewhere. Either way, the range clearly doesn’t fail safely. I imagine the total cost of fixing it would be somewhere between $200 and $500 depending on the total number and cost of parts that need replacing (plus the cost of parts that don’t need replacing but the technician wants us to believe need it).

The cost then of a new range starts at $400 and goes up to $1000 for the type of range we’d want (though ranges can cost well over that). The big question becomes, to replace or to repair? Normally, if something like this broke I’d just want to fix what is wrong. But in this case I’m left to wonder – will this range, with whatever replacement parts fail again? If it does fail, will it be as dangerous as this time? Will we even be home? Will we be asleep? Will it get so hot it starts a fire or causes a burn injury?

I’m sorry Frigidaire, Kenmore. I just can’t trust my family’s safety, or that of my home, to you. Not only with this range, but with your entire line. I’ve read too many stories online about this problem and your denial of its existence. You should have learned to fix your mistakes early, to design safety into your system, even in failure. If your range immediately turned off and refused to turn back on at the first sign of trouble I’d be more confident choosing you in the future, but to stay on, over heat and become a fire hazard, even when commanded to turn off.

So here, is the true cost of failure, especially when it comes to safety. I can afford to replace my range, and that is a cost to me. But the true cost is that of confidence. Not only will I lose confidence in the safety rating this range supposedly met, but I will lose confidence in Kenmore to meet safety ratings on any of their models. I only hope I can still place confidence in another company to provide a range that won’t give me nightmares.

In President Obama’s State of the Union Address (SOTU) last week, there were a number of statements that irked me, but one statement that clearly shows the liberal/progressive mindset is where he said “And let’s tell another one million students that when they graduate, they will be required to pay only ten percent of their income on student loans, and all of their debt will be forgiven after twenty years – and forgiven after ten years if they choose a career in public service. Because in the United States of America, no one should go broke because they chose to go to college.”

Now, the idle or idealistic mind might look at this and think “Isn’t this a great idea!” I must admit I sort of agree with some of the arguments that might come out in favor of this plan, however I still believe it falls flat. Student loans are not a terribly lucrative business compared to other forms of loans. Currently they are often government subsidized such that the interest rates are low and students who get them don’t have to pay anything while they’re in school, and can even get payments deferred without penalty if they come under hard times. To me, this is already a pretty good deal. I, as a student, don’t even have to have stellar credit, large quantities of disposable income or even wonderful grades to get such a loan. I just go, get a loan and pay it back in good times.

What’s the problem with that? I don’t know the statistics on people who actually go bankrupt because of student loans, but my gut feeling tells me the majority of those entering bankruptcy who have student loans are doing so because of a mortgage, car loan, or credit card debt that pushed them to the brink, not their student loans. In fact a brief search of the Internet, seems to indicate that even in bankruptcy student loans cannot be discharged unless the candidate can show that paying them off would cause “undue hardship” on the candidate, the candidate’s family and dependents.

So if bankruptcy isn’t the reason for this “free money,” what is? I almost shudder to think, but for now let’s assume it’s both noble and justified. First, allow me to explain why I call this “free money.”

Suppose a bright student applies and is accepted to a prestigious university in another state where the annual cost of tuition, room and board, fees, etc. is on the order of $35,000 per year. Let us further assume that the student is indeed able to graduate in precisely four years, paying a rough total of $140,000 for school. Let us then suppose that all of that was paid for by student loans.

Upon graduation (or a few months after), those loans begin to demand payments. Usually it is a small amount relative to the amount owed, in part because of the low interest rate. Under Obama’s proposal however, that amount would be capped at 10% of the student’s income per year, regardless of how high the interest might be or how large the debt.

Now suppose that student makes $50,000 per year (let’s use this as the average over the 10-20 years before the debt is forgiven). Further, let’s suppose the student never goes without a job or hits any hardship that would warrant deferment of payments.

Over the course of one year, the student would pay no more (but possibly less than) $5,000 toward this debt. That means in 10 years, should the student choose a career in public service, the student will have paid $50,000 of the $140,000 debt, roughly 36% (forget interest). If the student chooses some career not under the “public service” label the amount of $100,000 will have been paid, still less than 72%.

So, what’s the problem here? It sounds like a nice deal for the student. Simply go get a huge loan to go to a fancy expensive school and not have to pay all of it back, let alone the interest. That’s free money for every student out there who can get these loans!

The problem is where does that money come from? If it’s coming from the government and only these loans are subject to the 10% cap on payments and the 10-20 year forgiveness policy, it means we, the taxpayers, are subsidizing the education of every student that gets these loans. I actually don’t have a problem subsidizing education, but this isn’t the way to do it. Instead our government (state and federal) should be giving the money to schools directly, or to students as grants to help pay for the costs of education, not in phony loans that act as grants in the long term.

If however this policy applies to loans made in the private sector, we have a bigger problem. Instead of creating a boon for students who just need a chance to get an education so they can finally get a leg up on life, the policy will dry up the very business of student loans that’s already helping people do that. If investors see that for every $1 they invest in student loans they can only expect to get (at best) $0.75 back, that’s not an investment many people will be making. Loans will dry up and students won’t be getting them.

That is where the liberal policy of “Free Money” takes us every time. Either the tax payers subsidize an industry, practice, or “right” via round about means (again, I’m okay with the direct approach depending on what they want to subsidize), or we create some kind of bubble that eventually pops and creates a crisis (Sub-prime mortgage crisis anyone?).

So, let’s not adopt this new “Free Money” policy. Let’s instead invest in education the way we used to, by giving scholarships and grants to deserving students (can be defined as academic or sports capability, or just economic need), let’s fund our public universities and colleges sufficiently so average kids can afford to go to them, and let’s encourage universities and colleges to get their costs under control so they don’t grow at 2-3-30X the rate of inflation, by regulation if necessary. Finally, let’s dispel the notion that money ever was or can be “free.”