Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

An anonymous reader writes "The Security Ledger writes that the expulsion of Ahmed Al-Khabaz, a 20-year-old computer sciences major at Dawson College in Montreal, has exposed a yawning culture gap between academic computer science programs and the contemporary marketplace for software engineering talent. In an opinion piece in the Montreal Gazette on Tuesday, Dawson computer science professor Alex Simonelis said his department forbids hacking as an 'extreme example' of 'behavior that is unacceptable in a computing professional.' And, in a news conference on Tuesday, Dawson's administration stuck to that line, saying that Al-Khabaz's actions show he is 'no longer suited for the profession.' In the meantime, Al-Khabaz has received more than one job offer from technology firms, including Skytech, the company that makes Omnivox. Chris Wysopal, the CTO of Veracode, said that the incident shows that 'most computer science departments are still living in the pre-Internet era when it comes to computer security.' 'Computer Science is taught in this idealized world separate from reality. They're not dealing with the reality that software has to run in a hostile environment,' he said. 'Teaching students how to write applications without taking into account the hostile environment of the Internet is like teaching architects how to make buildings without taking into account environmental conditions like earthquakes, wind and rain,' Wysopal said."

That doesn't really hold at the university level, where research is required in conjunction to teaching. In fact, it serves a twin purpose - research forces people who just want to teach to stay current in their discipline. Teaching forces people who just want to research to focus and order their knowledge so it can be understood by novices. High school teachers get out of date pretty quickly, but university professors (certainly in my experience) has to be on the ball.

Perhaps the real question here is "Is the field of academic computer science out of touch?"

Full disclosure: I am a robotics researcher ('lecturer', equiv. to an assistant professor) at a university; I'm on a fellowship, though, so I don't have to teach much!

I've never found that to be the case with university professors. In fact, most of the ones I ever knew did no research at all. They wrote textbooks and taught classes.

They still weren't useless. They knew the material they were meant to teach. But they were horribly out of touch. I still remember having these bizarre arguments with one professor that was sure open source was a brief fad, that it couldn't catch on in any meaningful way, but that if it did, it would be poison for innovation in the tech industry. I'd like to go back and do an obnoxious, "I told you so."

My experience was the exact opposite... I guess it depends on your university's priorities. I had professors teaching undergraduate courses who were not only doing serious research, but were often leading their field. Off the top of my head (it's been a while, but jeez looking at it in hindsight it is humbling):

Should a "typical university" have a CS department at all? Speaking as someone who works in a CS department where the academic staff have to produce research output as well as teach, it sounds like there are places which just ought to stop the pretense and to actually call themselves "Visual Basic Training Schools" or something. (Disclosure: I mostly don't teach, and instead do software engineering to turn the CS research into practical tools to support other research areas.)

As someone who recently used my knowledge of the 1920's Nyquist limit on a project, I'm pretty skeptical of this claim. I don't think the fundamentals of computer science change nearly as fast as you assume.

And also a very good explanation. How on earth did they produce such a hopelessly stupid system? It was designed by people who are unready for engineering systems to be used.

I am a big fan of not blaming the victim, as a matter of moral principle. That's a great policy. But it's really crappy engineering design; building something that is designed to rely on the assumption that society can reliably provide perfect enforcement is stupid.

There's another layer of difficulty, which is that it is not always obvious whether something is a security hole or a permissive feature...

You know, we blame civil engineers when their buildings collapse, maybe it's time to start blaming computer "engineers" when their systems do. Now, I know first-hand how hard it is to design secure computer systems, and I'm well aware there's a fine line between "holding to account" and a witchhunt, but we're nowhere near that line as it stands.

In every single one of these stories I hear the mainstream media gasp about the "dangerous hacker". I see/. complain about morons who treat technical curiosity as an attack. But those comments outnumber 10:1 the most important question that you just asked.

How on earth did they produce such a hopelessly stupid system?

Maybe if we could get everyone asking this question, the conversation would shift.

The problem is not just in Software Engineering. Any applied field is faces the problem.

Think about it: in any university or college, NONE of your teachers are actually posess the skill you are trying to acquire. Unless, of course, you want to become a teacher or academia type scientist.

Say, you want to become a Software Engineer and you go to a college. There, general algo's professor teaches you general algorithms. Text processing professor teaches you compilers. The same for operating systems, programmin

I think it was some time in the 1980s when there was a very strong push to get academics in applied fields to do some outside consulting or perish. Then there's academics such as the head of R&D at the company I work for - two days a week at university and the other three designing and improving equipment and techniques that are used in a commercial venture.

Get ready to have no free (gratis) software, as it would be ridiculous to donate one's time to write code for free if you could be held liable for mistakes. Get ready for your paid software to cost 10X more to cover the extra development "hardening" time it would all require to be less penetrable, and to cover the insurance policies software companies would have to take out to shield themselves.

You know, we blame civil engineers when their buildings collapse, maybe it's time to start blaming computer "engineers" when their systems do.

But we don't blame civil engineers when their buildings collapse after they get blown up by dynamite. It's not like these computer systems are just falling over from nature. They're under malicious attack.

The civil engineers do design for resiliency against severe failures and/or attacks. The world trade center [wikipedia.org] was designed to be hit by a B-25 bomber. The terrorists used a 767, and simulations say that the building should have survived. The engineers did not design against a 767 filled with a full fuel load, in the 1960's, when a 767 did not exist yet.

To be fair, the software people in the SCADA software industry have the same safety issues. SCADA systems are designed to fail gracefully in the event of

There is no such thing as a secure system. This applies to both physical and information security. There's always a way in. So that's a bad analogy to life-safety engineering, or at least a subtle one.

When it comes to security, there's no "secure" or "insecure", and the threats are rarely well understood, let alone well described. The important questions are "how much will it cost an attacker to gain access" and "how much will it cost an authorized user to gain access" and "how valuable is this anyway" and "what's the tradeoff in making this more secure". Sure, there are also just stupid, terrible designs when it comes to security, but the mere fact that an attacker gains access means little.

When it comes to life safety, the parameters are thoroughly described. The levee must withstand the winds and storm surge from a class 3 hurricane, this building must survive impact from a 707, whatever. If they fail under far worse conditions than they were specced for, that's not an engineering failure. It's rarely so clear when it comes to security (though, of course, sometimes the password is sent as part of a URL or whatever, and it is quite clear).

We blame civil engineers if their buildings collapse under normal use. We do not blamed them if someone plants a bomb in the building. More actually, we don't blame the architect if someone successfully breaks into your home.

In the physical world, there is NO SUCH THING as perfect security. You can't design a setup that someone else cannot overcome. All you can do it make it so hard that nobody would try, and multi layered so you hopefully catch something if there is a failure at one level. There's no perfect security, no magic bullet.

Likewise there is nothing that is invincible, nothing that can withstand any and all attacks without problems. Everything has failure points, everything can be broken. You have to use things properly or they WILL fail.

We all accept this as part of every day life. However then when it comes to the virtual world, to computers, geeks seem to think things should be perfect. No system should ever have any security flaw, ever. No system should break or fail, even when subjected to deliberate attack. Everything should be built flawlessly.

Nope, sorry, doesn't work that way. While it is a lot easier to make things more resilient than in the physical world, you still have to assume that failure is possible, that flaws are present and not known. That is just life.

We blame civil engineers if their buildings collapse under normal use. We do not blamed them if someone plants a bomb in the building. More actually, we don't blame the architect if someone successfully breaks into your home.

If you are going to use an analogy use the right one. A computer security equivalent in relationship to a building wouldn't be the architect or engineer, it would be the company hired to provide security for the building such as the alarm or security company. And if there was a break-in

Go ahead and show me the home/business alarm you think will stop me. Go ahead. I can more or less guarantee you can't do it. The reason is I know quite a bit about how they work, since my grandpa has been in the business of selling them all his life, and how they can be defeated. Particularly if you are talking something public where you can look around innocuously and find out what is there. Ultimately they are at their core just a circuit board in a box that connects to sensors, sirens, and maybe a phone line. Break the board, they stop working. If you have one in your house open it up and see what's inside. It is simplistic, and not at all attack resistant other than the thin metal box it lives in.

For that matter, defeating an alarm really isn't necessary if taking something, like say physical data (files and so on) is your objective. All they do is make noise and if they are good ones, call a security company who will eventually call the police who will eventually respond (they aren't that fast, false alarms happen often). That doesn't stop people with guns from kicking in your door, grabbing what they want, and leaving.

Same shit with security guards. You ever have a look at the security that public places like office buildings and malls use? They are unarmed, and low paid. Their job is to call the police if shit happens. It doesn't take much to out-class them, you bring a pistol with you, you've already got them hopelessly outgunned. You think they are going to throw their life on the line if someone holds them at gunpoint? Hell no. For that matter there usually aren't very many. The mall near me has one car that patrols their parking lot at night (I overlook the parking lot). That is it for perimeter security. I don't know what they have inside, but you can bet it isn't much more (maybe not even anyone).

Physical security at homes and businesses keeps out the causal crooks, nothing more. Now that's all they really face, people wouldn't bother with a targeted, planned, attack, they just don't have enough of value. They face low level thugs that do vandalism, smash and grabs, that kind of shit. And oh, by the way, it DOES happen. The mall near me gets broken in to at least once a year, usually dumbass teens just causing trouble, and by the fact that they got in, it means security failed to stop them.

They don't get fired, their job isn't to stop everything, it is to report anything they see, and to drive around and look conspicuous (their car is marked, and has a flashing yellow light) so as to scare troublemakers off.

If your house has never been broken in to it isn't because you have amazing security. A burglar alarm and a crap lock do not make great security. It is because nobody has tried. They good news is most of us don't face much in the way of threats to security in the physical world. Nobody tries to break in, or attack us, or the like. It is quite uncommon.

Now that doesn't mean we should just be all lax with computer security, but it does mean that this silly demand of perfection needs to stop. Nothing is perfectly secure.

Well in this case the programming failed under normal use. That is it failed to keep people out.

We can easily secure systems such that the bad guys can't ever get in. Really. It's easy to do even. What is much harder is doing this while allowing the authorized users easy access. In the limit case of security, we just disconnect the system from the network and power it down: nobody will hack it then, and it is ever so easy to get right! But this is immensely inconvenient for people who are supposed to use the system. (To be fair, there are systems that have data so valuable that at least keeping them o

You dont, however, blame them when someone helpfully demonstrates that by taking out support pillar 3A with TNT that the building suffers catostrophic failure. I mean, yea, maybe you blame them a little, but generally you get pissed at the guy holding the detonator.

You know, we blame civil engineers when their buildings collapse, maybe it's time to start blaming computer "engineers" when their systems do. Now, I know first-hand how hard it is to design secure computer systems, and I'm well aware there's a fine line between "holding to account" and a witchhunt, but we're nowhere near that line as it stands.

The problem is what happens when I design an application that's fine, but it must run over an insecure OS or insecure hardware, and something else with higher permissions compromises my application's data?

Though the real issue is that with a civil engineering failure, something big falls down. With a computer failure, someone sees something they shouldn't have. "no harm, no foul" is uttered way too much, but is how people treat a problem they can't see or understand. At least with the World Trade Center

Wow! What a creative comeback. Really, That was SO impressive!! "Dumbfuck!" Such poetry, and you managed an actual two syllabe word. Most impressive, can I use that? Whatever you're paying your writers, double their salary and give them 2 weeks in Hawaii. That was, dare I say, creative genius! Yes, yes it was.

I may never post again, there's no reason to now, for I have read the ultimate in rebuttals. Someone call the Fox channel!

While there are always outstanding mavericks, a lot of engineering departments are primarily staffed by brainy people who would make third tier engineers in the real world. Most people who are passionate about a subject area are itching to go out and DO IT. Yes, there are a few amazing brainy oddballs out there that have to be in academia. Yes, there are 5 or 6 CS departments like Stanford or UC Berkeley or Carnegie Mellon that probably do not fit that mold.

But Dawson College? A top notch computer scientist could be racking up six figures with a BS or MS. Who do you think works there and what are they paid?

You're making a very bad assumption that only poor professionals work in minor colleges.

There are countless reasons for working at one university rather than another, the simplest being that it's a place you like or where you have family. Another might be that it provides good promotion prospects rather than only dead man's shoes. And another big one is that it's not a place infested with prima donnas where the only option is to play second fiddle.

Computer science programs became trade schools for programmers when idiot HR departments made a CS degree a requirement for every coding monkey position. The fact that a computer science degree does not give its holder any knowledge of actual computers or real world programming does not bother HR drones because they do not have that knowledge either.

But that doesn't mean we need to deliberately hamstring ourselves either. No more than we should be asking people to work on a Wang for everything. We need to keep pace with the technology and culture (and the challenges those pose) like any other field. My first computer networks course was still teaching token ring and FDDI, because they were still relevant as in place technologies at the time. Today, if we talk about them at all, we gloss over them as historical concepts.

All that happened was some young hotshot did something the dept forbids. He paid for that, end of story. How you go from there to "CS depts out of touch with today's world" is beyond me, but then again I'm not some CTO either.

Because the young hot shot wasn't doing anything nefarious, and when he first reported the vulnerability he was praised. It's only when he determined that no one was doing a fucking thing about the vulnerability that he got kicked out.

Dawson is not a university. In Quebec, "College" and "University" mean different things. Dawson is a CEGEP, which is a mandatory level of education between highschool and university.

CEGEPs in Quebec has two kinds of programs. 2-year Pre-university programs can be considered to replace the final year of highschool and first year of university (as in, highschool and university are both one year shorter in Quebec). They also have three-year programs (like the computer science program Al-Khabaz is in), which are vocational degrees intended to prepare a student for the job market rather than university. Graduating from either type of program grants you a degree called a DEC ("Diploma of College Studies" in English), which also happens to be required for admission to any university.

Many students, however, do what I did, and get a three-year vocational compsci DEC and then go to university and get their BCompSc. Yeah, it takes you an extra year (as compared to the pre-uni DEC), but CEGEP is the first time as a student that you get to study what YOU want instead of what the government says you must take, and I had a fantastic time.

From the article:Two days later, Mr. Al-Khabaz decided to run a software program called Acunetix, designed to test for vulnerabilities in websites.....A few minutes later, the phone rang......It was Edouard Taza, the president of Skytech. He said that this was the second time they had seen me in their logs, and what I was doing was a cyber attack.

Yea, see, this is why insecure.org has warnings to not run nmap against resources that you do not own: It is generally considered nefarious, ill-advised, and possibly illegal. Yes, pen-testing other people's stuff will land you in trouble. Should he have been expelled? Maybe not, since he was clearly trying to expose a vulnerability, but he should have known better and hopefully now he does.

Probably also should not have signed that NDA and then gone on to break it, but then Im no lawyer. Probably should have just said "yea, I sign nothing till i have representation".

If you do not have a job / contract with someone to pen-test, act as a "tiger team", check for physical security breaches, etc, DONT.

Unfortunately for Ahmed his method of verifying that the problem had been fixed was de facto a DOS attack:
“The attack made the College portal extremely unresponsive for its thousands of users. Had it not been countered, it would have put the College portal out of order for the entire students and teachers population of Dawson...."

I have a real problem with this quote. If I were running a relatively high-volume service (and apparently College Portal's user base is somewhere around 250,000) and a single user were capable of DOSing the entire service, I'd characterise that as a flaw in my service.

I might not thank the kid who brought it to my attention, but I sure wouldn't trumpet the fact that my service is as brittle as all that.

I think the CEGEP's action - the decision to punish the kid for poking his proverbial nose in where it was

According to Dawson College's [dawsoncollege.qc.ca] site they had tried a tongue-lashing:"He was expelled for other reasons. Despite receiving clear directives not to, he attempted repeatedly to intrude into areas of College information systems that had no relation with student information systems."

So this was not the first time he was testing things he should not have been, and it was not the first time they'd told him to cut it out.

Some kids need to be shown that rules apply to them. Apparently Ahmed Al-Khabaz was one. He'll d

They're not making any theatre out of this incident. They did not do a big press release announcing that Evil Hacker Ahmed had been apprehended. He used Acunetix to bring their system to a halt, they threw him out, he lost all his appeals, and then he went to the media in an apparent attempt to get another appeal. The "theatre" in this incident is all Ahmed.

I have no idea whether Skytech has fixed the actual bug. I know he's been offered an actual paid job by them, so if he actually wants to fix the bug (as

Apropos of nothing, and not really related: at university, I got into trouble for running a MUD (unauthorized!) out of my home directory. It went undetected until a friend wrote a buggy bit of lpmud code that caused the daemon to write a lot of log messages, and overnight it filled the filesystem where my home directory lived.

The next day (after having my account locked), I was being torn a new asshole by the sysadmin (who certainly did have some bastard operator from hell traits) and got the lecture about

The CTO said what he said because the department TRULY is out of touch with the real world if it believes that hacking is an 'extreme example' of 'behavior that is unacceptable in a computing professional'.

Hackathons, which involve unusual solutions to problems, often using hidden, undisovered features of various products, are becoming increasingly popular, and often you'll have BIG companies sponsoring these same competitions.

Moreover, the dept is wrong in its comment because CS as a profession is

True, but putting that statement here is very misleading because it actually has nothing to do with the situation. In this case the person was a legal user of the software and was authorised up to a point. It looks like they stepped over that line, but where the line lies comes down to fine print in licence agreements and not in criminal law IMHO. It's very different to "Running penetration tests on random companies' resources without prior authoriz

'Computer Science is taught in this idealized world separate from reality. They're not dealing with the reality that software has to run in a hostile environment,'

That's because if schools taught people how to properly test security, the government would label them terrorist breeding grounds. Anyone remember Steve Jackson Games? They released a game where one of the roles you could play was a computer hacker. The FBI called it a "handbook for computer crime" and the "anarchist's cookbook of cybercrime". No charges were ever filed. It was a work of fiction. It still nearly bankrupt them and took many years to resolve.

Schools do not want to teach students because they're afraid of government reprisal if they show a generation just how crappy our national infrastructure really is. As one recent net celebrity put it, "Our security posture is like a dog waiting for its belly to be rubbed." They don't wanna teach people how to find these problems, because it'll embarass the crap out of The Powers That Be.

They don't wanna teach people how to find these problems, because it'll embarass the crap out of The Powers That Be.

Don't blame professors for this. Look higher

Your explanation sounds a bit too tin-foil-hat. The reality is that the market just wants keyboard jockeys who can code a working product quickly and cheaply. The security (and I'd also say quality) of the product is way down on the priority list of most employers. If you want to fix that, you need to figure out how to demand high-quality software. Not the buggy, security-flawed crap we see from major companies like Adobe, Java and Microsoft.

But I do agree most of the graduating "computer engineers" I've interviewed barely knew how to code and had a few canned routines like bubble-sorting memorized. The ones claiming to be Microsoft certified were even more embarrassing.

I'm not sure you're aware, but, depending upon the school, an S.B. in computer engineering can be much more akin to an S.B. in electrical engineering than one in computer science. To elaborate, some computer engineering programs are part a joint department that focus almost entirely on circuit analysis and design, solid-state theory, (non-)linear/stochastic control, architecture design, electromagnetics, and much more, with very little, if any, emphasis on programming.

Computer engineering at Texas A&M (early 1990s) was part of the EE program, and was for EEs who wanted to design chips, rather than design wiring for commercial buildings (the two most popular things EEs do). There was almost nothing on programming, and most of the programming included hardware level (so maybe it was good for firmware writers or people writing device drivers).

That's because if schools taught people how to properly test security, the government would label them terrorist breeding grounds.

Not if step one in the process is: 1) get permission from the system operator/administrator/owner. That's where this guy failed.

Many years ago I knew of a problem in a web server I was running. Certain operations would cause it to hang. You know how I found out this issue? By running a script-kiddy scanner. It wasn't in a place I could easily fix, and the chance of it happening was rare. Except for the script kiddies who thought they were doing me a favor by scanning my system without my permission so the

Why should I get permission from someone to check if my data is being mishandled by him? It is absurd. A scan, as he did it, is very far from breaking into the system and accessing information you shouldn't have access to.

Go ahead and go in to a bank, or better yet a government agency. Barge in to the "Employees Only" area, open up a confidential filing cabinet, and start rifling through for your data. See how well you do in court with the "I was just checking to see if my data was being mishandled!"

You learned these rules in Kindergarten: Don't touch what isn't yours, don't break someone else's stuff. You don't get to go and try to bust in to systems you don't own, or have permission from the owner. It isn't just the law, i

Not if step one in the process is: 1) get permission from the system operator/administrator/owner. That's where this guy failed.

I'm not talking about this guy: I'm replying to the comments of the OP talking about how schools today don't teach security, and they don't. They don't because they're afraid -- teaching someone security is like teaching them how to use a firearm. In the process of learning that, you learn how to disarm it safely, etc. Learning is a two-edged sword, but unfortunately our government has demonized learning that could lead to political fallout. Look at how the press reacted to the gun control debate -- by publ

I'm not talking about this guy: I'm replying to the comments of the OP talking about how schools today don't teach security, and they don't. They don't because they're afraid --

And my first sentence dealt with that concern. If they make step one of the process: GET PERMISSION then they don't have an issue. That statement applies to more than just this one case.

People can't have an open dialog about computer security right now because it's too political.

That nonsense. Of course you can have an "open dialog", as long as you aren't doing it as part of breaking into someone else's computer without permission. It happens all the time.

You shouldn't have to risk your career just to show some kids how to do something that might actually help them and their community,

You don't. I've already described the dual course admin series that taught people exactly this without costing anyone any careers or getting them expelled. How did they do this magic? They used systems that they had permission to test. They put the systems together to learn how to do that; they broke into them to learn what was possible and how to prevent it.

There have even been cases of commercial outfits that have made public challenges -- and none of the participants have been hung or shot or had their careers ruined. More magic? No, just the simple part about having permission.

There's even a competition run by the government that deals with cyber security, which involves teaching kids how to break into systems. But then, they aren't doing it without permission.

Students learn much more from professors who have backbones than those from the family of invertebrates.

Yes, it's totally reasonable to expect someone who has spent close to six figures earning their degrees and certifications, and finally managed to earn tenure, risk it all to satisfy your idea of morality. Dude, that's bullshit. It's bullshit on an epic why-the-hell-did-even-two-other-people-agree-with-you scale.

College professors do have integrity. Well, many of them anyway. It's mean-spirited and flat-out wrong to accuse people who are responsible for ensuring that the next generation is trained at least well enough to know which way to hold the mouse before sending them out into the world... that they lack integrity simply because they don't want to be jailed and have their lives ruined to uphold an arbitrary moral value that I suspect even you yourself only sometimes adhere to.

Don't blame the victim! Put the responsibility on the asshats that created the problem: The government. Oh wait, they're the giant 3000 ton gorilla! Probably easier then to go after the wimpy guy with glasses next to it, huh? That's exactly what you've just done, while demanding others have a backbone. Pathetic.

That's because if schools taught people how to properly test security, the government would label them terrorist breeding grounds.

Not really. My team has a great track record of our products passing security scans. We've never used mock hacking to find security issues in our code. We simply do rigorous code reviews against solid security principals. Some teams around us do the whole code-hack-fix thing, and they have a lot of security fix work every time the pen-testing tool is updated or changed.

I laugh every time I hear a colleague come back from some security class they were sent to and I find out that they spent five days runn

If we want CS students what's really involved in creating a secure system, how about a mandatory "intro to hacking" course?

Using systems intended for such purposes and not someone else's production systems, of course.

Many years ago our Uni had such a course, run in two parts. Part 1: Unix system administration 2: How to break into improperly administered Unix systems. Nobody went to jail. Nobody was branded a terrorist. Many (some?) people learned how to be system admins.

people doing application development do need to know about makeing secure code but other parts fall on the sever and web guys who don't real need the full CS load of application development and theory classes. Also is parts of theory that people application development do not really need. Other then at at very high level.

However, I don't buy that what this student did was hacking (in the cracking sense)

Targeting a system you don't own, or aren't reponsible for and trying to break into it is almost always not a good thing to be doing, and should be considered unprofessional (and unethical) conduct.

Noticing a problem while you are setting something else up, notifying the appropriate people, and checking to see if that problem is gone are very reasonable things to do.

I have been working in Computer Security in Internet Banking for the last 15 years, and while I have had many co-workers who measure their worth by how good they are at breaking in to things, very few of those people have been nearly as good at defending those same things.

Figuring out how to hack a site takes finding one vulnerability.

Figuring out how to defend a site takes thinking about all types of vulnerabilities.

The computer science department is not teaching their students to write code without consideration of the environment of the Internet. At least nothing in this situation says they are.

What they are teaching is that it is unethical to run penetration testing against a system without permission. This philosophy is embodied in the ACM Code of Ethics [acm.org], in section 2.8:

2.8 Access computing and communication resources only when authorized to do so.

Theft or destruction of tangible and electronic property is prohibited by imperative 1.2 - "Avoid harm to others." Trespassing and unauthorized use of a computer or communication system is addressed by this imperative. Trespassing includes accessing communication networks and computer systems, or accounts and/or files associated with those systems, without explicit authorization to do so. Individuals and organizations have the right to restrict access to their systems so long as they do not violate the discrimination principle (see 1.4). No one should enter or use another's computer system, software, or data files without permission. One must always have appropriate approval before using system resources, including communication ports, file space, other system peripherals, and computer time.

He got thanked for finding the flaw. He got expelled for pen testing someone else's system. Two different acts, two different issues.

He got thanked for finding the flaw. He got expelled for pen testing someone else's system. Two different acts, two different issues.

It's obvious that the testing was done for the right reasons, he just went about it in the wrong manner. He was smart enough to find the flaw, and morally sound enough to report the flaw. It doesn't fit to make the punishment so extreme in such a case.

They should be Teaching how to deal with stuff like this but all they did was let him doing it his own and then say you did it wrong and we not just giving a C or even a D. and you are not just getting a F no you are getting a

He did something wrong, sure. But what he did was not bad enough to justify completely destroying his future from an academic and professional standpoint.

He's lucky that this story has attracted as much international attention as it has (and it certainly is strange to be reading about local news stories on international sites like Slashdot, when I work across the street from Al Khabaz' school). If it hadn't attracted all this attention, he wouldn't have had all these job offers, and would have been screwed.

Dawson tried to leave him in debt, unable to enter any other CEGEP, unable to enter any university (you're required to graduate from CEGEP to get into university in Quebec), and with severely diminished job prospects.

Should he have been punished? Yes. Should Dawson have tried to destroy his life? Certainly not.

He might have done something wrong, but the real problem is nobody taught him properly it was wrong. They are running a computer science course there, they should have taught him CS ethics. When his CS instincts were wrong, they should have fired themselves for failing to teach.

Yes, it's true that he was actually testing somebody else's system... however, it's not unreasonable to conclude, given what kind of software he was evidently trying to develop, that it would need to be fixed before he released his application or else the vulnerability might be exploited by anybody who used his app and happened to also discover it, as he originally did.

does no one ever read the article anymore?It was on a test server.....using credentials given by the vendor, Skytech Communications.

...the software vulnerability scan that got him expelled from school was conducted on a test server only, and using credentials provided to him by the company that makes Omnivox: Skytech Communications.

The mere fact that Skytech supposedly gave him a job offer is enough to think that the department has their collective heads up....well..you get the point.

There's a reason why the legendary Weld Pond [wikipedia.org] would be so vocal and would even say "These kind of people right out of college are the kinds of people we want to hire."

This isn't really about Al-Khabez. It's about policing the boundaries of the profession. The problem - the reason that there is a culture clash - is that despite attempts for over 40 years, no-one has succeeded in transforming computer programming into a profession. To be more precise, whether programmers professionalized remains a serious question for debate.

Look at the quotes from Simonelis, Dawson, and the ACM:

behavior that is unacceptable in a computing professional (Simonelis)

no longer suited for the profession (Dawson)

The Code and its supplemented Guidelines are intended to serve as a basis for ethical decision making in the conduct of professional work. Secondarily, they may serve as a basis for judging the merit of a formal complaint pertaining to violation of professional ethical standards. (ACM code of ethics)

If programming were a profession like medicine or law or engineering, programmers would acquire higher status, as would organizations like the ACM. From the point of view of managers, programmers are often seen as unmanageable crafts people with little respect for standard practices of business. For them, professionalization is about controlling and assessing programmers and theirwork. The rise of computer science, the creation of software engineering, and the creation of the ACM were all driven in large part by efforts to professionalize the field: sometimes more in the interests of programmers, sometimes more in the interests of management

This comes up again and again on Slashdot. Should there be a standard curriculum or test or other criteria that all programmers should meet? Should we have to belong to professional associations? Should programmers be obliged to follow codes or take legal responsibility for flaws in software? How much should formal education and credentials be valued? Should self-taught programmers be excluded?

These are contentious issues. Clearly Dawson College and Mr Simonelis have an interest in defining and policing the boundaries of the profession. This would enhance their status. But as nearly a half century of debate and ongoing discussion here demonstrate, there is no professional consensus for them to uphold. This is real cultural divide. Al-Khabez got caught in the middle, used by Dawson in their efforts to define the profession and their own status. I think that's terribly unfortunate.

For an excellent book on the history of programming and efforts to professionalize it, see The Computer Boys Take Over by Nathan Ensmenger. He argues that programmers are morke like technicians than professionals. Like other technicians, their work is often threatening to the organizations that depend on them. And despite the best attempts of computer science and software engineering, much of it is guided more by craft principles than by rigorous scientific or engineering methods.

Well yeah, 'Computer Science is taught in this idealized world separate from reality' has always been the case. Just like Math is taught in an idealized world separate from reality. If you want to learn to be a coder in the real world, don't waste your time with a CompSci degree, get a 2 year programming certificate at a vocational training school. I never really thought of computer science as preparing anyone for a real job as a coder.

Expecting a computer science graduate to know how to be an application

Here's US Berkeley's CompSci overview... what part of this makes you think that CompSci is preparing graduates to be application developers?

UC Berkeley construes computer science broadly to include the theory of computation, the design andanalysis of algorithms, the architecture and logic design of computers, programming languages, compilers,operating systems, scientific computation, computer graphics, databases, artificial intelligence and natural lan-guage processing. The Electrical Engineering and Computer Science Department’s goal is to prepare studentsfor both a possible research career and long-term technical leadership in industry.

That's not to say that a CompSci student can't become a developer, but the curriculum is not designed to teach that - there are far easier ways to learn application development if that's all you're interested in.

Maybe there should be a slightly different attitude towards breaking into computer systems, or attempting to break into them. However, it needs to be mentioned that if you are learning to skydive the first lesson isn't "what if you chute doesn't open." Similarly, the first project in a chemistry class isn't making dynamite.

What this case showed was a student with some skills could break into a university system. Great. One problem is that the student had little grounding in what consequences might pile up if this skill was used. Like the chemistry student making dynamite the knowledge might be there but no judgement about what to do with that knowledge.

Unfortunately, I don't think the proper response is for companies to hire people like this. They need a lot more work before they really can be expected to use their skills in a responsible manner - and today's corporate environment is hardly the place where people are going to get that. Would a person with the skill to break into computer systems and zero reasons not to do so willy-nilly (especially at the direction of lower level management with all kinds of reasons of their own) be a quality employee? More importantly, would such skills misused result in a good reference on down the road?

We are setting these people up to be unemployable in the future, right after they are exploited.

That like working some where let's say you adding cameras or new sensors or even upgrading the fire alarm system at bank security system and find there is a very easy way to bypass parts of the system and report it and let's a few days later you are back doing more work and find that no fix has been done.

and this guy is standing near the parked bikes. He comes up to me say and says you know, I could easily open that lock. I ignore him and walk away but I look back and he is standing there right beside my bike not breaking any laws. So I have a few alternatives. I can walk away and hope he doesn't damage it or rip it off. I can call the cops, but no laws have been broken, or I can unlock my bike and go elsewhere.

My point is that the statement I found a security problem with your system can be interpreted in different ways. Lets assume that the recipient of that message gets ten extortion attempts per week. An innocent message about a security hole would just go on the list to be reported to the police.

The college system is kind of out of date and comes with the full load of fluff and filler classes. Tech schools are roped into the college system as well.

There is lot's stuff that is poor fit into a 2 year or 4 year plan and other stuff that needs a lot more hands on training that is a poor fit for a collgle class room. When more of a community College setting is better. Yes community College offer classes non degree.

Yep, and the only way to realize just -how- vulnerable your systems are is test them out yourself (or have someone do it for you). I'm afraid that many CS graduates know nothing about how the "bad guys" are going to get into your system. They might have vague ideas about how a DDOS works, but its unlikely they ever have experienced one first hand. To an average person, indeed even an average CS graduate hacking (in the black or grey hat usage) either consists of just pressing a button or involves many crazy

I obtained a Masters in CS a few years ago. Security was a big topic for the department. We had a dedicated network and set of servers to learn, test and use the type of software that Al-Khabaz used. We do not use it on live networks against production servers. You never do that without knowledge of everyone involved. Same way where I work. Doing what he did would get you fired. If you find a security hole, point it out to the appropriate people. Then let them fix it, don't keep poking it with a stick. If y

Dawson computer science professor Alex Simonelis said his department forbids hacking as an 'extreme example' of 'behavior that is unacceptable in a computing professional.' And, in a news conference on Tuesday, Dawson's administration stuck to that line, saying that Al-Khabaz's actions show he is 'no longer suited for the profession.'

The geek's encounters with the law --- with society as a whole --- have not been ending well for him. The Internet is not his private playground anymore. Intrusions into other people's systems and software may end in a felony charge.

I've no doubt that the geek can still find shelter and support in his own community when things go south, but the climate outside is not so warm and welcoming anymore.

A felony would almost certainly require real damages to the target, DoS or theft or the like.

Furthermore, the kid already told the school about his find and made no attempt to conceal his identity on his second connection. He had gotten no negative feedback on his first connection. For a website, any connection outside of the provided user interface is outside of the normal terms of service.

Neither of his connections to the site resulted in any action by the actual owner of the site besides perhaps a call t

At the university I go to, I recall a computer architecture teacher that used handouts/slides from when the Pentium 4 was the highest-end CPU available

Basic computer architecture is basic computer architecture. The specifics may change, the number of bits may change, but the basics are still the same. I learned on 8080s and 6502s and PDP-8s and an odd CDC 6500, and they all shared the same concepts. When I pick up a datasheet for a modern processor, I see a lot of the same old stuff.

Once you have the basics, then you can expand. "How can we improve on X? By doing Y...". You don't know why Y is better unless you know what X is. And more important, it is hard to see the potential parallels for future improvement unless you know the past. "If we did A to improve X into Y, maybe we can do A to help this other thing, too..."

the NDA likely said don't tell how to get into there system and they seem ok about him talking about what happened and even if it did they are not makeing a big deal about as they did not want him to get kicked out of school.