I'm a very large fan of the writings of Paul Graham, a fairly famous LISP hacker. In his latest essay he talks about the characteristics of truly great programmers, and even makes a peripheral reference to Perl:

Indeed, these statistics about Cobol or Java being the most popular language can be misleading. What we ought to look at, if we want to know what tools are best, is what hackers choose when they can choose freely-- that is, in projects of their own. When you ask that question, you find that open source operating systems already have a dominant market share, and the number one language is probably Perl.

In typical Paul Graham style he talks about a variety of topics ranging from programming language choice to programmer motivation to the tyranny of cubicles. If you've never read any of his material before I highly recommend it, I find his material very intellectually stimulating.

A quick visit to Super Search reveals that this isn't the first time that Paul Graham has spoken favorably of Perl:

As one of the more visible proponents of functional programming languages, I think that Paul has managed to do a great job developing independent thoughts on the relative values of programming languages. Most LISP people that I know are very sheltered academics who pine for the old "glory days" of functional programming, and who certainly wouldn't touch a weird language like Perl with a ten-foot pole.

There are certainly some things that I don't entirely agree with in the essay, particularly a section on how he instantly predicted the failure of a startup who based their infrastructure around Windows NT and a Windows NT guru. He claimed that any person who voluntarily used NT multiple times couldn't be a first-rate hacker. He may have been right in this case, but I think it's a little silly to automatically assume that there can't exist a truly talented hacker who chooses to work on a Windows variant (regardless of what the preponderance of evidence suggests). :)

I'd be interested to hear what other monks around the monastery think of this essay. Any stuff jump out at you as really insightful? Horribly false? Subversively un-American? :P Post back here, this type of discussion is very interesting to me.

Edit: Thinking more made me think of why many of my CS professors detest Perl. One professor of mine stated that Perl was a low-level language because the following code snippet for reading lines from a file was fairly slow compared to other languages:

When I pointed out to him that Perl was doing exactly what he was asking of him, he proceeded to explain to me that the point of a "high-level" language is to insulate the programmer from dealing with "low-level" ideas such as hard drives, buffers and other things. I disagree with him, but perhaps it's just a matter of semantics. As a periphal point to the discussion, is Perl a higher-level or lower-level language than Java? COBOL? C++? LISP? Is it even comparable to PROLOG? Graham's essay centers around the idea of using the most powerful tool for the job, is this abstract concept of "levelness" even related to programming language power?

This essay was based upon the talk he gave at OSCON and I was sitting there listening to it (yesterday, as a matter of fact) and I really appreciated much of what he had to say. I do think that he overgeneralizes in some places, but the core of much of what he was saying was interesting, if you could distill it down. What I heard him say, ultimately, was that great hackers enjoy scratching their personal itches more than they enjoy searching for a back scratcher. As a result, they tend to be less inclined to look for tools that don't allow them to immediately dive into the problem.

To a certain extent, I think this is due to his love of the Lisp programming language and other dynamically typed languages that let programmers focus on the problem rather than the fiddly bits (bad stretch of a pun intended.) If I'm trying to track down that memory leak, I'm wasting time. If I really have to care about "hard drives, buffers and other things" then I'm not solving my actual problem. I'm solving problems that the language imposes.

If you're in a corporate culture, the first example might seem more appropriate because you can't forget to see if the open was successful and everything is this really cool OO stuff. The second example, though, lets you focus on the task at hand and clears away a lot of cruft. If I have an itch to scratch, I don't reach for a diesel-powered backscratcher. That, I think, is what Paul Graham was trying to say.

And no, Perl's not comparable to Prolog. The latter is a wonderful language, but it has such a limited problem domain that comparing it to Perl is comparing cheese and Wednesday. Different tools solve different problems and comparing them without a problem to contrast them against is an exercise in navel gazing :)

Side note: does anyone else see the weird newlines in the Java example? I've been seeing that a lot recently, but I don't know if it's a quirk of Firefox, my using FC2, or something else entirely.

Update: It's been pointed out that the "weird newlines" are from a bunch of whitespace at the end of some lines of code. Curiously, they don't show up in the editing textarea and they didn't come from me. Weird.

You have three lines in your first code block that contain several dozen trailing spaces each (and these spaces get wrapped by PM, which makes the "weird newlines").

I'll go out on a limb and guess that you are using Opera. Although I like a lot of things about Opera, it certainly has the most stupid bugs. This probably replaces the previous Opera bug where they'd add an extra blank line per non-blank line each time you preview.

Since this is affecting several people, it would be nice to track down the source and any work-arounds.

Well that's completely nuts, and so out of touch with reality. For a start, the underlying IO system of any decent OS will fetch the disk blocks of the file into memory in advance, especially for a sequential read. And the C library will then fetch the file in chunks of 4096 bytes (or so) at a time, and dole out the bytes line-by-line.

And regardless of any fanciness the OS and C library and Perl IO layers try to do, I/O IS SLOW! It doesn't matter if your snipper is sub-optimal, because its slowness will be lost in the noise. I really fail to see the professor's point: that snippet abstracts away the concepts of hard disks and buffers quite well in my opinion. I must be missing something.

Indeed. That was exactly my reaction. In fact, I completely fail to see the connection between "slow I/O loop" (whatever that means) and "low-level language". If one implies the other, then because I can write a tight I/O loop in ASM, does that mean assembler is not a low-level language?

I always believed high- and low-level refer to the ease of expressing complex ideas in a programming language, not to I/O speed -- or any other kind of speed, for that matter. On that score, Perl is squarely in the high-level language camp.

A high level language: the language goes to great lengths to do what the programmer wants.

A low level language: the programmer goes to great lengths to do what the compiler wants.

A structured language: the language goes out of its way to make the programmer do what some programming professor wants.

The idea that Perl is a low level language is laughable. I wonder if this professor ever did Z80 assembly, for instance. That's so low you're practically setting the bits with a screwdriver and a refrigerator magnet.

I'm lucky to have experienced that at work a few times, and it is really cool to have, in the next room, someone who is both a fantastic programmer, a Linux whiz, knows C++ to the core while still doing stuff in scripting languages, gets things done, is meticulous and can write to tell others about what he does.

Poeple like that are productive by themselves, but more importantly they make the entire team more productive by being around and making big problems small and show-stoppers go away so we less talented hang-arounds can continue working, hopefully learning something in the process.

If you can, make sure you're located within ear shot of a person like this, to pick up on all the small everyday things.

If you can, make sure you're located within ear shot of a person like this, to pick up on all the small everyday things.

If you cant then hang out here a bit theres quite a few folks in that category, and if you CB youll find that it represents a sort of dynamic ever changing uber-guru. Hell Larry has even been known to pop up there once in a while :-)

---
demerphq

First they ignore you, then they laugh at you, then they fight you, then you win.
-- Gandhi

He thinks while (<FILE>) is low level? Then I guess you didn't show him sysread. Or XS. :-)

My guess is that this guy is a functional purist, and therefore thinks a HLL shouldn't permit (or, at least, rely on) side effects. You should show him how Perl supports functional programming as well, including such nifty things as closures.

Truly one of the greatest things about Perl is that it is a HLL, a LLL, and everything in between. It's one-stop shopping. You can do everything you need to do, without having to cobble together pieces written in different languages. ("Yeah, I use DHTML for the UI, PL/SQL for the database stuff, and Prolog for the business logic. And I drop down to C for the fiddly bits.")

When functional language purists talk about how a language can completely lack of side effects, they tend to shrink back when you ask them about I/O. Yes, there are ways of dealing with it (monads, which are beyond my current understanding, and so will only mention them for completeness), but there is no getting around the fact that I/O is a side effect.

Update: One grammour mistake fixed. 1.56 * 107 more to go.

----send money to your kernel via the boot loader.. This and more wisdom available from Markov Hardburn.

When I hear people speaking bad about Windows programmers, I remember the '70s when we used to say that Hollywood movies where stupid films, compared to European or Asian.

It may be true that all as a whole is like this, but one should never generalize on it.

Several American film directors, do real masterpieces. And their only existence should erase any generalization out.

What I recognize is that most good programmers learn the basics from other operating systems.And it has nothing to do with a commercial idea. It happens because no one is able to leave happy to all types of users, and windows handles very well the comsuming audience. And when people wants something else more than a product they have to search in other operating systems.

One professor of mine ... proceeded to explain to me that the point of a "high-level" language is to insulate the programmer from dealing with "low-level" ideas such as hard drives

I totally agree with your professor here. Perl should have shielded it from us i.m.h.o..
I always felt that in perl, everything is a scalar, except a file(handle). Now if i am going to suggest here that a file should be a scalar too, some of you might tell me that that's unwise for performance reasons, but that's exactly what Perl should do for me, shield it and make it fast enough.

It's a bit like the primitive types discussion in Java. They should be objects.

A file handle is already an abstraction. Sure, it's usually attached to physical bits on a magnetic disk, but Unix is explicitly designed so that a "file handle" could also be many other things. It could be a pipe to another program. It could be a network socket.

The professor in question seems to have a problem making arguments. A file handle is in no way tied to magnetic-based storage, and hasn't been for 30+ years. If you want to say that a file handle is a poor abstraction, you must at least show a better one. I can see how "it's just a big string" could be handy in some places, but it would be no good in others.

For instance, on a networking socket, the "big string" would be constantly changing as data gets passed back and forth. You couldn't treat it as "big string" because a typical string isn't affected by external systems in typical operation. In other words, you can't completely abstract the underlieing operation. OTOH, the only difference (from my program's point of view) between a network handle and a file handle is how it's opened. After that, it's all the same operations.

----send money to your kernel via the boot loader.. This and more wisdom available from Markov Hardburn.

That is absolutely absurd. You may as well say that an entire database (table space) should appear in perl as one big string. No? Then where do you draw the line?

It is possible (via modules such as Tie::File and Tie::MmapArray) to get behavior like what you're talking about; but it shouldn't be the only way to work with files. Sometimes a file is too large to load all at once. Sometimes a file is really a stream/pipe/socket. Sometimes a file is being written by another process at the same time. If you're a programmer, you need to know, and to handle these situations. Any proposal to require that files be abstracted as strings is naive at best.

Do you mean totally abstract out the concept of a file handle, and only handle files as "big strings" (as was suggested on the perl6 language list recently), or do you mean just implementing a file-handle as a scalar? You can do that already since perl 5.6: