Posted
by
Soulskillon Tuesday May 08, 2012 @11:22PM
from the stop-trying-to-code-me-and-code-me dept.

Aciel writes "Ruby has long been popular in the web/business community, while Python dominates the scientific community. One new project seeks to bring balance to the force: SciRuby. We've already introduced a linear algebra library called NMatrix (currently alpha status). There's at least one fellowship available for students interested in working on the project this summer."

(1) The languages at the top of the Tiobe Index (even given that we can assume it has some claim to validity... I'm not so sure) are all compiled languages, or at least compilable to bytecode. Except PHP, which dominated the Web world for a long time but is sliding, and for good reasons.

(2) The languages at the top of the Tiobe index will always have distorted figures because they represent the majority of code that is already installed a

"Sure, we can try different methodologies. Here we see that Ruby comes behind PHP, Javascript and Perl on the normalized comparison."

Not really. First, it's a Web survey of discussions, not a measure of how popular these languages are in actual use. One might infer a relationship, but it ain't necessarily so, and even if there is one, who knows how strong that correlation is?

Second, being a web survey, web-oriented languages (like JavaScript; a rather glaring example) are over-represented. Perl (another good example) is a legacy language that was very popular in its day, although it is almost universally rejected (with a good deal of

"lol ok fine, no doubt you are right, and Ruby is much more popular than any metric we can think of indicates."

That isn't what I was saying at all. What I was saying is that it all depends on what you mean by "popular".

"Legacy languages" (like Perl, and even Java to some extent) remain popular for a long time for several reasons, among those reasons the sheer size of the established code base. I mean, sheesh... there is still a demand, although a small one, for COBOL programmers, and that was the first "high-level language" ever invented!

I'm not saying that Ruby is more popular than the charts indicate. What I

lol we can look at something like github [github.com], which has mainly new projects. That has Ruby as the #2 language for projects.

Predicting that a language will become popular in the future is tenuous at best. Ruby doesn't deal with multi-threading as well as some other languages. Will it matter? I don't know, but lack of a compiler isn't the only thing involved in a language's popularity.

"Predicting that a language will become popular in the future is tenuous at best."

Will you get off the Ruby kick? I was referring to a class of languages, not necessarily a language itself:

"... scripting languages like Ruby..."

And all I said was that will make them more popular. And that stands to reason, because then they will be useful for a wider range of applications. It hardly takes a Nostradamus to make that prediction.

Tell me: do you know of anybody working on compilers for Perl or PHP? When you do, let me know...

lol your Google-fu needs to improve. There is perlcc [cpan.org], available for a while although I have no clue if anyone is using it, but I do know Facebook is quite proud of their PHP compiler [slashdot.org] and they definitely use it. Yet another reason to not work at Facebook.

Also, it's not like compiling your code hopelessly obfuscates it. Several times over the last year I've had to disassemble C or C++ source code from an Android system to figure out how something works.

Okay, I knew there was a Perl compiler, but I wasn't aware that anybody actually used it. I was not aware of the PHP compiler.

But you don't need code to be "hopelessly" obfuscated. Even bytecode-compiled code is obfuscated enough for it to be commercially useful. Decompiled code is much harder to read and decipher than plain source code. Another example is the obfuscator that Microsoft used to supply (and maybe still does, I don't know) with Visual Studio, without which many people would not have used it

lol be nice!! You're the one who can't use Google, and I'm the troll??

I guess I can agree that having the code obfuscated makes it harder for a competitor to steal your code completely and wrap it up as their own product. If you're trying to hide your super-secret algorithm, or keep people from cracking your DRM, having the code compiled is a false sense of security. Although I admit some people are afraid of reading assembly.

I didn't bother to look at Google. I didn't feel my basic argument needed that to support it. And I still don't.

"I guess I can agree that having the code obfuscated makes it harder for a competitor to steal your code completely and wrap it up as their own product."

Which in turn makes the language more popular. That was the essence of what I was saying. Java would not have enjoyed anywhere near the popularity it has had if everybody could see all the source code. Even though decompiling is still possible. And the lack of same has been the deciding factor in my not distributing a couple of programs written in Ruby.

"If you're trying to hide your super-secret algorithm, or keep people from cracking your DRM, having the code compiled is a false sense of security."

I'm not so sure that many companies are actually so worried about people looking at their source code, especially for a language such as Ruby which is mainly used on the backend of websites, but I could be wrong.

"I'm not so sure that many companies are actually so worried about people looking at their source code, especially for a language such as Ruby which is mainly used on the backend of websites, but I could be wrong."

Then you've missed my whole point.

Java is used on the back-end of some websites too. And that is probably where it would have stayed, if it were not compilable to bytecode.

It makes a BIG difference to a LOT of people. Repeat: that's the whole reason why Microsoft included an obfuscator with Visual Studio: because (a) their CLR bytecode compiler did not sufficiently obfuscate the source, and (b) without that, nobody wanted to distribute desktop apps. It wasn't just an extra, it was considered to be ess

I should add that Ruby is used for a lot more than just websites. You are probably thinking about Rails. Ruby is suitable (and is used for) all the same kinds of things as Python. It's just that Ruby got a later start, because it wasn't really known outside of Japan until some years after it was created.

You know that and I know that. But I've worked for physicists, chemists and biologists, and believe me, that little detail doesn't stop them one little bit. A little birdie tells me that it's even worse in the social sciences.

MatLab might beat Python, but it's been losing ground.

Very slowly, and in the fields I've worked in, invariably to R.

R? I love R, but it's not a general purpose language and very few scientists know how to use it.

Those two little details don't stop scientists either.

In my experience, scientists will do just about anything to convince themselves that they're not actually programming, if only to avoid pesky annoyances like source code control. The less it looks like a programming language, the better.

In my experience, scientists will do just about anything to convince themselves that they're not actually programming, if only to avoid pesky annoyances like source code control. The less it looks like a programming language, the better.

Oh god. That would explain why none of their code looks like it's written in a programming language.*

* I work with biologists. By 'they,' I mean biologists. I know you physicists and quantum chemists have it lucky. Stop bragging. You're making me feel bad.

The ugliest stuff I have seen was from mathematicians working in medicine. As a naive student, one is inclined to think that MATLAB code, like any programming language that isn't Perl, line noise, INTERCAL, bf, or TECO, has a minimum floor of ugliness. One would be wrong.

It cuts both ways. I'm a geneticist, and it's painful having to work with tech guys who don't know the first thing about even basic biology, never mind genetics.

It's the same here on Slashdot. I always cringe every time I load up the comments on a story about genetics or evolution, because I know there'll be a slew of ignorant comments modded up to +5 insightful. At least most scientists know their limitations at programming, but the same cannot be said with regard to non-engineering subjects for many engineers, who feel themselves qualified to comment on just about any topic under the sun, regardless of their lack of knowledge.

Physicists are perhaps more famous for the same folly; one example I recently stumbled upon was David Boehm [wikipedia.org], who, in conjunction with a psychologist, developed a completely nonsensical theory of higher brain function and the emergence of independent thought based on nothing more than the appeal of physics concepts to a biological problem.

If it consoles you any, rectifying such misconceptions is one of the reasons I make a habit of posting here. Experience more suggests, however, that not reading the article

To be fair, there are a few notable physicists who have made major contributions to other fields. Dijkstra is the one that most computer scientists know. They are, of course, notable because they're the exceptions.

When I was an undergrad, biology was the science you did if you liked science but didn't like maths. That's the reason why I was turned off biology back in the day. Boy how things have changed.

Having said that, it's my experience that it's easier to teach a computer scientist (not just a "tech guy") enough biology than it is to teach a biologist enough software engineering, where "enough" means sufficient to form a productive working partnership. There are, of course, notable exceptions in both directions.

MATLAB wins in the engineering community, and that's by a long run.
But everybody keeps forgetting the unsung hero called FORTRAN. Several decades later it's still alive and kicking and it's still used on a daily basis in the scientific community. We don't need another language for scientific computing. And the last thing we need is a language like Ruby. Ruby is a memory hog and inefficient at the best of times. And I sure as hell wouldn't want to be tasked for writing a good optimizing Ruby compiler. FORTR

Yes, but we're talking about the SCIENTIFIC community. Engineers use MatLab because that's what they learn in school, and that's mostly because that's what their professors are familiar with.

Some of the people doing heavy duty number crunching still write Fortran code, but they're a small minority. Yes, most scientists use libraries that were written in Fortran all the time, but very few write code in it. It's not a very good language for modern general use.

Young Postdocs and graduate students may use Python. The established scientists use Perl, Fortran and IDL.

This is because scientists get paid to think about their field of specialty and not learning a new language. What language was popular during their education will most likely be the language they use in their careers.

Established scientists use Word to write grant applications and edit papers their students have written.

When I was a grad student my supervisor was a computer scientist who used IDL in his PhD. He hadn't written code, of any kind, for ten years. He wanted to learn some Python (because he wanted to write a hockey pool calculator) but couldn't find the time.

First of all. Believe it or not, engineers make up a large part of the scientific community. And you're very mistaken in that though. MATLAB is often the best tool for the job. Go and try to prototype any form of digital signal processing system or algorithm in anything else than MATLAB and they'll point and laugh at you. If you do any of that in python or another language you'll find yourself reinventing the wheel simply cause all those functions already exist in MATLAB. And at least you can be certain tho

I do a fair bit of work in IDL (writing software for the scientists to use)... but there are days that I want to stab myself repeatedly with a fork. (I've asked for ages for native SOAP/WSDL support... they insist they have it, because they can do the OGC services... but those aren't the SOAP calls I'm trying to make)

If you don't interact with their XML *just* right, it can cause some horrible problems where the cleanup time increases exponentially with the number of elements. And try to do it all with

but not a single damned person that I know of using Ruby for science in our group

Actually I like Ruby, but I haven't used it much for linear algebra (I have compiled code for that). I've been looking at JBLAS (a JNI package in java that uses BLAS and LAPACK) and may incorporate it in some Ruby running on JRuby. However every time I have the free time to try, I go WTF? and go get a beer.;)

Let's not forget Mathematica (my personal favorite) and Lab View (used for programming National Instruments cards, but soms people start using it as a general programming language because that's what they know -- visual interface, more like circuit design, quite interesting actually).

can be a lot more concise and readable than a forced indentation method. Flexibility is a good thing. It is ok if you don't like the style I just demonstrated, but you should be able to understand that other people do.

The most "Pythonic" way to handle your scenario is both efficient and chimes well with your readability standards:# Assuming you've got a threading.Lock or multiprocessing.Lock object called 'my_lock'...with my_lock:
code()
moreCode()
yetMoreCode()#...and that's it!

Note that there's no need to call something like releaseLock() since Lock() objects support the 'with statement' (aka context manager). The indentation rules me

I don't think there is a problem with signals, http://linux.die.net/man/3/pthread_mutex_lock [die.net]: "If a signal is delivered to a thread waiting for a mutex, upon return from the signal handler the thread shall resume waiting for the mutex as if it was not interrupted."

And you've given an example of why exceptions suck. There are good things about exceptions, but that's one of the bad things.

I don't think there is a problem with signals, http://linux.die.net/man/3/pthread_mutex_lock [die.net] [die.net]: "If a signal is delivered to a thread waiting for a mutex, upon return from the signal handler the thread shall resume waiting for the mutex as if it was not interrupted."

I don't know what acquireLock() is, but the function to acquire a pthread mutex is called pthread_mutex_lock().

And you've given an example of why exceptions suck. There are good things about exceptions, but that's one of the bad things.

I don't know what acquireLock() is, but the function to acquire a pthread mutex is called pthread_mutex_lock().

Yes it is, which is why I linked to the manpage of pthread_mutex_lock(). You were wrong, are you going to keep digging deeper, arguing that you weren't? Or are you ready to accept that pthread_mutex_lock() is not going to return on an interrupt?

The control flow implemented by exceptions occurs in C programs as well, it is simply not made explicit. By making it explicit, constructs like with: and try:finally: can then ensure cleanup in a consistent, lexically apparent way. In C, there is no consistent or lexically apparent way in which non-local exits happen, and there is no consistent or lexically apparent way in which resources are released on those cases.

Oh my friend, let me introduce you to if statements. They are quite ingenious, and allow you to handle conditions, like error states. You may not be familiar with them.

Yes it is, which is why I linked to the manpage of pthread_mutex_lock().

You can link to whatever you want to, the example didn't use pthread_mutex_lock(), hence whatever that function does with signals was and is totally irrelevant.

And, actually, if you think that pthread mutex lock's automatic release of resources is a good thing and you rely on it, you are endorsing exception handling, because that's the kind of thing exception handling does for you.

You can link to whatever you want to, the example didn't use pthread_mutex_lock(), hence whatever that function does with signals was and is totally irrelevant.

Oh, so you are incapable of reading documentation?

Oh, I'm familiar with them, like when shitty programmers produce reams of code

The shitty programmer is you, dude. A good programmer learns when he is wrong, he doesn't keep arguing like some arrogant guy who thinks the argument actually matters. You've lost, you're wrong, and your coding style reflects the fact that you can't read documentation.

Does that use pthread_mutex_lock() anywhere? No. From your code, one can't tell what kind of resource is being acquired, what non-local exits are possible, or how they are handled. It's code like yours that led to the introduction of exceptions and handlers.

A good programmer learns when he is wrong, he doesn't keep arguing like some arrogant guy who thinks the argument actually matters. You've lost, you're wrong,

That's not a problem with indentation; Python just doesn't have a full lambda construct.

As far as I'm concerned, that's fine. Python has a number of alternative constructs (iterators, with, explicitly nested functions, list comprehensions, etc.) that are individually more limited, but end up encouraging people to write clearer code. Functionally, it's not a limitation.

1, Because it'll break code when copypasting from a website.2, Because you can't use matching braces feature of your IDE if you don't have braces, and it's harder to see when blocks start or end when you have a method that's longer than your screen height.3, Because your IDE could autoindent your code if you used braces.

A valid issue. This presupposes that either the poster was an idiot who couldn't be bothered to use a code block, or the website is overly limited. It does happen, but coders worth listening to will usually cushion their code well.

2, Because you can't use matching braces feature of your IDE if you don't have braces, and it's harder to see when blocks start or end when you have a method that's longer than your screen height.

Try using more than 2 spaces to equal a tab. Yeah, that's sacrilege, I know. Other benefits of brace matching (such as collapsing code blocks) are still entirely feasible, and some editors will do it for you. Besides, without the brace clutter, it's easier to see where blocks

This is pretty rare these days since everyone uses code pasting sites like codepad.org and github's gist but it is a valid point.

2, Because you can't use matching braces feature of your IDE if you don't have braces, and it's harder to see when blocks start or end when you have a method that's longer than your screen height.

What the heck?!? You've obviously never coded Python in an IDE that supports Python. Python has braces, parens, and brackets for a lot of things (dicts, tuples, and lists, respectively) and the matching braces feature works fine. Why would it be different? Not only that but most editors that support Python also offer code collapsing and indentation works better for this purpos

The 1970's called and they want their whitespace dependencies back. Everything is cyclical. Freeform languages become the rage after years of being shackled to languages based on punchcards where position is everything (e.g.. Fortran 77). Fast forward to today and we have people singing the praises of whitespace significance within Python.

Next thing you know, we'll be talking about the merits of running applications on a centralized machine and having the users accessing their applications with thin client

That principle allows you to have line noise that parses in other languages, make is harder to do that in python, so I don't know really if it should be followed to such dogmatic ends. There still is a lot of freedom in presentation in python after all. The cstyle and later astyle commands are a taste thing, some C coders liked it, others didn't. I always like to use whitespace to help me and others understand code and have it in a consistent style from the get go, not have a script or program munge it for

This seems like a terrible idea. What could scientific computing with Ruby possibly offer that SciPy doesn't already? Way to split the potential work force guys. If you want to develop a scientific computing library for a rich dynamic language, then contribute to SciPy. What a wasted effort.

No, seriously, that's the most significant thing it brings to the table. With a minimal amount of functionality implemented, far more scientific problems can be addressed in ruby and integrated into a larger, thriving ecosystem (>38,000 ruby gems and counting). Ruby already has strong web applications; complemented with better science libraries a lot of great science can be accomplished (opentox [opentox.org] is an interesting example of this kind of integration)

But more importantly, numerical computation and visualization can be done much better in Ruby, for a number of reasons:

''Everything returns a value.'' Ruby's better object model means better chaining of computation.
''Iterators'' are way better than ''for'' loops (each_slice, each_cons, etc.).
''Readability.'' Ruby is incredibly readable, which makes it uber-maintainable.
''Metaprogramming.'' Sometimes the simplest solution is to write a [http://github.com/wycats/thor code generator]. Sometimes, eigenclasses are the cleanest.
''Integration into Rails.'' The influence of Rails on Ruby is undeniable. Web-based visualization for scientific projects is the future.
''R is nice but clunky.'' The learning curve is enormous. It does some things very well, and others not very well at all (try doing a multi-line plot in R).

Unfortunately it ignores the alternative of just using Python.

There's also a (I'll be charitable) silly discussion in this vein on the same page:

Ruby has no equivalent to the beautifully constructed numpy, scipy, and matplotlib libraries for Python. We believe that the time for a Ruby science and visualization package has come and gone. Sometimes when a solution of sugar and water becomes super-saturated, from it precipitates a pure, delicious, and diabetes-inducing crystal of sweetness, induced by no more than the tap of a finger. So it is, we believe, with the need for numeric and visualization libraries in Ruby.

IMO, this is a misguided waste of time and it's nearly inactive anyway.

The option of using python is implicitly rejected. Why would the contributors spend time on sciruby when they clearly know scipy exists? Speaking for myself, I've used python and scipy (both numpy and matplotlib) for several projects, but I much prefer coding in ruby to python. All the functionality of scipy isn't going to be duplicated with sciruby, but if the most common use cases are implemented, then I can use ruby for most projects.

this is a misguided waste of time

why? It's easier to re-implement (i.e., borrow from scipy) than implement in the first place, so it doesn't take all that much time. And, as pointed out, this is currently a minor project compared to scipy, so if it is a waste of time then it's not a lot of it.

Python has a "there is only one true way" mentality, so there isn't a lot of room to try and innovate within scipy. Perhaps sciruby will innovate in significant ways and scipy will draw a little from it one day.

I agree that since this doesn't have a lot of contributors it isn't a huge waste of time, though that's not a good argument. The question is if it's worth the resources to reinvent the wheel (even just porting entails a lot of work, and I'm not sure that's what they're after here) just so people can use their preferred language. I would say no since the difference between the languages isn't enormous. It's unfortunate, but tradeoffs need to be made and not every language can have nice numerical and algebrai

I think "had" would be a better way of putting it. Python set out to be the anti-Perl. This required making sure there was no bizarre behavior that coders could exploit to make their code more unreadable, and easier to break. There are a few kinks that are more limiting than I would like, but the language has gained a lot more flexibility over the years.

Sorry if I caused any offense. I don't know them, I'm sure they are great. I was just replying in regards to the list of reasons, which I thought was weak in my opinion. Sometimes I get cynical and snarky, again I apologize. I should have stopped at before all that you quoted, it was a reasonable response the words I wrote before that. Basically that not much has kept-up with visualization, especially in a browser. I worked (wow now more than fifteen years ago) on that as my first job after graduation, it s

Well, Ruby & Python are just like islam & Judaism or vim & Emacs.They're very similar, yet distinct, and once you've chosen one path, it's hard to take another.

I'm really good at using Ruby, know the inner workings, wrote many plug-ins and applications, and contributed to the project.There are many cool projects for Python that I'd love to understand, use and contribute to.

I tried some tutorials for Python, but it feels like a waste of time when I already know a OO high-level scripting language.

This Ruby/Python thing has gone on long enough. Here we have two languages with identical use cases, identical advantages/disadvantages, and (in the grand scheme of things) almost identical properties in every way. The practical differences between them stem almost entirely from the fact that they happen to be used by different communities, so certain modules in each are much better developed than in the other.

The fact that they both exist has split development effort to the detriment of both. For example,

I've been writing Ruby for a couple of years now and a couple of weeks ago I went for an interview about a position with heavy Ruby coding.
During one of the interviews, the guy hit the nail on the head:

A lot of people don't write Ruby when they're coding in Ruby.

In reality, Ruby practices make a lot of sense under the context of the language and when you get comfortable with them then you've reached a position when you can take advantage of what Ruby has to offer -- mainly brevity (without sacrificing clarity) and flexibility (crazy introspection and meta-programming capabilities).

There are a lot of things in the Ruby universe that suck (the interpreter can be quirky, gem management can bit flaky, documentation could be better), the language proper is not one of them.
However, (surprise, surprise) like with anything else in life, you've got to weigh the pros and cons and find something that suits you and/or your needs.
Ruby suits a lot of people's needs, that's why it exists, obviously -- so no, we're not where we were 20 years ago, not by a looong shot.

A Ruby method either explicitly "return expression" or just returns the last expression it evaluates. That's similar to a number of other languages. There are many languages that return the value assigned to the name of the function (Matlab, almost, in a slightly more bizarre way ; Pascal, if I remember well). Ruby gives a choice.

My biggest complaint with the language is that it does not remain consistent across even minor language versions. You haven't really lived until you've had to make two libraries that require two different and incompatible versions of Ruby work together. I've been burned a few times in the gem system around object versions. New library required by some program introduces an incompatibility with some other library that other things depend on, makes a huge mess.

My biggest complaint with the language is that it does not remain consistent across even minor language versions.

Amen! Nothing gets my goat like all the revisions of code I have to do between the versions of Python... wait we are talking about Ruby? Sorry.

I find this a little hard to swallow since I haven't experienced too much inconvenience between minor versions of Ruby. Of course both Python and Ruby are relatively young and prone to these minor inconveniences.

The fellowship is a summer long with only a $1,500 stipend. The most recent commit [github.com] is from December 1st, 2011. The wiki and issue tracker appear to be similarly inactive. Even if the project does something, it probably won't do much; contrast it with numpy commits [github.com] which are recent and numerous.

This story should never have been accepted. There are a million minor projects like this that similarly aren't newsworthy enough to discuss.

The slowdown in commits to sciruby proper is due to the recent efforts on nmatrix (the subject of the story). The github commit history [github.com] is easily accessible and shows a flurry of activity. Many projects associated with sciruby are also not housed directly under the sciruby name (e.g., rubyvis [rubyforge.org])

There are a million minor projects like this that similarly aren't newsworthy enough to discuss

... yet here we are...

The lack of comparable scientific libraries is the primary reason many folks choose python over ru

Really? I see one active contributor on these five SciRuby repos [github.com] (who is responsible for the "flurry of activity" you describe) with an overall trend downwards. The port you linked is also essentially inactive, though it may simply be complete. By contrast there's half a dozen different contributors on the first numpy commit page I linked alone.

I've been using Octave (an open source version of Matlab) in Stanford's online PGM course [coursera.org]. My first reaction was "great matrix manipulation library, extremely bad language". It's like time travelling to the 70's and discarding every progress CS made in the last forty years. Actually Matlab has object oriented classes now but somebody commented in the PGM forums that it's not so good. (Octave uses an older Matlab OO syntax I'll be merciful not to comment about.) I don't have any direct experience with R but on the PGM forum I read that its status is not so different.

My suggestion to the scientific community is to work on replacing those old languages with something modern, even Python which I cordially hate because of that white space thing. Obviously you need a fast (written in C) scientific library and an interactive prompt is extremely handy. Python and Ruby are sensible choices IMHO. Matlab and R won't disappear, Cobol didn't go away, but there is no reason why a 20 years old student shouldn't start coding with a modern language, if it's on par with the old ones (a big if, I know).

What people fail to understand is that OOP is a bad paradigm for numerical computing. It's ill-suited to vectorization and parallelization.

What people fail to understand is that >95% of any big scientific code is not performance critical, parallel, or vectorized. It's that code that needs to be written in OOP. That's why a good scientific language needs both good OOP support and good vectorization/parallelization support these days.

Funny you should mention Fortran. Actually, this Ruby thing might actually work, but only if numerical analysts, scientists, engineers, etc. (non-CS types) like it better than Python. Many such domain experts seem to prefer Fortran or C/C++, perhaps because they learned it in school and/or because everyone in their field before them used it to build the existing code bases. Python does have a hell of a head start, though.

With FORTRUBY, scientists and engineers can jump right in and start programming in what they're most comfortable with, Fortran, but the Fortran code isn't actually compiled,it's interpreted by a Fortran interpreter written in Ruby.

This allows scientists and engineers to transition gracefully to a modern language like Ruby. At first, they can just write in Fortran and insert calls to the Ruby library.

RIIIGHT... You know that in most cases that this FORTRAN code runs for days on clusters composed of tens, if not thousands of nodes? You know that they try to shave each little bit of performance out of it and that is the reason why they use FORTRAN, instead of the nicer C. (Yes, it might not be 100% true that FORTRAN is faster.) And then you want to interpret exactly this code... You can argue as much as you like about languages and semantic... but unless an efficient executable falls out at the end, it is