Posted
by
timothy
on Sunday March 20, 2011 @05:37PM
from the user-friendly-is-hard dept.

An anonymous reader writes "Java is performant, widely adopted and eminently portable, however, its syntax is largely inherited from C++ along with some of its esoteric unfriendliness. Mirah aims to place a friendly face on Java through the implementation of a syntax whose primary concern is developer friendliness (think Ruby/Python/Groovy), and route of least surprise. The result is a truly cogent alternative syntax delivering readability, expressiveness and some compelling new language features."

Mirah looks to me so far like a waste of effort. It has somewhat nice syntax, granted, but if you really want to use Ruby syntax with the JVM, there already is something that does that: JRuby.

If you just want simplified syntax, Groovy is just as simple and looks more familiar to Java programmers.

If you want simplified syntax and powerful new programming tricks, Scala and Clojure do this far better. If you ignore the Scala libraries and half its features, you get everything that Mirah was designed to do.

The language designers should do a better job explaining why this is worth paying attention to.

well...what languages you started with. Looking at that, the c++ based code is perfectly readable, but I can't make heads or tails of the other. In fact, reminds me of perl and objectiveC - just start hitting all those shifted characters - they each signify something special.

This I don't get. If you want "developer friendliness" and "least surprise", use a syntax with the (relatively) minimal set of keywords/tokens to accomplish your task. Ruby has basically incorporated the syntax and conventions of every major programming language of the last 30 years...

And I guess you could call it "developer friendliness" if you want to let people freeform program in whatever style they want, with no two developers tending to use the same syntax for the same implementation - but at this point in my career (ie having worked for half a dozen companies and realizing what you write now may exist for decades), I consider a major component of "developer friendliness" as "easily comprehensible and maintainable by the next developer".

If C/C++ were so developer un-friendly you would not find them under the hood of pretty much every new language that claims 'developer friendly,easier,etc etc etc ' that came out in the last 20 or so years. Take Java ,.NET etc all have some sort of C under the hood.It's not that C/C++ is not developer friendly. It's that a lot of developers(apparently) are simply not that good to begin with. While Java , C#, JRuby etc etc etc make things 'easier' (according to some people at least) they ALSO hide a lot of things that a GOOD developer _should_ know in orderr to understand what is going on under the hood.

I see and hear a lot of kids fresh out of University that never even used C/C++ (which frankly baffles me to this day) ask question like " What is a refference","Oh i thought the GC would take care of that"

He did share them... stop trying to "solve" the problem of programmers needing to know how to program by writing languages that try to cover for dummies. We have plenty of languages that dummies can use safely. The problem has been solved.

Computer languages are like power tools. They can be "so safe, but no safer" before they start losing function. They just _don't_ make a band saw that any 12 year old can use in complete safety. Same for joiner, circular saw, planer, or just good old knife.

Tools are _not_ supposed to be _safe_. They are only supposed to be "no more dangerous than necessary". We make cheap plastic toy versions of tools for kids to practice with, but eventually they either go away or they have to learn to use the real thing.

I will believe in these "for dummies" versions of language the day a contractor shows up at my house with a Fischer-Price nail gun that will actually hold up sheet rock but is safe enough to hand to my neighbor's 5 year old.

The "clever new idea" is that people don't _deserve_ to operate in fields where they are unwilling or unable to learn the skills required. This is just as true of my profession (programming etc) is it is true that my programming acumen doesn't mean that I should be able to walk into CERN and have a go at the LHC even though I know squat about high-energy physics.

Problem is: when I look at your python code, I don't know if I'm looking at spaces, or tabs, or some combination of both. Not without a hex dump, or something. And one invisible character out of place, and god-only-knows what sort of unexpected results I get.

Also, I probably can not cut-and-paste your code into mine, and have it work, without substantial modification.

Then there is the serious issue of emailing code, or cutting-and-pasting from a web-site.

Too bad that practically everybody on slashdot thinks of BASIC as GW-BASIC. Most versions of BASIC, that are less than 30 years old, actually have it right - no curly brackets, no counting spaces and tabs either.

Really? I mean, you're missing such basic things as do you want the ghosts to catch Pacman every time, or do you want him to be able to escape? Do you want Pacman to keep moving after the joystick is returned to center, or do you want him to stop? The problem with every simplified language I've seen so far, is that you still need to be able to express what you want precisely, and that adds complexity to your language. You have to figure out what you want the computer to do in the corner cases.

Unless you have AI behind your language, but then it's not really the same.

I don't remember where I read it, but I read somewhere that Ruby works as a "programming skill amplifier." As in, if you're a great programmer, Ruby allows you to write beautiful code, but if you're a poor programmer, Ruby will allow you to write the most hideous thing that your processor has ever seen.

And I agree. For better or worse, I think it's a testimony to the power that lies in the language.

Apple didn't choose to use it because C++ wasn't different enough; they chose to use it because that is what the NextStep was written with back when Objective-C and C++ were both still in their infancy.

There were better reasons than this. C++ is not well suited as a systems programming language. That's why there's still not a single major OS that uses the C++ object system at the system level. BeOS was the single failed experiment in this regard - remember the loveliness of "reserved slots"?

Objective-C's dynamic features make it a much better foundation for systems programming. Its relative simplicity is also attractive.