Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

First time accepted submitter Gavin King writes with news that the Ceylon language hit 1.0"Ceylon 1.0 is a modern, modular, statically typed programming language for the Java and JavaScript virtual machines. The language features, an emphasis upon readability and a strong bias toward omission or elimination of potentially-harmful constructs; an extremely powerful type system combining subtype and parametric polymorphism with declaration-site variance, including first-class union and intersection types, and using principal types for local type inference and flow-dependent typing; a unique treatment of function and tuple types, enabling powerful abstractions; first-class constructs for defining modules and dependencies between modules; a very flexible syntax including comprehensions and support for expressing tree-like structures; and fully-reified generic types, on both the JVM and JavaScript virtual machines, and a unique typesafe metamodel. More information may be found in the feature list and quick introduction."
If you think Ceylon is cool, you might find Ur/Web interesting too.

I'd like to see a type system that can help enforce units (like mass * distance = force). If it were really lightweight (in typing) to create types with meaningful units, it could protect you from accidentally adding things like Mbits with MBytes.

I can see enforcing types for say, current and resistance, but what about the fact that current times resistance is voltage? I can see having a voltage type too, but do you have to write the rules for every combination of units that result in a different unit, or is there a more clever way to handle that?

Actually, you're probably better off implementing the whole job lot of the physical sciences in scalars, vectors, tensors, yada, yada. Then it wouldn't be anywhere as difficult to figure out which type you get out the other side of the equations. Especially when dealing with cross-products and phasing.

Haskell allows types to be in a hierarchy. A => X(A)So for example you could have acceleration defined as an adjustment type time(distance).And then you could define a function to take A(B) and Y(A) to Y(B) in an abstract way.

map for example takes a function from a to b, an array of a (i.e. a structured type on top of a) and returns an array of b.number class is a good example of this sort of hierarchical layering.

So yes, you can do it to some extent in Haskell. If you want to go further Q

I tried to do that once, it turnes out to get pretty complicated if you don't have compleate symbolic manipulation library at your disposal. Simple unit */ another unit was easy, but then when they were hard to simplify because one unit was to the (1/3) power, my c++ class system couldn't handle it.

I eneded up simply requiring out group to always put units on their variable names so we could follow the code ea

I find one of the most effective techniques is just to enforce the rule that any variable that represents a physical quantity must have the units appended to it, using std SI notation, like massCannonBall_kg, or distanceToOuthouse_m. It also helps to insist that all units be base units. With floating point, why use _kOhm or _MOhm, when you can just use _Ohm for everything? It eliminates confusion and errors to stick to base units, and you can easily convert for convenient user I/O.

I'd like to see a type system that can help enforce units (like mass * distance = force). If it were really lightweight (in typing) to create types with meaningful units, it could protect you from accidentally adding things like Mbits with MBytes.

C++11 has very good support for this. You could do it in C++98 with operator overloading and templates, but the syntax was horrid. Combine that with auto type declarations and user defined literals and you're golden.

From the above link:One of the distinguishing characteristics of Frink is that it tracks units of measure through all calculations. This allows all values to contain a quantity and its units of measure. Frink understands how different units of measure interrelate, such as a length cubed is a volume, or power multiplied by time is energy. Different units of measure can be mixed in calculations, and Frink automatically ensures that the calculations lead to a result wi

Wow, wish I'd known that when I wrote my aurora-prediction system, my demonstration database, my point of sale system, and my meter generation software. I didn't know it was supposed to be difficult, so I committed the ultimate sin of writing, debugging and using these things quickly and easily. I just didn't know I was supposed to struggle with something more difficult. Can you ever forgive me?

No; I've always written code in an indent-structured manner, it's both natural to me and an aid, not a detriment, to my coding. I've even written (and had published) technical magazine articles on it, back in the day when that actually meant something. If your style (and I use that word loosely) is K&R or some other inherently indent-unfriendly method where braces or other forms of structure are visually unbalanced and code that is structurally dependent isn't visual

Well, technically it would have been better being altered to a colon if the semi-colon separated list were to parse correctly.. though a human brain would just issue a warning rather than an error here.

I'm sorry, I'd really like to be interested by this, but the second and last sentence is just the usual marketing bullshit that each newest language has been serving us in the past years.

So, here comes the usual question we always end up asking when such a thread shows up: could you give us a simple and clear explanation of what is so good about it as well as the traditional comparison of advantages/disadvantaged of this language with other similar ones.

The the extent I've looked at it, D seems like a great language. Maybe it doesn't get much attention because it's not one of the new cool (i.e. rehashed) languages, but it'd fill an important role. In many ways it's C++ done right. That fills the need for a language that's higher level than C, without all the historical baggage and minefield complexity of C++, and compiles to fast binary. I know lots of people these days say Java (or whatever your favorite slower language is) is fast enough, programming cos

If I was making a language that was C with a few more features, I wouldn't stop at just better string handling. I would also include a method of inspecting the heap so a function passed a pointer knows just how much storage is available at that pointer. I would also steal C++'s single line comment delimiter (//), and pass by value mechanic. I'd also build a first class regex utility into stdlib, but that's just me.

How many new languages really have anything new or interesting? Most just seem like whatever struck the author as a "best of" list. I'm learning Haskell out of curiosity, because it's genuinely different from other languages I know. However, knowing Perl and Python, learning Ruby or Lua strikes me about as interesting as watching paint dry. You can argue until hell freezes over about the pros and cons of different ones, but is there really that much difference? I'm not worried about saying that, because I'm

The one really useful "new" language that I've seen this past decade has been Scratch, something that I think could be turned into a production language (as opposed to the tutorial language that MIT uses it for). There have been some attempts to do just that by some other 3rd party developers, but it certainly isn't widespread and they largely take clues from what MIT is doing with the base language.

The GUI development environment is definitely useful, and I like how Scratch does multi-threaded application

Reading the language description, I don't see anything notably distinct from Scala. If anything, Ceylon seems a bit clunkier. The one upside appears to be baked in translation into JS, but others have already provided a Scala -> JS parser.

Serendib is the Arabic name for the Island, and Serendip in Persian. A Persian folk tale "The Three Princes of Serendip" became known in England through a 16th century Italian translation. The protagonists of that story were in the habit of discovering of things they were not seeking. In 1754 Horace Walpole, English writer and parliamentarian, coin the English word "serendipity" to capture that happy characteristic.

Taprobane, a Greek name for the island, would be another literary allusion.

Since Ceylon seems to be Java inspired, I looked for a comparison of the two but didn't find one. Wikipedia's entry about it [wikipedia.org] says, "The project is described to be what a language and SDK for business computing would look like if it were designed today, keeping in mind the successes and failures of the Java language and Java SE SDK." That nice, but can anyone here supply some details?

>combining subtype and parametric polymorphism with declaration-site variance, including first-class union and intersection types, and using principal types for local type inference and flow-dependent typing; a unique treatment of function and tuple types, enabling powerful abstractions; first-class constructs for defining modules and dependencies between modules; a very flexible syntax including comprehensions and support for expressing tree-like structures; and fully-reified generic types, on both the

Why build a better type system, or any type system at all, when I can just call everything an Object?A better type system makes it easier for developers to create correctly working, and efficient, code.Coding for concurrency is very difficult and a proper concurrency approach (ala Go or Rust) make it much easier for developers to develop correctly working, and efficient, code.Add-on libraries like Akka make things a little easier but can't help nearly as much as a properly designed concurrency architecture.

If they could use Ceylon over python for the system tools, that'd be a godsend. I know there are some good python programmers out there, but everybody writing Redhat system tools isn't a good python programmer. Node.js is coming along nicely, and that'd be a great combination.

A bit late, but Ceylon creator Brian Krig answered the following question in an interview [jaxenter.com] posted today:

Finally, going forward, do you think that going forward, Red Hat will start coding more in Ceylon?

The first step for us will be to bring some of our pieces that we have in the JBoss ecosystem that we delivered as pieces of the application server, and repackage them, and make them modular, and make those modules for the Ceylon platform.

At the same time as that, we're taking Ceylon, and we're enabling deployment to Openshift. Once we have then the capabilities that we have in JBoss, also for Ceylon, then it's going to be a lot more interesting - what can we do in Ceylon that we can currently do in JBoss?

People often ask me, does RedHat use Ceylon to build internal projects, and I'm always kind of like, I don't quite understand, we don't have internal projects, we're a product company!

We're sitting here bitching about how much PHP sucks while Mark Zuckerberg sits on billions of dollars an application still mostly built on the language.

I love language wars, but the truth is while fanatics like me are looking for the perfect blend of our favorite features and abstractions from Lisp/Haskell/Scala/C++/APL/Ceylon/Clojure/Python/Ruby/Perl6/J, the people using PHP, Python, Perl, Java, and especially C and C++ rule the world.

We debate the merits of programming languages because we seek the most efficient tool for writing useful software with acceptable performance and quality. That's the core concept of the discussion - blast or praise features of language X because you think it makes the development process slower, or the likelihood of high quality better, or the runtime performance unacceptably low.

And it's a fun discussion to have, I really enjoy it. But eventually I realized I was spending so much effort seeking the per

PHP didn't even start out as a programming language. Even now, you can happily look at it as a bunch of stuff that makes it easy for people make dynamic websites. It is clearly quite successful. It's successful because it was really good at doing the job it was, er, "designed" to do.

PHP, then, would fall neatly under the "successful languages" category.

Hate it all you want. Bitch and moan on Slashdot 'till your fingers bleed. It's not going away any time soon. There is no alternative that is even half as easy to set up and use. There is no suitable replacement. That "unusable" language just happens to be the best thing around.

I point out this obvious fact because it drives morons incapable of forming their own opinions crazy. With any luck, they'll stop polluting every programming related thread with their miserable whining.

We should recognize a new class of Language Wars for functional languages. Allow me the honor of firing the opening salvo: Erlang sucks, Haskell rulez! (not that I know Erlang, but why should I bother to learn it when it's such a lousy language).

(And what would that have to do with anything anyway? And actually, the popularity of the phrase "You're comparing apples to oranges!" shows that, actually, people do think in types, and get annoyed when others treat one type as another unthinkingly.

Consider:
int x;
What happens here? x = 3.1415; Does x hold the correct value? No. But if x was a scalar, then: x = int( 3.1415 ); says exactly what is happening. Type safety places restrictions on what can be done and programmer have to memorize the rules because they're not natural to his thinking. It's because of all the things that programmers have to memorize that programming is hard. The more things a programmer have to remember, the more bugs he will create. Having to write int() every time he want

BTW, you rounded wrong. Pi to 4 decimal places is 3.1416. However, in order to round 3.1415926 to 4 decimal places "correctly", you should specify the exact type of rounding you want to use. In that case it probably wouldn't matter, but there are plenty where it does. The cumulative effect of rounding a number of numbers can be significant (pun intended).

It's because of all the things that programmers have to memorize that programming is hard.

People learn the difference between integers and reals in grade school.

Having to write int() every time he wants an integer: 1. decreases bugs, and 2. makes the code more readable.

In programming, our challenges includes some tightly coupled issues. Identifying and removing errors to large programs and keeping code complexity to a minimum. Strict typing usually eliminates a number possible errors. Unexpected autoboxing is one of them, depending on the strength of the type system. You describe adding a 0 to a string, which is how people think for a simple domain. A type is a domain. An int will (in most languages) not contain.01 nor 1i nor A. Representative values notwithstanding. So