Posted
by
Unknown Lamer
on Wednesday September 07, 2011 @10:25AM
from the and-you-thought-java-was-slow dept.

snydeq writes "Ruby creator Yukihiro Matsumoto discusses the past, present, and future of the popular programming language, calling mobile the next target for Ruby: 'I'm currently working on an alternative subset or dialect of Ruby for the small devices. I'm going to make it public early next year. Of course, mobile computing is the way to go, so that's one of the reasons I focus on the Ruby dialect working on the smaller devices.'"

Most of those smaller devices don't require subsets of functionality or features. Any general purpose language that doesn't require a super heavy runtime environment or a bazillion linked libraries should get along fine with an API to interact with events generated by the device. See Android, etc.

Can't speak for the GP, but I happen to like JS... I don't see that Ruby really offers much over PERL, except in that neither have had as much effort put into optimization that JS has seen in the past few years. As Node has been increasingly used for highly IO driven workloads, it's being proven to be a good platform for a lot of things. It's the default language of development in-browser (which is arguably where most development happens these days, though not always the most high profile development), an

Obviously, rather than compiling to js and running on the the preinstalled JIT, we should have runtimes for ruby, python, lua, PHP, Perl and every other scripting language under the sun installed on our mobile devices. Makes no sense from an engineering perspective but mercy be that people who are irrationally and emotionally invested in a pet language will STFU!

You seem irrationally and emotionally invested in javascript.

If javascript is the only runtime preinstalled, using javascript seems like a pretty rational thing to do to me...

androids version of _java_ is quite complete.it's certainly not midp 1.0 type of subset on the vm side. you know, because you're not lacking things like floats.api's are another matter of course. but what would you do with a mouth class if you had no face to scream from.

if he's talking things like hw abstraction api's, then that's another matter. and highly platform specific. ui classes are also another matter.

There's nothing really wrong with TFA, but there's nothing there either. It's so bland. The questions are just "Why did you create Ruby? What's next for Ruby?" I mean, seriously? If you were interviewing someone for a high school newspaper that might be OK, but they really can't do anything better? There's nothing more interesting you could ask Matz?

Of course, mobile computing is the way to go, so that's one of the reasons I focus on the Ruby dialect working on the smaller devices.

While I do have Ruby on my N900, I wish him the best of luck in his goals. Between the attacks from the Apple and MS camp on Android and little to no attention being paid to real solutions like MeeGo, all we'll be left with in short order is anti-geek platforms like Windows Phone and iOS, where running things like Ruby (or Python) are expressly verboten.

Because my experience with Ruby on mobile devices is no indication of what the mobile space will look like in the very near future. I expect in the near term, being able to put a scripting language like Ruby on a mobile device will be hard if not impossible. And I am not optimistic.

The real question is why in the world would any non-developer (and I use the term lightly) run a CPU burning, battery draining, GC requiring interpreted language on a mobile device in the first place? Python / Ruby / Perl / TCL, all of these are prototyping / utility languages. They're not designed to build low overhead / low cycle apps. I can see why developers who aren't proficient in C / C++ / Objective-C want them, but in the context of user experience, there's absolutely no net benefit to using these l

The real question is why in the world would any non-developer (and I use the term lightly) run a CPU burning, battery draining, GC requiring interpreted language on a mobile device in the first place? Python / Ruby / Perl / TCL, all of these are prototyping / utility languages. They're not designed to build low overhead / low cycle apps. I can see why developers who aren't proficient in C / C++ / Objective-C want them, but in the context of user experience, there's absolutely no net benefit to using these languages in creating apps for mobile devices.

Actually, there is. Most mobile devices are not rebooted frequently and have a relatively (to desktops) small amount of RAM. Also, many mobile apps are long running. An app might be run for weeks before being closed or before the device is rebooted. This is a recipe fore heap fragmentation. Garbage collected language runtimes don't fragment memory, or if they do they can clean it up very easily. Once a C malloc hands you a block of RAM, the OS and language runtime cannot move it around to reduce fragme

all we'll be left with in short order is anti-geek platforms like Windows Phone and iOS, where running things like Ruby (or Python) are expressly verboten.

I'm not aware of any restrictions regarding writing apps in Python (or any other language) for Windows Phone. The problem is that it doesn't have a Python implementation targeting it - CPython requires a native code compiler, which is not available, and IronPython requires DLR, which relies on certain.NET features not available on WP. It is still quite possible to implement something less efficient than IronPython on WP subset of CLR, and then write apps in it.

I am not really a professional coder. I don't even work in computer-science related jobs. I just spent my limited brain power into somewhat successfully learning a portable language like C with GL and such.Because of that, I am not as passionate about coding as many users in Slashdot. For me it's a means to an end (game dev), and a pretty painful one at that.

Without time and resources ($$$) to buy books it was a pretty slow and painful experience (specially because asking for coding stuff usually has 2/3 pe

There are books on Android programming - several of them (I like "Hello Android") have great code examples. The Android SDK (and emulator) is free, and uses the (also free) Eclipse IDE. The cook isn't free, but the SDK (and code samples + tutorials from Google) and IDE all cost nothing but your time to download them. Pick something and DO it. You can! You will mainly get better at coding from writing code, so make up a toy application tha

Actually the promise came because someone, years ago, was concerned about starting a game because of costs in software and hardware. Writing my first game was painful but proved that it was possible, took a few months to get the engine rolling (C89 + GL + SDL +

What I do to make it interesting for myself is to decide on one particular application, and then build that application in every language I know, and every language I learn.

It teaches me the differences between the languages, and some of their relative strengths and weaknesses. (plus it gives a portfolio of my work, since the programs I work on for my job are not showcaseable).

Lots of people would be bored to death by that approach, but I bet they each have their own.

> I feel like all the training I did to be able to code games in a PC is going to be obsolete before I know it.

Oh, there it is!

But really, it's not that mobile computing is "the way to go", just that desktop computing is no longer "the only way to go".

People who've been closed-mindedly programming their Windows apps for years are scared that they now have to think in different terms, be those architectural terms like "do I need to think about 64-bit processors" or "do I need t

Actually I develop cross-platform FOSS games, not really stuck with Windows apps you know;)That I code games doesn't mean I don't take it seriously, despite (as pointed in other post) not being as passionate about it as other people here.

To be honest, I find portable devices to be expensive and not powerful at all.In my lunch break, with my cheap netbook, I can do everything I can do at home (2D work, 3D work, coding, tracker music, compiling, debugging, etc). It's the only way to develop a game while havi

Speaking of cross platform, I wrote a program in perl that works on OSX, Solaris, OpenSolaris, AIX, Linux (Ubuntu, RedHat, etc).The main problem I'm facing is the Linux distros don't generally include libraries that allow my program to easily make https connections (http works).

Compiling the https stuff for each and every distro (and significant version) and bundling them sucks.

I'm not aware of many ways around it. I need the stuff to be able to run on as many OSes and OS versions as possible, and the downl

Well, yes, I actually do different builds for and in my target platforms. It's an extremely brute and simple approach but never had a complaint from users.For the rest I provide the sources and detailed install instructions (download this, open this, type this, done) and a dummy build script in bash for those without autotools, and a more precise script for those with debian systems (fetches and builds everything).

Thanks. How do you deal with stuff like different versions/releases of Suse, OpenSUSE, different versions of Redhat, Ubuntu, Debian, Mint, Fedora, etc.Do you find that in practice they tend to be backward compatible within the distro? So you just build for the oldest supported distro (with latest backported patches of libs if possible - ugh) and it works on the new distros?

Or do you find you end up having to build for each distro version? If so it probably means that we might not bother with supporting "Lin

I feel like all the training I did to be able to code games** in a PC is going to be obsolete before I know it.

Training rapidly becomes useless. Education never becomes useless.

Memorize how to use a linked list library in Pascal = rapidly useless

Learn what a linked list is, why and when you'd use it = useful forever

Also much like human languages (supposedly) the first three languages are pretty tough, but once you learn a bit of ten or so, its pretty simple since all the concepts are the same. The hard part is knowing how to index thru an array without a picket fence mistake and figuring out how to troubleshoot it. The easy part is remembering or googling the syntax.

And you just committed the error in public. A 10m section of fence requires 11 posts, unless you ignore that pesky last metre. If this is too hard to grasp: consider a 2m long fence. You'll have a post at the beginning, one in the middle, and one at the end. Total: 3 posts, 2 spaces.

As a better answer to the two others who posted, I will give an example. C and C++ (and others) start array indexing from zero.If you have a 10 element array, in C/C++ you would access the last element as myarray[9]. You know that you have 10 elements, and if you are not careful you might try to index the last element as myarray[10]. This will generally cause errors in you program and you hope the compiler can catch them for you. In languages like C/C++, this could even be a hidden error since if you are no

The stationary computers aren't going to vanish.The mobile devices (pagers, portable gaming devices, phones, laptops, tablets, prosthetic brains) are the ones that will be squeezed into one device. And that's where the extinctions will be.

Most people don't want to carry so many extra portable devices.

So if you had a prosthetic brain that did everything a phone did (and more: virtual telepathy, virtual telekinesis), you would be tempted to not carry a phone - especially if the antenna wasn't passing near an

I've compiled and installed plenty of "unofficial" stuff on OSX and Linux. My experience with the Ruby thing was the first time I've encountered syntax errors in libraries that were part of the official distribution. I intend to stay away.

Ruby is a nice little language. There are a few oddities and the C Ruby interpreter, as Matz admitted, is not very efficient.

Rails, which is what everyone thinks of when they hear Ruby, on the other hand -- well, I'll stay far away from it thanks.

Count me among those developers who never thought that the way to build robust and flexible applications is to first define some database tables, then write a little CRUD code to generate screens. Maybe back in 1989, when SQL databases were new and there were a lo

Matsumoto, when you created Ruby you did not have facial hair. But shortly after growing some, Rails came out and changed everything. Now Ruby is popular. Do you feel this is connected to your facial hair?

It's more complicated than that. For you see, when he started growing facial hair, he started waxing his cock, balls, taint and asshole. (It's some sort of ying/yang thing). Hairy crotch, ruby languishes. Bald balls, ruby flourishes.