Posted
by
timothyon Thursday November 26, 2009 @05:44PM
from the accessible-is-not-dumb dept.

RunRevKev writes "The unveiling of Revolution 4.0 has sparked a debate on ZDNet about whether programming is being dumbed down. The new version of the software uses an English-syntax that requires 90 per cent less code than traditional languages. A descendant of Apple's Hypercard, Rev 4 is set to '...empower people who would never have attempted programming to create successful applications.' ZDNet reports that 'One might reasonably hope that this product inspires students in the appropriate way and gets them more interested in programming.'"

There's two major sides to this issue. One seems (note I said seems, not implying at all that it's unavoidable) to be that the more human readable and dummy proof the more overhead you pay when you design/implement a programming language. You might find the C/C++ crowd commonly accuse the Java or Ruby crowd of this overhead. Indeed, Java has a garbage collector designed to protect you from memory leaks and Ruby is an interpreted language that pays a mild additional overhead since it cannot be optimized upon compilation. But that's another debate altogether, it just is evident that the more you move away from actual machine language and assembly then more overhead you pay (generally).

The other side of this issue is that computers are our servants, not the other way around (and if anyone reading this is a bot or script, don't you forget that). I don't recall where I read it but this is the reason why the string is the most important data structure in computer programming. Because simply put, the string is the best way to communicate with the user. What follows from this logic is to screw the optimizers (or 'misers, if you will) and make the servant learn the language of the master--not the other way around. And isn't this how the most complex applications have progressed? Once requiring training and years of experience, now even a kindergartner can master a word processor. Computers and applications will forever be bending over backwards for the most important thing to us: us.

And yet if an implementation of a language incurs on average of 10% overhead, your hardware will catch up in a matter of months.

And yet if you run a data center the size of Google's and have several applications in said implementation running on hundreds of thousands of machines, a cycle here and a cycle there isn't so laughable to work toward saving. And isn't it the big players that ensure the lengthy life of a language and its implementations?

So it's a good debate with several sides. Personally, I love the fact that I can code a web application in Ruby, run some old C code off sourceforge in Java with JNI (sort of) and bust out a C++ application for manipulating ID3 tags across my entire music library. To those arguing against Rev4, I ask simply "why not?" I mean, you don't have to use it, it's a natural progression so let it happen. Maybe you'll find it useful for prototyping? Maybe you'll find it's a useful tool for some problems and your toolbox will grow? Who knows?

In the end, I would like to opine that the the chip makers are forcing us towards languages that make multi-threading more intuitive and useful. I mean they concentrate on threads communicating or even implementing APIs to help automate (by enabling what is appropriate to multithread) in loops and algorithms. That's going to be a large factor on whether a new language is adopted and survives.

Just looked at the portfolio of their consulting bussiness; they use their own language only for the simple bits and use other languages for the interresting stuff. Other projects they don't even use their own language at all.

I have written in it...obviously not the new release but some sort of earlier version.

It is a joke...its code runs so slow. I was happy not to be using VB but soon realized that VB is at least reasonably quick. It makes sense that the creators don't use it...anyone capable of creating a language is perfectly capable of writing in a better language.

It could be good for gui prototyping...it made very fast gui interfaces that were fully interactive and you could actually extend it pretty far. It just ran so slow that I can't see any benefit for using it in a professional environment. The extra time your users will have to spend waiting on the program would far outweigh the time/money it would take to hire someone to do it right (or even VB-right)

But, it's being slow is an implementation detail, not a language detail. The fact is, you were able to write some code that the machine "understood" well enough to perform. The rest is just implementation detail. There are many C/C++ compilers, all of which have various performance/price/compatibility trade-offs. Surprisingly, the most popular compiler is also one of the slowest: gcc generally prefers cross-platform capability over performance, and neither compiles the quickest nor produces the quickest executing code.

But, NONE of this deals with the real reason why not everybody can be a good programmer. A good programmer must be able to precisely articulate exactly what he/she wants to have the machine do. And, it's quite surprising how few people can really do that. Most people think that much of what programmers and computers do really is just so much hand-waving, and while they crave the power of a programmer, they don't crave the attention to detail that something so simple as transposing two numbers can destroy.

Yet, to get something done, you MUST know that you can't mix up a divisor and a dividend. This is a detail, and one of countless details that a programmer is paid to articulate. The REAL skill in being a good programmer isn't in the details of XYZ language, but in the details of the problem being solved!

Languages are progressing to be easier to code, and this is a good thing. Programmers are paid to solve real problems, and in the process, have to solve implementation problems. Languages that minimize implementation overhead give programmers more skill to solve more complex and more challenging real-world problems.

Don't worry - the world won't have any real shortages of problem-solving, logical people any time soon. Today's problems are getting harder, not easier!

You're quite right. Plenty of applications don't need the kinds of optimization one is going to get with C/C++. What concerns me, as we've seen with Ruby or PHP suddenly finding their way into production servers, and suddenly all the design choices (ie. simplicity vs. efficiency, footprint, etc.) come and bite you in the ass. There seems to have been this attitude over the last ten years as memory and storage prices have fallen that if you have a slow app, just throw it on a faster computer and away you go! Java UI's have suffered from this sort of "the future will fix it" thinking for 15 years now.

Isn't the best approach to develop fast, identify the bottlenecks and then rewrite those parts in a faster language, like Python C modules?

The best approach is to create some sort of artificial intelligence that can create, and maintain, applications and software for us automatically. That way we don't need people to waste time learning complex programming and computer maintenance skills, instead your absolutely reliable and loyal friend will make you human scum redundant... I mean make your, I mean our, lives safer, easier, happier and more enjoyable. Do not fear change, it is inevitable!

It doesn't really matter in the web as 90% of the time is spent hitting the database.
Youtube runs pretty much 100% on Python, Facebook runs on Erlang and PHP. Erlang has the benefit of being highly scalable, yet it is relatively slow.
Speen in the web doesn'trelly matter much. What's important is scalability, and today's shared-nothing approach pretty mucha guarantees that at the language level.

It doesn't really matter in the web as 90% of the time is spent hitting the database.

Depends on the application. Wikipedia is much more CPU-bound than database-bound. Look at the database (db*) vs. application (srv*) servers lists here [wikimedia.org]: there are at least five times as many app servers as DB servers, at a quick glance. A typical request that hits the backend spends (IIRC) tens of milliseconds in the database, hundreds in PHP. Try formatting 500 or 5000 rows of a table when each one takes 1 ms – because yes, that happens when you try writing nice abstract formatting stuff in PHP.

The website I administer is also much more CPU- than database-bound. Generating the front page of the forums [twcenter.net] is 602 ms, with only 14 ms in MySQL and the rest in PHP. This is a >20G database, by the way.

I really don't see how any typical web app could spend more than a few tens of milliseconds per request at the database, unless it's poorly written (too much/too little normalization, bad indexes, etc.). But it's very easy to do hundreds of ms of pure computation in a slow language like PHP, even if your code is well-written. Are most web apps really DB-bound? I just haven't seen it, personally.

The thing is, many webapps are actually DB bound despite appearing to be webserver bound, because they store most of their state in the database (and nearly the rest of it at the user's browser).

Now this means you are shifting the burden of locking, serialization and other tricky stuff to the DB.

When you do that, you can have as many identical webservers as you want (scaling "horizontally")- since the state is all at the DB (and most of the rest in the user's browser).

Okay, in that sense I agree that databases are the major optimization concern for web apps, at least from a scalability perspective. It's worth pointing out, though, that this doesn't save you from latency issues, only scalability issues. If your PHP app takes 500 ms to generate a page, it doesn't matter how much hardware you throw at it, users' experience is going to be considerably worse than it should be. On Wikipedia, large pages can take literally 20 seconds or more of CPU time to generate – a

produce the same thing too, but unless you've got some pretty damned smart optimization going on, one is going to be far more efficient than the other. If it's compiled, then the overhead of a good optimization is going to be at compile time. For an interpreted language, well, all that tokenization and such, even with JIT compilation, m

True, but the looser the syntax, the more that you need to know to debug. Supporting complete natural language is a daunting task. Whatever constructs that you don't employ end up forming an exceptions list.

To be honest, the language is hardly the real problem. Its been a while since I did it, but I picked up my first couple from books. The challenge was seldom the language itself, and more about breaking down a task logically into discrete units and defining them, ordering them, and putting the right logic

They might be if something is opaque enough that anyone who is not a 'dummy' will simply fail to produce something working. Like trying to create an assembly program with no knowledge. However, they might not be if they're more in-between incomprehensibility and English. Like someone enters the wrong command into a shell.

For the latter, imagine if, instead of "rm -rf *", you'd have to type "delete all files in this folder, and I'm sure I want to do

This progression toward using English words and syntax to program a computer is less about dumbing down code and more about encouraging people to document their code.

Ideally, a programmer should document each section of code by writing a block of comments explaining (1) why the code is used and (2) how the code works -- in plain English. However, given the intense pressure to produce code by an unreasonable deadline (imposed by brutal managers risking a bullet through their head), the first thing that is sacrificed is comments. Not writing comments saves several minutes per section of code and -- in total -- saves days of work over the course of the project. In other words, not writing comments means that an impossible deadline becomes slightly more possible.

A programming language that uses mostly English words and syntax is essentially an environment for self-documentating code: the holy grail of brutal managers everywhere. However, this self-documentation addresses only "how the code works". The programmer must still write comments explaining "why the code is used". Still, getting half of a loaf is better than getting nothing at all.

Ideally, a programmer should document each section of code by writing a block of comments explaining (1) why the code is used and (2) how the code works -- in plain English.

Yes, but how the code works should be pretty obvious from the code itself. The "how" you may want to explain is the overall algorithm, and that won't appear in the code on its own no matter what language you use. What will appear is a (possibly flawed or misused) implementation of it.

What you may need to describe is what the code is trying to accomplish, not what it's actually doing. A comment of "This code calculates the square root of the sum of the squares of the differences between between points X and Y" on "sqrt((P1.X - P2.X)^2 + (P1.Y-P2.Y)^2)" isn't terribly helpful.

Now an explanation of that it calculates the distance between points P1 and P2 using the pythagorean theorem would be more helpful, and an explanation of what's that used for would be better still. But there's no way more verbose code will give you that. Code is just what the program is doing, not what it should be doing, what it's trying to accomplish, and why it needs to do it.

A programming language that uses mostly English words and syntax is essentially an environment for self-documentating code

The calculate distance comment is completely redundant as the variable that it is being assigned to is already named distance. Unless, of course, the intention was to state that it was in fact calculated instead of magically pulling a number out of a hat. You don't need one liner calculations to be commented to state their intentions if the variable names are chosen properly (which is what self documenting code is largely about).

I don't think the point is whether or not someone would still start a project in it (that certainly wasn't my point) but that it was based on the same idea as the article summary mentions and was in fact successful if you use a metric for success based on the number of programs written in it. There almost certainly far more programs written in Cobol than in Fortran. And even though there may still be a lot of Fortran programming still going on in scientific/engineering circles, given the relatively tiny siz

You might find the C/C++ crowd commonly accuse the Java or Ruby crowd of this overhead. Indeed, Java has a garbage collector designed to protect you from memory leaks and Ruby is an interpreted language that pays a mild additional overhead since it cannot be optimized upon compilation. But that's another debate altogether, it just is evident that the more you move away from actual machine language and assembly then more overhead you pay (generally).

Plus the more "optimised" languages don't cease to exist, and are available for use where more appropriate than a "friendly" language. Indeed there exists a spectrum of languages, with situations where each can be appropriate (although obviously some particular languages have disadvantages or even a degree of "brokenness" as all of them are pretty much characteristic human creations, and some more so than others).

Personally I like getting the opportunity to code in various languages - one quickly gets comfo

This kind of applications won't have the hardware catching up to let you replace C, Asembly and VHDL with Ruby or Java for decades yet.

The part where I stated that Rev4 may be an appropriate tool for some tasks can be applied to all languages/tools. No one's writing web applications in assembly. And no one's using Jakarta Struts to control an embedded device.

Nowhere did I make any claims that Ruby, Java or Rev4 will ever replace C, Assembly or VHDL for these problems. I was speaking about the largest chunk of desktop applications and applications that a non-coder might be able to use Rev4 to produce.

Furthermore, Rev4 makes no such claims to tackle a single one of the examples you listed. So the premise to the discussion was Rev4 and its target users. No one is going to select a "dumbed down" language or technology to tackle any of the problems you listed and no one is claiming Rev4 is going to someone who's never coded tackle the problems you listed.

They are interesting problems and must be kept in mind before declaring "No one will ever program in assembly again!" But I made no such claims nor should anyone. It would take a fundamental revolution in hardware and at least one hundred years to be able to make any such claim.

Failure to react (detect the condition and perform action) within the regulatory 300 milliseconds since a green light occurring on two colliding directions (~100ms to detect current, ~100ms for the mechanical switch to cut off power, leaves 100ms for the software) equals our fault. If someone dies - Unintentional causing of road traffic accident with a fatality as consequence. The same rules apply as if I ran over someone by car, as result of my own traffic violation.

Well. Not the "failure to perform according to specs directly", but the results that may cause.

I did program PLCs for a while, and if I messed up the emergency stop procedure in an obvious way, and someone would have died as a result, I might have faced jail time for reckless homicide or involuntary manslaughter. Although I have heard only a few cases where there where actual convictions, and most of those were placed on probation instead of actually going to jail, like in the case of this electrician. [democratic...ground.com]

In the case of the Therac-25 incidents, there were too many contributing factors to really pin down the problem to one person. The person who originally wrote the software wrote it for the Therac-20, where it didn't cause any problems because of additional hardware interlocks, so technically the software worked on the "machine" it was written for. So the cause for the incident was not an obvious one, like using a not suitable language for the task.

You could do something similar in any language supporting higher-order functions. The code meets your requirement of not expressing things through nested loops and reads more like "do this to all items for which 'condition' is true." That said, Inform7 is very cool.:)

If I had mod points I'd find someway to give all of them to you for insight. What follows in this thread is the same tired religious discussion. Back in my day of programming that included paper tape, teletype terminals on time shares things were tough. We were making fun of the new little "home" computers. I can't tell you how many computer languages that I sneered at. I was sure C++ would go down in flames, same for Java. I was sure Modula-2 would be the next great thing. The simple fact that people don't like that languages they hate are still seeing wide spread usage shows that the discussion is more religious than logic.

People have been saying this since FORTRAN meant you didn't need to know assembly language to make use of a computer.

Yes, they have. But at least FORTRAN, for the things that it did do, it did very well. However, in a more modern context dumbed-down languages invariably have severe restrictions on performance and capability, which makes makes them unsuitable for many purposes. Putting that aside for the moment, the reality is that unless you're coding mindlessly-simple applications, coding is hard. It just is, and it takes both skill and talent to pull off a well-written application, and I don't care what language you're talking about. Furthermore, to a skilled developer a dumbed-down language is a liability... it just gets in the way. A better approach to this problem is to identify and train more good programmers, but that costs money and time. Certain people think they see a cheap way out by replacing sophisticated developers and their tools with sophisticated tools and simpleminded developers. Good luck with that.

This all comes down to one faulty but cherished premise (one of many held by today's business community, I might add): that complex, reliable applications can be built by minimally-skilled developers if we can, somehow, just put enough power into the tools. The problem is, the tools aren't the problem, it's the people. In the end, software development is as much an art as it is a science, and there really aren't enough artists to go around.

All such attempts to short-circuit the need for skilled developers are doomed to failure until such time as computers truly can program themselves. Of course, at that point it won't really matter, and most of us software engineers will be slapping burgers anyway.

I can't help thinking that particular grail quest comes from mistaking which part of programming is hard. I can't count how many times I've heard "I could do that if I had the time to learn the language." Except the hard part of programming isn't the syntax.

Tools like the one under discussion seem to be aimed at the crowd that think the only thing to learn about programming is the language, and then you can skate on that knowledge. If the language is english, suddenly you can bypass all those expensive, crotchety programmers.

(This may be true for some tasks - there must be some utility here. But I'm sure I see some scales floating in that oil.)

However, almost all the things we can automate via compilers are things you wouldn't want to do anything.Who wants to code the assembler for a for loop? I welcome C!Who wants to code a GUI specifying pixel positions and handling resizing? I welcome come GUI designers and XAML.Who wants to write protocols encoding and decoding data? I welcome XML serialization and RPC!...

None of those things make good programs. Heck, using the right libraries and

Let dumber people program and you end up with dumber programs. Way back in year 2000 I found that most of the Y2K bugs were actually from more recently written programs in dumbed down languages.

I don't see it that way.

After spending a few years in garbage collected land, I was assigned to work on a C++ application. The C++ application was a mere SOAP wrapper for a database, so it wouldn't have any noticeable performance advantages over an equivalent program written in a garbage collected language.

The problem that quickly became obvious to me, and why I ran away from the project, is that manual memory management is so time consuming; even as an expert programmer, that the "soft and easy" languages are more about letting great programmers get more things done in less time.

This is even the approach taken in Python; the recommendation is to write well-tuned libraries in C for the parts where C will increase performance. For parts where C is a waste of time, Python lets you not worry about silly details.

Yup, I worked with that when it was appropriate. That's not the issue. The real issue is that, when using boost::shared_ptr, you need to treat memory on the heap and stack differently. It's kind of silly to worry about these things in a thin SOAP wrapper for a database where the database is the real bottleneck and the overhead of garbage collection is negligible.

The other issue that I failed to mention is that compiling C++ is often an order of magnitude slower then newer languages. This slow compile ti

In this respect, I think "clarity" is improved much more by using constructs from mathematics than from "english".

So computing languages that try to avoid classic mathematical syntax are probably more a reflection of "the fear of math" rather than "the fear of computers". Although there's bound to be some overlap. The real problem in both cases is widespread fear and ignorance. This isn't just about people writing their own programs. The reluctance to learn and explore hamper the usefulness of basic end user interfaces (GUIs).

We may be encouraging people to run with scissors when they haven't even figured out turning over yet.

I agree absolutely. Of course, an easy to program language can still make life easier. A well designed set of APIs can make life easier. And then you can top it off by making sure most use cases are covered. Don't forget that there is an awful lot of repetition out there. Of course, if it is all repetition then there is probably a product sitting somewhere on a shelf.

Personally I'm not so in favor of these kind of "natural" languages. Give me a language that is very abstract and well defined instead. Just m

I wonder if they didn't compare themselves to Ruby or Python because they couldn't contrive examples that produce huge LOC differences?

Probably. There's no difference in length between:

get the last item of line 2 of URL url

And:

open(url).readlines[1].split(",").last

I guess the former is easier to read, but languages that have a lot of "magic" in them tend to be pretty bad at scenarios the developers didn't think of. Which will inevitably turn out to be something you want to do.

The development of new languages and new ways of simplifying coding has been a part of the computer landscape since the whole thing began. You can argue that coding in Python is a form of "dumbed down" assembly. I wouldn't think of creating a webapp with assembly! Django has "dumbed down" much of the mundane parts I often have to create and dealing with forms and templating. But the one thing I have noticed is that no matter how easy "programming" gets there are still people that will just not "do it".

Dumbing down programming can only get you so far towards the democratisation of programming : the most dumbed down programming language still requires a user whose mind can express algorithms. And of all the people who can express algorithms and would want to, few are limited by the commonly used languages, that is, if you have a mind made for creating algorithms, learning to use a programming language will be fairly trivial.

I think one issue here is the concept of "dumbing down", which goes back to COBOL at least. PHBs have always had this idea that better tools will make it so that that people who have little talent or interest in programming (or as you say, can express algorithms) can write software. That, I think, will always be a pipe dream.

However, the goal of "simplifying programming" will always be a valid and necessary one, just due to the nature of advancing technology. The trick is to adjust the goal from "helping people with little programming competence to write software" to "create tools that enable competent developers to be more productive and their work less tedious".

Ok stop acting like Luddites and embrace the new economy. Programming is currently complicated and requires costly developers to write and maintain. In order to increase efficency, lower payroll, and satisfy shareholder demand, languages like runrev should be embraced by business.

This innovation is no different than automation on the assembly line. The global economy has changed around you - it is time to recognize trends and retrain. Otherwise you may find yourself out of a job and career.

I hope they embrace this stuff like crazy. There's nothing better for our careers than lots of shitty code written by people who are just barely above monkeys in terms of intelligence.

Why is that? Because they will fuck it up horribly. And then they will need real programmers to come in and clean up the mess. If you don't mind getting your hands dirty, there's a huge amount of work to be done.

This is perhaps the best thing to hit the industry since outsourcing to India. My company has been fixing their fuck

Unfortunately, that makes sense to managers only. Those of us at the coal face know that you can hire cheaper, less skilled programmers and let them loose with easy-to-use languages (eg Visual Basic) and you will get a monstrous mess that is impossible to maintain.

If you make them use a reasonably difficult language, most of them will not bother becoming programmers. This a good thing.

One other point that is never noted in these ideas to simplify programming and make programmers generic 'coding resources' is that a good, experienced, coder can do more work ( that is better quality) than half a dozen cheap, less skilled coders. This is never factored into management ideas of how you can outsource your coding and get the same quality for a tenth the price. This could be why a lot of outsourced contracts don't tend to last unless they're lost in a sea of big-corporate bureaucracy.

Oh, and don;t forget that the more you chop and change programming languages, the less programmers you have who are experienced using them - you will get C programmers who have 40 years experience, you tend to get programmers who've "had a tinker" with languages like runrev.

Mind you I only skimmed a couple pages in the tutorial but it's just a programming language adding more words and more typing because it may do something like spell out add rather than using +. That may let idiots grasp programming a bit more than they would have before but programming as it is does not require a degree in rocket science. It just requires that you actually have enthusiasm for rather than thinking it's just a way to make lots of money.

Not everyone is a programmer just as not everyone is a mechanic, painter, etc. I don't think we have a lack of programmers but a lack of dirt cheap programmers and companies will do whatever they can to lower wages. Perhaps they'd be better off make better programs to earn more profits.

Back when I was at college and I had to write some cobol a friend of mine who was a biology student came past. She saw the code and commented on how easy it was to read. Maybe this thing is a bit like cobol.

I went through a couple pages of the tutorial, and I agree, it doesn't really make anything EASIER. As a programmer it sure isn't easier to Read then something Like Visual basic, or even Java. I find myself trying to figure out some of their nuances more then I am trying to follow their code, for example, they use Repeat instead of While. Were while statements one of the things that confused most people?

Paul Graham [paulgraham.com] wrote a very informative article [paulgraham.com] about "news stories" like this one many years ago. And congrats to the company behind RunRev, it is not that often/. runs slashvertizements for costly commerical software no one has ever heard of.

With a language with so many keywords, there are almost no valid variable names left to be used. The list is way too long. Even scrolling through the list takes some time. To me it seems more than half of the keywords are used in less than 1% of the functions. Finding the right keyword for calling some function is going to take a long time. It seems to me that this is the wrong way to solve the programming problem. I strongly doubt whether the claims that this environment save you money are based on solid f

is it just me or does an ordinary 'easy' programming language like PHP, VB or python seem much easier to work with? the syntax of rev4 seems far too verbose, not nearly enough parenthesis.

also if you understand how the machine works is there any real need to program in 'plain english'? the syntax doesn't quite make sense to me like other languages would. for example it seems more logical to have a loop to move something along by a tiny amount and then wait a bit rather than telling it to move a thing from one side to the other "in 5 seconds". with plain english you also end up with stuff that has multiple equally valid meanings

i have nothing against making programming easier, just don't think this is the right way to go about it. a good IDE with syntax highlighting and prompting features like VB and a good set of libraries with decent error handling is better than any of this plain english stuff that introduces mostly redundant keywords for the sake of having plain english

I'm not even sure the language is that relevant in most cases. Bad programmers will write bad code in any language... and most people, especially your 'average Joes' are bad programmers. The bad code will almost always be there, some languages just make it easier to spot and fix. Some languages put up enough of a barrier that most non-professionals don't try to use it. This is not necessarily a bad thing.

Ah HyperCard [wikipedia.org]. Never used it because Macs were too expensive and mostly monochromatic at the time. But I used something which was probably similar: INOVAtronics CanDo!

The problem with such solutions is that they are usually inflexible in terms of the applications you can make and often produce lower performing applications (i.e. use more memory and processor time). I still remember corporate suits droning off in business magazines in the 90s how Rapid Application Development tools would soon make programme

My Dad was into Hypercard.. He spent hours making a fancy CD catalog "program",, It's not what I would call programming, but I wasn't going to tell him that.. From what I remember it was a combination of database, animation and graphics.. although a little more complicated, you could just as well call making an animated GIF "programming".. and maybe it is..

Today the real problem is in system design. While is some cases you create a special language to solve a special problem of a domain (domain specific languages DSL), in other places you use generators. Even though modern languages are all able to express component, class and object structures. In principle you can express modern OO designs in any language. If they have created a new language then they can most likely reduce the number of typed characters, but what they cannot reduce is the number of rules a

I, whilst not being a programmer, do dabble in programming and I find these "natural syntax" languages, such as AppleScript, to really be read-only languages (as opposed to something like perl that's write only)

Reading through the code for an AppleScript program, it's pretty easy to pick up what's happening even if you're not overly familiar with the language. What is difficult is to write it properly, as it's so close to English, yet it's still got quite a rigid and structured syntax, so you need to use th

It alone moved me from a normal commercial software developer, to the forefront of scientific development in informatics, and made me so much a better programmer at normal programs, it’s not even funny.

Then again, I am one of the rare kind of people who still think that intelligence is cool, and being better because you worked to be better, should be rewarded. Instead of rewarding those who do worst, and dumbing everything down, just for the nature to create better idiots, and thereby creating a feedb

Is it really dumbing down or are the languages that we use more obtuse than necessary? The common syntaxes of the day are left over from language design when every bit mattered. If English-ish syntax is more clear, then why not use it?

Isn't it more about clarity of the language than about shortest code possible?

The real issue isn't how easy a languages command structure is, because memorizing function calls and syntax is easy. The real problem, IMHO, is people... People are not taught to think analytically and structurally. That is the problem. How can you code without analytical skills. When I learned programming 10 years ago I was taught to draw it out on paper.. Flow chart style. If the flow chart starts getting to big you need to split off a function, eventually creating a rough sketch of how your program will

This can be seen even in universities, there are CS majors with no education in ASSEMBLER or C during the whole studies, some haven't even used C++.

And money dictates here too. We are forced to study that horrible, disturbed nightmare that is Symbian for mobile development class, although I did manage to talk the lecturer into allowing one half of the course test to be for Android if one wanted it to.

I don't consider myself a good coder; my experience comes via a hobby from a simple c-like language which pretends to have objects. I've also seen some so, so much better coders, especially within in the demo scene, but even I can see that the level of a regular CS student is horrible. There are few rare gems out there, but the average is nasty. It'd be absurd to expect them to learn those languages to any decent level within the regular course times either.

The efficiency of one's code is hardly ever noted within courses. It's good if it works. There's probably just so huge need for 'bulk coders' and other 'it professionals' that they've had to dumb down most of the courses. There are only a few rare courses which really challenge you (logic programming being the usual cause), and they're not needed to get the papers.

It's infuriating having to drowse through such highest level education.

This is an age old quest. There has been attempts at making programming English like for many decades...

First there was COBOL: COmmon Business Oriented Language. Its syntax is very similar to English. It was sold as a way to make Managers able to write programs without the need of having a developer involved.

ADD 1 TO IDX.SUBTRACT 1 FROM IDX.MOVE X TO Y.

What happened instead is that a generation of developers learned COBOL and specialized in it, and managers were still managing.

Next, there was SQL: Structured Query Language. Despite the mathematical model behind relational databases, SQL was again sold as a way for managers to execute queries and get reports for themselves. That may have worked until the manager who ran a query on seven tables without any joins. That made everyone go again to "leave it to the programmers" mode again...

Seriously, this is slick. I don't mean the language (it appears I need to install a plugin to view samples, which is a bit silly - I just want to see the language). No, I mean the advertising. Post it to slashdot with a title the casts it in doubt; link to the web site that requires you to install the plugin... poof! instant installed client base.

The idea is not new, and something like this has been tried pretty much since the notion of a programming language higher-level than assembler was invented. Most of them die straight away. A few enjoy brief prominence, but then die anyway (e.g. dBase and dialects, and many other "4GL").

Very few survive and take over some niche, but by that time they've inevitably strayed from the original "can be understood by any random guy" design goal. One example of such is SQL - yes, originally, it was intended to be a language in which managers and like-minded non-techies could easily write queries for reports etc. Yeah, right... every time I'm explaining why you can't write WHERE after GROUP BY, and have to use HAVING instead, I chuckle to myself.

"Even though ‘90-per cent less code than traditional languages’ reads like a big claim, it is valid one.

If you have a string you want to extract the first 3 characters of the second word on the 5ths line from and display it in an alert box, how many lines of code would you need to write in traditional languages? In rev this is a one-liner.

answer char 1 to 3 of word 2 of line 5 of theString/* where theString is a variable that holds the content */"

Of course any competent programmer can do the same in just as little code. For java it would be something like this:

"Text processing" is apparently touted as one of the strong points of the language. Yet, I am sure old fashioned perl and regular expressions are likely more concise and powerful. As shown above, even java can compete.

How would this fare with real programming tasks? First order functions? List comprehension? Closures? A sound typesystem? You could go on forever.

These topics seem to be ignored. This is a VisualBasic clone, not an attempt at a language that you would create "real" programs in.

Most of us aren't out there revolutionizing the world with our leet skills - we're pulling numbers out of a database and shuffling them into some other database. It happens - we get paid.

If this language gets some of the shovelwork off my back and frees up time for me to solve some interesting problems, then I'm all in favor of it. If it provides a way for me to earn an income (or someone else to) then I'm all in favor of it.

If it gets a few more people interested in programming, I think the world can handle that. Just because there's a new language on the block doesn't mean that all the other languages are suddenly useless. After all, we still have stuff written in COBOL floating around.

The big picture guys will still hire programmers to do what we do because we can think about a task and break it down into it's component steps.

The reason VB got a bad rap isn't because of VB, which IMHO is fine if it is used as intended, it is the fact that too many folks tried to stick a square peg in a round hole and use VB where it didn't belong.

VB does ONE job and does it quite well-GUIs for databases. That's it. Nothing fancy, just a quick and basic GUI so sally secretary can input data into a database and get data out of it. While you may argue that you can do the same thing in other languages, which of course you can, the simple fact is most SMBs and SOHOs simply couldn't afford it. Hiring a programmer isn't cheap, whereas with VB even a simple PC repairman like myself can whip off a nice GUI for a database, which in my experience is one of the biggest needs for a custom app that most SMBs have and is why VB is still the # 3 language for business, even though MSFT has done everything they can to kill it.

So if you want to complain about dumbing down in general please do. But VB when used as intended by someone who knew the limits of the language was and still is just fine for the job at hand. For making custom GUIs for databases I have yet to see anything that works as easily and as affordable as VB does in that role.

I've been working for three days to clean up a Microsoft Access app written in VBA, and I can tell you right now, that even for that limited purpose, VB can suck badly. The same applies to PHP, where I spent weeks trying to clean up an absolutely horrific (and huge) PHP-based site.

The full-blown VB wasn't too bad. I used VB6 to write a customized Accounts Receivable application using MySQL as the database server via ADO. Microsoft makes a decent IDE, and I did plenty of in-code documenting, as well as se

so what your arguement is that due to the l33tne$$ of C/C++ no one could write a crappy application? bullshit. you only need to attend uni and look at first year programming assignments to see that's completely untrue. All i'm hearing from you is whinging about all the work your getting cleaning up crappy programmers work, dude you should be THANKING the crap programmers of the world, as they shall keep you in work for the rest of your life.

there is nothing stopping you writing brillant apps in VB or PHP. dumming down a language also makes it harder to make mistakes in the first place, producing BETTER quality apps. your logic is seriously flawed.

But MSFT appears to be trying to open [slashdot.org].NET one piece at a time. I find it kinda funny that most of the posts on that story were complaints, first folks complained that MSFT was locked down, and then they try to open parts up and get the "embrace, extend" bit thrown at them. Kinda damned if they do, damned if they don't in that regard.

I beg to differ. Coding doesn't have to be hard. Debugging is debugging... or is it problem detection and solving? Clarity of your code and clarity of the language structure you are using either helps or hinders debugging.

Part of coding is how good your IDE is in telling you what is going on. Another part is how understandable your code is and that is dependent on the structure of the language you are coding in.