Posted
by
timothy
on Saturday December 22, 2012 @02:58AM
from the crisis-averted-you-underground-adaians dept.

hypnosec writes "The Ada Resource Association (ARA) announced that the Ada 2012 programming language has been approved and published as a standard by the International Organization for Standardization (ISO). Announcing the development, ARA and Ada-Europe said that the new version brings with it the concept of contract-based programming, Concurrency and Multicore Support, Increased Expressiveness and Container Enhancements.'"

Seen it used for some logic in a gateway that converted one network medium to another (and did a bunch of other not-so-trivial things while it was at it..).

I wouldn't really have classified it as an "algorithm" language. If anything it's a logic language with a major focus on reliability (lots of strong typing and compile time error detection) and fault handling (really good run time error handling).

I looked at it briefly in college. Everyone I've seen in the "real world" that needs that level of reliability uses C++. You catch a lot of stuff at compile time in C++ that would make run time errors in other languages. This was one of Ada's big selling points too. Strong typing might be a pain in the ass for your average programmer, but it's still around for a reason.

Big thing with ADA is the focus on reducing errors. Very strongly typed with a lot of compile time checking and strong run time checks (and the ability to handle them gracefully). It seems to be used in environments where reliability and error free execution are critical: defense, aviation (the planes and air traffic control), etc. I've never heard of it used in the medical field but it would make sense.

Like anything else though there is of course trade offs to support this. Plus because ADA is expensive (yes yes, I know, GNAT..) and people who know it are rare and expensive. This seems to have turned it into a niche language.

Also the ADA community in general are an unusual bunch. There is almost an apple level fanboyism going on.. it's weird.

I've used it in years past. The languages that I've used that help me produce code that just works, no matter how inclined I am to screw things up, are F#, Scala, Ada -- basically your strongly-typed languages keep me out of trouble. If you can handle the pricing and/or license issues, I would still recommend Ada as the best fit for procedural/OO programmers who want to work with tools that sustain quality. I'd expect that a very highly skilled team that wrote F#, Scala, Lisp, or Haskell could beat

I've used it in years past. The languages that I've used that help me produce code that just works, no matter how inclined I am to screw things up, are F#, Scala, Ada -- basically your strongly-typed languages keep me out of trouble. If you can handle the pricing and/or license issues, I would still recommend Ada as the best fit for procedural/OO programmers who want to work with tools that sustain quality. I'd expect that a very highly skilled team that wrote F#, Scala, Lisp, or Haskell could beat the dog out of most Ada teams for productivity over a few months or even a few years, but that over a period of many years, a good Ada team would be hard to bear for reliability and maintainability.

In my experience Ada catches things at compile time that other languages leave you to catch at run time, and catches things at run time that other languages leave you to discover when you find out you've been getting erroneous results for bog-knows how long.

Let me add a note for the community: yes, they are almost as bad as Apple fanbois when it comes to advocating Ada, but aside from that it's the friendliest community I've ever met on the Internet.

When I as a hobbyist, who dabbles a bit in Ada, can ask questions and get answers from professional industrial programmers with multiple large mission-critical systems to their name, and get these answers in a friendly supportive tone, I think I am justified to say that this is a nice community.

..But most products are created by small teams of programmers, have a lifetime of a few years, don't need to be anywhere as reliable..

And that, dear readers, is why I really hate what passes for programmers nowadays.The oldest code I've wrote that I know of which is still currently in daily use is now at the 22 years old mark, was done in assembly, was only really intended to 'work' for a year at most (was done for a R&D prototype, they kept the code in for the production models.)

None of Ada's facilities have any bearing on the common kinds of bugs that cause web applications to leak private information. And the time you spend satisfying Ada's pointless static checks is time you don't have for developing meaningful security audits and test cases. So, given the same budget and time constraints, it is pretty much a given that an Ada based web application would be worse, not better, in terms of privacy and security.

Ada tried to address a problem that really existed in the 1980's: the lack of an efficient, strongly typed language. That ceased to be a problem long ago, and Ada is kind of a relic now.

I really cannot agree. Historically, one of the most common ways to pwn a machine was to exploit buffer overruns. Languages such as Ada and Java are virtually immune to that sort of exploit.

You are also assuming that time spent making the compiler happy is time "lost". I came to quite a different conclusion long ago. I have worked with a large number of languages, and my observations are that with the possible exception of assembly language, the amount of time and effort to make a secure, robust application

I did it at university because they thought it would teach people good programming habits. In fact it just made us hate Ada and look for ways to subvert it, like redefining "-" to add values together.

Ada is extremely pedantic. The idea is that it enforces good coding practices and prevents the kinds of subtle errors that can creep into more flexible languages like C. Supposedly some people in the aviation and space industries use it for mission critical stuff.

I did it at university because they thought it would teach people good programming habits. In fact it just made us hate Ada and look for ways to subvert it

That's a shame. I was also introduced to Ada at university and, while I accept that the syntax is pedantic, it was demonstrably extremely useful. Now I just see it as another tool which I could competently apply to problems where high reliability is mandatory and distributed/concurrent algorithms are involved.

the choice of ADA for that language by the faculty was stupid as hell - it's practically the only course in that university that uses ADA so the course becomes an ADA course rather than parallel programming course, the practice work is really trivial except for the fact that it's ada and it's a bitch to find ada information that isn't loaded with "IT'S MILITARY GRADE, YO!!!" bullshit for the first 10 pages of the text, making it a bitch to find out the simplest things about string manipulation and output.

on the other hand, ADA had such sweet parallel programming mechanisms(rendezvous etc) that you didn't have to learn much anything about parallel programming, last I heard they even dropped semaphores from the course work.

it's supposed to be really reliable though, but if every fucking book about it has to justify it's existence with that for half the book.. it just starts feeling fishy.

Isn't the original strict type checking language Algol 60? And didn't that effort lead to Algol 68, which somewhere along the lines of the Orange Catholic Bible in the Dune novels ended up "recanted" by its creators or some such thing as working against error reduction in programming, either by becoming to complicated semantically or to difficult to implement a compiler?

And didn't the failure of Algol 68 lead to Niklaus Wirth's Pascal, as a simpler, better Algol 60? And for whatever Pascal's limitations

"The more I ponder the principles of language design, and the techniques that put them into practice, the more is my amazement at and admiration of ALGOL 60. Here is a language so far ahead of its time that it was not only an improvement on its predecessors but also on nearly all its successors".- C.A.R. Hoare, "Hints on Programming Language Design", 1973

So I guess Ada has its "community" of "dudes with crew cuts, clean shaven, and with pocket protectors working for Defense contractors", does Ada have any "street cred" with the academic "software theorem proof" or "software reliability" communities?

I believe I had to complete one project in Ada as part of my Computing Engineering degree, in a CompSci course that basically went through the different kinds of languages available back then. C, Prolog, Lisp, Pascal, Ada, probably a couple others.

There was much rejoicing amount the two Ada programmers left in the planet for the approval of this standard. Ooops.

You know you are doing poorly when the major user of your product, in this case the military, has been turning away from Ada to C/C++ for the last decade. The F-35 is one example. Even parts of the F-16 software have been rewritten to C/C++ by now. There are still some use cases but all newly written software seems to be C++. Another example is SpaceX who used C/C++ to write their flight con

Absolutely. Ada was designed to be like mainstream languages it has a small and gradual learning curve. I don't understand why the sorts of companies that want to use a mainstream language and want exceptional reliability don't use it.

First language I learned at Uni back in '00.Haven't used it since, but I recall it being far more pleasant than the java I had to learn next.Just no opportunities for me to use it since, unless I wanted to work for the ADF (Australian Defence Force)

Ada is used in lots of places where the application is safety critical or there is a need for high reliability.

It got a lot of bad press, mostly because it originated from a request from the US DoD for a single language to do "all things". It's original MIL-STD designation was 1815 (Ada Lovelace's birth year).

Many aircraft have their flight control systems built in Ada, eg: Boeing 777, Apache Helicopter.

From memory the GPS Block II Satelites, not sure about the Block III.

The original Ada 83 compilers were pretty awful - slow and produced horrible code. But they got way better. Because the language required a "program library" to store information about the compiled units (yes, GNAT showed that it wasn't really necessary), the optimizers in the middle pass and back-end had access to more information than normal, giving scope to some very effective optimizations. I've seen generated code that was as good as the best hand coded assembly - multiple levels of inlining, removal of sub-routine pre and post amble code, delayed branch slot filling, value propogation used to remove implicit and explicit tests etc.

My biggest challenge was always the lack of support for the latest and greatest libraries, eg: X11, MOTIF and their Windows equivalents.

Anybody here using Ada, or has used Ada? Not implying anything, but genuinely interested. Isn't Ada one of the most crazy complex algorithm languages ever invented? Just my impression.

I've used it a lot, but not lately. Its syntax is Pascal-like rather than C-like. However, I think Ada 2005 introduced the C++-style syntax for methods.

It does have a lot of complex features, e.g. rendezvous for distributed programming, but you can get started by ignoring most of the unusual stuff and using a subset that is very much like Pascal, then learning the advanced features as needed.

Supposedly when it first came out they had to invent new compiler technology to implement it, but things don't look so exotic now. Lots of integral support for real-time and distributed systems, and as others have said, verbose and an emphasis on reliability. Those last two are related: it makes you say what you mean and mean what you say. Ada programmers laugh when they hear someone describe C++ as "strongly typed".

However, in my experience the more I worked on it the leaner my code got and the more I was able to think on the level of abstractions rather than details, e.g. by using the 'range attribute when looping over arrays.

C# went Ada 95 one better on pragmas and attributes, but I don't know what Ada did in 2005 or 2012. I found them *really* helpful.

Can't give much more comparison, because I'm not up on the latest features of more familiar languages either.

Supposedly the space shuttle's on-board system was written in a subset of Ada.

This was the only annoying thing I found when using it at university, having already learnt Pascal: the number of times I wrote what I thought was valid Ada and wondering why it didn't work only to realise I'd written Pascal instead. Purely my fault, there was nothing wrong with the language.

I'm using it for time-critical stuff. It's complex and has some arcane syntax features, but it's still easier than C++ and a lot safer than C and C++. One problem is the lack of free or affordable compilers, there is not enough competition to GNAT and AdaCore makes sure that the *really* free FSF version with MGPL lacks essential features, whereas their own free versions are useless for anything but GPL'ed software.

Overall it's a very powerful and fast language that I would recommend for real software devel

No FUD at all. Ada is worthless without the runtime. The license of the GNAT Ada runtime forces you to put programs compiled with the GPL version (mainly developed by AdaCore) under the GPL, too. The FSF version is licensed under the MGPL, a modified version of the GPL that allows you to use the runtime in non-GPL'ed software. However, the MGPL version lacks some essential libraries.

The restrictive licensing of GNAT is probably one of the major reasons why Ada never became very popular for general purpose p

Contrary to what people tell you, it is not a complex language at all, it's a fairly standard Algol descendant. It does have an intricate type system that you must get used to to get anything done, and it is a bit wordy, using actual words as identifiers for everything except for some operators, but aside from that it's a very elegant language, and surprisingly fun to work with.

I've spent two years porting Ada code from VMS to Linux. Overall it was a nice experience but compile times were horrible on our VMS system. Getting a syntax error after 15 minutes of waiting is kind of frustrating:)

A LONG time ago. This was in the period long enough after its release that people were getting used to what worked and what didn't. Like C, Ada has any number of constructs that you don't really use because they tend to make things harder. If you used its base functionality you had what was basically Pascal with very strong error checking, and that I didn't mind.

But throughout my (limited) use, I couldn't help but shake the feeling that Ada *really* wanted to be an O

I use it all time The complexity assertion is a bit confusing. I am not sure by what measure you'd rate it more complex than languages like Java.
I've hired lots of engineers out of college and none has ever had a problem learning it. There are certainly some bad habits from other languages that carry over in their early work but generally not that big of a deal.

At one point, the DoD required that software written for the military be written in Ada. So a decent number of schools could jack up the placement rates of their computer science or software engineering majors by teaching it and buddying up with the defense industry recruiters. I went to one of those colleges, and the majority of the early programming classes were all Ada.

At some point, though, the DoD decided to get with the times and dropped the Ada requirement. My first job out of college was most

Yes Ada is very big in air traffic control and defence. On the system I worked on it was used for all the back end processing, pretty much right up to the user interfaces which are in C and Java. Another system I heard about has the UI in Ada as well.

At the university I attended, it was the standard language for all the courses that involved programming or software design. That may have been because the head of the computer science department was on one of the committees that designed it. This was about 20 years ago. I was aware that it's still used for real-time applications, particularly military and aerospace, but I'm mildly surprised that anyone still cares about it enough to develop a new version of the standard. I applied for a programming job at

This very argument happens in chip design where we have two languages VHDL & Verilog. VHDL is very much done in the Ada Model and is a direct descendant where Verilog comes by way of C. There was a language shoot out back in the late 90s when this argument had more life with similar results. Verilog lets you get things done in about half the amount of text and about twice as fast as writing the equivalent in VHDL. Verilog gives you every chance to mess-up, i.e. it doesn't hold your hand. VHDL smothers

Amusingly, yes. The language standard adhered to is to capitalize each word AND separate words by underscores. To make matters a little more "interesting", the language is case-insensitive, so the capitalization is only a (unnecessary, imo) luxury for the reader.

Microsoft bought off all the national bodies of ISO when they ramrodded through their undocumented and impossible to implement "document standard". If ISO knew anything about business processes or standards development they could have prevented that panel-stuffing result. And yet one of the standards they set is business processes for just this situation. I don't trust them any more and I don't think you should either. They are too easily swayed by corporate interests.

Ada's cool in the esoteric nerd sort of way. Like SNOBOL or APL. I shared a girlfriend with one of the Strawman implementors of ADA and knew him moderately well. She was hot (she's probably a great-grandmother now) and he was cool, it was fun, and I'm still fond of ADA. I'm not fond of overloading operators and keywords in an RTOS in the practical sense, but as an artistic exploration of tech potentials I'm all for it as long as you don't make it the OS/language for a drone or something similar. I've never been a fan of garbage collection. It solved some problems I'd rather work around. Sadly Ada went rather overboard in the dynamic re-purposing of symbols, resulting in some unfortunate but predictable side-effects and plain code that had indeterminate use based on context.

If you can't count on a word symbol to mean a quite specific and limited thing, you can't anticipate what your app will do. In my own mind, that was the problem with ADA. By subtexting and repurposing everyting - including the "=" operator and keywords like "if" they created a thing that was useful for mapping and mimicking human intellectual processes but not for doing useful stuff.

Keywords do have well defined behaviors, so I don't see where "repurposing everything - including the ""="" operator and keywords like ""if""" happened.And you actually do need to permit overloading of operators (otherwise you end up with nasty code [ie. in complex arithmetic]).

Microsoft bought off all the national bodies of ISO when they ramrodded through their undocumented and impossible to implement "document standard".

Cite please. I'm not asking for a check number, just someone else even making this accusation instead of the usual idiotic assumption that everyone in the standards bodies is all clamoring for a handout from Microsoft.

Microsoft bought off all the national bodies of ISO when they ramrodded through their undocumented and impossible to implement "document standard". If ISO knew anything about business processes or standards development they could have prevented that panel-stuffing result. And yet one of the standards they set is business processes for just this situation. I don't trust them any more and I don't think you should either. They are too easily swayed by corporate interests.

Operator overloading reached its peak in the mid-80s at the time of Ada and C++. Since then most languages have pulled back from there and today it is considered mostly a source of bugs and heavily discouraged even in languages that allow it, e.g. Python.

Operator overloading is not "discouraged" in Python. Rather, you're encouraged to use it to make your objects "Pythonic" (i.e. support a similar range of features to a built-in python object) and to make sure the semantics are basically the same. Eliminating operator overloading from languages entirely is one of the things that makes Java suck so much and so frickin' verbose, in my opinion.

Yes, I should have said something like "heavily restricted to straightforward overloads" such as straightforward extensions comparators and iterators with no unexpected side effects. Any other use is heavily discouraged.

No, I define "abusing" as doing things like overloading an addition operator which doesn't perform an additive operation.(Which would be just as abusive if you used a non-operator function with a misleading name).

Can you provide an example which is non-abusive, but is a source of bugs? Or are we just expected to accept the veracity of your unsubstantiated claim that "today it is considered mostly a source of bugs"?

I define "abusing" as doing things like overloading an addition operator which doesn't perform an additive operation.

There are other issues, such as side effects. Say if I add/merge two complex data structures or assign them does it create deep copies through and through or does it creates a new reference to the object?

It also makes errors harder to catch, since it weakens the type signatures of the overloaded functions.

Or are we just expected to accept the veracity of your unsubstantiated claim that "today it is considered mostly a source of bugs"?

No, I expect you to already know this. Just like you to know that there is such a thing as Java, without me having to give a citation.

I am so tired of runtime failure. I am tired of feeling that which has been labeled as 'shipped code' was always shipped too early, and that there's some embarrassing bug that's going to come back and bite me(us) on the ass. Perhaps programming in Ada would allow me to sleep well at night, knowing that I didn't prematurely ship code. Sure, there's still the problems of fundamental logic errors based upon the programmer(s) misunderstanding the problem, but knowing that we're clever idiots is better than bein

I've been involved with Ada since the DoD-1 "Strawman" requirements document. A couple of points I haven't seen mentioned in these comments:

1. Ada lets me say clearly what my code is supposed to be doing in the code itself; I don't have to write coding tricks, I don't have to write comments that explain what the code really does. I can write code that I can read a year from now or some other coder can read ten years from now and we'll both know what it does. Sure, I can still write obscure, obfusticated c

require check the input. Thus it is able to detect a bug in client. ensure checks the output before sending. Thus it is able to detect a bug in supplier, that is an input which gave an unexpected/unreliable/unusable output was in some way unexpected in its effects. The ensure code prevents the situation of the caller's code being more flexible and handling a weird input but creating a call with variables the responder's code is having tr

What's the difference between this and using the existing assert mechanism in C, Python, or the like? Is it able to perform some sort of automated reasoning on require and ensure between functions that allows some assertions to be optimized out?

Yes, this is similar to the assert mechanism of C, Python, etc., but because it's built into the language, it can do some extras. Often these are optimized out, because the compiler can determine that they're always met. You can also define a "Type_Invariant" once, and then the invariant is checked every time the value can be changed from the point of view of a user (e.g., after initialization, conversion, or return

1) There is a lot of additional syntactic sugar which makes this much easier to use in practice. In theory of course you don't need any syntactic sugar for anything but it helps. Making this easy and uniform for developers is different than bare bones tools

2) The calling functions know how to respond. So a system can come back with "failed require" or "failed ensure" and the caller should know how to respond. Assert just throws an error to the end user. Which means you can embedded these dee

ADA had generics long before VHDL. From the first version of Ada in fact. Stepanov and Musser wrote the Standard Template Library [wikipedia.org] for Ada in 1987 C++ Standard Template Library was first developed in ADA and then ported to C++ in the early 90s.

Just to complete the trivia, Oracle's PL/SQL has a syntax that was based somewhat on Ada.