A language that attempts to be everything to everybody, or at least address frequently-reoccurring concerns such as concurrency, security, collection-handling, persistence, scaling, distributed programming, exception handling, etc.

For comparison, the opposite is generally separate services that attempt to be app-language neutral.
Top's term for his personal experience of BlubParadox. {And for any language that isn't ExBase or BrainfuckLanguage...}

Paul Graham's GodLanguage would probably be a new form of Lisp with lots of libraries. I don't entirely disagree with that, but doubt it fits in with developers' existing human nature well. And, I doubt he'd buy the view that heavy types and objects will save the universe.

I've told you a million times, stop exaggerating! "Save the universe," indeed! However, when the requirements dictate "heavy types" (whatever those are) and objects, it's nice to have them. A GodLanguage, as described at the top of this page, sounds great. So does a new form of Lisp with lots of libraries.

The "requirements"? And is exaggeration different than HandWaving? I can't keep your sin classification system strait.

Don't you mean "straight"? Exaggeration is different from HandWaving. Exaggeration is "objects will save the universe", i.e., gross hyperbole. HandWaving is "Paul Graham's GodLanguage would probably be a new form of Lisp with lots of libraries", i.e., idle speculation without evidence.

Yes, "requirements". Y'know, those things that users are asking for, i.e., the reasons we write software.

Usually, it's an implementation decision, not a fundamental requirement.

Implementation decisions may be arbitrary, but they shouldn't be. They should be based on requirements.

That's the million-dollar issue: how to turn software design from an art into a practical science. When there's too many variables involved, people tend to pick their favorite variables and HobbyHorse them.

Why can't it be both? Since every new application is a unique creation, software development is obviously a craft. Crafts -- which range from making ornate cutlery to constructing skyscrapers -- are equal parts art, engineering, and science.

The majority of the sky-scraper project has to be tested via the laws of physics. For the most part those are not a matter of opinion. (Disputes about the physics of buildings can usually be settled by using math and models. In some cases the economics of testing certain scenarios to their fullest may be an issue.) Different designs can be tested against the laws of physics in a fairly straight-forward way. Physics is the elephant in the room as far as dictating options. There's nothing of comparable size in software engineering. Machine performance is the closest, but it has gradually been shrinking over time such that the "art" side of things is larger and larger component percentage-wise. (Generally I see the mid 1970's as the break-even point where software issues appeared to pass hardware issues in importance. This is based on "software crisis" sort of articles that started popping up around that time.) Engineering doesn't have the type of great holy-wars and fad-cycles that software-engineering does. The variety of different solution paths appears far greater in software engineering than in physical engineering.

To be useful, software is ultimately constrained by functional and non-functional requirements, and perhaps aesthetics. This also applies to most craft objects. The variety of different solution paths for these is infinite, but frequently constrained in practice by tradition. The construction of skyscrapers owes as much to tradition as it does to physics.

Some of this comes from fear of getting sued (RealProfessionalsGetSued). If you try something without a history and it fails, you are far more likely to lose in a court case. In software we can take far greater risks with new tools and approaches in comparison because people are less likely to die. (Imagine being paid to build software for a medical safety device, you'd be far more likely to select Ada than RubyOnRails. In other words, something that's been used in the field for a decade or two.)

Catoring to HobbyHorses is not necessarily a bad thing. Fitting the WetWare of the developer can result in productivity gains I contend. But I don't claim that my HH will fit every mind. I don't claim One True Language/Paradigm/Methodology. You, on the other hand, seem to imply a logically-discoverable Single Right Way. But your justification and demonstration is very round-about and devoid of examinable scenarios/tests. Related: TooManyVariablesForScience.

{Your view of the role of science is naive and, frankly, incorrect. You think the role of science and logic is 'discovery' of concepts - things that can be demonstrated. What science does is 'kill' models and concepts. Science is a rather negative discipline that forever says "you're wrong" and never, ever says "you're right". The best you can hope for science to say is: "I haven't figured out why your hypothesis is wrong. Yet.", and for it to say the same thing for long enough - under enough new observations - that the hypothesis becomes a theory. Further, from GoedelsIncompletenessTheorem and RicesTheorem, much the same can be said of mathematics and computation: math can prove some true things true, and some false things false, but there will forever be a gap where one cannot prove anything at all. That you imagine logical models derived of science would find a "Single Right Way" (or establish any "Right Way") mostly tells me that you really, truly, don't grok science.}

{The person you assert as claiming a "Single Right Way" has never claimed any such thing, and is rather insulted by your fabricated lies and insinuations. There is a meta-physical universe of difference between achieving a "Single Right Way" and simply rejecting the "Known Wrong Ways".}

Well, all stated "Known Wrong Ways" seem to punch out a pattern in the paper that strongly resembles the type-centric GodLanguage concept when held up to the light. Technically you may be right. In practice it's merely a round-about version of "Single Right Way".

{In practice, it's an attempt to systematically and incrementally achieve progress. Your insinuations about a "Single Right Way" are malicious and vile attempts to associate a reasonable approach with something absurd that you may then attack - a StrawMan. Your entire line of sophistry on this issue is insulting.}

If I mischaracterized it, it's because you failed to supply sufficient details, not because I am evil. Show it being better so that my description of it doesn't matter. You could say, "code snippet 5 shows that you are wrong because line 7 doesn't have to wait for message X to make an estimate...etc..." RaceTheDamnedCar. Show your grand language/tool kicking P/R's butt in semi-realistic scenarios. Show you understand working with the real world instead of merely diddling with idealistic ivory-tower toys. Show Don't Talk.

{And your comment on type-centrism is ridiculous, but is more likely because in your skewed perceptions you imagine any mention of 'types' in any context whatsoever is equivalent to raising them to a central position. TypeSafety (in the broad sense: proving programs won't have undefined behavior - a practice that might not involve "types" at all) is important to almost every other language feature, but TypefulProgramming and ManifestTyping are not; of those, the former is of questionable value (not enough data on FeatureInteraction), and the latter is (if required) very problematic.}

I suggest you create a handle and PersonalPage so that I don't confuse you with bracket guy (if you are different than bracket guy....bracket guys?).

I have a PersonalPage, but I believe in an EgolessWiki. I suggest you address the text in front of you, within the discussion(s) in which it appears, rather than make sweeping statements about who you assume I am. AttackIdeasNotPeople. I am sometimes bracket guy, sometimes italics guy, sometimes plain text guy, and sometimes all on one page, solely as needed to disambiguate points within a thread.

There's implications of prior debates floating around. I'll try to point them out if they appear again.High-End of Each Idiom

Those proposing a GodLanguage on this wiki generally seem to want to take the "best known" common idioms of computing, such as concurrency, security, collection-handling, persistence, scaling, distributed programming, exception handling, etc. For example, it will come built-in with the best (alleged) security techniques known.

I see at least two potential problems with this. First, it will give it a high learning curve. Even if one only wants the basics of one of the aspects for a given project, say basic concurrency, one has to learn how the top-of-the-line security system works. It's like being forced to learn how to use a tractor when a shovel would be sufficient.

The second is that a new "best known" idiom may come along, making existing code obsolete. Perhaps it may require a complete change of the language even to effectively use the newcomer.

I disagree. I don't need to know how top-of-the-line GarbageCollection works in order to add numbers in a language possessing a garbage collector, and I won't need to know how top-of-the-line security works to learn and use standard concurrency and concurrency-control idioms. Admittedly, someone probably needs to think about security properties when developing concurrency-control patterns (lest they introduce high risk of PriorityInversion or DeadLock as a DenialOfService attack), but that burden should be on the standard library or language designer rather than on individual users. And though a particular means of expressing concurrency or collection-handling might be considered 'idioms' and become obsolete, the concurrency and collection concepts will be around for as long as we have at least 2 CPUs in this world. Sure, if a language becomes obsolete, some people might need to learn new language skills - but they'll never need to relearn everything. Such is the nature of progress.

The basic concepts of security may still be there, but not necessarily the details of the language interface. There are usually many ways to express the same things.

Huh?

Also, language design isn't BigDesignUpFront. If it were truly BDUF, you'd never hear of something like "algol derived" languages, you would not be able to trace lineages of language. The only reasonable way to understand language design is as an iterative process, with versioning and forks and plenty of history and experimentation to study.

But the iterative path may be much larger for a language that attempts to integrate so many concepts.

Sure. Why is that a problem?

There is only one God language.

Well. That explains a few things....

{I classify Lisp as a meta-language, not a language. However, I'm sure we can LaynesLaw this up the wazoo.}

Under which conditions is a meta-language not also a language?

[Like many general-purpose programming languages, Lisp can be used to define sublanguages within its own environment. It shares this capability with FORTH, C/C++ with macro pre-processing, and many others. However, these are all regarded to be languages that have meta-language capability. None would be considered to be "a meta-language, not a language". In that category (assuming by "language" you mean "programming language") are specialists like BackusNaurForm. You seem to be emphasising Lisp's MetaLanguage capability to the point of excluding everything else it does. In typical practice, its MetaLanguage capability is of relatively minor importance.]

{One gets a different answer no matter which Lisp affectionado they ask.}

[Maybe, but the rest of my point holds true. Categorising Lisp as "a meta-language, not a language" is incorrect regardless which Lisp "affectionado" anyone asks.]