Philwelch wrote:Not at all. You have to start from physics and understand enough EM to know how to build a transistor, and then you can learn logic circuits, and then maybe, just maybe, you can understand how the CPU works if you write in machine code.

Then, maybe, if you know how to build an assembler in machine code, writing in assembly will make you understand how it works.

You're talking about "understanding how it works". Assembly isn't how it works--it's an abstraction layer over a profoundly sophisticated piece of electronics. To understand how assembly works you need to understand how machine code works, and machine code works using logic circuits. "MOV BP AX" is actually a hexadecimal number, but that hexadecimal number is really a set of voltages in a memory circuit that the CPU switches to the right logic circuits to change the voltages of the memory circuits in a particular register.

And to understand how that's even possible you have to know how to build a logic gate out of a transistor. It just goes down from there.

If you talk about "computation" like it has nothing to do with its physical implementation, there's no special privilege assembly has. You're just as well off thinking in Lisp or Python and pretending you're interacting with an abstract Python machine. If you think the implementation does matter, then why stop at the assembly code?

Fascism: If you're not with us you're against us.Leftism: If you're not part of the solution you're part of the problem.

Philwelch wrote:Not at all. You have to start from physics and understand enough EM to know how to build a transistor, and then you can learn logic circuits, and then maybe, just maybe, you can understand how the CPU works if you write in machine code.

Then, maybe, if you know how to build an assembler in machine code, writing in assembly will make you understand how it works.

That wouldn't give you any insight on computation, unfortunately.

And now we come to the fundamental confusion. You're stuck on a particular definition of 'computation' such that everything else is voodoo magic. This fails on two fronts.

For one, as Philwelch points out, assembly is *still* an abstraction over circuitry. Digital logic (itself an abstraction over physical circuitry) is *fascinating*, and it's a level or two below assembly. You want to understand how your programs work? Go build yourself a toy CPU out of logic gates (Digital Works is a good program for doing this). Then you'll understand how datapaths work, what pipelining really means, etc., which is vital for writing fast assembly these days. Frex, changing the order of instructions in seemingly useless ways can have noticeable effects if it can mesh better with the CPU's pipelining procedure. Hell, adding *noops* to your assembly can actually speed the program up sometimes. (This is, by the way, why I keep saying that writing modern assembly is too complex for humans. Because it's true.)

For two, the fact that our computers are built on von Neumann architectures means nothing. It's not some fundamental axiom of the universe that this is the most ideal way to build computers. C is efficient not by any special quality of itself, but because it happens to sit close to the processing models of current computers. Back in the day, though, we had Lisp Machines, which had hardware custom-built to efficiently handle Lisp code. Lisp ended up losing the language wars, though, so those machines died. The point is that computation has little to do with hardware. You want to gain insight on computation, *study computation*, not assembly. Learn to program well in as many languages as possible, so you can distill the essence of a program into computational atoms and execute them efficiently. Someday we may finally dump C-like languages and move to a machine architecture that's better suited for another language. We'll only find out if that's worthwhile, however, if we actually *try* other languages and apply them to real problems.

Philwelch wrote:You're talking about "understanding how it works". Assembly isn't how it works--it's an abstraction layer over a profoundly sophisticated piece of electronics. To understand how assembly works you need to understand how machine code works, and machine code works using logic circuits.

Xanthir wrote:For one, as Philwelch points out, assembly is *still* an abstraction over circuitry. Digital logic (itself an abstraction over physical circuitry) is *fascinating*, and it's a level or two below assembly. You want to understand how your programs work? Go build yourself a toy CPU out of logic gates (Digital Works is a good program for doing this).

Digital logic is neither fascinating nor challenging. Not even sequential logic. You don't get any insight from that. Designing CPUs taking into consideration only digital logic is tedious, but nonetheless easy. You can minimize the number of logic operations needed to evaluate a particular logic function using the same approach that I use to reduce a functional specification to its assembly or machine code implementation.

Philwelch wrote:"MOV BP AX" is actually a hexadecimal number, but that hexadecimal number is really a set of voltages in a memory circuit that the CPU switches to the right logic circuits to change the voltages of the memory circuits in a particular register.

Yes, the physics is where it gets actually interesting. While that's not actually programming, it's a prerequisite for learning how to program. Because we all program physical devices. But the topic is called "Best (programming, I assume) language for n00bs".

Xanthir wrote:Then you'll understand how datapaths work, what pipelining really means, etc., which is vital for writing fast assembly these days. Frex, changing the order of instructions in seemingly useless ways can have noticeable effects if it can mesh better with the CPU's pipelining procedure. Hell, adding *noops* to your assembly can actually speed the program up sometimes. (This is, by the way, why I keep saying that writing modern assembly is too complex for humans. Because it's true.)[

What's the use of learning assembly if you aren't going to take these details into consideration???!!! I said newbies should start form assembly so that they could be exposed to all that!!!

Digital logic is neither fascinating nor challenging. Not even sequential logic. You don't get any insight from that. Designing CPUs taking into consideration only digital logic is tedious, but nonetheless easy. You can minimize the number of logic operations needed to evaluate a particular logic function using the same approach that I use to reduce a functional specification to its assembly or machine code implementation.

Lolwut.

Assembly is *nothing more* than an abstraction layer over the *real* programming in digital logic1. Saying digital logic isn't programming... Gah, that breaks my brain. What, was ENIAC just an electrical engineering project? The original computer scientists *were* programmers, even though they were doing all their programming directly in hardware.

What's the use of learning assembly if you aren't going to take these details into consideration???!!! I said newbies should start form assembly so that they could be exposed to all that!!!

...

You can't take those details into account. Maybe a handful of people in the entire world can do so properly for a single CPU. Nobody can do so generally - it's just *far* too complex for our human minds. We don't have the correct kind of intelligence to do that. (That's why we *invented* computers - they're immensely more intelligent than us in particular ways.)

So this brings us to one of two conclusions. (1) You're claiming that you can accurately model the CPU in your head so you can write efficient assembly, which is a lie, or (2) You're putting assembly on a special pedestal where the exact details of how it gets translated from abstraction to reality don't matter, but those details *do* matter for higher-level languages.

If the former I can stop arguing, because there's never any point trying to argue with a godmoder. If the latter, then I can also stop arguing, because we honestly disagree and I know that I'll never come around to your point of view. I believe that anyone who thinks that CPU pipelining belongs in CS101 is fundamentally misguided.

EduardoLeon wrote:You can minimize the number of logic operations needed to evaluate a particular logic function using the same approach that I use to reduce a functional specification to its assembly or machine code implementation.

And that is the final nail in your coffin--the advances we're seeing in semiconductors in recent years is entirely due to the fact that we have CAD tools that can minimize logical expressions algorithmically. Rather than having expensive, skilled, trained engineers working over Karnaugh maps for a living we have gotten machines to do that work for us. You can't find an EE who minimizes his logic expressions by hand anymore. There's no more call for that.

Fascism: If you're not with us you're against us.Leftism: If you're not part of the solution you're part of the problem.

SummerGlau wrote:Game Maker Language. It's the language for this program called Game Maker. It actually sets you up for coding really easily, it has everything you need, it's also more motivational cause you can create your own computer games. So it prepared me for C++ a lot.

Seconded. I learned to program in Game Maker (just the drag-and-drop to begin with, to get my head around the idea of program flow), before moving on to GML and then C. I'll warn you now - I only have basic C knowldege (no pun intended) and really use it for scripting where e.g. Python would be more suitable. I have found though that with my fairly basic C knowledge I can pick up other languages easily - for my course I had to learn Pascal/Delphi, I also taught myself C# for a project in a day and I'm trying to do stuff in Python I'd naturally do in C.

tl;dr: I think it's important to get your head around what you want to happen conceptually before you try to learn the intricacies of any one language.

22/Male/Female/Single/Yep (Age/Gender/Interested in/Status/Don't you hate people who just jump on the bandwagon)

Wow, having not checked up on this thread since a while ago I totally missed the EduardoLeon vs. The World flamewar but I must say it's been a lot of fun catching up.

Anyways, to add my 2.3 cents to the matter...

EduardoLeon reminds me a lot of a guy I knew at University and still contact now and then. He's a very clever guy and a great graphics coder but back at uni he was completely priggish about anything that wasn't C++ because it didn't allow him to produce the results he was used to in the way he was accustomed to. Catching up with him recently, it seems like working an actual job has mellowed his views a great deal. Or maybe they haven't. I actually can't recall for sure.

Anyway, pointless anecdote aside, EduardoLeons ideas, while amusing, fly in the face of reality and a whole lot of study done on the matter of Men vs. Machines (See the annals of the 75th Men vs. Machines Conference held just last year for various reports on research done on everything from Men vs. Machines in the Field of Compiling C Code to Assembler to Men vs. Machines in the Field of Cleaning Toilets with Toothbrushes).

Wait, that was another pointless aside.

I guess what I really want to do is thank EduardoLeon for his amusing posts and thanks all the other people who have replied to him for sharing a whole lot of really cool knowledge.

I, for one, am certain that as soon as EduardoLeon gets a real world programming job that involves working with a team of other developers (Most of whom will be less technically competent than him, make no mistake) he will come to realise the numerous and various flaws in his arguments.

OOPMan wrote:Wow, having not checked up on this thread since a while ago I totally missed the EduardoLeon vs. The World flamewar but I must say it's been a lot of fun catching up.

The (flame)war is already over. The World has kind of won, me thinks.

OOPMan wrote:Anyway, pointless anecdote aside, EduardoLeons ideas, while amusing, fly in the face of reality and a whole lot of study done on the matter of Men vs. Machines (See the annals of the 75th Men vs. Machines Conference held just last year for various reports on research done on everything from Men vs. Machines in the Field of Compiling C Code to Assembler to Men vs. Machines in the Field of Cleaning Toilets with Toothbrushes).

Men vs. Machines Conference?

OOPMan wrote:I, for one, am certain that as soon as EduardoLeon gets a real world programming job that involves working with a team of other developers (Most of whom will be less technically competent than him, make no mistake) he will come to realise the numerous and various flaws in his arguments.

I do work with other developers. (Parse the last sentence taking "do" as an auxiliary verb and "work" as a full verb, not taking "do" as the a full verb and "work" as the object.) I have a day job in which I have to temporally forget my principles and just go with the flow, because I'm just a grunt.

At work I've done ones like this a hundred times using Visual Studio.net with ASP as the front end and C# as the backend. At home I lack the luxury of such tools and don't want to battle IP debates if I develop a web app at work.

I've been mucking with PHP but I don't truly feel comfortable with a scripting language in a web based setting. I have some experience with Java and Perl, but I don't know how to move forward. Any advice on where to start making web applications what would be easiest to fit my needs?What I'd like to create:A web application that allow players in an LARP I run to log in, view their character sheet and spend accrued experience points. Future goals, are an administrative back end to override changes, and maybe approve changes before they are finalized.

I can handle Database Schema, but I'm floundering with regards to actually building the front end system to interact with the DB.

Any pointers?

Always strive to do the right thing. If you fail, try harder. Never waiver in this, not even in the face of the apocalypse.

If you are already familiar with Visual Studio, get Visual Web Developer Express, it's free. If you want to learn something new, I would go with Java Server Pages. You can get free IDEs like NetBeans to develop web pages (although I am not very familiar with what Java tools are best). I would avoid PHP or Perl for web development these days; Perl is pretty much dead when it comes to webpages, and I think PHP is on its way out. A lot of people seem to like Ruby on Rails as well. I personally prefer C# in ASHX pages myself.

Started with BASIC (NOT VISUAL). Then I learnt C myself. Continued to C++, Python, Perl, Java, PHP and Scheme. I've even given a shot at an esoteric language like Brainfuck but its just annoying. Now messing around with asm.

I find the opposite is true. At least languages with wimpy type systems like C and Java seem to be more of a crutch.

I started with static, went to dynamic, and when I came back, I found I no longer needed static typing. I no longer made type errors with any regularity.

Really, I think Dynamic (+strong) typing is the real BDSM system. It forces you to keep all the types in your head, and punishes you severely for failure to do so. Static says "That's fine. Here's where you made your mistake, hon. Hold my hand next time, OK?" and then gives you a lollipop.

Not only that, but static typing allows IDEs to coddle you in some pretty ridiculous ways. Have you ever used C# with VS2010? Yeesh. (not to say this is a downside. It's awesome, but it IS coddling.)

And the "discipline" is hardly that. By some lang's point of view, the only invariants you could ever want to assert on a number is if its floating point or int or if it's signed or unsigned. What if you have a function that doesn't make sense except on arguments that, when floored, are odd, and are greater than 2*PI? At some level, you have to give up on proving your program correct and do a run-time assert. If you want to get anything done. And I think we can all agree that a language like Python and a language like Java don't actually draw the line too far from one another when you consider the real problem is proving your entire program won't crash before running it.

And, in practice, do programs written in static languages crash less, statistically? I can believe it of something like Ada or Haskell, but not C++.

Xanthir wrote:You can't take those details into account. Maybe a handful of people in the entire world can do so properly for a single CPU. Nobody can do so generally - it's just *far* too complex for our human minds. We don't have the correct kind of intelligence to do that. (That's why we *invented* computers - they're immensely more intelligent than us in particular ways.)

This is wonderfully optimistic (overly so) view of the current state of compilers. I'll be the first to say that the vast majority of software should absolutely not be written in assembly, but I can run rings around any compiler with regards to micro-optimization of idiomatic C code for several CPUs of multiple architectures (any recent ARM or x86 core). I can say the same of three co-workers in my immediate group. It requires a certain mindset, and a lot of practice, but it's absolutely possible (in fact, I'd go so far as to say anyone with sufficient patience could learn these skills).

For straight-ahead integer computation, I typically manage 10-15% over the best compilers available on a platform. For complicated algorithms with non-trivial data dependencies and ILP, that number increases. In situations where vectorization is possible but severely non-obvious, I have occasionally beaten quite respectable compilers (ICC, LLVM, XLC) by entire orders of magnitude.

Compilers win on efficiency of compilation, not flat-out performance of the end product. I might spend a week or more on a performance-critical routine. The compiler spends what, 10ms maybe (a huge portion of which is blocking on filesystem access)? If compilers were designed to spend weeks mulling over a problem, they would probably give me a run for my money, but that's not the way they're designed. For the foreseeable future, my job is quite secure.

GENERATION -16 + 31i: The first time you see this, copy it into your sig on any forum. Square it, and then add i to the generation.

the best noob language is IphoneOS (jokes, jokes, they couldn't even handle that)

if you want power for a high level project (game, application), consider cython (evolution of pyrex)it has been proven to, with just static variables and c'd functions (commonly achieved by putting the letter c in front of them), beAS FAST AS C

You could do like the engineering department at my university, which lets Matlab (or Octave, if you prefer FOSS) scripting be many students' first programming language. The eye roll means I don't recommend that, just so we're clear. Matlab/Octave is an extremely useful language to know, especially if you're dealing with large matrices of data, but teaching implicit-looping array operations without first teaching explicit looping just seems counterproductive to me...

In light of the impermanence and absurdity of existence, I surmise that nothing is better for us than to rejoice and to do good in our lives, and that everyone should eat and drink and enjoy the good of his/her labor. Such enjoyment is a gift from God.

(And, yes, I read the whole thread. Including that epic flamewar with that one dude who thinks that "programming should be made as hard as possible to learn so that as many students as possible will be turned off of the subject as early as possible"...and who still evidently believes his pedagogical opinions have any validity whatsoever after that.)

I think for learning the simple ideas of computer programming—loops, variables, conditions, etc.—it would probably be simpler to start out with pure text processing.

sed has been referred to as "The assembly language of text processing", and is perfectly suited for learning simple concepts such as conditional branching and "goto"s, and it requires extremely minimal work to make it do something. Also, unlike actual assembly language, sed has a very real day-to-day use. And within its niche it is unlikely ever to be supplanted by anything else. It makes a nice investment for future use on the job (unlike assembly).

The next step up from sed would be awk. In awk you have loops, variables, associative arrays, functions, if-then-else constructs, etc., etc., etc. You're not going to program any video games with it, but it's simple—and it contains all the vital basic elements that can be stumbling blocks for "non-computer people" to really dig. And as a plus side, its syntax for conditionals, loops and code blocks very closely matches that of C, so you'll get a basic "leg up" on reading C code in the future.

My counterargument would be that sed/awk are extremely limited in what they can do. Text processing is only one tiny slice of what's possible. They're not even any good at a lot of realistic text processing; a lot of extraction and manipulation of semi-structured formats like XML/HTML or JSON really needs a full parser. sed certainly can't do that in anything like a reasonable way, and looking around it doesn't look like awk provides anything better either.

I know that those were your languages, and more power to you; if anyone is happy with what they started with, than it was at least a pretty good language for them. At the same time, if someone wanted to get into programming (as opposed to "I want to get better at using my Linux system" or something like that), I have a hard time envisioning a scenario where I'd consider recommending it.

Missed this discussion the first time around. I'm not sure there really is an ideal newbie/learning language, but I think the criteria I'd use are:

It is interactive. Write-compile-test is okay for those of us who already know more or less what we're doing, but when you're starting out you need something more immediate and hands-on.

It doesn't hide too many lower-level programming concepts completely (gotta learn your pointers sometime, and new programmers should at least be aware that there's a difference in how computers handle integer and floating-point numbers,) but doesn't require them for most of the basic stuff.

It doesn't introduce too many more esoteric/high-level concepts as fundamentals. Stuff like OOP is interesting and potentially meritous, but trying to get someone who's yet to even get the hang of standard procedural programming to wrap their head around it is not a great idea.

It doesn't have too many arbitrary weirdnesses for the newbie to come to grips with. It's probably inevitable that it will have some, but it's easy for the bogosities of a language, large or small, to ingrain themselves on the new programmer in semi-permanence, so the fewer of them the better.

So with that in mind, my thoughts on some of the languages put forward:

C is right out, and I say that with it being my favored implementation language. It's good for what it is, but it fails on pretty much all counts but #3 - non-interactive, ubiquitous low-level concepts seeping up into what is ostensibly a high-level language, and plenty of odd little quirks and gotchas in both the language and the standard library.

Java is also right out. It was never a good teaching language even when it was the go-to teaching language. (Is it still, or have the .NET zealots taken over in educational settings?) Like C, it fails on pretty much all counts (non-interactive, arbitrary weirdness abounds, plus the standard library is fairly ugly and extremely verbose, and the whole Java philosophy of programming-as-assembly-of-prefab-parts is just horrible,) but trades succeeding on #2 for failing on #3 (why do so many OO languages insist on wrapping main() in a class? What does that accomplish? Especially when it doesn't even take advantage of it in any way, like, say, making the arguments a member of the class instead of an argument to the function?)

Python is pretty alright, really. I can see why people consider it the BASIC of the 21st century, and it's generally quite featureful without pushing too many complex features on people who don't need them. All the same, it brings in a little too much of OOP concepts where they're not really needed, arrays are kind of a hack compared to basically any other HLL, and literal whitespace is awful, arbitrary weirdness of the first order and belongs back in the '60s with JCL.

Pascal is...eugh. It's much, much better than it was back when it was introduced, to be fair, having been extended to the point of being a perfectly usable language for general applications, but no language that contains something as demented as making the size of an array be part of its type should ever see use as a teaching language, even if there is a sane workaround now. You wouldn't tell students to do their homework in a library that's just a little haunted by insane cannibal ghosts.

Lisp...really? Lisp? Don't get me wrong, it's a fascinating and cool language and more than a little beautiful in its simplicity, but the mind boggles to consider what kind of programmer would result from using Lisp as an actual first-introduction-to-programming teaching language. Still, I suppose it does satisfy pretty much all the criteria I laid out.

Smalltalk I really want to consider as a good teaching language. It has a beautifully simple programming model to rival Lisp, but takes less work to wrap one's head around, and it's hands-down the best design for an object-oriented language I've ever encountered. But...well, it's pervasively, inescapably object-oriented, and while it's beautiful about it, I don't think that's probably the best choice for teaching/learning, because OOP is not fundamental to programming. Plus the conflation of language interpreter and computer operating environment is just not great for most purposes, however interesting it was back in the '70s. My favorite Smalltalk is actually the command-line GNU Smalltalk, just because it allows me to enjoy the beauty of the language without having to put up with the weirdness of the operating environment.

Really, in spite of everything, I think I'd have to recommend Basic as a learning-language-of-choice, with the very explicit caveat that it be a modern, sane, structured Basic (FreeBasic would be my recommendation.) It's not remotely perfect (for instance, as far as I've seen every Basic still makes you use those goddamn left$/right$/mid$ functions for string manipulation, even if you just want to check one character at a specific index - why can't we just index the string like an array? And they really should've moved the main program flow out of global scope and into a main() equivalent when they devised structured Basic,) but it's vastly improved over the BASIC of old, and it more or less meets my criteria - it provides both lower- and higher-level concepts while allowing the novice to sit comfortably in the middle until they're ready to delve into the more esoteric stuff, it's relatively free of bogosity and weirdness, and it's easy to learn. The main downside is that I don't know of a free Basic that's both highly capable and structured and also interactive, but I haven't really looked in some time.

"'Legacy code' often differs from its suggested alternative by actually working and scaling." - Bjarne Stroustrupwww.commodorejohn.com - in case you were wondering, which you probably weren't.