General

> Formica
> Programming

BNF. By: Lars Marius Garshol Contents Introduction What is this?

This is a short article that attempts to explain what BNF is, based on message <wkwwagbizn.fsf@ifi.uio.no> posted to comp.text.sgml on 16.Jun.98. Because of this it is a little rough, so if it leaves you with any unanswered questions, email me and I'll try to explain as best I can. It has been filled out substantially since then and has grown quite large. What is BNF? Backus-Naur notation (more commonly known as BNF or Backus-Naur Form) is a formal mathematical way to describe a language, which was developed by John Backus (and possibly Peter Naur as well) to describe the syntax of the Algol 60 programming language. (Legend has it that it was primarily developed by John Backus (based on earlier work by the mathematician Emil Post), but adopted and slightly improved by Peter Naur for Algol 60, which made it well-known. Programs that do this are commonly called "compiler compilers". How it works The principles A real example ?

You know, the one you're supposed to put in HTML and you never quite know what it should be? Did you ever get an email from your friends in Bulgaria with the subject line "???? ?????? I've been dismayed to discover just how many software developers aren't really completely up to speed on the mysterious world of character sets, encodings, Unicode, all that stuff. But it won't. So I have an announcement to make: if you are a programmer working in 2003 and you don't know the basics of characters, character sets, encodings, and Unicode, and I catch you, I'm going to punish you by making you peel onions for 6 months in a submarine. And one more thing: In this article I'll fill you in on exactly what every working programmer should know.

A Historical Perspective The easiest way to understand this stuff is to go chronologically. And all was good, assuming you were an English speaker. Unicode Hello Encodings. Create Your Own Programming Language. Table of Contents What's a Programming Language?

Why? In reality, a programming language is just a vocabulary and set of grammatical rules for instructing a computer to perform specific tasks. Regardless of what language we use, we eventually need to convert our program into machine language so that the computer can understand it. Compile the program (like C/C++) Interpret the program (like Perl) In this article, we use the second way “interpreted language” like Perl or Ruby, called “St4tic” for demonstration. 3- JavaCC Ready!? Fight!