Posted
by
timothyon Tuesday April 29, 2014 @11:34AM
from the goto-10*5 dept.

harrymcc (1641347) writes "On May 1, 1964 at 4 a.m. in a computer room at Dartmouth University, the first programs written in BASIC ran on the university's brand-new time-sharing system. With these two innovations, John Kemeny and Thomas Kurtz didn't just make it easier to learn how to program a computer: They offered Dartmouth students a form of interactive, personal computing years before the invention of the PC. Over at TIME.com, I chronicle BASIC's first 50 years with a feature with thoughts from Kurtz, Microsoft's Paul Allen and many others."

I mean Basic isn't difficult either, but I really don't understand the perspective at the time that FORTRAN was so complex that BASIC and COBOL were really needed for their syntax changes alone. All of the explinations I've read about them, invariably have the line somewhere about FORTRAN being so difficult to understand that only scientists could master it. I understand they were all invented for different problem domains and that's kind of a good reason in and of itself, but sheesh, its not like it was brain fudge.

It isn't hard. I think the only issue that anyone has had with it was the column restrictions, and the important development there was interactive computing and a decent text editor, not a new language.

Bear in mind, also, that most people haven't ever attempted to write in original FORTRAN. Most have seen nothing earlier than FORTRAN 77 and that was tremendously easier and far less irritating than FORTRAN IV. I am old enough to have written FORTRAN IV on card decks, the "good old days" sucked for the most part.

I always found BASIC to be far more irritating to program in anything more than trivial applications.

I don't think the language is the problem, it was all other things that where difficult and mostly solved in the 60's and 70's.

FORTRAN is certainly still in use for Real Programming. I haven't seen a way to use BASIC- in the way it was originally conceived - in 25 years.

I haven't seen a way to use BASIC- in the way it was originally conceived - in 25 years.

To be fair, a lot of people were still using VB6 for Windows frontend programming as recently as 10 years ago. Now it looks like it is finally dying off, although I'm pretty sure a lot of legacy projects are still in use.

BASIC really took off when it was ported to microcomputers. On a minicomputer like at Dartmouth it was just one of many languages, and wasn't even the only interpreted language. However most languages were still rapidly evolving at the time and there was a lot more experimenting going on. The intent of BASIC though really was to be a "beginner's" language, for the undergraduates rather than grad students. At the same time it was introduced when time sharing was brand new, so the idea of interacting dire

No, it is more a case that access to any computing resources was very limited.
Twenty years later, universities still had old time sharing machines running Fortran and BASIC that first year students trained on and COBOL, while PL1 on Sperry-Univac machines was the new thing for the seniors. A typical university only had two to four computers in total.
I also started by learning BASIC with punch cards on a mainframe, since that was the only thing that was available.
The small computer population explosion only really happened in the 1980s.

FORTRAN wasn't the language in 1964 that you think of as FORTRAN today.

Most people's concept of FORTRAN is FORTRAN 77 or its descendants, which was 13 years in the future from BASIC's introduction.

At the time of BASIC's introduction, FORTRAN IV was the current version.

FORTRAN wouldn't be ANSI-fied for another two years as FORTRAN 66, so every version had machine-specific features. Also, because FORTRAN's development was largely driven by IBM until FORTRAN 66, all the non-IBM versions were "nonstandard." Imagine if, today, every computer came with a C compiler and there were no ANSI or ISO standard to constrain its behavior. The last common reference would have been K&R '78.

Another fun feature of early FORTRAN was fixed column layout [wikipedia.org], common among languages invented in the punched card era. That is, you had to do things like start all statements in column 7 or later, because the first 6 columns had other meaning.

Early FORTRANs also had very primitive program structuring concepts, hardly raised from the level of assembly language.

Read through the Wikipedia article [wikipedia.org]. You'll probably be shocked at how primitive FORTRAN was in the early 1960s.

That's all very useful information, thank you for sharing that. It doesn't change my opinion really as I've worked with early FORTRAN, but its pretty interesting for those that were not aware of that history.

Well there were a few key advantages.BASIC was an interpreted language, vs FORTRAN or COBOL that required compiling. This gave beginner programmers quicker response in terms of code it run it. vs. Compile find a slew of cryptic errors try to find them and fix and try again.Secondly BASIC was a line driven language vs. procedural.10 print "hello"20 goto 10yes it created a lot of bad programming habits. But it really explained how the computer processed the stuff better then the procedure. But I am sure som

I can't speak to Fortran, having no direct experience with it, but every Cobol program I saw in the youth of my career was interpreted. I administered a multiuser Cobol-based accounting system on a Xenix box in the early 1990s (still probably the best accounting program I've ever seen, BTW), and it was was all i code that ran on a Cobol interpreter. I imagine that for most of Cobol's history the majority of its software has run this way.

I stopped programming COBOL about 5 years ago, after having professionally used COBOL for about 13 years.I've never even encountered a COBOL interpreter; it was all compiled.

IMHO, interpreted COBOL makes no sense at all; you use COBOL because you want raw performance.If performance isn't top priority, you'd be better off using Java, C++ or most other languages.Perhaps if you need backwards compatibility for obsoleted hardware running legacy code.

All shops I've worked for are actually pretty hardcore about performance, testing against other languages whenever a compiler is updated.Nobody wants to program COBOL code, it's just that it's a produces very good binaries for the specific (financial) tasks it was designed for.Or do you really think all those COBOL programmers still active are only doing it out of legacy support?

Are you sure...Collage Age Geeks: 18-24 Were born in the 1990's! When they were old enough to start doing geeky stuff with the computer we are well in the late 90's and early 2000's. There is a new generation of "Kids" out there who never had to experience the world pre-GUI, with 5 1/2 or bigger sized floppy disks, or without high speed internet.The actual kids today who are able to start doing more then basic stuff with the computer, doesn't comprehend a world without internet connectivity will cellphones

It's not so much that FORTRAN was horrible, but AFAIK it did not lend itself to a simple interpreter and so the gratification was not instant. Compiling is an extra step and it is not a fun step. In addition (again relying on very faint memory), BASIC had easier string-handling functions built in until FORTRAN77 - and most of us were dealing with strings when we were learning to program.

FORTRAN isn't that complex. Originally, it only had 7 statements (or was it 10)? The only halfway complex thing about it was the expression compiler parts.

But BASIC had several advantages. It was intended for interactive use, at a time when most FORTRANs were batch-only. It originally only supported a very limited set of variable names like "A", "B", "C", and so forth, meaning that you didn't have to implement symbol table logic, and the associated storages on machines when 16K RAM space might be considered at lot - instead the variable name was the hash code to a value table. Refinements such as the rich set of built-in functions and extensive string services were also later additions.

The original BASIC was so minimalist that even the first effort from Gates &Co. exceeded it. But it introduced a lot of people to "instant gratification" programming, and thus its influence can be felt in many places to this day.

I've actually programmed in Fortran and BASIC way back in the day (late 70s early 80s). From a language point of view early dialects of Fortran (e.g. Fortran IV ca. 1961 and still in widespread use in the 70s) and BASIC were in fact *very* similar. What was different was that Fortran was *compiled* and BASIC was *interpreted*.

It was common until the mid 1970s for Fortran programmers to physically drop off a deck of punched cards at the operator's window. They'd get their results some hours later, if not the next day, after the operators got around to running the job. Most of the time those results wouldn't the desired computation, but a compilation error. So to be productive in Fortran you had to think about your *entire* problem in advance, carefully preparing your deck to get as much as possible correct before handing the job off.

BASIC was an interpreted language initially. That meant you to type in little snippets of your program, even individual expressions, to see how or if they did what was expected. If you typed in a program and there was a syntax error, you'd know as soon as you hit "return". This allowed a more exploratory approach to programming and learning to program. Of course, you could get the *same* interactive experience in a much more sophisticated language by using Lisp.

I started programming in C in the 1980s, and this use-style distinction between compilation and interpretation remained. A full compile and link of our modest application took something like 30 minutes on the minicomputer we were using, which had a clock speed in the single digit MHz range. So we prepared our source changes *very* carefully, and used our down time to read the full 8 volume Unix manual from cover to cover, over and over again. There was something to be said for such an approach to programming, but it was not for the faint hearted.

By the 90s this had changed. Compilers were orders of magnitude faster; you'd actually hit "compile" and *watch* the compiler do its thing. A decade earlier that would have been like watching paint dry. Editors became syntax-aware too, and debuggers went from requiring mad voodoo skills to being interactive and usable by ordinary mortals. So now compilation vs. interpretation is a packaging and runtime issue; there's not much to choose between them with respect to how *hard* a language is to use. Naturally someone who cut their teeth in the modern era look at BASIC and Fortran as they were in the 60s and wonders what the big deal was. But it *was* a big deal, at least for people who weren't going to learn Lisp.

FORTRAN at the time had string handling capabilities that sucked so bad, they caused local black holes. Not a problem for techies solving differential equations, but somewhat more of a problem for business and hobbyists (the latter of whom often fancied imaginative string handling to create the illusion that the computer was actually holding a conversation).

I grew up with a little TRS-80 on which you had to learn BASIC to so much as load a file. In Grade Three I was learning things like coordinate geometry and algebra, while my peers were struggling with their multiplication tables. I remember when my peers were introduced to algebra for the first time, some of them had difficulty understanding how x could be a number, while I was busy making adventure games at home.Thanks to this head start in life, I now have a job in IT. BASIC gave me a great head start in computer literacy!

That isn't true.The CS 101 Class should have components about following up structure. Give assignments where the students may need to build off their code, if they start doing it the BASIC Way make assignments that shows how much harder it is to maintain over time. I went into CS with a Strong knowledge in BASIC, it was actually very helpful. As the other students were struggling with Loops and getting the darn thing to work, I was focusing more on trying to follow the guidelines for style and seeing if

I agree, but this is actually an old tongue in cheek essay, in context it makes more sense perhaps:

"FORTRAN --"the infantile disorder"--, by now nearly 20 years old, is hopelessly inadequate for whatever computer application you have in mind today: it is now too clumsy, too risky, and too expensive to use.PL/I --"the fatal disease"-- belongs more to the problem set than to the solution set.It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration.The use of COBOL cripples the mind; its teaching should, therefore, be regarded as a criminal offence.APL is a mistake, carried through to perfection. It is the language of the future for the programming techniques of the past: it creates a new generation of coding bums."

Sure I got the talking to when in a lecture I brought up using a GOTO to solve a problem. However that was about it.

Crappy professor.

GOTO is a perfectly reasonable solution for a very small set of problems. Dijkstra's famous essay has a less-famous rebuttal (that I'm far too lazy to find at the moment), with which Dijkstra agreed: Code should be readable first, well-formed second, and functional third.

Using a GOTO is usually only functional. It doesn't preserve the structure of the code like the two ends of a loop, and being one-ended, it's not easily followed when reading. Other mechanisms are usually better.

I learned to program on 8 bit computers (Tandy and Commodore), all of which were running BASIC interpreters, and I had no problem moving on structured programming like Pascal, and ultimately OOP languages like Java.

/sarcm Riiiiiight, because programmers "never" dissect and disassemble Basic to see how it was implemented, learning such topics as Taylor Series,Horner's Evaluation for Polynomials, learning how to use memory-mapped IO, to learn how parsers work, or any other advanced topic in interpreters.

Guess what, a great programmer is STILL great in-spite of or independent of brain-dead languages, such as Basic or Javascript. At the time Basic freed programmers from the mundane and tediousness of assembly language.

I wouldn't contend for an instant that the kids I grew up around were 'retards'. 8-year-olds can't magically know things without experience.

How many kids have the chance to sit down in front of a computer and learn that the reason a ball goes across the screen comes down to something as simple as x=x+1? Schools won't teach them that until the end of primary.

BASIC does probably teach some bad programming habits but at the same time it's accessible to an 8-year-old, and you're learning concepts that are applicable for life: file management, how to store and retrieve data, syntax, etc etc. If the goal is to introduce kids to ehmm.. basic computing concepts, it worked admirably.

Compare to someone with no knowledge of programming concepts at all whatsoever trying to grasp how to call a function for the first time in their life.

BASIC does probably teach some bad programming habits but at the same time it's accessible to an 8-year-old, and you're learning concepts that are applicable for life: file management, how to store and retrieve data, syntax, etc etc.

Also access to source code; before BBS about the only tutorial/learning docs available.

I'll bite. Grading curves. Bad teachers [ or rather, bad test writters] grade on a curve so they are sure to get atleast a certain bell curve of A's, B's, and C's. So, if you are surrounded by idiots as early as middle school, you'll get better grades. Then, because of quota's, you'll get in to a magnet school, which will have better teachers, and more interesting classes, and smaller class sizes. That is a recipe for sucess [ for someone like me anyways].

So, if you are surrounded by idiots as early as middle school, you'll get better grades.

When I was a kid, I wanted to be a machinist. I love working with my hands.

In Middle School and High School, there were these "idiots" who took shop barely passed Algebra and took jobs that gave them credit to graduate.

My parents didn't want me to be a blue collar worker and demanded I go to college. Part of it was that they wanted something more for me - blue collar jobs were being sent down South (Carolinas, GA, FL, etc..) at the time and the "college boys" had their cushy salaried jobs and were the on

Yes, and then from there it's really easy to understand how to calculate things like velocity and gravity, and understand vectors. If I ever have a kid, I'd totally want to get them learning some form of basic coding at an early age. Nowadays it doesn't have to be BASIC, it could just as easily be LUA There's so many useful/abstract concepts you pick up naturally while figuring out these things even outside of basic computing.

The BBC Model B equipped with BBC BASIC was released in 1981. As well as the usual litany of BASIC like features (i.e. goto), it had proper named procedures and functions with local variables, which allowed structured programming. It didn't have proper block structured if though.

It also had dynamic memory allocation and pointer indirection (not that wretched peek and poke stuff).

It was still tied to line numbers though, but in practive you (a) didn't need them except for computed goto and jump tables and (b) it had a proper renumbering command if you needed to insert space which corrected all the gotos, gosubs and jump tables (but not obviously computed goto).

It had a 5 byte floating point system build in too, which while slow was pretty decent.

Was quite powerful. It also had graphics and sound built in, which made it very nice to play with.

And then I graduated on to QB when I switched to a PC. Mostly QBasic then a pirated version of QuickBasic. Actually my dad was very against piracy but relented when we phoned a Microsoft sales office and they denied all knowledge of such a product and tried to hawk us an early version of Visual Basic.

QBasic was a fantastic system, especially given it was free with PCs, and I challenge anyone to claim otherwise with good justification.

The BBC Model B equipped with BBC BASIC was released in 1981. As well as the usual litany of BASIC like features (i.e. goto), it had proper named procedures and functions with local variables, which allowed structured programming. It didn't have proper block structured if though.

Yes indeed. I initially learned to program on a BBC, and I learned a number of good habits in the process.

You're absoloutely correct. It did have those and it was awesome. It meant you could freely mix ASM and BASIC code. Handy so one could do inner loops in ASM and the more complex outer logic in basic without sacraficing anything. I seem to remember that the special variables A%, X% and Y% would be used to set the A, X and Y registers on entry and be set back on exit for easy communication.

The BBC was and still is far ahead of anything else as a teaching machine. Simple enough to understand, complex enough to be useful and enough I/O to put a pi with gertboard to shame even today.

Good point. The built-in assembler was excellent too. The whole BBC Micro project was designed educate people about the computer as a powerful tool they could use, and not just a games machine. And, as you say, they did a damn good job.

Richard Garriott (of Ultima fame) is running an interesting challenge to port his very first RPG computer game, written in BASIC on a teletype connected to a PDP-11, into a web-friendly or Unity version. https://www.shroudoftheavatar.... [shroudoftheavatar.com]

If IBM had gone to Chuck Moore [slashdot.org] instead of Bill Gates (or rather, his mom) for their 4.77MHz 8088 PC, your title might have been "50 years of FORTH, the Language That Made Computers Personal".

If IBM had gone to Chuck Moore instead of Bill Gates (or rather, his mom) for their 4.77MHz 8088 PC

Microsoft was selling microcomputer BASIC to Fortune 500 clients as early as 1976. It became the most visible --- and the most successful --- developer of programming languages for the micro in the eight bit era and well positioned to move into system software and applications.

No doubt BASIC was the path of least resistance, but if you click through my link to 1983, you'll notice that Smalltalk was positioned to execute on a FORTH stack VM which was then reduced to hardware in the Novix chop. Moreover, the technology you see in the Javascript V8 engine had already been published in relation to Smalltalk in 1983.

This was technically feasible at that time. The fact that it wasn't the path of least resistance hardly qualifies BASIC for the credit accorded it by the title of this a

During the late 1980s to mid-1990s, Smalltalk environments ---including support, training and add-ons --- were sold by two competing organizations: ParcPlace Systems and Digitalk, both California based. ParcPlace Systems tended to focus on the Unix/Sun Microsystems market, while Digitalk focused on Intel-based PCs running Microsoft Windows or IBM's OS/2. Both firms struggled to take Smalltalk mainstream due to Smalltalk's substantial memory needs, limited run-time performance, and initial lack of supported connectivity to SQL-based relational database servers.... While the high price of ParcPlace Smalltalk limited its market penetration to mid-sized and large commercial organizations, the Digitalk products initially tried to reach a wider audience with a lower price.

The reason I promoted Forth as a graphics communications protocol alternative to NAPLPS is the Western Electric Videotex terminals for the Viewtron service were so limited in RAM and ROM (far more limited than the first 4.77MHz IBM PC) that it needed a highly compressed representation of the firmware for decent graphics performance. Forth provided that and it would have further allowed dynamically downloading tokenized Forth stack functions (called 'words'). I talked about this with the guys at PARC late

Long before Lisp or Perl, Basic made things much, much easier to deal with text.C (and its children) had pointers and allocation to deal with.Cobol, Fortran and Pascal, by default, dealt with fixed-length strings (yes, later versions improved it).

I bought a VIc-20 in 1982 to use in my woodworking business. I learned BASIC on it by trying to key in the Tank vs UFO game that was printed in the manual. I don't know if it was all of my typos or errors in the printed listing (both likely), but through debugging that ASCII character game, I got started in the direction that took me to working in IT.

Hmmm... now that I come to think of it, proving him wrong actually requires a significant number of programmers to start with BASIC and then go on to become "good programmers". Or perhaps, another way of looking at it is for a significant number of teachers to notice that un-teaching bad habits previously learned isn't all that hard. I think that is indeed the case. Once again, it's subjective. He leaves himself some wiggle room in that quote; but I stand firm in my conviction: It's wrong.

Lisp had gone through a lot of experimentation, and it really didn't settle down until what we recognize as Lisp until later. Lisp 1 was somewhat crude. Lisp 1.5 was a lot nicer but still clumsy. MACLISP really made things a lot more modern but that wasn't until 1966 I think.

Oh and the big difference I think, is that BASIC was explicitly intended for all the students and faculty at a liberal arts university, whereas Lisp was initially intended for mathematicians and researchers at a science and technology university.

I started working on computers in the early 80's... The first one I used was a TI 99 4a. It had tape drives and a TV set as a monitor, and a horrific keypad (note: not keyboard). Then my brother got a PC Jr. and I started hacking with that and then went off to college. As an engineering major, I learned FORTRAN on punched cards. I hated it! Swore I'd never have a job where I used computers.

Then my brother got the family to chip in and buy me a Tandy 1000a. It came with DOS, Deskmate, and Basic. I started programming in Basic using the concepts I had learned in FORTRAN. By the end, I think I had dumped about $5,000 into that computer. Printers, memory upgrades, floppy upgrades, hard drive, monitor, etc. And still was able to do amazing things with Basic and with BAT files.

My first job was with Arthur Andersen. COBOL. Batch COBOL. 2.5 years of it. Learned it in 6 weeks, and spent the rest of my career there either coding it or writing tech specs for it.

Went to work at an insurance company coding SqlWindows, a now obscure 4th gen programming language. But hey, it was Windows programming. Spent 10 years there in a variety of roles.

After that I set up my own web development shop... Wrote classic ASP which is essentially Basic for the web. And then went to work at another insurance company, writing, you guessed it, Microsoft VB.net. Granted, VB.net was a far cry from the original basic, and probably would have been better off learning C#. But that was Microsoft's strategy with.Net - recycle old VB programmers and old C programmers using the CLR. At the end of the day, not much difference between C# and VB.net. Now I don't code anymore, I'm a VP at that insurance company. But I owe a lot of my career for having a tool like Basic available to me in my formative years. Sure, it teaches you some bad coding habits. But just like anything else, you learn from that, and others, and classes (and objects for those who like puns). Those who say that you can't be a good programmer after having learned basic are either elitist snobs or idiots. Sometimes you have to do it wrong first to see how doing it right makes all the difference. So Happy Birthday Basic - I love ya' baby.

I would have liked to hear from Ric Weiland but it's not possible since he died in 2006. He was responsible for the BASIC that I learned on: The Microsoft BASIC-in-ROM that came with my family's Ohio Scientific Challenger 4P (a 6502-based system from 1978 that had hardware similarities to Commodore systems). It also featured the first "Easter Egg" I remember: The system's boot prompt was "C/W/M?" (i.e. cold boot, warm boot, monitor). If you selected "A", it responded with "WRITTEN BY RICHARD W. WEILAND."

I recognize that message. I have one of those in my garage and, the last time I checked, it still worked. I finally wrote an emulator for the thing, copied the roms to a modern computer, and play with that, occasionally.

Basic was my first programming language, and I actually spent almost 10 years using it before moving onto more structured languages like C, but it wasn't too long after I learned Basic that I found that my favorite features of the language were the ones that enabled me to extend it with my own customizations, which I would have to write in assembler. If I remember correctly, the relevant basic keywords in the implementation that I used were 'usr' and '&... practically turning it into another language with all of the extensions that I would throw in.

It was Conways Game of Life recently explained in Scientific American Mathematical Games. I wrote it on a teletype (text terminals awaited cheap 512-bit ROMs for character sets in 1975) connect from my school to a local university computer (PDP-8?). Numbered lines were convenient in early BASIC when you could only edit one line at a time. The output was an asterisk and blanks grid. I think the printing took about 30 seconds which was longer than it took to compute the next generation. I found a listing a

There's still something awesome about the idea of sitting at a Teletype Model 33ASR playing Star Trek. Yes, you'd go through a lot of paper, but it was still fun. Of course what was even more fun was a version of BASIC I had on my 2nd computer, which allowed for INPUT statements that had a timeout feature on them. I was then able to write a version of Star Trek that would have the Klingons be able to attack you if you sat at any command prompt too long. Added an entirely new element to the game, you couldn'

Now seriously. Pascal was published some 2 years before Kernighan and Ritchie released their masterpiece. Having the opportunity to have a long look at Pascal and yet coming up with something like C shows a very strong character.

No, it just shows habit. C was descended from B, which descended from BCPL. They just did more of the same, instead of going to someone else's syntax.

And, having programmed in Pascal for 15 years. Pascal as defined was not suitable for large projects, whereas C was. Every Pascal compiler had to have some non-standard add-ons to handle modularity. And they were all different. Obviously, the Borland model came to have the status of a de-facto standard, but that was not till some years later. You could not have written Unix in standard Pascal; it was written in standard C. Wirth acknowledged the modularity failings of Pascal in his Modula language family, but by that time he had missed the bus.

I do not deny that Knuth, and Wirth, created other, very cool, programming languages later. But I stand by my statement that Pascal, as originally defined, was not suitable for large projects, a failing that Wirth himself recognised.

Knuth had to write a custom pre-processing system to deal with modularity and portability in Pascal when he did TeX and Web. He explicitly stated that Pascal was not his preferred language for that task, but he used it anyway because it was widespread among his target audience.

In 1962, he sat down to write one book of 12 chapters. More than 50 years later, he hasn't finished the project, which has scope-creeper to seven books. In the process, he created several new languages for use in his book project, and became famous.

Yes, I think Knuth may simultaneously be the world's best programmer and the world's worst project manager.

I haven't coded in Pascal since the good old DOS days, when I was about 17, but at the time was writing TSR apps, picture viewers (GIF, PCX, BMP, TIF), some graphic-mode UI, including mouse support, even some VGA graphic demos. I can't think of anything that I couldn't do in Pascal (and some ASM, I give you that). In fact, it's the reason I never got too heavily into C...

I agree the same problem of having to assume gcc will turn it into an sbi instruction, but at least it is a little clearer and more likely this way.

My personal feeling is that there should be a C-like language where every single global keyword not in a namespace is reserved for internal use. Ie max() and min() and sin() and sdi() so on are all reserved for direct implementation by the compiler. Currently the solution is to pretty much implement __foo() as a built-

It's been a while since I messed with it but I believe you could do a union of a byte with a bit array (or some other bitwise structure) and set the bits individually. At least that's the way I remember doing it for the A/D board I built.

Depending on the machine, the BASIC could be awful. On a Commodore 64, there were no commands for sounds and graphics. You had to get elbow-deep into peeks and pokes. Might as well go to assembler in that case. The C128 had BASIC 7.0, which had a much improved set of commands for graphics and sound.