Friday Hack Chat: Circuit Python

Back in the olden days, if you wanted to learn how to program a computer, you used the BASIC interpreter stored in ROM. This is how an entire generation of devs learned how to program. Now, home computers do not exist, there is no programming language stored in ROM, and no one should inflict JavaScript on 8-year-olds. What is the default, My First Programming Language™ today? Python. And now it’s on microcontrollers.

For this week’s Hack Chat on hackaday.io, we’re going to be talking all about Circuit Python. Circuit Python is based on the Open Source MicroPython, a Python 3 interpreter that implements a subset of the Python language on microcontrollers and other constrained environments. It is the spiritual successor of BASIC on every computer: MicroPython has an interactive prompt, arbitrary precision integers, closures, lists, and more. All of this fits on a microcontroller with 256 kB of code space and 16 k of RAM.

Our guests for this week’s Hack Chat will be [Scott Shawcroft] and [Dan Halbert] from Adafruit. [Scott] started working on MicroPython with Adafruit in September 2016 and has led the renamed CircuitPython effort ever since. [Dan] started working on CircuitPython in early 2017 and joined Adafruit in August of that year. [Dan], by the way, is the original author of the ‘more’ command in UNIX.

For this Hack Chat, we’re going to be talking about CircuitPython, its history, current boards that support the project, and the end goals for CircuitPython. We’ll be talking about future plans, what will be supported in the future, and asking any technical questions about CircuitPython.

I think there are several advantages for using simpler systems that PCs or even raspberry pi’s for teaching programming.
There are no distractions, like facebook.
It’s difficult to “corrupt” the OS image, you simply power-off and start again.
The environment is necessarily simple, no complex window-management frameworks and so on.
You get a taste of “what lies below”, the low level operation of a computer, which is often abstracted away in more complex environments.

I think it’s more like ‘the house computer’ doesn’t exist. There’s no ‘home computer’ or default that everyone is using. Just like there most likely isn’t a phone tethered to your kitchen wall anymore. IDK that’s how I understood it.

i guess because there’s so many modules/packages. That and, for GUI stuff it sure beats C/C++ with one of the cross-platform widget libraries (wx, qt, .), and you can still easily code the number crunching parts in C/C++ for native code and performance.

If you think wxWidget is a proper cross-platform python GUI toolkit, you are mistaken. I’ve maintained a python-wxWidgets application for years. And there are lots and lots of subtle platform depended bugs. To the point where it becomes unmaintainable if OSX is also your target. And you need to test everything on all the platforms you want to support.

Not to forget that python3 support was never really done for wxPython, there is “project phoenix”, but after years, still no stable release.

And that is even more why it isn’t a real programming language. When your program consists of a couple dozen lines of “code” where you’re pretty much performing some if then else logic and passing variables into the 300 modules you have loaded, you aren’t programming. You are sticking lego blocks together.

I knew an EE who felt that way, guy never programmed in his life ’til he needed to write a test suite for a product. A vendor told him to try Python and the guy fell head over heals. The rest of us who had been programming for quite a while in more low level languages could see the benefits.

Long story short, we had to port all of our C code to Python. I don’t care for it much now.

I disagree with much of this article..
When teaching my children program I start with Scratch – it is great for a 6 to 8 yo. If you stat an 8yo with python they are not going to be interested…
Once they are about 9 it’s time to start them on something else – I found the Ardunio C++ (in later days running on a 8266) to be ideal (NOT THE IDE), as they could really get to understand procedural language and get things done. And there is enough of the basic OO in it to let them use some of the concepts.
From there it depended on what they were interested in, the two main choices were –
1) lazarus. Runs on windows/linux/ and on a pi… Easy to write graphical programs with.
2) javascript, writing mods for minecraft. You wouldn’t believe how easy it is to get a 12 yo to want to be able to do this..

Where was python or BASIC in that? Yep, nowhere…

One of the strengths of BASIC was that you didn’t need to buy/install/wait for ever to compile a compiler. Most of those issue have gone, and with modern computers compiling of small programs is extremely quick. Thus I think the entire interpreter/compiler argument is semi dead.

The language I’d most like to teach kids with is a cross between the simple Ardunio C++ and the lazarus ability to work with a gui. That would be fantastic – ie a simplified clone of the old borland builder… :-)

(and if anyone is interested, the language I was using as an 8yo was fortran 66…)

I learned to program on a KIM-1, with 1K of RAM and hex readout and keyboard, no room for BASIC. So it was machine code, I had to hand assemble since there was no room for an assembler. But a really god monitor for learning.

I got a C Compiler about 1988, and soon gave up. It took so long to compile a tiny program (with 64K of RAM and two 5.25 inch floppy drives, and 1MHz 6809 CPU) and it was hard to follow the long string of error messages for a beginner. With a 1GHz computer, simple programs compile in a flash.

BASIC was seen as the future, even before home computers. For hardware people, it made sense to propagate a “simple” high level language, hence Tiny BASIC from Dr Dobbs and BASIC from Microsoft. It was common to read statements like “BASIC is much simpler to learn than assembly language”. Including it in ROM meant a selling point, out of the box you could do something with your new computer, not even needing a floppy drive.

Now, any home computer can run endless languages. I understand the concept of the Raspberry Pi, a separate computer so you can’t destroy important stuff as you play. But that’s not the only path. You don’t need a separate computer, run it on an older computer if you don’t want to worry. I bought a used netbook with 4gigs if RAM last year for twenty dollars, got two ten year old laptops the year before fir $2.50 each.

As for language, Python probably makes sense now, BASIC is less common. But one can certainly get BASIC for Linux. There was a time when Pascal was favoured by many. Other languages have come and gone, their impact of varying levels, but generally touted as the next thing.

The talk page for the PDP-11 at wikipedia has a debate about whether it qualifies as a “home computer”. For some, it would seem “home computer” is things like the Commodore 64, “cheap” and when suddenly a lot of people from a wider background bought them.

I think “home computer” was used from the early days, I looked a few years back. It is an ambiguous term, yet when “small computers” came along, they were all used at home, people dying to get any access to computers. Practical uses came later, expensive CP/M systems (ie loaded with floppy drives or later a hard drive, 64K of RAM
a daisy wheel printer etc) used by some businesses but likely with a hobby background. But it varied, Byte covered programmable calculators almost from the start (but only for a few years). Not that long after the Altair 8800 arrived, one California club did a group buy of LSI-11 bits, Byte did have articles about the LSI-11 and of course when Heathkit added digital computers, one was based on the LSI-11. SO I’d say “home computer” was not the computer, but where you used it.

But still the misinterpretation persists. They don’t make computers like they used to, so for some “there aren’t home computers”. The rest of us get way better home computers than in 1984, for the price of a C64 back then.

Remembering the old TRS-80 where the BASIC was the whole operating system. While there were load and save commands (to tape) there was only ever 1 file in memory at a time. Each line had a number, allowing GOTO commands to work without labels.
If you cut back on multi-tasking, or even multi-thread programs, you simplify your OS significantly. If necessary, you can add external RAM to microcontrollers (typically used as graphics buffer for LCD or similar, due to access speed issues).

The BASIC on the TRS-80 was not an operating system. It was a programming environment that was booted directly into. There were versions of DOS that ran on the TRS-80 if you had floppy drives.

Speaking of computers booting directly into BASIC, It was once a thing in the IBM PC world as well. I actually had an old pentium era computer where if you had no HD or floppies attached it would try and load BASIC out of ROM, However there was no BASIC ROM or even a socket on the board to insert one, so it would just end up displaying NO BASIC ROM FOUND

Since there was nowhere to add a BASIC ROM on the board, i can only assume that was some legacy BIOS stuff that was left in the computers of the time.

Few years ago I discovered inexplicable law of the nature: At any given time exist a technology which is a way more superior than the popular one. In case of Python, it actually resembles a giant bubble blew up by a bunch of idiots, as just about any competing technology is way more superior.

It is perfectly good for most applications. But if your program is running too slow for your application, you might need to use C or even assembly language. In some cases you must – if you develop a program for a microcontroller.

There are people here who actually programmed computers using switches, so watch out :)

Learning Python. I’m really enjoying it as a language and a script language as well. Out of curiosity how complex can you make these Micro Devices with code alone? I’m getting ready to host a Coder Dojo and we had this idea as a stair step project. Could you program say Circuit Playground form adafruit to send morse code with one and have a second one listening. When the 2nd devices hears the code it could flash an LED signaling it translated it correctly and then send a reply based on the message. (e.g. device a(S.O.S) device b(led flesh red and responds help is on the way)

If possible we thought this might be a great group project or something that the students advancing could work on together.

Side Note: I’ve had more fun playing with these $150 worth of micro controllers and python than I’ve experienced with technology in a long time. I hope someone can give me a two cents worth.