>>>Personally, I'd say there's been precious little new in programming>>>languages since Simula gave us OOP in the late 1960s.

I wouldn't say so. Advanced type systems (bounded polymorphism and
linear types to name a few) have enetred the picture since.

> The ASCII character set has been a limiting factor for programming> language design for decades. Here I'm talking about the interface that> faces the programmer, not "language features" that enable buzzword> compliant programming.

As John mentioned, APL has been around for ages and used a lot of
non-ASCII symbols. Algol was originally designed to use several
non-ASCII symbols that could be encoded in different ways depending on
the local symbol set. ASCII was by no means a standard then --
FIELDATA and EBCDIC were common alternatives, so the choice was either
to limit the language to use the common subset (which was rather
small) or to use an ideal set of symbols and allow these to be
encoded.

ASCII certainly has the advantage of being easy to type using a
standard keyboard, but with touch screens it is now not so difficult
to have soft keyboard with various extensions. But if I were to go
outside ASCII for a programming language, I would also use extended
layout: subscripts, superscripts and more. Using a subset of HTML for
program layout would work fine: Programs can be displayed in any
browser and you can use HTML editors (and even ASCII editors) to edit
programs if you don't have access to a dedicated IDE.