Here's an interesting article that argues that using C to write low-level fast code that operates close to the bare metal is no longer a straightforward task, and is becoming increasingly virtualized..

One of the key attributes of a low-level language is that programmers can easily understand how the language's abstract machine maps to the underlying physical machine. This was certainly true on the PDP-11, where each C expression mapped trivially to one or two instructions. Since then, implementations of C have had to become increasingly complex to maintain the illusion that C maps easily to the underlying hardware and gives fast code... In light of such issues, it is difficult to argue that a programmer can be expected to understand exactly how a C program will map to an underlying architecture.

Here's an interesting article that argues that using C to write low-level fast code that operates close to the bare metal is no longer a straightforward task, and is becoming increasingly virtualized..

...In light of such issues, it is difficult to argue that a programmer can be expected to understand exactly how a C program will map to an underlying architecture.

Yes, quite so. As a longtime pedant and onetime assembler programmer, I could never really see that it was correct to call C "a low-level language" and was skeptical of the terminology.

/Rant on:Similarly, I was extremely skeptical of what looked like orchestrated obfuscation and fake news narrative around the so-called "Meltdown" and "Spectre" "threats" ("The sky is falling down!"). They are not "threats" per se, but apparently built-in potential security weaknesses that were intelligently and knowingly hard-coded into the CPU firmware in the early days by Intel engineers, to optimise throughput rates. They were probably justifiable as a perfectly valid workaround at the time - i.e., a justifiable trade-off to maximise efficiency. However, following the Snowdengate revelations, it became apparent that any such potential security weaknesses were likely to be exploited at some point - that's what made them "threats" - e.g., including (say) back-doors deliberately coded into Cisco router firmware, maybe (?) as a provision for anticipating the need for remote maintenance/support, etc.Post-Snowdengate, there's been a sort of FUD-induced insecurity and an almost total cessation of trust (and add to that the Bitcoin scams) - so we'll all be wanting encrypted/blockchained everything and Tandem Non-Stop hardware and operating systems (or similar) at this rate. Then all our social chats can be "secure" and even the most banal data - "Hey! Cool pix of your cat/dog/twerk!" will be "secured" (except to the NSA, of course). We'll probably get it too, but only at a cost. Some people are going to make a lot of $money out of all that FUD. /Rant off.

The other day, I was trying out a proggie called Domain Name Speed Benchmark <https://www.grc.com/dns/benchmark.htm>. I thought it seemed to be a very fast and rather nifty and small-sized proggie. Then I saw a bullet-point in the features list that said:

Around 1980 I was working for Sperry Univac. Their model 1100/60 had recently been introduced, and I was tasked with doing the software installation for a new system at Williams College in northwest Massachusetts.

We couldn't get the system booted up. The code in the status lights on the CPU cabinet indicated an "Instruction set error"!?! It took a while to determine this was caused by a failure of the SSP (System Support Processor) to load the microcode that told the CPU how to perform the instruction that were represented by the value in the instruction register. The microcode was loaded by the SSP from an 8" floppy.

We'd heard about the microcode as a feature that allowed loading a sub-instruction set that was optimized for running COBOL programs. This episode rubbed my nose in the realization that the assembler code I wrote was not what the hardware was running -- it was running an emulation of 1100 Assembler.

Out-of-order execution is enabled by the execution of microcode that tells the assembler-level accumulators, registers and instruction pointer what to do. I'm not aware of any contemporary architectures that have a loadable microcode, though.

Neither the Intel nor AMD x86 processors can be fully redefined with Microcode, though. The companies don't want to provide much information on what is possible, and the firmware blobs are undocumented, encrypted and digitally signed - but the general idea is that some of the more complex things can be tweaked, whereas a lot of the more common stuff is pretty hardwired.