Jeff Epler's blog

About me

I've been a computer programmer since I first started typing in program
listings on a Commodore Vic 20 when I was about 8. My hobbies include
electronics, CNC manufacturing, photography, beer and winemaking.

I live with my wife, cat, and lots of left-over parts from unfinished
projects in Lincoln, Nebraska, USA.

It remains unclear to me whether effective quantum computing -- which I think of as something that can implement algorithms like Grover's and Shor's -- is a mere matter of engineering, or whether it requires one or more scientific breakthroughs. So many companies doing public-facing research in the area act like it's the former, but authors like Mikhail Dyakonov who know much more than me act like it's the latter. The thing about scientific breakthroughs is that it's hard to put timelines on them; the hoped-for breakthrough might never come.

As bad as C/C++ can be, particularly for systems that have to be secure in the face of untrusted input, it's simply not financially possible for companies to transition their legacy systems to anything else. Take some random and unnamed commercial codebase for example -- while 'sloccount' isn't the Word of God, here's a big application that it says would cost just north of $100 million to write from scratch. But it also depends on any number of file format libraries; just one of them, sloccount reports, is $43 million in its own right. There are maybe 3 or 4 such libraries, and they're all in C++. And then there's the one that the company buys in binary-only form, so you're stuck into a single C++ ABI unless you fork over a 5- or 6-digit quantity of money for a fresh build with a different toolchain! You simply can't rewrite this in rust or other "safe" language, not even your core codebase. You also have to grow 5 or 6 people who are such domain experts that they can write, from scratch, file format libraries for formats where the (complicated × niche) product is huge. And again, whole 'sloccount' is not the word of god, it might say your schedule for just the two subtasks above is 10 years for a staff of 100, which already dwarfs your current development staff. May the address-space layout randomizer have mercy on us all.

The Gaia spacecraft is pretty amazing. Run a gigapixel camera, in space, for 5 years and somehow get the data on billions of stars home on a mere 3Mbit/s link. "only a few dozen pixels around each object can be downlinked". The in-space processing must be pretty sophisticated and high performance.

"[T]here’s the AI trained to identify toxic and edible mushrooms, which simply picked up on the fact that it was presented with the two types in alternating order. This ended up being an unreliable model in the real world."

I thoroughly disagree with the author's assertion of the equal epistemic(?) status of the two fields "date of birth" and "sex/gender" of a birth certificate. I am at home with a world where a 5-second or even 50-year investigation of the shape of a body can't accurately reveal this (once assumed to be objective and unchanging) characteristic. Just think of it like pronouncing a baby a habitual criminal based on the debunked science of phrenology! On the other hand, the truth of passing days and years seems just about as objective as anything; and find nothing particularly sinister in the way we codify it into a civil calendar which in turn enables legal contracts like "the term of the lease shall be 12 months from November 8, 2018". Hopefully we some day arrive in a world where even if there's some reason to write down quick notes on the shape of baby genitals (one weird trick for telling babies apart with ~P(0.5)!), nobody insists on printing anything about it on our everyday ID cards, or imagines it should inform our use of pronouns or whether we should prefer white wine or lite beer.

at a first guess I'm at or above that 321 hour average, based on 8000 yearly miles driving and 25mph average rate gives 320 hours, not including whatever I do in rental cars. On the other hand, we have made a choice to do driving vacations the last few years, racking up 1200 miles at a go; that driving at least brings much greater rewards than the drive to work! ETA: Average people get 120 hours of vacation?

With $100 terabyte hard disks but without inexpensive backup media and
drives, it is often necessary to mark parts of a filesystem 'nodump' to get
useful backups on a reasonable number of pieces of media.

turd is a program I cooked up to help with this: it is like the
standard du utility, except that it consults the filesystem nodump flag.
This means the output of turd approximates the amount of tape space
required to dump the named directory. Like du -x, turd never
recurses into a different filesystem.

tar0 is a program that takes a list of NUL-terminated filenames and
directory names on stdin and writes a tar file to stdout. For each filename or
directory the file or directory itself is written; the contents of directories
are not recursively written. To create a tar of only files changed since the
last level-1 dump, use

cd /mount-point; turd -F -1 | tar0 > level1.tar

turd and tar0 ar released under the terms of the GNU General
Public License, version 2 or later.