The Weird and The Wonderful

The Weird and The Wonderful forum is a place to post Coding Horrors,
Worst Practices, and the occasional flash of brilliance.

We all come across code that simply boggles the mind. Lazy kludges, embarrassing mistakes, horrid
workarounds and developers just not quite getting it. And then somedays we come across - or write -
the truly sublime.

Post your Best, your worst, and your most interesting. But please - no
programming questions . This forum is purely for amusement and discussions on code snippets. All
actual programming questions will be removed.

The Master said, 'Am I indeed possessed of knowledge? I am not knowing. But if a mean person, who appears quite empty-like, ask anything of me, I set it forth from one end to the other, and exhaust it.'
― Confucian Analects

Back in 2000, I encountered "delimeter", but the code had nothing to do with the length of a subway sandwich.

".45 ACP - because shooting twice is just silly" - JSOP, 2010-----You can never have too much ammo - unless you're swimming, or on fire. - JSOP, 2010-----When you pry the gun from my cold dead hands, be careful - the barrel will be very hot. - JSOP, 2013

It was not exactly a typo that went into a small document browser which was rushedly sent out for the U.N..

The day before I had added a toolbar and a menu item with the text 'Toolbar ein' or 'Toolbar aus' (= show toolbar, hide toolbar). At least I should have written that. Instead, I wrote 'Einbartool' and 'Ausbartool', just for fun.

Then, the next day, it was released in a hurry and my nonsense texts were still in it. Within an hour we had a mail with the following question: 'Qu'est-ce que un Einbartool?

I have lived with several Zen masters - all of them were cats.

His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

If you do native iOS (Swift) and native Android (Kotlin now) and web dev (JavaScript), you will find that Swift, Kotlin and JavaScript will have you going crazy in their choice of usage of : let, var, and val

Recently, I stumbled over C# and C++ meaning rather different concepts with their "array". While I may seem rather snarky here, I think that language designers more often than not explicitly try to remain different from the existing stuff for the sake of, I guess, differentiation for the sake of it.

Wait until you see VB.NET's array declarations. They still use the upper bound instead of the length because that's what VB6 did, despite the fact that the reason it did that no longer applies to VB.NET!

"These people looked deep within my soul and assigned me a number based on the order in which I joined." - Homer

Wait until you see VB.NET's array declarations. They still use the upper bound instead of the length because that's what VB6 did, despite the fact that the reason it did that no longer applies to VB.NET!

I haven't used an upper bound in a VB.Net array declaration for several years. The upper bound is optional and only really useful if you know ahead of time how big your array will be. Almost all the dotNet framework collection classes support this feature as well. There are times when it's useful but generally you don't need this feature.

I think that language designers more often than not explicitly try to remain different from the existing stuff for the sake of, I guess, differentiation for the sake of it.

I think you may be correct.
The two languages which are extremely similar are Java & C#.
I wrote an app as a C# desktop app and then wrote it as a Android app (Java) and there was a lot of code that translated without any changes -- the pure language stuff / syntax.

When I learned the way C# understands arrays, it gave me a deja-vu to Algol 68.

In those days (the language definition was published in 1968), hardware wasn't fast enough to make such complex language very useful. Experience with optimizing such constructs were limited, too. So Algol 68 never made it into the mainstream. When I learned about it 10-12 years later (we didn't have a compiler, but studied its concepts, in the university compiler course) we saw it as a collection of great ideas that couldn't be realized in practice, in any useful way.

So when I saw C# actually realizing those array concepts, it was sort of like a dream from my student days coming true.

What concepts are those? Because automatically managed dynamic generic type-safe arrays by themselves don't wow me as much, I've been using them extensively in Delphi as well and they're just as comfortable as in C#.

First and foremost, ragged arrays - a vector of vectors (of vectors of ...), each of a different size. That requires a radically different memory allocation strategy and address calculation strategy. (Algol 68 also allowed arbitrary lower bounds, which unfortunately has not been carried over to C#.)

Also, a function argument could be an array of any dimension; it was transferred by a descriptor stating the dimension of the actual argument. I am not sure if the actual argument could be dimensioned at runtime - with all the other mechanisms required for array handling, dynamic sizing woudln't add that much extra .

Lots of the things you see in Algol 68 (in addition to ragged, dynamically sized arrays) are present / common in languages of today. But Algol 68 preceeded Pascal by two years, it appeared at roughly the same time as Fortran IV. While we were using named COMMON in Fortran, Algol 68 offered dynamic, ragged arrays, with index checks. And pointers. And user defined operators. And threads with semaphore synchronization. And ...

It was like a brainstorming language. All the crazy ideas thrown out on the table at the same time. It took quite a few years to mold those ideas into something useful, and learn how they could be realized efficiently. Today we know how. But fifty plus years ago, the ideas seemed rather crazy to those who worried about the different address calculation whether you lay out fixed size, rectangular arrays by row (Pascal) or by column (Fortran). Algol 68 was in a completely different world, at that time.

Remember the old times of Basic and VisualBasic, where Let assigned the result of an arithmetical expression (luckily became optional) but if you needed to set a reference in VB you'd better remember it wanted the Set...To keyword or it would crash at the first execution.

It was also needed with custom VB classes and objects with no default property. A "nice" syntactic requirements in a langauge that did not have references or pointers (but then the hidden VarPtr, StrPtr and another function whose name I forgot came to help).

Which is why using C# for native iOS and Android apps, as well as MacOS, tvOS, Linux, and web apps (with webassembly) makes so much sense. No Java, no Kotlin, no Swift, no JavaScript (or the myriad of JS libraries). Just C# for them all.