cloud

There are a set of modules in the Perl universe that I’ve been looking for an excuse to use for a while. They are the MCE set of modules, which purportedly enable easy concurrency and parallelism, exploiting many core CPUs, and a number of techniques. Sure enough, I had a task to handle recently that … Read moreFinally got to use MCE::* in a project

Way back in the pre-history of the internet (really DARPA-net/BITNET days), while dinosaur programming languages frolicked freely on servers with “modern” programming systems and data sets, there was a push to go from a static linking programs to a more modular dynamic linking. The thought processes were that it would save precious memory, not having … Read moreWhat is old, is new again

While the day job builds (hyperconverged) appliances for big data analytics and storage, our partners build the tools that enable users to work easily with astounding quantities of data, and do so very rapidly, and without a great deal of code. I’ve always been amazed at the raw power in this tool. Think of a … Read moreRaw Unapologetic Firepower: kdb+ from @Kx

An article appeared on HN this morning. In it, the author noted that all was not well with the universe, as their backup, using Amazon’s Glacier product, wound up being quite expensive for a small backup/restore. The OP discovered some of the issues with Glacier when they began the restore (not commenting on performance, merely … Read more"Unexpected" cloud storage retrieval charges, or "RTFM"

In Feb 2013, I opined (with some measure of disgust) that people were looking at various software packages as silver bullets, these magical bits of a stack which could suddenly transform massive steaming piles of bits (big … uh … “data” ?) into golden nuggets of actionable data. Many of the “solutions” marketed these days … Read moreThere are no silver bullets, 2015 edition

This is a very exciting hyper-converged system, representing our next generation of time series, and big data analytical systems. Tremendous internal bandwidths coupled with massive internal parallelism, and minimal latency design on networks. This unit has been designed to focus upon delivering the maximal performance possible in an as minimal footprint … both rack based … Read moreThe worlds fastest hyper-converged appliance is faster and more affordable than ever

I had hinted at this last week, so I figure I better finish working on this and get it posted already. The previous bit with language choice wakeup was about the cost of Foreign Function Interfaces, and how well they were implemented. For many years I had honestly not looked as closely at Python as I should have. I’ve done some work in it, but Perl has been my go-to language. For me, the brevity of the interface, the ease of use of the FFI in it was what made me rethink some things.
I look at languages as tools to an end, a way to implement an algorithm, which will not always be expressed in the same language. I’ve been doing one manner or the other of FFI (not called that back then) since the mid to late 1980s. Usually Fortran and C, but as often as not, Assembler, Basic, Fortran, C, etc. One of the first codes I wrote for hire in 1984 was a terminate and stay resident driver for an experiment control card. Main code was in basic (I kid you not), and I exposed interrupt service routines to handle control functions. It was a fun thing to do, and later the experiment became famous. Regardless of that, the ability to understand where code was spending its time, and what it was doing became very tightly ingrained in me from that experience. Not that basic was fast … it wasn’t. I had to do all the hard/fast stuff in assembler. If you didn’t understand your code, chances are your measurements of the code, and your ability to effect changes would be … limited.
I look at rapid development languages (Perl, Python, etc.) as being high level mechanisms to bridge a variety of “fast” lower level bits (libraries, etc.) together. These tools often have good internal support for things that I frankly really don’t want to waste my time implementing.