What is a National Research Computing Platform For in 2019? Computers are everywhere now, but computing is still hard. Canada should build on its competitive advantage by strengthening existing efforts to provide expertise, skills and training to researchers and scholars across the country, and let others provide the increasingly commodity hardware. The result will be a generation of trainees with deep research and cloud experience, and a critical mass of talent at centres focussed on building enabling technologies. As R&D becomes increasingly intertwined with computational...

( Note: This is a bit of a work in progress; even more so than usual, comments/criticisms/additions welcome ) The Stages of Research Software Development Research software development covers a lot of ground — it’s the development of software for research, and research is a broad endeavour that covers a lot of use cases. The part of research software development that I find the most interesting is the part that is a research effort itself; the creation of new simulation methods, new data analysis techniques,...

I was invited to speak at this past weekend’s fourth annual Chapel Implementers and Users Workshop (CHIUW 2017). It was a great meeting, with lots of extremely high-quality talks on work being done with and on Chapel. The slides from the presentations will be up shortly, and I recommend them - the libfabric, KNL, use-after-free tracking, and GraphBLAS works were of particular interest to me. The Code Camp on the next day, working with members the Chapel team on individual particular projects, was also a...

Canada is a federated nation, and this is particularly visible in areas of research funding, where both the federal and provincial orders of government play a role. In building a successful digital research infrastructure to support Canadian science and scholarship, we must recognize that reality, and rely on the successful examples of many organizations in Canada and around the world that embrace such a federated approach. In this discussion paper, my colleague Jill Kowalchuck and I lay out what we hope to be the beginnings...

Julia and Chapel are both newish languages aimed at productitive scientific computing, with parallel computing capabilities baked in from the start. There’s lots of information about both online, but not much comparing the two. If you are starting a new scientific computing project and are willing to try something new, which should you choose? What are their strengths and weaknesses, and how do they compare? Here we walk through a comparison, focusing on distributed-memory parallelism of the sort one would want for HPC-style simulation. Both...

I was asked recently to do short presentation for the Greater Toronto R Users Group on parallel computing in R; My slides can be seen below or on github, where the complete materials can be found. I covered some similar things I had covered in a half-day workshop a couple of years earlier (though, obviously, without the hands-on component): How to think about parallelism and scalability in data analysis The standard parallel package, including what was the snow and multicore facilities, using airline data as...