Workshop on Concepts - WoC 2009

Formal specifications have long been considered beneficial for software development and software quality. A central motivation has been the possibility for formal verification. Other uses such as testing have been on the agenda. Until recently most specifications had to be kept separate from the code, and often needed extra notation to couple the specifications with the code. This is on the verge of changing, with several attempts at integrating specifications as annotations or testing frameworks, following the lead of languages like Eiffel and Extended ML. With the proposed inclusion of Concepts into C++0x, formal specifications would for the first time be integrated in a main stream language.

Early experiments with concept based software development have shown significant productivity increases. And the advent of tools for concepts is opening up new areas like high level software optimisation, software testing, software evolution tools etc.

However, concepts have now been removed from C++0x. This opens up for a series of research questions, hopefully leading to a better understanding of concepts, their implications for current and future software practices and code bases.

The purpose of this workshop is to get an overview of the various proposals for Concepts in C++, the main issues leading to their ousting from the C++0x standard, and related issues such as value-added benefits and tooling problems. The workshop is to form a backdrop for a European research proposal investigating concepts. One aim of the research would be to come up with a Concept proposal acceptable to the C++ community.

Abstracts of WoC position papers

An obvious use of axioms in concepts is as a basis for automated testing. Axioms are used to generate test oracle code, and then test cases are generated for each set of types that model a concept. Generating test code is surprisingly straightforward.

There are still some open issues in concept-based testing, though, such
as how to deal with auto concepts, and what is the best way to express
axioms for object-oriented code and exceptions?

Another use of axioms is as rewrite rules for optimisation of
programs. This is limited to (conditional) equational axioms,
but even such axioms may not be immediately useful for optimisation. A
programmer has no control over how and when rewriting is done, and the
compiler may not be able to see on its own whether an axiom is useful for
optimisation or not. Attaching strategies to axioms, or employing some
kind of classification scheme can alleviate this problem and make axioms
useful in optimisation.

We sketch a hardware independent parallel programming model based on the theory of Data Dependency Algebras (DDAs) and their embeddings.
Using axiom-based concepts to define and implement DDAs, one has not only an elegant programming style for efficient massive-core programming,
but also the tool of axiom-based testing to ensure quality.

The presentation briefly illustrates the enormous potential of C++ concepts, esp. of
semantic concepts. We give a short overview what can be already achieved with the
current concept proposal and give ideas of new language features that augment this
potential further.

Formal methods in general have been around for a long time, but even though they promise to improve software quality, they have never become part of main-stream software development. Concepts may change that. Here we give an account of the SAGA project where we are exploring the use of concepts for the development of C++ software. We are actively using algebraic specification methods for domain engineering, resulting in a radically different design of the software based on new domain specific abstractions, software testing, generating reusable test oracles, code optimisation, getting rid of the computational overhead induced by the heavy use of abstractions in the software. An important point being that we are reusing the same specifications for all these purposes. This comes on top of the significant programming productivity gains resulting from the improved software design.

Various areas of scientific computing, such as algebraic or differential
topology, differential geometry or geometrical algebra, each have different notations and
requirements.
Our work focuses on the extraction of all the necessary (mathematical) concepts to
enable fully multi-dimensional and multi-topological programming. We are currently
working on
realizing traversal capabilities which transcend current implementations.

Inge Norum

Potential use of C++ Concepts in Petrel, a large-scale software product

Petrel is a commercial large-scale scientific software application largely
programmed in C++. A brief overview of Petrel's usage of C++ (templates) and
associated tool set is provided. As an industrial user of C++, some
envisaged benefits of a C++ Concepts extension are presented in light of our
experience with C++ templates. The importance of tool support is raised, and
is viewed as a deciding factor for our use of generic programming.

In this talk we review how various specification formalisms can be
expressed as institutions: equational logic, process algebra, and
finally a formalism combining processes and data. We conclude with a
brief speculation how these ideas can be useful for giving concepts a
formal semantics and for re-using tools for reasoning about concepts.

This is a position paper giving our views on the uses and makeup of module interfaces.
The position espoused is inspired by our work on the Extended ML (EML)
formal software development framework and by ideas in the algebraic foundations
of specification and formal development. The present state of interfaces in EML is
outlined and set in the context of plans for a more general EML-like framework with
axioms in interfaces taken from an arbitrary logical system formulated as an institution.
Some more speculative plans are sketched concerning the simultaneous use
of multiple institutions in specification and development.

In this note, I will report on the results of a small
empirical study that assesses performance, expressivity, and
convenience of a function concept. The study shows that the
function concept is faster and at least as expressive as the
best function datatype, but also less convenient to use.

The presentation is based on the 2003 paper and what we have learned since.
What do we really want? Logically? Performance wise (runtime and compile time)?
How can we scale use? How close to current C++?

The Origin C++0x Libraries are a framework for experimenting with generic programming and library
design in C++0x. The Origin.Concepts library provides features for emulating (via templates and
programming idioms) nearly all of the features of the concepts proposal for C++0x. The implementation
of this library leverages many of the new features found in the current C++ Draft Standard including
variadic templates, type deduction, extended SFINAE and forwarding. The intent of the effort is to
provide a framework for experimenting with the underlying semantics of concepts and their use within
generic libraries. In the process of developing this library, we have been able to duplicate virtually every
problem addressed in the WG21 publications regarding the syntax, semantics, and use of concepts,
making it a viable resource for experimenting with the underlying technologies in terms both language
and library development.

Issues addressed (and yet to be addressed) during the development of the Origin.Concepts library
include deductive and adaptive typename declarations, strong and weak interface checks, explicit and
automatic concepts, negative concept maps, the implicit duality of requirements (assertion and
elimination), and the definition of archetype systems. In the implementation of concept systems, we
have identified use cases for requiring member variables, static members, checks on the visibility of
members, and the need to syntactically differentiate abstractions.

The optimizations in modern compilers are constructed for a
predetermined set of primitive types. As a result, programmers are
unable to exploit optimizations for user-defined types where these
optimizations would be correct and beneficial. Moreover, because the
set of optimizations is also fixed, programmers are unable to
incorporate new optimizations into the compiler. To address these
limitations, we apply the reuse methodologies from generic programming
to compiler analyses and optimizations. To enable compilers to apply
optimizations to classes of types rather than particular types, we
define optimizations in terms of generic interface descriptions (similar
to C++ concepts or Haskell type classes). By extending these interface
descriptions to include associated program analysis and transformation
fragments, we enable compilers to incorporate user-defined
transformations and analyses. Since these transformations are
explicitly associated with interface descriptions, they can be applied
in generic fashion by the compiler. We demonstrate that classical
compiler optimizations, when generalized using this framework, can apply
to a broad range of types, both built-in and user-defined. Finally, we
present an initial implementation, the principles of which are
generalizable to other compilers.

I am an observer of C++ rather than a researcher evolving it. As a consultant and analyst, clients often
ask "which programming language should be used" for various of their projects, so keeping abreast of the
directions of languages is an important activity. Hence my interest in C++0x and concepts. I don't
actively use C++ these days, being more in the Java/Scala/Groovy and Python arenas.

The introduction of templates to C++ removed the need to "hack" with macros, and allowed the compiler to
perform checking as well as analysed code generation. This was a good thing. However, template parameters
have always been a bit of an enigma -- seemingly untyped and yet strongly typed. In principle concepts
bring an explicit type system to template parameters. In practice it seems compilers will get hugely bigger
to handle concepts, arguably too big.

It is unsurprising, at least from the outside, that concepts were withdrawn from the C++0x standard. They
were too new, there was insufficient experience with which to standardize. Conversely, the
introduction of a threads standard comes far too late. C++0x has threads and futures, but shared memory
multi-threading is the problem not the solution to parallel programming. Moreover concepts do not really
add anything to programming in a post Multicore Revolution world.

The main problems for C++ are the complexity of the C++ language, the huge take up of Java, and (more
recently) the arrival of the functional languages (most notably OCaml and Haskell) as commercially viable
languages. By most metrics Java and C are "kings of the hill" though Scala, C#, Python, Jython, Ruby,
JRuby, Perl, Haskell, OCaml, PHP, JavaScript are also (and increasingly) important.

Will concepts be missed from C++0x? Unlikely, as most people using C++ don't really know what they are.

Marcin Zalewski

Value Types, Computational Bases, and Concepts: What is the Best Way to Glue Code Together?
(slides)

In this talk I will briefly summarize the taxonomy of ideas introduced by
Stepanov and McJones in their recent book "Elements of Programming." Then, I
will ask how do these ideas impact generic programming, and how should they be
reflected in programming constructs. In particular, I will talk about value
types, computational basis, and concepts, and about the possibility of
corresponding language mechanisms that are "higher kind" than concepts.