Satellite events in the weekend of 2-3 April and on Friday 8 April take place in the Auditorium (building n.1), rooms CZ 1 to 8 (ground floor) and rooms CZ 9 to 16 (second level). The meeting of the IFIP WG 1.3 will take place on 31 March-1 April 2016. Location: MultiMedia Paviljoen (building n.74) MMP 3.

Bidirectional transformations (Bx) are a mechanism for maintaining the consistency of at least two related sources of information. Such sources can be software models and code, graphs, relational databases, or any other document following standard or ad hoc formats. Bx are studied in a wide range of research areas, including databases, programming languages, software engineering, and graph transformation. However, results in one field often get limited exposure in the others. BX 2016 is a dedicated venue for Bx in all relevant fields, and is part of a workshop series that was created in order to promote cross-disciplinary research and awareness in the area. We welcome theoretical, practical and tool-oriented work.

Cassting is a European research project funded by the European Commission under the FP7 for Information and Communication Technology. The objective of Cassting is to develop a novel approach for analysing and designing collective adaptive systems in their totality, by setting up a game theoretic framework. Currently, most of the games played on graphs are of the sort "two-player zero-sum", we aim to extend them to "multiple-player non-zero-sum", and show the applicability of the new theory to the analysis and synthesis of interactive computational systems.

This workshop is the final event of the Cassting project. It aims to bring together researchers working on topics related (in a large sense) to formal methods for the automatic verification and synthesis of complex systems. The workshop will be composed of four invited talks, together with contributed talks (presenting either original or already published work).

During the last few years, it has become increasingly clear that a great variety of state-based dynamical systems, like transition systems, automata, process calculi and class-based systems, can be captured uniformly as coalgebras. Coalgebra is developing into a field of its own interest presenting a deep mathematical foundation, a growing field of applications and interactions with various other fields such as reactive and interactive system theory, object oriented and concurrent programming, formal system specification, modal logic, dynamical systems, control systems, category theory, algebra, analysis, etc. The aim of the workshop is to bring together researchers with a common interest in the theory of coalgebras and its applications.

Today's IT systems, and the interactions among them, become more and more complex. Power grid blackouts, airplane crashes, failures of medical devices, cruise control devices out of control are just a few examples of incidents due to component failures and unexpected interactions of subsystems under conditions that have not been anticipated during system design and testing. The failure of one component may entail a cascade of failures in other components; several components may also fail independently. In such cases, determining the root cause(s) of a system-level failure and elucidating the exact scenario that led to the failure is today a complex and tedious task that requires significant expertise. Formal approaches for automated causality analysis, fault localization, explanation of events, and blaming have been proposed independently by several communities - in particular, AI, concurrency, model-based diagnosis, formal methods -, and the work on these topics has significantly gained speed during the last years. The goals of this workshop are to bring together and foster exchange between researchers from the different communities, and the presentation and discussion of recent advances in the field.

The area of implicit computational complexity (ICC) grew out from several proposals to use logic and formal methods to provide languages for complexity-bounded computation (e.g. Ptime, Logspace computation). It aims at studying the computational complexity of programs without referring to external measuring conditions or a particular machine model, but only by considering language restrictions or logical/computational principles entailing complexity properties. Several approaches have been explored for that purpose, like restrictions on primitive recursion and ramification, rewriting systems, linear logic, types and lambda calculus, interpretations of functional and imperative programs...The workshop is intended to be a forum for researchers interested in ICC to present new results and discuss recent developments in this area.

In recent years, the growing importance of functional correctness and the increased relevance of system quality properties (e.g. performance, reliability, security) have stimulated the emergence of analytical and modelling techniques for the design and development of software systems. With the increasing complexity and utilization of today's software systems, FESCA aims at addressing two research questions: (1) what role is played by the software design phase in the systematic addressing of the analytical and modelling challenges, and (2) how can formal and semi-formal techniques be effectively applied to make the issues easier to address automatically, with lower human intervention.

Software product line engineering (SPLE) aims to develop a family of systems via systematic, large-scale reuse in order to reduce time-to-market and costs and to increase product quality. Formal methods and analyses offer promising techniques towards realizing these goals. While some analysis approaches (e.g., for feature modeling and variant configuration management) and formal methods (e.g., SMT/SAT solvers, model checkers, and formal semantics of variability models) have been applied to SPLE, a considerable potential remains to be exploited. The FMSPLE workshop series has proven to be an effective forum for discussing the adaptation of existing formal techniques and the development of new techniques, and their efficient and effective application to families of (highly configurable) systems.

GaLoP is an annual international workshop on game-semantic models for logics and programming languages and their applications. This is an informal workshop that welcomes work in progress, overviews of more extensive work, programmatic or position papers and tutorials.

Graphs are used as models in all areas of computer science: examples are state space graphs, control flow graphs, syntax graphs, UML-type models of all kinds, network layouts, social networks, dependency graphs, and so forth. Once such graphical models are constructed, they can be analysed and transformed to verify their correctness within a domain, discover new properties, or produce new equivalent and/or optimised versions.

Graphs as Models combines the strengths of two pre-existing workshop series: GT-VMT (Graph Transformation and Visual Modelling Techniques) and GRAPHITE (Graph Inspection and Traversal Engineering), but also solicits research from other related areas, such as Social Network Analysis.

Most program verification and synthesis problems of interest can be modeled directly using Horn clauses and many recent advances in the CLP and CAV communities have centered around efficiently solving problems presented as Horn clauses. This workshop aims to bring together researchers working in the two communities of constraint/logic programming (e.g., ICLP and CP) and program verification (e.g., CAV, TACAS, and VMCAI) on the topic of Horn clause based analysis, verification and synthesis. Horn clauses for verification and synthesis have been advocated by these two communities in different times and from different perspectives and this workshop is organized to stimulate interaction and a fruitful exchange and integration of experiences.

This workshop is intended to be a less formal counterpart to the Principles of Security and Trust (POST) conference at ETAPS, and with an emphasis on "hot topics", both of security and of its theoretical foundations and analysis. It is part of the activities of the IFIP WG 1.7 on Theoretical Foundations of Security Analysis and Design. Like POST, the themes are: theory of computer security; formal specification, analysis and design of security systems; automated reasoning for security analysis.

The aim of the working group is to support and promote the systematic development of the fundamental mathematical theory of systems specification, and to investigate the theory of formal models for systems specification, development, transformation and verification. The meeting will address the theoretical aspects of the specification and development of computing systems that are based on algebraic and logical concepts, and can be studied systematically within a theory of systems specification. Participation is by invitation only.

MSFP is devoted to the derivation of functionality from structure. It is a celebration of the direct impact of theoretical computer science on programs as we write them today. Modern programming languages, and in particular functional languages, support the direct expression of mathematical structures, equipping programmers with tools of remarkable power and abstraction. Where would Haskell be without monads? Functional reactive programming without arrows? Call-by-push-value without adjunctions? The list goes on. This workshop is a forum for researchers who seek to reflect mathematical phenomena in data and control.

Applications today are built using numerous interacting services; soon off-the-shelf CPUs will host thousands of cores, and sensor networks will be composed from a large number of processing units. Many applications need to make effective use of thousands of computing nodes. At some level of granularity, computation in such systems is inherently concurrent and communication-centred. PLACES aims to offer a forum where researchers from different fields exchange new ideas on one of the central challenges for programming in the near future, the development of programming methodologies and infrastructures where concurrency and distribution are the norm rather than a marginal concern.

Quantitative aspects of computation refer to the use of physical quantities (time, bandwidth, etc.) as well as mathematical quantities (for example, probabilities) for the characterisation of the behaviour and for determining the properties of systems. Such quantities play a central role in defining both the model of systems (architecture, language design, semantics) and the methodologies and tools for the analysis and verification of system properties. The aim of the QAPL workshop series is to discuss the explicit use of time and probability and general quantities either directly in the model or as a tool for the analysis or synthesis of systems.

The area of Resource Aware Computing grew out from several research results to use modelling techniques and static analysis techniques to derive upper bounds for consumption of resources such as time, space and energy. The RAC workshop serves as a forum for presenting original research results that are relevant to resource aware computing and the analysis of resource consumption by computer programs. The workshop aims to bring together the researchers that work on foundational issues with the researchers that focus more on practical results.

In the FP7 project Sensation (www.sensation-project.eu) work is being performed on extending a variety of design support methods, among which Markovian analysis, dataflow methods, timed automata and (statistical) model checking, to take into account system energy constraints. In doing so, design support tools are developed that help in designing embedded systems that can work well in energy-constrained circumstances. Within the project, case studies are performed on embedded video and audio streaming, solar- and battery-powered nano-satellites and the energy bus. The aim of the workshop, which takes place at the end of the project, is to present methods and techniques, tools, design flows and applications that focus on energy-aware and energy-constrained embedded systems. Submissions reporting on work performed within the project, as well as performed in other projects are invited.

SynCoP aims at bringing together researchers working on verification and parameter synthesis for systems with discrete or continuous parameters, in which the parameters influence the behavior of the system in ways that are complex and difficult to predict. Such problems may arise for real-time, hybrid or probabilistic systems in a large variety of application domains. The parameters can be continuous (e.g., timing, probabilities, costs) or discrete (e.g., number of processes). The goal can be to identify suitable parameters to achieve desired behavior, or to verify the behavior for a given range of parameter values.

Sharing of subterms is an obvious way to represent terms more efficiently, yielding term graphs. This is ubiquitous in implementations of programming languages: many functional, logic, object-oriented and concurrent calculi are implemented using term graphs. Research in term and graph rewriting ranges from theoretical questions to practical implementation issues. Research areas and topics for the workshop include: modelling of first- and higher-order and infinitary term rewriting by (acyclic or cyclic) graph rewriting; the use of graphical frameworks such as interaction nets and sharing graphs (optimal reduction); rewrite calculi for the semantics and analysis of functional programs; graph reduction implementations of programming languages,; graphical calculi modelling concurrent and mobile computations; graphs as models of biological or chemical abstract machines; and automated reasoning and symbolic computation systems working on shared structures.

VerifyThis is a series of program verification competitions held earlier at FoVeOOS 2011, FM 2012, Dagstuhl (April 2014), and ETAPS 2015. Its aims are: to bring together those interested in formal verification; to provide an engaging, hands-on, and fun opportunity for discussion; to evaluate the usability of logic-based program verification tools in a controlled experiment that could be easily repeated. The competition will offer a number of challenges presented in natural language. Participants have to formalize the requirements, implement a solution, and formally verify the implementation for adherence to the specification. There are no restrictions on the programming language and verification technology used. The correctness properties posed in problems will focus on the input-output behavior of programs, yet it is at the discretion of participants to treat only aspects of requirements. Solutions will be judged for correctness, completeness, degree of automation, and elegance.

The workshop aims at bringing together researchers working in the fields of Program Verification and Program Transformation. Recent research in these areas has shown a great potential for beneficial interactions. On one hand, methods and tools for Program Transformation such as partial evaluation, fold/unfold transformations, and supercompilation, have all been applied with success for the verification of infinite state and parameterized systems. On the other hand, model checking, abstract interpretation, SAT and SMT solvers, and automated theorem provers have been used to enhance program transformation techniques. Moreover, the formal certification of tools, such as compilers and automated refactoring tools, has recently attracted considerable interest and posed new challenges. The workshop will also provide a forum where researchers from these fields may interact and foster new developments.

This workshop focuses on verification of software changes and upgrades, as well as on synthesizing new program versions and automated bug fixing. Software continuously evolves to meet rapidly changing human needs. Each evolved transformation of a program is expected to preserve important correctness and security properties. Thus there is a clear need for the techniques to formally verify program correctness after a change and either provide a proof of correctness or further employ results of performed analysis to fix detected bugs. This workshop will bring together researchers in formal verification and program synthesis to ultimately make the process of software evolution safe. The workshop program will consist of invited talks.

Rewriting logic is a natural model of computation and an expressive semantic framework for concurrency, parallelism, communication, and interaction. It can be used for specifying a wide range of systems and languages in various application fields. It also has good properties as a meta-logical framework for representing logics. In recent years, several languages based on RWL (ASF+SDF, CafeOBJ, ELAN, K, Maude) have been designed and implemented. The aim of the workshop is to bring together researchers with a common interest in rewriting logic and its applications, and to give them the opportunity to present their recent works, discuss future research directions, and exchange ideas.