Call for Papers

Workshop Scope

DIFTS (Design and Implementation of Formal Tools and Systems) workshop
emphasizes insightful experiences in formal tools and systems design.
The workshop provides an opportunity for discussing engineering aspects and various design decisions
required to put formal tools and systems into practical use. In the past,
we have invited speakers who have shared their deep insights
and discussed the practices followed in the Industry towards adopting formal methods.
It provides a forum for sharing challenges and solutions that are original with ground breaking results.

Topics of Interest

DIFTS takes a broad view of the formal tools/systems area, and solicits
contributions from domains including, but not restricted to, decision
procedures, verification, testing, validation, diagnosis, debugging, and
synthesis.
This workshop encourages and appreciates system development activities, and
facilitates transparency in the experimentation. It will also serve as a
platform to discuss open problems and future challenges in practicing formal
methods.
Call for Paper CFP (txt).

Invited Speakers

Formal verification is a powerful and effective method to ensure correctness of electronic systems. In order to check whether enough properties have been written coverage metrics have been proposed along with methods to automatically increase the coverage by determining new properties for missed scenarios. In the invited talk we present how such coverage methods can be leveraged to higher levels in the design flow of electronic systems. We first review existing coverage metrics and demonstrate state-of-the art formal verification techniques for correctness checking at the Formal Specification Level (FSL). Based on those we describe how coverage techniques can help to automatically derive missed scenarios at the FSL, where structure and behavior are given in terms of UML class and sequence diagrams, respectively.

Rolf Drechsler received the Diploma and Dr. Phil. Nat. degrees in computer science from J.W. Goethe University Frankfurt am Main, Frankfurt am Main, Germany, in 1992 and 1995, respectively. He was with the Institute of Computer Science, Albert-Ludwigs University, Freiburg im Breisgau, Germany, from 1995 to 2000, and with the Corporate Technology Department, Siemens AG, Munich, Germany, from 2000 to 2001. Since October 2001, he has been with the University of Bremen, Bremen, Germany, where he is currently a Full Professor and the Head of the Group for Computer Architecture, Institute of Computer Science. In 2011, he additionally became the Director of the Cyber-Physical Systems group at the German Research Center for Artificial Intelligence (DFKI) in Bremen. His current research interests include the development and design of data structures and algorithms with a focus on circuit and system design.
Rolf Drechsler was a member of Program Committees of numerous conferences including e.g., DAC, ICCAD, DATE, ASP-DAC, FDL, MEMOCODE, FMCAD, Symposiums Chair ISMVL 1999 and 2014, and the Topic Chair for “Formal Verification” DATE 2004, DATE 2005, DAC 2010, as well as DAC 2011. He is a co-founder of the Graduate School of Embedded Systems and he is the coordinator of the Graduate School “System Design” funded within the German Excellence Initiative. He received best paper awards at the Haifa Verification Conference (HVC) in 2006, the Forum on specification & Design Languages (FDL) in 2007 and 2010, the IEEE Symposium on Design and Diagnostics of Electronic Circuits and Systems (DDECS) in 2010 and the IEEE/ACM International Conference on Computer-Aided Design (ICCAD) in 2013.

Decades of research in the field of formal verification have resulted in a number of break-through successes. In many applications, the scalability of formal techniques no longer causes concern. Adequate specification languages such as PSL and SVA have been developed that match well with powerful proof algorithms. Standardization of these languages has facilitated their adoption in industry. Moreover, sophisticated methodologies have emerged in industry that proved to be highly productive. Yet, when deploying tools and methodologies for formal verification in industrial practice a number of hurdles have to be faced. Only fully automatic techniques such as equivalence checking and automatic linting enjoy wide-spread popularity in SoC design. Property checking, on the other hand, is only moderately successful. As a result, simulation has continued to be the predominant verification technique.
Based on observations of industrial practices over many years, this talk will identify a set of hurdles that impede formal techniques in gaining further ground oversimulation-based verification. After inspecting these hurdlesnew lines of research will be proposed for overcoming them. In particular, it will be argued that the future success of formal techniques may depend on how these techniques are integrated into system-level design flows. It is proposed to leverage existing formal technology as a contribution to close the “semantic gap” between the system level and the register-transfer level. This has a strong impact on how to use and design formal verification tools both for hardware and for low-level software. The talk will describe recent results and new practices already explored in industrial case studies. In conclusion, a new system design methodology including formal techniques is envisioned to support effective design procedures based on high-level synthesis as well as manual design refinements at the system level.

Wolfgang Kunz received the Dipl.-Ing. degree in Electrical Engineering from the University of Karlsruhe, Germany, in 1989 and the Dr.-Ing. degree in Electrical Engineering from the University of Hannover, Germany, in 1992.
From 1993 to 1998, he was with Max Planck Society, Fault-Tolerant Computing Group at the University of Potsdam, Germany. From 1998 to 2001 he was a professor of Computer Science at the University of Frankfurt/Main. Since 2001 he is a professorat the Department of Electrical & Computer Engineering at Technische Universität Kaiserslautern.
Wolfgang Kunz conducts research in the area of System-on-Chip design and verification collaborating with several industrial partners including AbsInt, Alcatel-Lucent, Atrenta, Audi, Bosch, Infineon and OneSpin Solutions.
For his research activities Wolfgang Kunz has received several awards including the Berlin Brandenburg Academy of Science Award and the Award of the German IT Society. Wolfgang Kunz is a Fellow of the IEEE.

Fahim Rahim, Atrenta, France

Title: Efficiently using formal verification techniques to reduce power

Atrenta's Spyglass Power Platform allows the designer to reduce SoC power at the RT level. Spyglass detects fine-grain power saving opportunities to reduce register and memory power consumption and automatically modifies the original RTL to include the newly identified power saving opportunities. In addition, the tool detects potential coarse grain power optimization in the SoC and guide the designer to explore power consumption of their SoC Spyglass employs formal verification techniques to explore the design space for power saving opportunities and prove that the proposed opportunities don't modify the design functionality. The talk will illustrate how formal verification techniques are being used to drive Spyglass technology and outline future challenges in the field of power reduction.

Fahim Rahim is the managing director of Atrenta in Europe and has over 20 years of experience in the field of Electronic Design Automation and has worked on all aspects of synthesis, ESL design, and formal verification. He holds a PhD in Formal Verification from the University of Paris (UPMC) and is the author of over 10 conference papers and journal articles presented at IEEE Conferences. He has previously worked as a senior researcher at the Synopsys Advanced Research Lab, California, and was one of the early members of Mentor Graphics Advanced Research Lab

Paper Submission

The workshop specifically solicits contributions with substantial engineering
details that often do not get published but have significant practical impact.

In the system category, we invite papers that have original ideas accompanied
with novel integration techniques, adequate design/implementation details,
important design choices made and explored, and good experimental results.

In the tool category, we invite papers that focus primarily on the engineering
aspects of some known/popular algorithm, with significant emphasis on the
design/implementation details, and various design choices made to advance
current state-of-the-art approaches.

The page limit for submissions in the system category is 10 pages in double
column format and for submissions in the tool category is 8 pages in double
column format.

Submission of papers should be made electronically in PDF format via EasyChair.
Submission Page.

Evaluation

To keep maintain uniformity and fairness in the reviewing process, the program
committee will evaluate the technical contribution of each submission based on
the following guidelines: the paper should provide enough details for others to
reproduce the results; and should solve a clearly-stated problem that is
significant and has wide interest; and the paper should provide enough
motivation for the design choices made. Overall, the paper should also clearly
identify what the main contributions of the work are.

Publication

All accepted contributions will be included in informal proceedings.
High quality submissions will be considered for a special issue
of journals such as FMSD (Formal Methods in System Design) or
IEEE TC (Transactions on Computers).

Formal verification is a powerful and effective method to ensure correctness of electronic systems. In order to check whether enough properties have been written coverage metrics have been proposed along with methods to automatically increase the coverage by determining new properties for missed scenarios. In the invited talk we present how such coverage methods can be leveraged to higher levels in the design flow of electronic systems. We first review existing coverage metrics and demonstrate state-of-the art formal verification techniques for correctness checking at the Formal Specification Level (FSL). Based on those we describe how coverage techniques can help to automatically derive missed scenarios at the FSL, where structure and behavior are given in terms of UML class and sequence diagrams, respectively.

10:00-10:30

Coffee Break

&nbsp

System

10:30-11:00

Alberto Griggio and Marco RoveriComparing Different Variants of the IC3 Algorithm for Hardware Model Checking(Paper) (Slides)

IC3 is one of the most successful algorithms
for hardware model checking. Several variants of the
original IC3 algorithm have been proposed in the literature:
each differing from the other for the approach
used in implementing the different steps of the algorithm,
e.g. pre-processing, clause generalization, the SAT solver
used, simplification. In this paper we perform a thorough
practical comparison of the different variants by
implementing each of them in the same tool pushing as
much as possible the corresponding implementations to
achieve the best performance. This enabled for a flexible
experimentation and to gain new insights, both about their
most important differences and commonalities, as well as
about their performance characteristics. We performed the
experimentation using as bench set the problems used for
the last three hardware model checking competitions. The
analysis helped to identify which is the best variant.

This paper addresses the issue of property grouping,
property decomposition, and property coverage in model
checking problems.
Property grouping, i.e., clustering, is a valuable solution
whenever (very) large sets of properties have to be proven
for a given model. As such sets often include many “easy-toprove”
and/or “similar” properties, property grouping can reduce
overheads, and avoid reiteration on common sub-tasks.
On the other end of the spectrum, property decomposition
can be effective whenever a given property turns-out (or it is
expected) to be “hard-to-prove”. Decomposition of properties into
“sub-properties” follows the divide-and-conquer paradigm, and
it shares with this paradigm advantages and disadvantages.
Our contribution is to present a heuristic property manager,
running on top of a multi-engine model checking portfolio, with
the specific target of optimizing productivity. We discuss, and
compare, different clustering heuristics, and we exploit circuit decomposition
strategies for property sub-setting. We also consider
the problem of evaluating a coverage measure for properties,
where the “coverage” is used to monitor the “advancement”
during the (partial) verification of a given property. We finally
consider estimates of the bound of a property as a measure of
confidence in BMC runs.
We include preliminary experimental data indicating that the
proposed approaches can provide improvements over state-ofthe-
art methods potentially enhancing productivity in industrial
environments.

One of the main challenges in software development
for safety-critical Cyber-Physical Systems (CPS)
is in achieving a certain level of confidence in the system
correctness and robustness. In order to perform formal
monitoring, testing and verification of CPS, the fully modular
tool S-TALIRO is presented. The tool is designed for
seamless integration with the Model Based Design (MBD)
process in Matlab/Simulink™. S-TALIRO performs robustness
guided Metric Temporal Logic (MTL) testing and
monitoring. Since writing specifications in MTL is an error
prone task that requires expert temporal logic users, a
graphical formalism for the development and visualization
of specifications is presented. The article provides an upto-
date overview of S-TALIRO. It includes a discussion
on the benefits of the fully modular architecture and the
challenges encountered in its development.

In modern chip architectures, the increase in parallelisation
brings about highly complex on-chip communication
fabrics. We present WickedXmas, a tool that facilitates the design
and formal verification of such interconnects. The tool is based
on the language xMAS, which is a high level design language
for communication fabrics, originally proposed by Intel. The
use of xMAS ensures that many common modelling errors such
as unintended loss of data or dangling wires are prevented
by construction. Therefore, the major challenge in verifying
xMAS models is establishing deadlock freedom. WickedXmas
can automatically detect deadlocks or prove their absence. If
a deadlock is found, it is presented to the user for further
analysis. Experimental evaluation on a range of interconnects
shows good performance and scalability of WickedXmas in
contrast to verification from scratch, or using existing model
checking techniques. Using WickedXmas, a user can draw a
communication fabric and formally verify it automatically.

Decades of research in the field of formal verification have resulted in a number of break-through successes. In many applications, the scalability of formal techniques no longer causes concern. Adequate specification languages such as PSL and SVA have been developed that match well with powerful proof algorithms. Standardization of these languages has facilitated their adoption in industry. Moreover, sophisticated methodologies have emerged in industry that proved to be highly productive. Yet, when deploying tools and methodologies for formal verification in industrial practice a number of hurdles have to be faced. Only fully automatic techniques such as equivalence checking and automatic linting enjoy wide-spread popularity in SoC design. Property checking, on the other hand, is only moderately successful. As a result, simulation has continued to be the predominant verification technique. Based on observations of industrial practices over many years, this talk will identify a set of hurdles that impede formal techniques in gaining further ground oversimulation-based verification. After inspecting these hurdlesnew lines of research will be proposed for overcoming them. In particular, it will be argued that the future success of formal techniques may depend on how these techniques are integrated into system-level design flows. It is proposed to leverage existing formal technology as a contribution to close the “semantic gap” between the system level and the register-transfer level. This has a strong impact on how to use and design formal verification tools both for hardware and for low-level software. The talk will describe recent results and new practices already explored in industrial case studies. In conclusion, a new system design methodology including formal techniques is envisioned to support effective design procedures based on high-level synthesis as well as manual design refinements at the system level.

The memory subsystem of a microprocessor is
responsible for scheduling memory accesses as efficiently as
possible, hiding latency costs from the Data Processing Unit
and, ideally, minimizing power costs. For ARM processors,
requirements on ordering and completion of memory accesses
from the ARM Architecture and relevant bus protocols specify
an envelope of acceptable behavior that must be maintained by
the schedule. Consequently, any given sequence of instructions
can be satisfied by a range of different schedules. Verifying the
correctness of the memory subsystem is therefore a complex task,
often on the critical path to release.
Inspired by earlier work on specifying end-to-end properties
we developed a language of Directed Observer Graphs
(DOGReL) capable of expressing implementation independent
properties at a level of abstraction between architecture and
micro-architecture that specify the behavior of the memory
subsystem. These properties are then translated into System
Verilog monitors which can be applied for bug-hunting at the
RTL level using commercial Model Checking tools. During a
recent micro-controller design project at ARM these properties
proved highly valuable in detecting bugs earlier in the design
cycle than simulation. The process of writing an unambiguous
specification also exposed early design flaws and provided clearer
documentation of the required behaviors for engineers.

Verification techniques for analyzing the design or
requirements at early development stages are used since the
beginning of the model-based design paradigm. Most of these
analyses are focused on a single purpose, like safety, real-time,
or geometry. This separation of concerns leads to the introduction
of so called aspects that describe these properties of a system.
Nevertheless, these aspects are not necessarily independent. In
this paper we use the fault tolerance time interval, the maximum
time to recover from faults, as an example to state the need for a
multi-aspect analysis. We present how a virtual integration test
can be performed covering safety and real-time properties to
prove the correct refinement of requirements. Our requirements
formalization approach using contracts, a pattern language and
the internal representation as timed automata are described. The
presented technique is applied to an automotive lane-keepingsupport
system.

Atrenta's Spyglass Power Platform allows the designer to reduce SoC power at the RT level. Spyglass detects fine-grain power saving opportunities to reduce register and memory power consumption and automatically modifies the original RTL to include the newly identified power saving opportunities. In addition, the tool detects potential coarse grain power optimization in the SoC and guide the designer to explore power consumption of their SoC Spyglass employs formal verification techniques to explore the design space for power saving opportunities and prove that the proposed opportunities don't modify the design functionality. The talk will illustrate how formal verification techniques are being used to drive Spyglass technology and outline future challenges in the field of power reduction.

16:00-16:15

Coffee Break

16:15

MEMOCODE final Key Note by Joseph SifakisA framework for modeling architectures and their properties

Architectures are common means for organizing coordination between components in order to build complex systems and to make them manageable. Despite the progress of the state of the art over the past decades, there are still a lot of foundational issues that remain unsolved. In this talk we present a general framework for modeling architectures and their properties.
The framework considers that systems are obtained as the composition of atomic components by using glue operators. These are composition operators on components. Their meaning can be specified by operational semantics rules giving the global behavior of a composite component as a function of the behaviors of its constituents. They implicitly define connectors between components each connector involving multiparty interaction and an associated atomic computation modifying the state of the composed components. We present results characterizing the expressiveness of glue operators and show that universal expressiveness can be achieved by using a limited and well-defined set of types of connectors. We also propose a powerful notation for connectors encompassing both static and dynamic coordination.
Architectures can be considered as generic operators parameterized by types of coordinator components and glue operators. Applied to a set of components they enforce a characteristic property on the resulting global behavior. Coordinators are needed only when the types of glue operators are not expressive enough for achieving the desired coordination. The proposed formalization of architectures can be used for building component-based systems that are correct by construction. We present composability results, which guarantee preservation of the characteristic behavioral properties of a set of architectures when they are jointly applied to the same operands.
We present logics for the description of architectural properties. Formulas of these logics characterize architectural styles as the description of component types from which a system is built, and their interconnection. The presented logics include a propositional logic and two higher-order extensions of this logic. We provide an axiomatization of the propositional logic as well as results on the expressiveness of the higher-order logics and their effective application for the description of architectural styles.