Entries for month: November 2013

To
deliver enhanced integrated warfighting capability at lower cost across
the enterprise and over the lifecycle, the Department of Defense (DoD)
must move away from stove-piped solutions and towards a limited number of technical reference frameworks
based on reusable hardware and software components and services. There
have been previous efforts in this direction, but in an era of sequestration and austerity,
the DoD has reinvigorated its efforts to identify effective methods of
creating more affordable acquisition choices and reducing the cycle time
for initial acquisition and new technology insertion. This blog
posting is part of an ongoing series on how acquisition professionals and system integrators can apply Open Systems Architecture (OSA)
practices to decompose large monolithic business and technical designs
into manageable, capability-oriented frameworks that can integrate
innovation more rapidly and lower total ownership costs. The focus of
this posting is on the evolution of DoD combat systems from ad hoc
stovepipes to more modular and layered architectures.

The verification and validation of requirements are a critical part of systems and software engineering. The importance of verification and validation (especially testing) is a major reason that the traditional waterfall development cycle underwent a minor modification to create the V model
that links early development activities to their corresponding later
testing activities. This blog post introduces three variants on the V
model of system or software development that make it more useful to
testers, quality engineers, and other stakeholders interested in the use
of testing as a verification and validation method.

In early 2012, a backdoor Trojan malware named Flame was discovered in the wild. When fully deployed, Flame proved very hard for malware researchers to analyze. In December of that year, Wired magazine reported that before Flame had been unleashed, samples of the malware had been lurking, undiscovered, in repositories for at least two years. As Wired also reported, this was not an isolated event. Every day, major anti-virus companies and research organizations are inundated with new malware samples. Although estimates vary, according to an article published in the October 2013 issue of IEEE Spectrum, approximately 150,000 new malware strains are released each day. Not enough manpower exists to manually address the sheer volume of new malware samples that arrive daily in analysts’ queues. Malware analysts instead need an approach that allows them to sort out samples in a fundamental way so they can assign priority to the most malicious of binary files. This blog post describes research I am conducting with fellow researchers at the Carnegie Mellon University (CMU) Software Engineering Institute (SEI) and CMU’s Robotics Institute. This research is aimed at developing an approach to prioritizing malware samples in an analyst’s queue (allowing them to home in on the most destructive malware first) based on the file’s execution behavior.