Veranstaltungen zum Wissenstransfer

Aktiver Beitrag

Abstract

Modern software development processes, such as DevOps, and Microservice architectures, decrease the time from development to production. It is important to integrate performance testing in this context, and several techniques for continuously verifying software performance have already been introduced. These techniques usually provide only specification and automation of basic performance tests, and do not leverage the software performance knowledge generated in continuous lifecycles to speed-up software performance testing. During my PhD, I have been working on automated benchmarking of Workflow Management Systems, thus developing a declarative domain specific language (DSL), where the users can specify their performance testing goal(s), and a model-driven framework for automated performance testing of such systems. In this proposal we plan to extend both the DSL and the framework to enable automated benchmarking of Microservices. We intend to enhance the expressiveness of the DSL by adding new performance goals suited to the new domain. These goals are to be automated by the framework that we are going to integrate with the DevOps continuous development lifecycle. By leveraging the integration, we also plan to enable the framework to collect, analyze and model the relevant lifecycle data, so that it can be leveraged in order to speed-up and enhance performance testing. To assess to which extent the framework can perform automated goal-driven performance testing for Microservices in DevOps, we will evaluate the framework with real-world use cases provided by our hosts at the University of Stuttgart.