Bachelor Thesis BCLR-2018-98

BibliographyZorn, Christoph: Concern-driven reporting of declarative performance analysis results using natural language and visualization.
University of Stuttgart, Faculty of Computer Science, Electrical Engineering, and Information Technology, Bachelor Thesis No. 98 (2018).
107 pages, english.
Abstract

Recent trends in the development of software architectures show that an increasing interest in the quality assurance of software solutions exists. To sustain the value of these programs, continuous testing and performance analysis has to be performed in every layer of the application. This process is usually performed by performance analysts, which are aware of pitfalls and unsuspected problems. Moreover is the development of software shifting into the culture of DevOps, the practice of managing all steps of the software construction in one department. This requires developers to be flexible and perform a diverse amount of tasks. In the case of performance analysis, a developer has to take the place of a software analyst. It requires profound expertise in the field of software performance evaluation. With an exceeding amount of analysis technologies, it is impossible to be an expert in every analysis solution. This suggests that the process of performance analysis has to be altered and simplified. On the one hand, people outside the performance analysis community have to be provided with tools for easier access. On the other hand, the complexity of such tools should support advanced expert configurations as well. Declarative Performance Engineering (DPE) tries to abstract the definition of performance relevant questions and the execution of the performance evaluation. The idea behind DPE is to hide complex configuration from inexperienced users and provide existing use cases to perform simple and advanced performance analysis. This thesis aims to provide users of performance evaluation with such a declarative approach. Users can formulate questions in human language and receive a range of evaluation options through the format of the question. The results of those evaluation methodologies are independently processed and presented to the user based on the asked question. This is done with declarative methods and advanced visualization technologies. To get a prospect on the possibilities in declarative performance reporting, a prototype is created that will be evaluated by two user groups in a user study. Experts in performance analysis will show what level of detail is necessary to make accurate predictions. Inexperienced users with no background in performance analysis will report which level of support is needed to perform basic evaluations. The outcome of the study will be examined on the differences in user behavior between the two user groups and reflects on the possible use of the prototype in the future. Collected data will be discussed and set into context with existing projects in the area of declarative performance analysis.

Full text and
other links
Volltext
Department(s)University of Stuttgart, Institute of Software Technology, Software Reliability and Security
Superviser(s)van Hoorn, Dr. André; Okanovic, Dr. Dušan; Ferme, Vincenzo; Beck, Jun.-Prof. Fabian
Entry dateMay 20, 2019
   Publ. Computer Science