Bibliography | Hashad, Nabila: Context-Based Load Shedding in Complex Event Processing. University of Stuttgart, Faculty of Computer Science, Electrical Engineering, and Information Technology, Master Thesis No. 74 (2020). 63 pages, english.
|
Abstract | Abstract
Complex event processing (CEP) systems process event streams in real-time. These systems need to maintain a certain latency bound during event processing, especially at times of sudden increase in input rates. Load shedding (LS) proved to be popular among researchers as a method of getting rid of the detected system overload. Dropping events from the input stream, while it reduces the computational load on the operator, it can negatively affect the system’s ability to detect all complex events and in turn degrades the Quality of Results (QoR). In this thesis, we propose a LS approach that uses an event’s context to indicate its utility; where in overload cases, events with the lowest utilities are dropped. To get event utilities during LS with minimum overhead, we store event utilities in a data structure that enables the load shedder to efficiently retrieve event utilities. However, a large event context might result in considerably enlarging the data structure’s size that might exceed the available memory size. Therefore, we propose two different methods to summarize events’ contexts and minimize the storage overhead required to store event utilities. The first method (zSPICE) uses the Zobrist algorithm to encode the context of the event and map it to its utility value. The second method (aeSPICE) uses autoencoder models and utilizes their encoders to predict an encoded, substantially smaller representation of the context, that is then used to index the event’s utility. Our performance evaluation experiments show that our LS approach with the zSPICE method has outperformed most of the similar state-of-the-art LS approaches.
Keywords. Context-based load shedding, Load shedding, Zobrist-key, Complex event processing, Dimensionality reduction, Undercomplete autoencoder, Variational autoencoder, Context modeling, Event’s context, Quality of results
|