In the Internet of Things, physical objects – the things – are connected through a network and actively exchange information about themselves and their surroundings. This paradigm enables the existence of so called smart environments, in which numerous context-aware applications can be deployed. Such applications can have a significant impact in the every-day life (e.g., smart homes, smart cities, etc.). Context-awareness allows applications to recognize situations of interest and properly react to them when necessary. However, deriving the large amount of raw, low-level sensor data into higher-level knowledge is a challenging task. In the last years, Complex Event Processing (CEP) has emerged as an important trend in applications that recognize situations in real or near real time. CEP can be employed to process sensor data in a continuous and timely fashion, in order to recognize situations as soon as they occur. Within the scope of this master thesis, a Situation Recognition System based on sensor data is developed using a CEP engine. This system can be used to monitor many situations in parallel based on the perceived surroundings of things that send context information, i.e. sensor values, to the system through the Internet. The recognition of situations is based on a non-executable model called Situation Template, which offers a means to easily describe the conditions for the occurring situations. Furthermore, this master thesis presents a sensor push approach so that sensor data is available to the Situation Recognition System as soon as possible. Moreover, this work analyzes three different CEP engines and motivates the choice of a CEP engine that copes with the powerfulness of Situation Templates. To execute the situation recognition using CEP, this work implements mappings from Situation Templates onto executable representations, i.e., CEP queries, to be deployed into the chosen CEP engine. Finally, a prototypical implementation of the Situation Recognition System is presented and evaluated via runtime measurements.