Research Projects

Research Projects

The following research projects are currently realized by the HCI group.

Designing and Evaluating Scalable Behavioral Biometrics Systems for Pervasive Computing Environments

Funded by DFG (Deutsche Forschungsgemeinschaft)

Duration: 3 Jahre

Knowledge-based authentication mechanisms that require users to remember login and password are still among the most popular means for authentication. The average user is expected to access sensitive information through about 200 different, password-protected accounts in 2020. The caveat of such mechanisms is that they require people to use more passwords than they can remember and that entering passwords requires a significant amount of time. The number of authentications will further increase as more and more pervasive computing devices are being used in smart public spaces. These devices will not be recognized as computing devices anymore and may use predominantly interaction techniques most likely not suitable for entering knowledge-based passwords (e.g., gestures, speech). Examples include personal devices (e.g., smart glasses, smart phones, HMDs, smart clothes, wearables) as well as devices in the environment (interactive displays, pressure-sensitive floors, etc.) and will jointly create a pervasive computing environment. In past years, behavioral biometrics, that is the ability to identify users implicitly from their behavior, received considerable attention in the research community. This approach does not require users to remember a secret but authentication can seamlessly slide into the background. In particular, researchers showed that numerous behavior traits (gait, typing behavior, touch targeting, gaze) can be used for identification. At the same time, behavioral biometrics so far was mainly investigated in the lab for single users, since assessing different features’ biometric value requires precise measurements. Thus, it remains unclear how these approaches scale to novel challenges of pervasive computing environments. In this project, we examine how pervasive computing environments can leverage behavioral biometrics for identifying and authenticating users. The main challenge this project is addressing is the question how behavioral biometric approaches scale to different pervasive computing environments, containing multiple users with changing behavior, different physicalities, and changing sensing and interaction
capabilities. From this objective, many questions emerge: (1) How is users’ behavior influenced both by other people in the vicinity, characteristics of a space, as well as by novel interaction techniques emerging as more and more computers become part of our everyday life; (2) how does this influence the way in which we design and develop behavioral biometrics; and (3) what does this mean for behavioral biometrics-based authentication concepts. We envision
this project to enable a significant leap forward towards behavioral biometrics becoming a powerful means for identifying and authenticating users in future pervasive computing environments that combines high usability with strong security. The project outcomes are valuable beyond security, enabling novel UIs to be built that adapt to users’ behavior.

Functional Biometrics

Funded by DFG (Deutsche Forschungsgemeinschaft)

Duration: 3 Jahre

The number of systems requiring user authentication is increasing every year. Classical authentication approaches such as Personal Identification Numbers (PINs) and passwords overwhelm users since they have to remember dozens of them. Research also showed that the selection of user-generated passwords is predictable. To counteract these weaknesses, devices such as smartphones and laptops are more often equipped with biometric authentication. The most common biometric authentication approaches are finger prints or face recognition and, thus, using the body of a user as a physical token to identify this user. While these approaches currently provides a sufficient level of security, they have two inherent drawbacks: (1) the user is not able to change biometric passwords and (2) the user is leaving them essentially everywhere (e.g., leaving finger prints by touching the environment, video surveillance cameras recording the user´s head and body).

To tackle these challenges, we introduce a novel class of biometrics, called Functional Biometrics. In contrast to physical token and behavior based biometrics, Functional Biometrics exploit the user´s body as a function. Thus, this novel class of biometrics does not only rely on the body itself but also on a specific input signal generated by the system the user wants to authenticate to. This signal (e.g., audio, electrical, haptic stimulus) is modified by the user´s body which in return generates a characteristic response through a user’s unique body reflection (e.g., propagated audio response, muscle reaction). This characteristic response is used as a biometric password. It is measured by the system and compared to a pre-stored response (i.e., the password) to authenticate the user.

Functional Biometrics combine the advantages of common knowledge-based authentication approaches such as alphanumeric passwords (i.e., changeable, multitude of passwords per user) with the advantages of biometrics authentication approaches such as finger prints (i.e., no cognitive load and no memorability required). In this project, we will systematically explore the design space of functional biometrics. We will investigate the feasibility of different sensing and actuation technologies, develop and adapt models and algorithms for automatically authenticate users based on measured body reflection, and will develop research probes and an authentication framework to synthesize the gained knowledge.  

IoTAssist

Gefördert durch das BMBF

Dauer: 03.2020-02.2023

Grundlegendes Ziel des Projekts ist die Entwicklung einer Plattform, die die Interoperabilität zwischen Geräten und Diensten im IoT- und Wearable Bereich ermöglicht und darauf aufbauend intelligente Assistenzsysteme einfach und intuitiv umsetzbar macht. Hier wird insbesondere der Fokus auf die Integration existierender Geräte-Ökosysteme im alltäglichen und häuslichen Umfeld gelegt. Eine besondere Chance bietet hier die Integration der stark in Ihrer Zahl wachsenden mobilen Geräte und Wearables mit vorhandenen Ökosystemen (z.B.: Fitbit, Philips Hue). Da mobile Geräte (z.B. Smartphones) oder Wearables (z.B. Activity Tracker, Smartwatches) oft durchgehend am Körper getragen werden, bieten Sie die Möglichkeit als intelligente, kontextsensitive Schnittstelle zwischen Menschen und ihren vernetzten Umgebungen zu agieren. Aufbauend auf diesen Schnittstellen können individuelle Assistenzsysteme geschaffen werden, die den Nutzer bei der Bewältigung alltäglicher Aufgaben unterstützen. Im Rahmen dieses Projektes beschränken wir uns auf individualisierte Assistenzsysteme für die Gesundheitsförderung durch Kombination von IoT-Geräten und Wearables.

Für eine Zukunft, in der hunderte von intelligenten, vernetzten Geräten und Sensoren in unsere tägliche Umgebung eingebettet sind, stellt sich die Frage, wie man diese Geräte effektiv programmiert und vernetzt, um dem Benutzer bestmögliche intelligente Assistenzsysteme zu bieten. Zur Adressierung dieser Frage wird im Rahmen dieses Projektes die IoTAssist Plattform entwickelt. Eine Technologie-Abstraktionsebene in Kombination mit einer plattformübergreifenden, webbasierten integrierten Entwicklungsumgebung (IDE) kann die Funktionalitäten und Fähigkeiten einzelner Geräte zusammenbringen und intelligente Dienste orchestrieren. Dabei stellen die einzelnen Geräte einer Umgebung ihre Fähigkeiten (Sensoren und Aktoren) zur Verfügung – dieses Wissen ermöglicht einer intelligenten Umgebung, erst die Fähigkeiten der vernetzten Geräte geschickt zu kombinieren und sie zu „Team-Spielern“ werden zu lassen. Hier ist es besonders wichtig, dass Endbenutzer mit Hilfe der Plattform die Fähigkeiten und Dienste ihrer Geräte durchsuchen und diese intuitiv ohne Kenntnisse einer Programmiersprache zu intelligenten Assistenzsystemen kombinieren können. Hierzu wird ein Baukastensystem entwickelt, in dem die verfügbaren Teilsysteme und Dienste als Bausteine zur Verfügung stehen. Durch das Verbinden einzelner Bausteine durch den Benutzer wird Plattform intern automatisch der entsprechende Code erzeugt. Um ein solches Verhalten zu ermöglichen werden entsprechende standardisierte Geräteprofile entwickelt und implementiert. Hierdurch können verfügbare Services kommuniziert und neue Geräte standardisiert in ein bestehendes System integriert werden. Um einem Benutzer die Möglichkeit zu geben, das Verhalten der Geräte zu steuern und anzupassen, müssen entsprechende Interaktionsmechanismen und Visualisierungen entwickelt werden die verschiedene Interaktions-Modalitäten parallel einbeziehen (z.B. Gesten, Spracheingabe, visuelle Schnittstellen). Durch die Technologie-Abstraktionsschicht kann bei einem Austausch ein anderer Hersteller oder eine andere Kommunikationstechnologie als Ersatz verwendet werden – solange die Fähigkeiten der Geräte identisch sind kann der Dienst seine Aufgabe fortsetzen.

More Information

Computing for Intercultural Competence

ComIC

The ComIC (Computing for Intercultural Competence) project promotes the intercultural dialog between Islamic and European countries using “computing” and “research” as the common language to tackle emerging challenges. Intercultural dialog refers to proactive awareness and preparedness to integrate within other societies in social and professional aspects. This is particularly important when considering the IT industries in western and Islamic countries. In western countries, there is a gap between the number of available computing jobs and the number of qualified computing professionals. It is known that there are more computer science related job opportunities than computer science students. On the other hand, there is a misconception in Egypt that there are no jobs for computer science graduates.

Motivated by this misconception and to create more awareness and a better understanding for the opportunity between both worlds, we propose the ComIC project. The project consists of 12 activities that will foster the cultural dialog, modernize teaching, empower young and, particularly, female researchers, create research networks between Germany and Egypt, and support cultural development. Within these activities, we will conduct workshops and hackathons with participants from both institutions. We particularly focus on using computing technology to bridge the gap between both worlds and create a common ground. We strongly believe that computing technology has the potential to ease up mutual understanding of cultural differences. We will use computing technology (e.g., virtual reality or serious games) and research methodologies (e.g., from Human-Computer Interaction) for solving relevant societal challenges the Islamic world is currently facing. We will offer young researchers and teaching assistants from the GUC the possibility to gain experience at the UDE and vice versa. Moreover, the project expands the strong research network of the two universities through their regional partners. Furthermore, it is a platform for nourishing core scientific values and ethics like promoting gender equality.

Funded by the DAAD (German Academic Exchange Service)