IMDEA Software

Iniciativa IMDEA

Inicio > Eventos > Charlas Invitadas > 2015 > PrePose: Security and Privacy for Gesture-Based Programming
Esta página aún no ha sido traducida. A continuación se muestra la página en inglés.

Ben Livshits

jueves 12 de marzo de 2015

10:45am Lecture hall 1, level B

Ben Livshits, Research Scientist, Microsoft Research

PrePose: Security and Privacy for Gesture-Based Programming

Abstract:

With the rise of sensors such as the Microsoft Kinect, Leap Motion, and hand motion sensors in phones (i.e., Samsung Galaxy S5), gesture-based interfaces have become practical. Unfortunately, today, to recognize such gestures, applications must have access to depth and video of the user, exposing sensitive data about the user and her environment. Besides these privacy concerns, there are also security threats in sensor-based applications, such as multiple applications registering the same gesture, leading to a conflict (akin to Clickjacking on the web).

We address these security and privacy threats with PrePose, a novel domain-specific language (DSL) for easily building gesture recognizers, combined with a system architecture that protects user privacy against untrusted applications. We run PrePose code in a trusted core, and only return specific gesture events to applications. PrePose is specifically designed to enable precise and sound static analysis using SMT solvers, allowing the system to check security and privacy properties before running a gesture recognizer. We demonstrate that PrePose is expressive by creating a total of 28 gestures in three representative domains: physical therapy, tai-chi, and ballet. We further show that runtime gesture matching in PrePose is fast, creating no noticeable lag, by measuring on traces obtained from Microsoft Kinect runs.

To show that gesture checking at the time of submission to a gesture store is fast, we developed a total of four Z3-based static analyses to test for basic gesture safety and internal validity, to make sure the so-called protected gestures are not overridden, and to check inter-gesture conflicts. Our static analysis scales well in practice: safety checking is under 0.5 seconds per gesture; average validity checking time is only 188 ms; lastly, for 97% of the cases, the conflict detection time is below 5 seconds, with only one query taking longer than 15 seconds.