Visuo-Motor Coordination, Interactive Perception, and Goal-Directed Problem-Solving
As a member of the Knowledge Technology group in Hamburg, I work on various problems concerning neuro-symbolic integration, intelligent cognitive robotics and language understanding. In a recent project, we have integrated object detection- and tracking system based on convolutional networks with a deep network for grasping. Our implementation platform is the Neurally Inspired COmpanion (NICO) robot, which uses the method to distinguish and grasp a specific object in sparsely cluttered environments.
We use the same robot also to realize interactive perception, i.e., the execution of an action with the purpose of generating a signal that the robot can analyze. In this edutainment video, the robot shakes visually indistinguishable objects to generate an audio signal that it can classify.
Natural Language Understanding for Human Robot Interaction
At ICSI, Berkeley, I collaborate with Jerry Feldman et al. to integrate a framework that captures the deep semantics of natural language using Embodied Construction Grammar with robotic systems. This is used to facilitate natural Human Robot Interaction. Herein, it outperforms current natural language understanding systems for robotics, that are mostly based on simple context-free grammars, in that it allows for advanced linguistic reference resolution, conditional commands, and clarification. The below video illustrates its capabilities.
Conceptual Blending and Computational Creativity
In Cognitive Science, Conceptual Blending is deemed to be a fundamental and uniquely human method for creativity. In Barcelona, I have worked in the COINVENT project, where we computationalized the cognitive blending theory as formulated by Gilles Fauconnier and Marc Turner, and developed a framework that is capable of blending two existing concepts to invent a novel one. It currently supports the Common Algebraic Specification Logic (CASL) to specify concepts in first order logic. As example domains, we investigate music and mathematics. For illustration, consider the following video by my colleague Ewen Maclean from University of Edinburgh, UK, which shows how the system is used for music harmonization and to facilitate mathematical proofs.
Action Planning for Cyber-Physical Systems
At the University of Bremen, I worked on epistemic action planning for cyber-physical systems. Specifically, I developed the HPX online planning system, which employs the online Answer Set Programming solver oclingo to dynamically generate and repair plans while a robot is executing the plans. Due to its comparably low computational complexity in domains with many unknown contingencies, HPX is particularly well suited to perform failures diagnosis and dynamic plan repair in robotic environments. The following video shows how the HPX planner is applied for Smart Home control in the Bremen Ambient Assisted Living Lab (BAALL), in combination with autonomous robotic wheelchairs. The BAALL is equipped with automatic doors, illumination control, adjustable furniture and many more actuators and sensors. The autonomous wheelchair “Rolland” is designed to act autonomously in narrow indoor environments. The video illustrates a scenario where two autonomous wheelchairs can be used to bring a person to the bathroom. Once the goal is received and interpreted via speech recognition, the HPX system generates a plan, involving the first wheelchair to bring the person to the bathroom. However, an abnormality occurs in that the passage of the wheelchair is blocked by the little box. This system failure is postdicted by the system and the plan is repaired. The new plan involves the other wheelchair driving to the person and bringing him to the bathroom.