Current Projects

Brain-Computer Interaction using functional Near-Infrared Spectroscopy (fNIRS): [Project]

The current focus in our research group is on a new generation of brain-computer interfaces. Brain-computer interaction has made dramatic progress in recent years, but its main application to date has been for physically disabled users. Our research in real-time measurement and machine learning classification of functional near infrared spectroscopy (fNIRS) brain data leads us to develop, use, and evaluate brain measurement as input to adaptable user interfaces for the larger population.

We are using brain input as a way to obtain more information about the user and their context in an effortless and direct way from their brain activity. We then use it to adapt the user interface in real time. We are creating and studying these new user interfaces, with emphasis on domains where we can measure their efficacy, such as a multi-modal dual task human-robot interface. We are also broadening this work in the direction of more general types of real-time adaptive user interfaces.

Past Projects

Tangible Programming Languages: A practical approach to computer programming in educational settings.[Project]

Reality-based Interaction: Understanding the Next Generation of User Interfaces [Project] [Closed Wiki]

TUIMS: Tangible User Interface Management System [Orit Share] [Paper]

New Human-computer Interaction Techniques for the Digital Library [Project]

Virtual Markets and Wireless Grids (retargetable user interfaces, high-level specification of user interaction) [Summary]

Visual Understanding Environment (interaction techniques for viewing and manipulating concept maps for learning) [Project]

Senseboard: A Tangible Interface for Manipulating and Organizing Abstract Information
[Project] Paper [pdf] Tangible Media Group, MIT Media Lab

Models and Abstractions for Next-Generation User Interface Software [Project]