Jeremy Cooperstock, McGill University

Profile photo of Jeremy Cooperstock, expert at McGill University

Associate Professor Electrical and Computer Engineering Montreal, Quebec jer@cim.mcgill.ca Office: (514) 398-5992

Bio/Research

My lab is broadly concerned with human-computer interaction technologies, emphasizing multimodal sensory augmentation for communication in both co-present and distributed contexts. Our research tackles the full pipeline of sensory input, analysis, encoding, data distribution, and rendering, as we...

Click to Expand >>

Bio/Research

My lab is broadly concerned with human-computer interaction technologies, emphasizing multimodal sensory augmentation for communication in both co-present and distributed contexts. Our research tackles the full pipeline of sensory input, analysis, encoding, data distribution, and rendering, as well as interaction capabilities and quality of user experience. Applications of these efforts include distributed training of medical and music students, augmented environmental awareness for the blind community, treatment of lazy eye syndrome, ocean science observation, highly accurate face and affective state recognition, low-latency uncompressed HD videoconferencing and a variety of multimodal immersive simulation experiences. Most of our research takes place within the Shared Reality Environment, a facility that includes two different configurations of multi-projector displays, camera and loudspeaker arrays, and a high-fidelity vibrotactile sensing and actuated floor.

BIO:

Jeremy Cooperstock (Ph.D., University of Toronto, 1996) is an associate professor in the department of Electrical and Computer Engineering, a member of the Centre for Intelligent Machines, and a founding member of the Centre for Interdisciplinary Research in Music Media and Technology at McGill University. He directs the Shared Reality Lab, which focuses on computer mediation to facilitate high-fidelity human communication and the synthesis of perceptually engaging, multimodal, immersive environments, and also leads the theme of Enabling Technologies for a Networks of Centres of Excellence on Graphics, Animation, and New Media (GRAND). Cooperstock's accomplishments include the Intelligent Classroom, the world's first Internet streaming demonstrations of Dolby Digital 5.1, uncompressed 12-channel 96kHz/24bit, multichannel DSD audio, and multiple simultaneous streams of uncompressed high-definition video, and a simulation environment that renders graphic, audio, and vibrotactile effects in response to footsteps. His work on the Ultra-Videoconferencing system was recognized by an award for Most Innovative Use of New Technology from ACM/IEEE Supercomputing and a Distinction Award from the Audio Engineering Society. Cooperstock has worked with IBM at the Haifa Research Center, Israel, and the T.J. Watson Research Center in Yorktown Heights, New York, the Sony Computer Science Laboratory in Tokyo, Japan, and was a visiting professor at Bang & Olufsen, Denmark, where he conducted research on telepresence technologies as part of the World Opera Project. He chaired the Audio Engineering Society (AES) Technical Committee on Network Audio Systems from 2001 to 2009 and is currently an associate editor of the Journal of the AES.


Click to Shrink <<

Links